Yes, America has learned something in recent years in recent years about gender equality, because they have done a lot better in recognizing women's rights in the workplace. Sexually harassment is no longer something that is privately tolerated; rather, we all agree that women should be able to work equally in the work place.
Gender equality has come a long way over the last few decades. What was once a taboo subject is openly discussed. From the office to social gatherings, the effects of a move toward gender equality can be clearly seen.
There are many individuals that take a hostile position to even the concept of gender equality; but, I believe, the trend for most folks is toward acceptance and accountability in regards to promoting or insuring equitable treatment regardless of one's gender. Certainly, progress has been made and their is more work to be done.
I think America has learned a great deal about gender equality in recent years. In spite of the women's movement and strong media support, we never really embraced gender equality in the work place. In just the last few years there have been women promoted to lead positions in some of the worlds largest, most successful companies and these women have been remarkably successful.
I believe America has learned something in recent years about gender equality. I think changes tend to come slowly and many of those changes have evolved over the last decade or so. I believe gender equality needs some tweaks, we do need to understand that the genders are slightly different in some ways. Equality is one thing, realizing your different at the same time is enlightening.