Amazon.com Widgets

Are gender roles changing in the U.S? If so, how and why?

Asked by: zxcv961
  • Yes and we're better off without them.

    Gender roles are slowly going away. They offered very little actual benefits as diversity among individuals is far greater than the differences between the genders. This means that basing a role on ones sex is downright counterproductive and repressive. If you still think that the genders are equal than go to any school in the country, spread a rumor that a boy and girl had sex and I think you'll find that both get called very different names.

  • Unfortunately, yes they are.

    For some reason I cannot understand, it is now looked down upon for a woman to be a stay at home mom and housewife. People say it somehow makes them weaker and they are not doing their gender justice if they do not become selfish and start focusing more on a career and making money rather than raising a family. The sad part is many low and mid level income families are not able to be supported with only the husband working. However, it is plain to see that the further we get from the traditional family, the worse off the United States is. Its funny though that people push and shove for all this liberal backwards thinking and then those same people stand around and say "wow, what's happening to our country."

  • Yes and its great

    Gender roles aren't just changing they're disappearing. As a society we're starting to value people for their personal goals and skill sets and what they can do and want to accomplish, and who they are determines that not what genitals they are born with. It's great. If you want to be at home you should no matter your gender. Same with working, etc.

  • Men & Women Following "Gender Standards"

    -Women in today’s society, have a voice. In (social) media, women are seen as independent. Especially on sites, like the famous blog, tumblr.
    -Men share personalities and traits that back then, would only be considered for woman. For example, today we accept that not all men are masculine.
    Not all women are feminine either.

  • Yes Of Course.

    Gender Roles have been changing since the American Civil war. Since the 1970's this change has accelerated. In the last five years it has accelerated again. We are in a very Liberal swing as far as Gender roles go. (politically it is turning more conservative) I predict that when Gender role change, slows or stops, it may cement in the gains to the female, and losses to the male. We may be looking at a female dominated society. Nobody will " want " this, rather it will just be the result of the trend line playing out.

  • Yes they are

    I'll be blatant; they are. Everyone can see the change and I wouldn't argue that it isn't changing. It's obvious to everyone. Women are getting more money in their jobs and also taking on more stressful jobs. This is one of the many changes that we will see during this time.

  • They definitely are

    Whether it is for the better is another question, however. I would say that generally traditional gender roles are best for people with children, since the sexes evolved for those roles and for the purpose of bringing up children. Females, since they are the ones that give birth, breastfeed, and are more nurturing, are better suited for raising the child. Does this mean that all females should have to be put into this role? Absolutely not. Gender is a spectrum, and not all females are suited for traditional motherhood and should not be forced to do it.

  • No responses have been submitted.

Leave a comment...
(Maximum 900 words)
No comments yet.