Feminism is one of the most hypocritical movements in American history. Take their response to the sexual harassment instances. Rather than realizing that rapists know they're breaking the law when they commit these crimes (hence why they force silence) and that groping is completely difference from rape, they force the idea that men have been "raping" women by talking about how hot they are, when this is a perfectly normal occurrence during adolescence and young adulthood.
Feminists say that they stand for gender equality, but really they're just trying to get people to believe that men are the scum of the earth.
If this movement continues, I'd wish I would have been born 30 years ago, back when women didn't care about these things as much, because the situation then is the same as it is now.
Men like women and women like themselves because of their femininity but without masculinity there is no reason for women to like men nor men to like themselves thus creating the rape society that they complain so much about not to mention the high rate of male suicides. And you also get tons of druggies, couch potatoes, men with anger issues, etc. etc. etc.
And when men believe this they act accordingly with all sorts of unhealthy things nonstop. Therefore they don't have any muscle and hardly know how to move their arms and legs which is completely against what nature intended. All the while women are constantly told to be manly by this movement so they can "prove" women can do everything as well as men can by comparison. Thats why we constantly see stories and videos about physically strong women beating up these types of people. Any time masculinity is even slightly "disproven" the media blows up about it like a wild fire and exaggerates like crazy. Yet when anything masculine happens it is downplayed if anybody ever even hears about it.
Women are liked and respected by men because of their femininity but if masculinity doesn't exist there is no reason why women should like or respect men. All this does is make men feel inadequate and hate themselves to no end and become jerk offs druggies and couch potatoes which further "proves" this ideal that is being pushed on so much by the media. Then when beats up one of these low lifes that hardly knows how to move their arms and legs without a muscle on their body from a lifetime of nothing but drugs and TV the internet explodes with proof that women are as masculine as men. Yet if a man that has the natural athletic ability that nature intended for him beats up a room full of professional female fighters nobody ever hears about it.
As a female myself I am beginning to see that feminism in this day and time is no longer about equality because they continuously keep putting women first and not men. Feminism is getting to the point where a man is getting judged just for being a man. Also if a man complains about feminism he is sexist. If feminism was about equality they would of allowed a real man to speak his mind. What has feminism done so far for America?
Feminism has convinced people that all women are victims, when we're all not and whenever they're asked about male victims they are completely silent. They think they should should put women first before men in order to create equality.
Feminism has raised divorce rates because men are now starting to see the court system is being unfair towards him even when he done nothing wrong because he is a male. Because of this a man is slowly losing the desire to marry knowing he'll lose everything if divorced by a woman. Most divorces ruin men financially to the point he commits suicide.
Feminism had also made it seem like every man is a rapist because of this they made it where a female can falsely accuse a man of rape even when he didn't do it and the female who accused him has lack of evidence to prove so.
Feminism is also getting to the point where they think they're speaking for every women when really they're not. There are females saying they don't need feminism because they have sense enough to know they have enough rights. We can vote, we can go into any career we want, we can become the breadwinner if we want and choose so, and so much more. So what more is there to want?
Also women against feminism they have sense enough to know instead of using the government to cheat they're way up top like feminists has done they are willing to work themselves to the top fairly just like I'm doing.
Feminists think every female need to feel empowered, but we don't. They say females who are speaking against them don't know what's best for themselves, but they fail to realize they are coming off as too controlling even to us where they think they know what's best for every female. Females who are against feminism don't want someone speaking on their behalf when feminists literally didn't took the time to ask what they think and that's why they're speaking against it and so am I.
If feminism was about equality they would of literally mention men getting abused by their wives, they would of fixed the unfairness when it comes to divorce between a man and woman, and they would of literally open up and hear what both men and women have to say about their movement. Yet they continue to lie and say it's about equality. By the way feminism is literally starting to depress everyone.
All people have the same basic rights, or at least should according to the law of the land. Women were oppressed for quite some time and are now simply ensuring that the rights that they do have are not going to be further infringed upon. The feminist movement is no different than any other pro-social movement that is currently going on.
Feminists are not destroying America. They are helping and have helped millions of American women to have the same opportunities as men in education, voting, employment, and even athletics. Feminists have paved the way for women such as former Secretaries of State Condoleeza Rice and Hilary Clinton and former Supreme Court Justice Sandra Day O'Connor. These women, in turn, inspire a new generation of American women.
They are not destroying America, but they are very annoying. They preach that they want equality and want to improve society. However, I feel that they feel entitled to everything and actually create a new perception for females. In some aspects, I feel that they are doing more harm than good for America. They are no way destroying America though.
Feminists are not destroying America, as they are helping improve the country. Gender inequality still exists, whether it's in the form of the corporate glass ceiling or in pay inequality, among other things. Sure, women have made great strides in the last 100 years, but true equality is not there, yet.