Rights for women aren't debatable. Of course, they should have the same rights as men! But when they suddenly start viewing men as the enemy, it gets out of hand. They believe that anyone who celebrates International Men's Day (Nov. 19) is sexist and needs to be shot in the head. They believe that the men who are more successful in life are only successful because of their gender. They also like to shut down anyone who disagrees with them. Single parents, for example, are labeled as "selfish" and "uneducated". If you don't agree with modern feminists beliefs, you will be labeled as a (fill in the blank)-ist, and will be attacked by the media. In other words, many (not all) modern-day feminists are using insults and labels to get what they want. Even though feminism started out with a good purpose, it needs some changes.
3rd wave feminists are a toxic movement in our society. They don't believe in the free speech of those who disagree with them (MRA activists for example), They don't truly stand for equality (MRA for example again, as feminists show up in droves to cancel their events), and the icing on the cake is that the majority of them have a condescending and arrogant personality. America and the west in general would be better off without these people.
Judging by this question, feminism has and always will ruin the U.S.A. because their triggered all the time, their beliefs suck, their goals are illegitimate, and they all need to be shot by cops, every single one of them. Just like Donald Trump said let's make America great again. Yes!
Feminists are doing this because men didn't know how to respect them in the first place! Learn some respect and you'll do better! Seriously, what kind of society do you live in? Is it just thought that men are the innocent here? Men need to step down a little bit and learn how to treat women.
There's an difference between a feminist and a radical feminists. Like a republican and a conservative or democrat and a liberal. There's a side, and then there's the extreme side. I can easily see why you might dislike one side, but don't say feminists are ruining the country. If there was a list to be compiled of what is ruining the U.S., they wouldn't even make the top 100. You gotta be kidding. What kind of pathetic human being wants to blame a nation's declining well being on a movement like equal rights? On the same note, I understand why one would be upset with the extreme view, because those aren't even feminists, they're probably on the same note of you, somebody who's lonely and takes his frustration out on people. And if I come off cold, or stupid, it's because I'm using this answer, and trying to compare to the same caliber of this question.
Feminists are people who believe in equality for all genders. There are very extreme people who think that men are evil. Those people like to call themselves feminists but they really aren’t. Today’s society gives feminism a negative connotation because of the small extreme group of people, but real feminism is to make sure that all genders are to be seen as equal.