Feminism brought about women's suffrage. It allowed women into the workplace, it has made great strides in curbing our patriarchal trends.
Domestic abuse is down, women in higher education is soaring. Feminism has done wonders for creating a more equal, more fair society.
Cultures that exploit women seldom last. It's time we honored all people equally.
No question, In the past feminism fought for equal rights. Wanting the same rights as the opposite sex. But its pretty obvious that the last 20 plus years of feminism has nothing to do with giving up all those many special rights, privileges only for women. You do not see women willingly offering to give up the laws, rights, privilege's, offerings, chivalry they solely command in this generation.
Feminists are causing the world to be a worse place for men and women. They say that women are victims of men so they need men to be feminists to help them. They also portray Masculinity as 'toxic' and Femininity as 'perfect'. They also celebrate abortion, which I support the right to choose to have a safe and legal abortion but it is not something to be celebrated.