Females are seen as "victims" of society when they are too blind to even realize that men and women are capable of doing the same things. Women can hit men in public and it would be seen as OK in society but when a man defends himself by hitting back, he is seen as abusive and short-tempered.
I believe that, at the very least, the major Western countries like the US and UK are very gynosympathetic. I believe it is why feminism has been able to be such a powerful social force.
What privileges do women benefit from?
- More likely to succeed in school, and more likely to attend university
- women live longer
- there is more social awareness and concern of female issues.
- Less likely to be the victim of assault or murder
- young women now get paid more than young men as a result of affirmative action.
- Greater freedom with their children (i.E. If a woman is pregnant and wants the baby, she will have the baby even if the man does not want to be a father, and vice versa).
In general, Western society is becoming much more feminine. And it is likely female privilege will continue to expand.
Some people believe in "never hitting a woman". Others believe she just has to hit you first. Some people believe that only men should work in trades and wont hire women. It is frowned upon to hold one sex over the other though.
Most of these "privileges" are out of respect not to be discriminatory
If a man says something like (Women should stay at home, and provide a better upbringing for a child), that statement would be considered outrageous and bigoted against women, when in reality he was just trying to bring a traditional viewpoint that he thought was best for America's younger generation. If a women says something like that on the other hand, it would be looked upon as a joke. So, I think that there is a female privilege, but its also like having a kid do child labor for 8 years, and than put him in America and treat him like a rich kid. Its twisted, but it is justified.