• Feminism Forever

    Feminism will always needed because even with laws to protect women, men have remained rapists and supporters of the hierarchy. Without feminism men will continue to think it is acceptable to rape women because this is what they are taught at birth.
    As long as men are raised this way, feminism will be needed as well as anti-racism and LGBT advocates.

  • Women are paid 20% less than men in Western society. Is it not clear enough to you yet?

    Any argument that liberation movement are no longer relevant is ridiculous. Even in an ideal world without discrimination liberation movements are still so useful, to ensure that this idealism does not slip.
    However, this is not the case! Women are paid less than men, fewer than 10% of rape cases result in a conviction and the word 'woman' is used as an insult. Is my case not clear enough?
    I assume that anyone who voted in any other way self-defines as a misogynist.

  • Feminism is still relevant, because our society is still run unfairly by men.

    Women still make less than men, on average, in every career field. Violence towards women gets worse with each passing day. The greatest insult to any man is to claim he is somehow effeminate. Prejudices against gay men are primarily so strong, because gay men are perceived as effeminate, or less than a man. It is still the ultimate goal to be as masculine as possible. Men are not the only ones that are culpable. Women are just as guilty of using the word "girly" as an insult to men. Feminism will always be relevant, especially when we still have so far to go.

    Posted by: SelfishSalvatore79
  • Feminism = Equality. I dont see equality just yet.

    Feminism will be relevant for as long as there is inequality in this WORLD, not just in America. Fact of the matter is equality is far from being reached, that is obvious. Women in sports, the workplace, wages, opportunities, overall expectations ect. Women's worth is put into their appearance and body image other than their accomplishments.

  • Feminism will always be relevant!

    Feminism is not (and never has been) about female superiority or taking away the rights of men. Its about EQUALITY. Its not just about voting rights or getting jobs. Its about the way women are treated in everyday life. When GENDER itself is no longer a detriment to someone's ability to do something (i.E. Females can't handle leadership) then feminism will no longer be needed.

    When women are no longer blamed for being raped, or automatically expected to want children, or actively discouraged from traditionally male fields (math, science, politics, etc) then feminism will no longer be needed. When women are treated the exact same way man would be in ANY situation, then feminism will no longer be needed.

    And FYI, if you're someone who thinks feminism isn't necessary or that women have true equality, then you are part of the problem.

  • Everyone is Supported With Feminism.

    Just glancing at the negative side proves that women are being degraded and objectified in this culture. We need to teach each other to treat each other as equal human beings. Men also need feminism, they can be raped as well, and oppressed by a culture that teaches them what they must do to be "manly".

  • Because sexism still exists.

    Feminism is a stance of empowerment for everyone. In a world where women still earn 77 on the dollar in comparison to men, it is of extreme importance that the feminist movement live on. There is still much work to be done as patriarchy dominates so many aspects of our society.

  • I Had to Vote YES when I saw all the "NO" comments - WE NEED IT.

    In response to this user:
    "Women are inferior!!!!!!!1 According to a recent pewresearch.Org poll the average American female is inferior at understanding the world. (http://www.Pewresearch.Org/quiz/science-knowledge/) --you must take the pole to see the results. The pole shows that females lack, comparatively to their male counterparts, the ability to understand science and the world around us. In some form or another they should be stripped of some rights. Women need to be barred from holding position of power(congress). They have poor decision making skills and should not be able to vote or own property. I'm extremely disappointed in how we have made it to this point. Feminism has only harmed us."

    I have to say this:
    It's a poll, not a "pole", sweetie. I would blame your country's education system, not the women. Women need to be in equal positions of power if only to prevent extremists of ANY gender, race, etc. to create biases. The only way to provide a genuinely equal view is to split it equally.

    In terms of people generally wanting to know if feminism means you can "hit girls now" you might first want to look at your priorities, if that's the first thing you ask when we mention equal rights. Second, if you're being assaulted and you need to defend yourself, than yes. Men, women, trans, gender queer...People of the world - if you are being assaulted, do what you can to defend yourself and GET AWAY. On that note, ladies - if a man says "Stop" and you're hitting him, I don't care if you're teasing, STOP IT. If you keep going, you're putting the onus back on them to keep going when you say stop.

    And speaking of feminism harming us, the US Bureau of Justice statistics showed that 90% of homicide offenders were male, while females had the lowest offending rate of any racial or age groups. Check your facts. We don't want to take over the world, we just want to be equal in it.

  • If we ignore history it is doomed to repeat itself. We can't ignore inherent qualities in men.

    Less than one hundred years ago women did not have the right to vote weren't able to have jobs and in the event that their husbands died they were left with nothing and no way to care for there children. Men were allowed to beat women and there are still many, many more women than men being raped and kept as sex slaves. The movements and protests are not needed because the issues are not as serious as they were (in the U.S. Middle East is a completely different story) but it still must be a part of education because men, for as long as history has been recorded, have been violent, domineering and controlling and any research will show that men make up the majority of the prison system and commit the majority of the crimes. I'm not saying men are all evil and I'm not saying women are better. I'm saying that men obviously are aggressive and controlling in nature and that we can't allow ourselves to fall back into that kind of barbaric way of living. We need to keep our children educated. I mean why is teaching children from a young age not to hit one another still relevant? For the same reason that teaching people that women have rights and shouldn't be downtrodden because they are physically weaker than men is.

  • It is still needed

    Feminism is a belief that women should have the same right a men. I think not. Feminism means that yes but I believe that women like me are using it more as a term for saying; we want respect! We don't want men hitting our butts because "they have to right to" we are judged daily based on how we look. How you look determines if man is going to sleep with you or not. Surprisingly we aren't tools you find in your tool box... So don't use us like them. We think that equality is achieved but women are still not respected. Women are paid an average of 20% less that men on an overall average in careers. So if men would notice this we would stop needing feminism. WE WANT RESPECT!!

  • Women should not be empowered as they are not as smart as men.

    Women have too many rights as it is - the right to vote (which most of them don't), the right to drive (which was obviously a bad decision), and the right to be employed under affirmative action. Next they are going to want to be president. Feminism is a bunch of nonsense.

    Posted by: P0nyTobi
  • Experience and Knowledge

    Its clearly irrelevant at this point in time. I mean what can't a woman do that a man can other than stand while peeing. Now turn that question backwards, What can't a man that a woman can?

    Women get favor in everything, everytime I turn on the TV, something being advertised for women. Look at it this way, my friend was once grabbed in the crotch by this girl, he didn't say anything (surprise, right?) When he tried to grab her breast all of a sudden all hell broke loose. After explaining the situation, she fessed up about doing it. We both wondered, why can't he touch your breast? She says because I'm a Girl. Oh, ok. What no.

    How about the Girls who take advantage of the no hitting rule? Who will always get in your face, and constantly be on you like your the scum of the earth? Happens to me in algebra sometimes, I get really close, but I don't hit her. I just completely deflect her everytime she tries to hit me. What about the girl's that fool you into getting them things, and then trick you later leaving you?

    I cannot believe you still think women cannot have the same opportunities as men! My mom is single, and she has been the Principle of over 4 schools, she provided well for us and I am very grateful to her. She has had the same education chance as any male! Paid less, get out with that crap, at one point in time she was making more than my Uncle who was also a Principle, IN THE SAME SCHOOL DISTRICT!

    Your upset about the negative image of women in the media? Girls that get raped? Or the way women get depicted?

    NEWSFLASH, women who are strippers, CHOOSE to be that. Granted, it may be a last resort, but you know that something else happened that involved that, and it IS NOT ALWAYS A MAN'S FAULT.
    Women in media are PAID to do those things and act that way TV, they are actors. Those ladies, chose to be depicted that way since they are on there! If they really didn't want to be on TV that way, THEY WOULDN'T DO IT!! Take that media argument up with the actors that do it.

    Rape, is something that no one can really ever stop. Its like trying to stop a 2 yr old from crying. They are always going to do it, its going to happen. What you could do is to stop yelling at all men. Even though I am not fully grown, I have never done anything like that to a girl. The worst I've done is maybe felt her hair without permission.

    The abortion thing I do agree with though. Won't even argue there.

    If your fighting for women's rights in other countries you take it up with them. I just don't wanna get yelled at for something that hardly ever occurs here in the States anymore.

  • Women totally suck

    Women are clearly not as intelligent as men. Women should only exist to serve the sexual needs of men and to look after children. This is what biology intended for women and it is ridiculous of feminists to argue with centuries of biological fact. Feminism is a joke. Everybody knows this.

  • Women are inferior!!!!!!!1

    According to a recent pewresearch.Org poll the average American female is inferior at understanding the world. (http://www.Pewresearch.Org/quiz/science-knowledge/) --you must take the pole to see the results. The pole shows that females lack, comparatively to their male counterparts, the ability to understand science and the world around us. In some form or another they should be stripped of some rights. Women need to be barred from holding position of power(congress). They have poor decision making skills and should not be able to vote or own property. I'm extremely disappointed in how we have made it to this point. Feminism has only harmed us.

  • Feminism has already become irrelevant, because women already have more rights then men!

    Feminists say that that the average working woman doesn't make enough money, and that is due to the fact that a good portion of women don't work as hard as most men! Employers have to give women workers 9 or more months of maternity leave, and still leave the position they had in that job open so they can get it back. And don't get me started on how the law greatly benefits woman, in most cases who gets custody of the children, Women. Who gets alimony when a couple gets divorced, Women. Who I ask, acts sneaky and pokes holes in condoms to get pregnant and charge the male/shemale(because they are technically dudes) of rape or makes them pay child support... Women

  • Feminism is no longer relevant in America today because women have equal rights alongside men.

    Feminism is no longer relevant in America because, while it helped women gain equal rights to men in our country, those rights are now standard. Women now can choose careers and motherhood without condemnation from the business world and society. Women now get equal education and business opportunities as men and even run some of America's top businesses. While feminism was needed many decades ago, its goal has been accomplished.

    Posted by: BouncyWalter37
  • They talk a lot without saying anything.

    Modern feminists have never seen real oppression and don't even know what equality means. It's just a whole lot of whining and misandry at this point. They got their rights while managing to escape certain responsibilities (e.G. Selective service). They're in many ways the preferred gender, especially in divorces and domestic violence cases. They still cling to concepts like chivalry, "never hit a woman", etc. Without holding themselves to the same standards.

    Posted by: Quan
  • Absolutely not still relevant, it never was.

    It's about as equal as things can get. I don't believe any more should be given to women until they can show that they deserve it.

    I see plenty of women complain about there being under-representation of women in politics, but I also noticed how under-represented they are in the street cleaning industry but I hear no complaints about that.

    Women want equality...but only the good parts of it. (See how the Minister for women and equalities reacted when the car insurance prices for women was finally brought up to the same level as men).

    Personally, I don't think feminism was a movement that should ever have formed, women should not be in any positions of power simply because I don't believe they are capable of handling it.

    For the record, I have never had bad experiences with women. I have a loving mother and sister, what I said is based from simple logic and fact. Can't argue with that...though I'm sure "butthurt" feminists will try.

  • Women already have equal rights as men!

    Women have the same rights as men. I do not see women being discriminated against anywhere. Women already have equal rights so it is irrelavent.

  • Feminism in western culture has lost its focus, is one sided and the neglect of scientific studies/evidence to support its claims have rendered it irrelevant.

    Firstly, the main problem with feminism and the social sciences in general, is that they are characterized by a distinct lack scientific evidence and proof and, instead, they heavily rely on theoretical basis that may or may not find an accurate reflection in what really happens in reality.

    What I mean by that is that feminism not only ignores but downright denies the impact of BIOLOGICAL effect on gender and race even if there are many studies to prove them wrong.

    Instead, they interpret the various social phenomena based on their own experiences and ideologies.

    The fact that such a scientifically inaccurate movement has thrived in the academic space makes wonder: Why has the patriarchy allowed this? Does the patriarchy really exist?

    So if feminism does not rely on science to reinforce its positions, then what does it rely on?


    Feminism is no longer just about gender equality (if it ever was).
    It has expanded to race,class, age, religion etc with the same feeling of self righteousness and bitterness it had when it started. However, if this kind of suspicion was once a strength of feminism, now it is its weakness.

    The need of feminism to constantly find enemies has led it to deplecity and ideological narrowness.
    It is no longer about truth but advancing an ideological agenda which is not powered by cold hard evidence but theorems and opinions as well as clearly biased interpretations of reality.

    On another note, feminists claim that feminism is about equality.

    That is not really the case.

    Feminism, as long as gender is concerned, is about granting more privilege to women. A noble cause, since women are known to be oppressed throughout history.

    But does it end there?

    In areas where women are underprivileged, supporting and giving them more power will indeed lead to equality and is something that is greatly needed but also majorly achieved in most countries of western culture.
    However, in areas where women are not underprivileged or even equally privileged, doing so causes just another imbalance which immediately leads to discrimination, this time against males.

    The kind of one sided and absolute view of oppression is distinct in feminism and not only in gender:

    The opinion that if a specific group is in average more privileged than another, that means that members of the same group CANNOT be oppressed or treated unfairly by the latter is quite popular with feminists, especially in the academic space.

    They demonise masculinity and have come up with the term "hegemonic masulinity" to reinforce this idea.

    Since feminism has never ACTED in support of underprivileged straight, white males against overprivileged females in cases where male discrimination is a really serious and prevalent, this alone tells me that feminism is really, really far from egalitarian.

    These three reasons (lost of focus, bias and scientific neglect) is why feminism isn't only irrelevant but also harmful to the western culture.

Leave a comment...
(Maximum 900 words)
No comments yet.