Amazon.com Widgets

Are women respected enough in Hollywood as they were back in the early days of cinema?

  • Feminism, feminism, feminism...

    Feminists are never content with their 'women domination' ideology. Simple things like Star Wars, Ghostbusters, have all been ruined and transformed into feminist propaganda, yet people are not content. There is no proof that women are being discriminated against, if anything, it is males being discriminated against.

    Women already have it big in the movie industry, and it pains me to see already gender-equal movies turned rotten with a feminist agenda. This has nothing to do with 'pay' or 'roles'. Hollywood may have been rotten from the start, but it does not mean any gender is being discriminated against.

  • Women are gaining respect in Hollywood

    Women are gaining respect in Hollywood. Although early numbers suggested a number of women working in in the industry during the 1920s, as the industry matured, women were cast aside from writing and director roles. That is no longer the case, with more women assuming a these roles and older female actors gaining roles.

  • Hollywood's Imperfect Image

    Well, for starters, men are paid more for their roles in movies than women, even if both genders are playing an equally shown role.
    Secondly, women are expected to have perfect figures and have a certain "image". Recently, I read something that stated that Hollywood has a tendency to hire those with a certain perfection to them. Also, haven't you noticed some of the revealing clothing women are made to wear in some films? Some are relevant to the role, but many are there for the visuals.
    Finally, yes, age has been proven to be a barrier for female acting roles, yet older men are still in films.
    I am in no way stating that men get all the respect (not always). Rather, I am stating that women are not always respected.

  • There is a pay gap.

    If women were respected as much as men in Hollywood, there would be equality in pay. Hollywood tends to be liberal. You would think they would insist on equal pay for equal performance. Unfortunately, Hollywood doesn't respect its women because they still pay them significantly less than men make for similar work.

  • Women are not respected enough in any occupation.

    In any career or occupation, women do not get paid enough nor are respected enough. In today's Hollywood, women are seen merely as sexual objections and the older they are, the less likely they are to get work. Men have the main roles and most women are the supporting actresses. There are women who are taking on more production work in film. These roles include producers, directors, and writers which do earn respect.

  • No, women need more respect in Hollywood.

    I believe that women are not as respected in Hollywood as they were in the early days of film. While there are many wonderful, talented women acting I believe that even the top earning women in film are still making less than their male counterparts for similar films. In addition I feel like there is a lack of roles for women over a certain age or women from a diverse background. If a man created a role or a franchise he is allowed to play that role until he no longer wants to while I feel that isn't the same for women.


Leave a comment...
(Maximum 900 words)
No comments yet.