Amazon.com Widgets
  • Can't watch anything anymore without homosexuality being shoved in your face

    Many characters in movies/series don't appear in 'romantic' scenes or with a partner or talking about a partner, simply because it doesn't fit into the plot. So technically, they could be hetero- or homosexual and nobody would care, and I honestly wish nobody did and they can just be whatever they are because I don't care about a character's sexual relations when it has nothing to do with the plot or character development. But with homosexual characters it seems that they have to be specifically portrayed as gay, meaning they have e.g. a very graphical sex scene to show that they are homosexual. I watch series on Netflix and pretty much all of them have homosexual relations/sex included, Orange is the new black (obviously), The good wife, Ozark, Sense 8, Ray Donovan, even Narcos. Sometimes it goes fine with the plot but often I feel it's being shoved in my face, which I also don't like with heterosexual sex scenes btw, only the latter is something I don't come across anymore too often. To me it seems like a token thing and by now it seriously annoys me, so the last time I started watching a series and every character was displayed in situations such as work, with their parents, hanging out with friends, and then, of course, the lesbian couple had to have a very 'the more you see the better' sex scene, I stopped watching. I want to see a series, not agenda pushing, thank you very much! It also seriously annoys me that 'open' and 'tolerant' is something that only applies to you (or that's what is being suggested imo) when you enjoy people shoving their tongue down each other's throat and more. I just don' like that, I think people should have things more than holding hands and kissing (in an affectionate non-sexual manner) in their private homes. I was a rather left-wing democrat, non-religious traveller all my life, but with everything that's going on with people now calling everybody a hateful homophobe who doesn't think their own kids would be so much cooler if they were gay, I feel how I am drifting more and more to the right.

  • All media supports and promotes homosexuality

    You'd have to be a giant IMBECILE! To not realize that all Media in general promotes homosexuality. Every major network from A to Z, even news networks, promote it like hell. Can anyone name me a show that does not have a gay character in it? Here's the thing, America today isn't the America that our parents grew up in. This is now a totally secularised, godless nation that's gone to the toilet. Everything that common sense says is wrong, today it has turned into right. How ironic that there are more things going wrong today in this country alone than ever before, and we're left with morons who question "why does God allow bad things to happen? How about because this nation along with the rest of the nations of this world have completely rejected God and His holy word the bible. Enough said, no one cares or listens anyways.

  • Hollywood definitely promotes it!

    I get they want us to "accept it" but they don't need to have homosexuality in every TV shows and movies. Honestly I'm getting tired of watching them. I don't support their gay lifestyle but I do respect them. I have my beliefs and I feel like their views and opinions are shoved down my throat to HAVE to accept them. People should be allowed to express and hold on their beliefs without being called "Homophobic". I also see that Hollywood is treating this as a fashion trend. I just saw a TV show about a girl coming out and another guy was commenting to a friend "Yeah the gay thing was 2 years ago, now is transgender." I'm like wtf?? Hollywood sees it as a trend because it's selling and will keep having gay/transgender scenes on TV until people will get fed up with it because it's played out too much and Hollywood will move on to the next trend. That's just the way it is. Until then they will make it seem that being gay is BETTER than being straight. That's not okay!

  • What happen to all these new movies that are blown up on advertisements in order to convince you to watch the homosexuality hollywood is promoting

    In the past four movies I have watched they have had some kind of homosexuality. Don't get me wrong I have nothing against it, but why has hollywood made this such a trend. What is the reason for hollywood to promote homosexuality. In the last four movies which were Hot tub time machine, The Interview, and some other ones I can't seem to remember displays how homosexuality is cool, funny, and just better. Again I have nothing against homosexuality but when clearly every movie I watch promotes homosexuality I tend to question whether I want to visit the movie theatre and watch another movie that I know will have propaganda to make us how they want us to be.

  • Every show must have a gay episode

    Seems every series (yes, even NCIS) has to promote the gay lifestyle. There have been gays for centuries and I have no problem with that. What I don't support is Hollywood promoting the lifestyle so that children having difficulty in social situations feel the solution is to become gay. In many cases, it is the lack of a supporting family.

  • Yes it does

    Gays are shown everywhere now a days......In movies , in tv series and in a lot of other programs. It seems like gays are present in this world more than normal / straight people. All the gay people are presented as perfect , sensitive , helpful and loving in nature. Sometimes it feels like we are the minority. They say that we are homophobics and we are not giving them their BASIC HUMAN RIGHTS but infact they are the ones who dont even let us express our views and are constantly imposing their aggresive thoughts on us. They are spreading like an infectious disease in media. I don't know if Hollywood is paying celebrities or they are sponsored by some other agencies but you can clearly notice that all those celebrities who have wide fan following like Matt Bomer and Jim parsons are COMING OUT OF CLOSET everyday and people are so afraid to say anything about it because they will be labelled as narrow minded. I am MBBS doctor. I remember we used to study homosexuality in the section of perversion and it was considered as abnormality. But kids wont listen cause LBGT is in fashion now.

  • Yes, Hollywood promotes the homosexual lifestyle.

    Yes, whether or not people agree with it doing so, Hollywood definitely promotes the homosexual lifestyle. In reality, only around two percent of people are homosexual. However, in TV shows and movies, a higher percentage of people are (and are always shown in a positive light). It is similar to how Hollywood has under-misrepresented minorities. They give us an inaccurate view of what the country is really like, and, in this instance, make it appear that more people are homosexual than they really are, which promotes it.

  • HW def promotes homo lifestyle

    Homosexual roles are always displayed in a positive light and there are many people mostly children who imitate and look up to actors. Hollywood mainly tv creates the fads and "cool" things to do and in my opinion "homosexuality" is the new fad. Hollywood promotes the homosexual lifestyle by making it look desireable as if only homosexuals can dress good, will be popular, or superior in the arts

  • Hollywood promotes homosexuality

    Hollywood does indeed promote the homosexual lifestyle. In fact, they are doing so now more than ever before. You turn on the TV and you will see homosexual individuals on TV more often than not. There's even gay dating shows, and many more gay characters in series and in movies as well.

  • Hollywood Has Helped It Become More Acceptable

    Yes, Hollywood does promote the homosexual lifestyle – but only because it’s the newest, hottest trend. Hollywood seldom does anything for the good of the world – it’s usually all about glamour, popularity and money. But in this case, Hollywood has done a lot to make the homosexual lifestyle more acceptable to the American public and that is a very good thing. After all is said and done, the sensationalism of Hollywood is sometimes the catalyst society needs to make any drastic attitude changes, so that is not a bad thing. Everything has a purpose.

  • No, they aren't "promoting" in the sense people think.

    Hollywood does show homosexuality in its material, but showcasing homosexuality is not promotion in the sense people think. I haven't seen any movies or shows that try to convince heterosexuals to experiment with homosexuality. Gay people should be able to show themselves in public without being accused of trying to convert the straight populace.

  • People are gay or straight (or whateer else), regardless of what they see on tv

    Regardless of whether or not Hollywood says it's OK to be gay or not, such a "promotion" begs the question, and assumes that being gay is something people choose. People with homosexual orientation will be just as attracted to their own gender regardless of what they see on TV, and those same people will be just as sexually unattracted to the opposite gender regardless of what they see on TV. The only difference the TV generally makes is whether or not they accept what they feel, whether or not they realize what's actually going on.


Leave a comment...
(Maximum 900 words)
No comments yet.