• Hollywood distorts the truth

    Movies like U-571 definitely increase anti-Americanism around the world. The event the movie was based on occurred before the US even entered the war and involved British soldiers. Yet the movie comes out and Americans are portrayed as heroes and the Brits not even mentioned. Real people on that sub fought and died and the movie makes a mockery of it. When this is pointed out to some Americans they not its only entertainment and who really cares. No Brit would ever make a movie showing the British flag being hoisted over Iwo Jima, but can you imagine the outcry if they did?

  • Hollywood paints Americans in a negative light

    Hollywood grossly distorts stories and paints Americans as something they are not. Movies are merely entertainment and that should be recognized. Movies are very unrealistic and do not reflect the lifestyle, views, or opinions of really anybody. The only exception would be the elite, and they are often depicted accurately in movies.

  • Yes, with all the American-ranting stars, it is no suprised

    It has become common place to be lectured about Hollywood stars about how unequal and unjust America is. Hollywood is constantly telling the public about America's exploitation of the poor, racism, and inequality as these stars engage in the same activities from their five-star hotels. I believe most around the globe see around this due to the generosity of many everyday Americans; however, Hollywood's rants are probably fueling anti-Americanism in Parisian cafes.

  • No, it is just entertainment.

    While people abroad get many erroneous notions about America from what they see in television and movies, ultimately it is just entertainment. The real reason there is anti-American sentiment is the policy agenda that we pursue. We engage in extra-judicial killings, we bully other countries into giving us what we want, and we constantly act as though our culture is superior. There is a special form of American hubris, and that is why so many people dislike us.

  • No, Hollywood hasn't fueled anti-Americanism abroad.

    People might say that some Hollywood movies show a hint of an American first policy or a tendency of despising certain countries, contributing to the development of anti-Americanism abroad. But because of most Hollywood movies people fell very close to American and develop understanding about American. Therefore the the advantages of Hollywood movies override their disadvantages overall.

Leave a comment...
(Maximum 900 words)
No comments yet.