• Oscars still represent film achievements

    The Oscars have always been, and continue to remain, an important part of American culture. Each year at the awards ceremony, cinematic achievements in the film industry are recognized and actors, directors, etc. are honored. To win an Oscar is still one of the greatest achievements possible within the film industry.

  • No the Oscars are not relevant any more.

    The process of awarding of movie Oscars has become too political for them to be relevant to the quality of the movie that they are awarded to. It seems that big money is the driving force behind who gets the nominations, with a very few exceptions. Awards such at the People's Choice are much more relevant to a movie's crowd appeal and quality.

  • No, the Oscars are not still relevant.

    No, the Oscars are not still relevant. It is my belief that the Oscars are just an excuse for the Hollywood elite to dress up in free designer clothes, jewelry and other accessories and flaunt them to their millions of fans. It's not even about the movies anymore, it's just a big fashion show.

  • We don't care.

    I keep seeing promos from Good Morning America that say they know I can't wait to hear who the Oscar nominees are. I live in middle America. Californians think poorly of me because of where I live, so I do not care which of them is nominated for an Oscar. They do know we don't watch movies anymore, right?

Leave a comment...
(Maximum 900 words)
No comments yet.