I think that we as an American people are sick. We watch reality TV that is immoral and we live lives that are immoral. Because of our influence on the rest of the world, we are succeeding in making the rest of the world a very sick place, that's for sure.
I hate to be this way, but I do believe that exporting our culture is probably the worst thing that the United States does to the world these days. Our culture is based primarily around greed, every man for himself, and a consumerism that ignores all notions of moderation and morality. It's really a horrible thing for us to be unleashing upon the world.
American culture is an amalgam of many cultures. Any culture that comes in contact with American culture is made better by the exposure. Americans have unparalleled freedoms and an entrepreneurial spirit that is unmatched in the world. Of course we have some negative aspects to our culture, as do other countries. We are not forcing anyone to adopt anything from our culture.
American culture is a melting pot of many cultures that has formed into something new. What it has become should have little to no effect on how culture thrives in other countries. If a global culture or the culture of s specific country was destroyed because of the United States, then their culture wasn't that strong to begin with.