Should the United States transform into a imperial empire.Posted by: Rezamee
Should the U.S. rapidly expand it's territory throughout the world, through the use of military and diplomacy of course. (United Countries/States of Terra anyone?)
Yes. For freedom, liberty, and justice for all citizens of Earth!
No. For the nation-state, diversity, and tradition!
I may sound a bit crazy, but when you look at most nations on earth, you don't exactly see very much tolerance, egalitarianism, or liberty. Some nations still execute people for being gay, or force 12 year olds to give birth against their will. And I don't even need to remind you of the ones that kill others based on race or religion. For this reason I would support an imperialist power who's goal is to establish a set of laws on the rest of the world which will protect vulnerable groups.
If they want to come here, we won't let them in. Instead we'll come to them and run them ourselves. Problem solved.
Every non-American (aka the vast majority of the planet) is reading this and making a face palm right now.
The imperialist americans have already established an empire. They are a threat to the security and prosperity of the Glorious Democratic People's Republic of Korea.
An empire can't turn into an empire. It's already one.
We already control most of the world through indirect means. We would gain more problems than solutions if we turned into an all out Empire.