This country has always been a bully on the global community. Everyone in America is argent and full of themselves. This abuses its military power to control every other country to bend to uncle Sam's greedy ways. This country has never had the intent of helping any other country but itself.
I used to believe the US was the greatest. As I grew older I realized this isn't true. Adding 'anymore' is a good idea, since it probably was long ago. Back then most countries were monarchies with rough royalty, but I don't know. There's a chance it never was the greatest country. I'm planning on leaving the US.