I believe that the U.S. is becoming increasingly
fascist. Every time I turn around, I see
that we are losing more and more of our civil liberties. Although the president is not a dictator,
some presidents basically rule like kings.
The government is becoming bigger and bigger and taking away more and
more of our rights.
Calling the United States "increasingly fascistic' is a scare tactic used amongst those on the right to try and dirty the sheets of the current political regime. Ask any country that has lived in true fascism by a true dictator if America is even remotely similar. The idea is laughable.