The United States of America is definitely an empire. Granted they have done a tremendous job hiding the fact that it is, But it is one nonetheless. First, The definition of an empire should be established. An empire is a nation that acquires territory through military force. The United States has done this NUMEROUS times throughout its history, Primarily through military actions taken against native Americans. The other “conquests” are where it gets tricky. The landed gained from the Mexican-American war was “technically” a transaction, But it’s safe to say if Mexico had refused, The U. S. Would have just declared that it was theirs, I mean what could Mexico do about it, The U. S. Beat the shit out of them, They couldn’t do anything to resist. The transaction was basically a cover up of the fact that it was gained through military force. The Cuban island was received through a peace treaty, So again, Through military action. As a colonial power the U. S. Attempted to claim that they were territories not colonies, Despite the fact that, Like most of the other major industrial powers at the time were also imperial powers, Themselves included, And when it boils down to it, Imperialism and colonialism are essentially the same thing, The only difference being that colonialism involved colonist being sent to settle the new area, And them being called “colonists”, Both imperialism and colonialism had the same purpose, To control area to utilize its resources for the interests of the imperial/colonial power in question
Idk what to type so yeah, It is not a colonial power. Just because it was formally a colonialist regime, Colonialism is all about over and post colonialism and nationalism are the next step. Most former American territories are now free over given political autonomy. So no, It is not a colonialist power.