Yes, World War I should have remained a European war. The conflict drew in all of the major powers of Europe at the time, and the United States lacked a clear strategic interest in entering the fray between two groups of powers who had treaties and ententes binding them together. The result was unnecessary expenditures (and deaths) that might have been better spent shoring up the nation's financial health.
War is never an answer, and there is never a true winner at the end of a war. Even though the United States may have proven to be instrumental in ending the first World War, some historians believe that it would have ended well without the involvement of the United States.
Historians are in general agreement that the involvement of the United States in World War One was decisive in the victory of the Allies (Britain, France, Russia and ultimately the United States) over the Central Powers (Germany and Austria-Hungary). Absent the involvement of the United States, the bloody war would have carried on far longer. Moreover, the aggressor nations of Germany and the Austrian-Hungarian empire may have prevailed in dominating Europe.
While it was not necessarily America's war to fight, World War I would have been a greater disaster without its involvement. Germany and the other members of the Central Powers caused some of the greatest atrocities in the history of the world. The presence of hundreds of thousands of American troops shored up the morale of the Allied nations and gave them a boost in armaments. America paid a small price in the amount of lives lost compared to many of the other Allied Nations and only spent one year fighting in the war, but it tipped the scales and helped end it all the same.