The meaning of this question isn't clear. But, if it means, "Should the US Have Involved It self In World War I?" then the answer is "Yes." Allowing the Kaiser to seize all of Europe would have made for an even worse 20th century than actually happened. "Good Old Fashioned German Militarism" was actually just an early form of Nazism. It is probably true, however, that, if the US had not joined the allies in WWI, Lenin would never have toppled the Russian monarchy. But, then what? We'd just have had to deal later with two brutal monarchies, German and Russian! If the question means, "Should Americans be informed about World War I?" then the answer is also "Yes." It is always better to be informed and thoughtful about the past so that we can help avoid repeating past mistakes.
We had to. The Black Tom explosion was a direct attack by Germany upon American soil.