The War in Iraq needed to happen. Did everyone forget about September 11th? If we had not fought back the United States would have been considered weak and terrorists everywhere would have made us a target. There is no question in my mind that Iraq had weapons of mass destruction. To think otherwise is naive.
The war in Iraq did nothing but tarnish the already tarnished image of the United States in global politics. Nothing was gained from it, and no one benefited from the war except for a few wealthy business executives. The entire country was destabilized for the purpose of spreading democracy, and that is absolutely not worth it since democracy is not something that can be given freely.