Americans having free health care!


We should have free health care because America is obviously the "land of the free" and even though we have other free systems, why hasn't the goverment had the common sense yet to have free health care like England or France? Do people not see that these countries not only have better health, but have a better chance to live than all of us Americans, even the wealthy ones?! It just doesn't make sense to me at all!...

Voting Period
Updated 11 Years Ago