Amazon.com Widgets

Should the United States require citizens to have health insurance?

  • The United States should require citizens to have health insurance.

    The United States should require citizens to have health insurance. It is too much of a strain on our hospital system to have thousands of people come in every day requiring treatment that they could'e gotten a lot earlier. Thousands of people die every year because they don't have health insurance and caught a totally preventable condition.

  • Health Insurance Usually Doesn't Pay Off

    The United States government should not be forcing people to pay for health insurance. Most of the time, it is a waste of your money because usually you will have no need for it. Also, since health insurance is not a public plan, the United States government should not force you to sign a contract for health insurance. It is unfair and unneccessary.


Leave a comment...
(Maximum 900 words)
No comments yet.