Health insurance mandates: Should employer-based health insurance be mandated?

  • Yes, employer-based health insurance should be a right to US citizens.

    Citizens should expect health care as a right, and if the country is not going to provide it, all employers should offer health care. In order to afford it, the government should offer aid, especially to small businesses, so every company has the means and ability to provide each person with health insurance.

  • The USA is a Democracy

    I do not believe that employer based health insurance should be mandated, for one I believe its the individual right if they want health insurance or not, and an employer should have to offer health insurance only if that indiviudal wants health insurance not be made to give health insurance to employees who are not even interested in having health insurance.

Leave a comment...
(Maximum 900 words)
No comments yet.