Is health care reform needed in the United States?

  • Yes it is.

    Health care reform is needed in the US. The way health care now is not good and has been made even more expensive for people who can not normally afford it. The health care now wants everyone to have it and pay for it even if we can not afford it.

  • Wage Reform Needed To

    Health care reform is only part of the problem. We need education and wage reform as well. People making minimum wage can't live well in the United States. More employers seek college degrees from applicants, but more people aren't enrolling in college because it is too cost prohibitive. In order for the quality of life to improve, America needs better health care, better wages and more education.

  • Good how it is

    No, I do not think that health care was a problem here in the United States, nor do I think that it needed to be reformed. Most people in the country had health care, and those that did not usually had a way that they could get it if they wanted.

  • We Need Cost Control

    I believe it is far more important for the United States to focus on cost controls within the medical field rather than more health care reform. The United States literally just implemented the Affordable Health Care Act, so further reform would be troublesome. To me, the big issue and the one that was completely overlooked, is the unreasonable prices charged by hospitals, clinics, and doctors themselves.

  • We were better off before.

    No, heath care reform is not needed in the United States, because we had the best system in the world. In the old system, there was still an incentive for producers to make and sell instruments and goods for medical services. People were able to choose which services they wanted covered among insurers. It was superior.

Leave a comment...
(Maximum 900 words)
No comments yet.