The American health care system needs reforming. Anyone who has had to submit to doctor’s office visits, emergency room care, tests, exams and the subsequent pharmacy expenses knows that help with the costs of medical care is dire. Too often insurance does not cover what your medical needs are and prices are over the top for those who cannot afford to get the insurance that they need.
The USA is the wealthiest countries in the world and it's not even close. And yet, the health outcomes were not commensurate with the level of spending on health care by Americans. We need reforms to make health care more affordable and increase access, to reduce waist and improve outcomes.
Yes, the American healthcare system really needs reforming, because a person should not be dependent on their employer for their health care. A person should have their own health insurance, and be able to take it with them if they lose their job or if they change their job. That would make people have peace of mind in their health care.
I believe the American health care system does need reform. The industry is obsessed with profits when it should be obsessed with providing care to society and making life better. Health care is not meant to be an industry that seeks profit, it is suppose to be in place for the better welfare of the people.