I think to completely dismiss the government as a legitimate player in public health care would be ridiculous. To say that the government has no place in health care would commit someone to the position of saying that Medicare and Medicaid are programs that should be eliminated. It seems that programs like Medicare and Medicaid make it possible for people who otherwise might not have access to health care to be able to take care of their families. To say that it should left up to the states is also a non-argument as that is still part of the government of the United States. Medicaid is jointly funded by the states with the federal government and managed by the states. Should those programs be eliminated in favor of nothing in the private sector to replace them? It makes no sense to deny people the security of their lives and continued health care because some people don't trust the government.
The government should make sure that all of its citizens have access to health care. The Affordable Care Act provides this service. It is one of the most significant things to happen in the United States in the last thirty years. Government regulation of the public health is the only way to provide all citizens with equal treatment.
Without a doubt, the government needs to take an active role in promoting the public health and protecting people against harmful substances. Otherwise, the federal government isn't doing its job properly. A government that doesn't help care for its people is doing a disservice. With that in mind, the government needs to influence public health.
If you think healthcare costs are bad with government regulation, imagine how out of control they'd be without it. The government sticks its hand in things it shouldn't plenty, but in the case of public health, it does need to have a somewhat active role for there to even be a shred of fairness involved.
Yes, the government should have a role in public health, but that does not mean that they should micromanage our lives. If there is a major outbreak of disease, the government is the appropriate agency to step in and coordinate a response. But that does not mean that the government has any business taxing potato chips.
The government should have a role in public health. Anything that the government can do to provide assistance to the citizens to lower the overall cost of health insurance then they should do it. Many citizens do not get the treatment that they need because they can not afford to go to the doctor.
Nearly all developed countries (and some undeveloped countries) have universal health care. The United States needs it now. How it should happen is a matter of debate between people, but the majority of Americans support health care as a right.
If public health deals with preventing disease, and promoting the health of society, the Government needs to ensure everyone has accessible health care.
Unless they are going to pay for our doctor's visits, then they have no business playing a role in our public health. I think this should be state run and the decisions should be made by the people. The state should have an allowance for public health care. The people should all have a co-pay every time they come in. We don't need the government.