I do not believe there is any freedom when being forced to purchase insurance and I do feel as though it goes against free speech and freedom of choice. I have to assume this question is referring to the Affordable Health Care Act, which I feel a little differently about. While the act does say you must buy, you have to consider the reason why this became a requirement. The end objective is to provide health coverage to every citizen, not just those that can afford it. I think we'll see a lot of changes to this plan over the years and it will eventually become second nature. This policy works over the overall good, in that it tries to pull everyone in for health care. It could have been brought together better, however we must deal with what we have.
I think that there are some things in the world that should be required. If everyone has insurance then it will open up a more progressive health care program. There will be less tax burdens because of uninsured people using and scamming hospitals. I think it's a good idea, honestly.
Sorry, not sorry, but you don't have a right to make your emergency health costs the burden of taxpayers by not having health insurance. That means I think that you should be forced to buy health insurance. Apparently the courts have found the same thing as well, since as of this writing we're one month into Obamacare and that provision stands.
This country was based on freedom, and we should have the freedom to choose for ourselves whether or not we want insurance, social security, or to pay into acts to support other people. If someone has a great immune system and sees insurance as more of a cost than a benefit, then let them choose. It should be the individual's choice to spend or save that money. The problem is that we're brainwashed by insurance companies to think that accidents occur often, but relative to the entire population, they're don't. The majority of people lose much more on health insurance than we get back. And not just health insurance. Because of insurance, companies increase their costs tremendously (prescription meds, auto repairs, home repairs, etc.). Why? Because for one, these companies are getting paid and can get away with it, and two, because people aren't given the choice to choose their meds based on cost. You just get what the doctor prescribes.
Health insurance coverage in America should be optional yet affordable. No one should be required to purchase anything in America as we use free market principles to drive commerce. There is nothing wrong with affordable health care. But there is plenty wrong with forcing citizens to have health insurance or face a fine on income tax returns.
Even though I do think people should have some form of insurance just to play it safe, it is really should be their choice. At the end of the day, some people look at insurance as a complete waste of money and simply another monthly bill they need to pay.