Do insurance companies control the business of health?

  • Yes, and it comes down to money

    Insurance companies would not exist if they did not turn a profit. Profit is the main goal. Any health care you receive under insurance is approved by the company. The company will not approve it unless it makes the most financial sense for them, not because it is in your best interest. Billing is set up this way as well. It is in the companies best interest to pay out less (IE< keep you healthy) but even then, you are subject to what they think will keep you healthy.

  • They decide for you.

    Yes, insurance companies control the business of health, because they have a way to say which procedures will be covered and which will not. The doctor does not get to decide what they will do in terms of what tests to run, because the insurance companies have already decided what you will and won't get.

  • They Soon Will

    The Affordable Care Act hands the health industry over to the insurance companies in my opinion. This was not the case prior to the Affordable Care Act, but it will be now and in the future, at least in the US. I think we will end up wasting a lot of money by including this middle man.

  • No, insurance companies do no control the business of health.

    No, insurance companies do not control the business of health. They are actually subject to the business of health. In order to get medical service one can utilize health insurance, but in no way does this mean that insurance is required. Since this is true, there is no way that insurance can control the business of health.

Leave a comment...
(Maximum 900 words)
No comments yet.