Back to top

Intellections Discussion

What’s Wrong With Health Insurance in America?

What has been the effect of mandating coverage for every insurance plan offered in the United States?

Responses

Costs have increased , though more have access

Was it a missed opportunity to encourage health promoting behaviors(diet, exercise). Many illnesses are unavoidable, many are not.

I heard on an NPR radio program that since the mandate, knee replacement surgeries increased by 32%. This is indicative of people having foregone medical care because they didn't have health insurance. True, costs have gone up, but so have executive salaries, some executives making more in a day than the average person makes in a year.

Share