Back to top

What’s Wrong With Health Insurance in America?

Share

Reforming health insurance in this country begins with redefining our understanding of what insurance is and what it supposed to cover. Insurance isn’t for routine or predictable expenses. Over time, we have come to expect all of our health care to be provided through insurance, and covering more has helped make health insurance cost more.
 

DISCUSSION QUESTIONS

  1. What has been the effect of mandating coverage for every insurance plan offered in the United States?
  2. How free is the market for health insurance in the United States?