Back to top

What’s Wrong With Health Insurance in America?

Share

Reforming health insurance in this country begins with redefining our understanding of what insurance is and what it supposed to cover. Insurance isn’t for routine or predictable expenses. Over time, we have come to expect all of our health care to be provided through insurance, and covering more has helped make health insurance cost more.