A Brief History of Insurance

Last week, we started a series of articles to explain health care jargon and contextualize the information so all Minnesotans can engage in the discussion. The first article in the series covered health centers that cover Minnesotans who need care the most.

The largest policy debate happening in the health care world right now is about the role of government in insuring Americans. More generally, the question is if health insurance can or should be treated as a completely free market. Proposals range from completely gutting public insurance programs to embracing universal coverage, but the overall debate revolves around the extent of involvement our government has in health care. Free market ideology has dominated the U.S. health care system since the early 1900s, but federal and state governments have periodically intervened to address market failures, which routinely exclude those that need health care the most. This article will provide a brief history of public health insurance in the U.S. and Minnesota.

In the 1930s, the first steps towards our current health care system occurred. At the time, hospital care was largely reserved for the rich. The system was purely fee for service, meaning anyone that couldn’t afford those fees relied on charity care in public hospitals (this included the majority of working people). This began to change in 1929, when Baylor University offered a $6 per year plan that guaranteed professors up to 21 days of hospital coverage annually. The “Baylor plan,” along with a prepaid plan covering miners and lumber workers in the Pacific Northwest, would become a company now known as Blue Cross Blue Shield, and the model for employer-based health care in the United States.

This program was popular among physicians and employees of companies that offered coverage, but the unemployment rate was exceptionally high throughout the 1930s (sometimes rising above 20%!), meaning millions of Americans still could not afford hospital care. President Roosevelt several times proposed a national health program that resembled single-payer, but a variety of factors led to lack of implementation, particularly once the Second World War broke out in 1939. Roosevelt tried to rekindle a reform effort in the 1940s, but he would not live to see a plan put into place, as he died in 1945.

President Truman also proposed a national insurance program, but he failed to overcome the political hurdles as well. President Eisenhower barely touched on health care, and all the while third party payers (health insurance companies) grew to become the dominant payment structure by the 1960s. This system worked very well for the burgeoning working and middle-class and was immensely profitable for insurance companies and doctors, but it failed for two key demographics: the elderly and the poor.

A study conducted by congressional leaders in 1959 found that a vast majority (85-90%) of elderly people had to spend their savings and Social Security or rely on their children or the hospital to pay for health care, per Blumenthal and Morone. Worse yet, the study found that 56% of seniors had no insurance at all, and only one in four seniors had comprehensive health insurance. At the time, insurance companies had no obligation to offer coverage to people they saw as bad investments, so they wouldn’t. Instead, insurance companies and physician organizations acted out of financial self-interest, ignoring the elderly (bad investments) and the poor (also bad investments).

Kennedy campaigned on this issue, as his father was often ill and spent thousands of dollars on treatment. He went on to make an impassioned speech at Madison Square Garden to push for a national insurance program for the elderly in 1962, and his efforts were continued and realized after his death by President Johnson. In 1965, Johnson passed the Social Security Act Amendments known as Medicare and Medicaid, against vehement opposition from the American Medical Association and health insurance companies. These programs guaranteed hospital insurance to seniors, poor families, and the disabled. They were (and still are) vital as a safety net across the country, but had almost no cost controls to speak of, and did not change the insurance companies’ practices at all. This reform simply addressed the market failing senior citizens and poor people.

Despite the AMA’s opposition to reform, they quickly capitalized on Medicare’s incredibly generous reimbursement rates. At the time, the government would reimburse any procedure so long as the cost was “reasonable”, despite not having any operational definition of what that might be. For the next 15 years, doctors used that ambiguity to maximize profits, performing expensive and unnecessary procedures to bill the government whatever they wanted. Insurance companies still avoided covering poorer people and sick people by declining those with pre-existing conditions. Additionally, as the uninsured would begin utilizing emergency departments when primary care was unavailable, hospitals turned them away, sometimes without even stabilizing the patient.

This changed in the 1980s when the government radically changed how reimbursements would work, as they would pay hospitals for what they expected a diagnosis to cost. For example, if a patient was diagnosed with dehydration, the government would pay for fluids and a full day in the hospital, even if the patient needed two or three days. President Reagan passed this to cut costs, but it totally flipped the incentive structure for hospitals. The government also passed EMTALA, which required hospitals to stabilize patients in emergency situations regardless of their ability to pay. Again, the federal government acted to counter market forces that made the practice of medicine too business-oriented at the expense of those excluded from the system.

In the early 1990s, President Clinton tried to pass reforms that guaranteed primary care coverage to all Americans. This would have mandated all individuals buy a qualified health plan that would have a required minimum benefit set while setting a cap on out-of-pocket expenses. Those below a certain income would receive financial aid to pay for the plan, or the government would pay for them, but this plan was torpedoed by the health care industry.

President Clinton did recover, however, and in 1997, he passed the Children’s Health Insurance Program (CHIP). This program provided federal matching funds for children whose family earned too much to receive Medicaid, but not enough to afford family coverage. This program still exists today, and it covers nearly all children below 200% of the Federal Poverty Limit (FPL). Minnesota expanded coverage up to 275% FPL, but most children are covered by MinnesotaCare, a state-run basic health plan that covered 6,240 kids in August 2017. In 2015, about 30% of Minnesotans were covered by a public insurance program like CHIP, MinnesotaCare, Medicaid, Medicare, or a combination of these programs.

In 2003, President Bush passed the Medicare Modernization Act in order to combat the growing costs of prescription drugs (an issue that our own Senator Paul Wellstone fought for until his death in 2002). Pharmaceutical companies had increasingly raised prices of prescription drugs and employers would refuse to cover drugs for retired workers, leaving many seniors without the ability to pay for basic life-sustaining medications. The reform would establish “Part D” in Medicare, allowing seniors to pay into a plan that would cover 75% of drug costs until they hit the “donut hole.” The donut hole was an intentional gap in coverage where the senior would pay 100% of the costs until they hit a catastrophic level (well over $6,000). Once the patient paid thousands of dollars in prescription costs, the federal government would cover 95% of the costs until the next year.

This was largely a disaster for seniors across the US: by 2007. With high drug costs and a system that incentivizes flipping beds more than caring for patients, high medical costs with poor outcomes became the norm. For example, the enormous costs in McAllen, Texas, exacerbated by doctors overutilizing imaging and diagnostic procedures and performing unnecessary elective surgeries, per Dr. Atul Gawande.

While insurance companies grew to multi-billion dollar industries, regular patients paid the price. In 2007, at least 62% of bankruptcies were medical in nature. For those who did have insurance, it often was not very comprehensive, and many plans had massive deductibles that enrollees couldn’t afford. This necessitated further reform, which President Obama implemented via the Affordable Care Act. Guaranteed issue of insurance, subsidies for Americans buying insurance on their own, and government-subsidized marketplaces for insurance have greatly improved access, but health care is still very expensive.

This piece hopefully highlights the fundamentally different goals between the American people and the American health care system, and that the results of that difference historically the exclusion of the sick, the poor, and the elderly. Much of the world treats health care as a human right, while in America it is treated like a commodity. Know that the reason public insurance options exist at all is because the private market refused to cover almost one third of Minnesotans. Know that without these programs, many Minnesotans (including children, pregnant women, and the elderly) would go be forced to go without health care by default, and that these programs will expand as the industry systematically capitalizes on the sick and vulnerable.

For further reading on this topic, The Social Transformation of American Medicine and Remedy and Reaction by Paul Starr, and The Heart of Power by David Blumenthal and James Morone are great options. Next, we’ll get into women’s health services in Minnesota.