Jan. 10, 2018 at 4:12 p.m.
Texas Gov. Greg Abbott on Wednesday posted a despicable lie on Facebook, saying that Democrats want to bring “socialized medicine” to Texas. I can’t find any cohesive or coherent information on the Web that backs Abbott’s assertion. Not that it’s essential to do so. This hyper-hypocritical politician has been part of the disinformation factory for a long time. And, there is more to openly disdain about Abbott: He full-well knows the real definition of socialism and he full-well knows that the American health system will not be “socialized” any time soon.
What Abbott is doing is stirring up his base, trying to demonize the Affordable Care Act and a potential “Medicare for All” universal insurance plan. Remember, though, Abbott’s base believed, like Donald Trump’s followers, that Obamacare was different than the ACA and they supported ACA when it was explained but still opposed Obamacare.
There is little doubt that the nation will be discussing health care policy for a long time. It’s a daunting subject — one that Trump by his own admission didn’t know was so complicated. As broad and as deep a subject it may be, understanding how the United States has gotten to this point in health care policy can be explained. So can the economic principles that distinguish this industry from others in the American economy. Making sense of health care policy is a bit like that cliché question, “How do you eat an elephant?” Answer: One bite at a time.
I have been writing about health care public policy and health economics as a journalist for some 20 years; and, that’s after a master’s in hospital administration and 25 years in the health care industry. I believe I can help people make senses of this if they are willing to take the time to read somewhat long posts. The first two bites in this three-post series will be provide some history. The final bite will be a discussion of the economics of health.
A bit of history
To fully understand the evolution of health care policy, it is important to go back as far as the end of World War II. The historical context coincides with these mid-20th Century events.
Recognize that the structure of the system was that the hospitals were predominantly nonprofit and so there was a certain lack of “business” pressure with respect to performance and the bottom line. In addition, the people with the financial responsibility on boards of directors were the moneyed movers and shakers of the community. They could, and sometimes did, cover operating deficits. Physicians, while having plenty of interest in the hospitals’ functioning well, were neither employees nor contractors. They were free agents within the hospital organization.
In some communities, working with the medical staff was like herding cats. It’s not that the doctors were evil. It’s that they wanted some things the hospitals sometimes couldn’t give — like more technology, for example. The entire structure was akin to the professional autonomy found in universities.
At the same time, the population was growing, the need for a full range of services was being demanded and the federal government was there to help. The passage the Hill-Burton Act spurred a boom in hospital contraction that lasted from the late 1940s until the mid-1970s. While the main purpose was to build hospitals in communities that didn’t have them — read rural — like most federal programs, people who played the game well got the money and hospitals sprang up everywhere. Or, existing hospitals got new wings.
Policymakers came to realize, however, that supply created its own demand. If a hospital built the beds, ORs and x-ray suites and so on, they had to be used to cover the costs and debt service. At the micro-level, the community had to pay for the over-building and cover the costs of the resources. At the macro level, pumping patients through the system to fill beds and finance the growth added up to higher costs for the country.
And guess what came along to help fill them? Medicare and Medicaid. Those programs created another revenue stream for the system. In economic terms, they created demand. Remember, though, that some of that demand wasn’t need that was based on clinical judgment and evidence-based medicine. The model for paying doctors by the procedure and service still prevailed in the early days of those huge programs. Although these programs had economic consequences, they also did something valuable for Americans. They took care of sick people who otherwise would have suffered whatever resulted from not getting medical care.
The filled hospitals and clinics and added to the total medical and health care costs for the nation. Combine this influx of revenue with the supply creating its own demand, and medical costs skyrocketed. By the way, some researchers noticed that all of these medial resources didn’t vastly change what were then considered key indicators of a nation’s health — infant and maternal mortality, life expectancy and other measures of morbidly and mortality.
Around the mid-1960s, scientific advances and medical technology were having a greater effect on medical care .The first kidney transplant had already occurred in Boston in the 1950s. By 1967, Dr. Christian Barnard stunned the world with the first heart transplant. The first CAT scanner came along in the 1970s, but the people who came up with the idea had developed it in the 1960s. And so it went.
But, for all the good the technology did, it also drove up the costs. And it was around the mid-1960s we began to hear the word “crisis,” which we’re still hearing today. Some of those people taking about “crisis” were members of Congress. One would think that if they could call it a crisis for 60-plus years, they might have found a way to solve it, but that’s another story.
One of the answers Congress had for cutting costs in the 1970s was to control capital expenditures. It passed a series of laws establishing coordinated planning for communities that required states to institute programs to assure that expensive capital expenditures could be justified by community need and not duplicate services. This was called Certificate of Need, or CON, and remains on the books in some states. If a hospital wanted a new CAT scanner, or someone wanted to build a new nursing home, they had to get it approved through a process that justified the expense. The theory was that if need was proven, the utilization would be optimized.
Some hospital administrators understood this was a good thing for the community, the nation and the system. But as a group, they fought CON. Then they got smart and began to work with their doctors and gamed the system — except in two hospital towns. Behind the scenes, medical equipment and hospital construction interests (contractors, architects, etc.) loved CON. They had things to build and equipment to sell. Despite the political nature of the local competition for new toys, many felt that the legislation provided for a rational distribution of resources like preventing duplication of services and to some degree held back the growth of health care costs, if only a little.
As far back as the late 1970s, the American Public Health Association and others proposed treating the entire health care system is a public utility. Health care is a public good, the argument went, so regulating it as we did with gas, electric and water providers would give us a rational model for resources usage and controlling health care costs.
Policymakers would never know. The programs were not in place long enough to fully test them because in 1982, something happened to change the landscape drastically once again.
Part 2 —So-Called Consumer Driven Health Care
Powered By AffectDigitalMedia