Creative Commons License

A Pound of Cure

Saturday, June 27, 2009 at 05:12 PM EDT

Technology Review, published by MIT, has an interesting article on the use of information technology in health care. As the article points out, the administration’s economic stimulus package includes $19 billion for health-care IT spending; the aim is to provide an incentive for hospitals and doctors to switch to using electronic medical records. There is some good evidence that such a switch would have benefits, both in reducing administrative costs, and in reducing errors (such as adverse drug interactions). But it isn’t clear that cost is the major obstacle to getting this done:

The truth is that these folks could have digitized the whole industry ages ago. The technology has been around for a long time: Wall Street began phasing out physical stock certificates over 35 years ago. Even the cash-strapped airline industry has gone ticketless, removing huge labor and overhead costs. These industries started using electronic records because they believed it would save money.

I suspect that part of this is a kind of cultural aversion among doctors to be seen as motivated by “management” concerns like efficiency. But as the author. Andy Kessler, suggests, there may be something more fundamental that is holding things back.

It is an often-cited and depressing fact that the United States spends more money per person on medical care than any other country on Earth, yet gets poorer results in many standard measures of public health (e.g., infant mortality). And it should not come as a surprise to anyone that the cost of health care is an issue of great concern to people generally, as well as being one of the primary factors behind personal bankruptcies.

As Mr. Kessler points out, the way the health care system in the US works, in the majority of cases, is that doctors get paid for treating people who are sick. Other things being equal, doing more tests, treating more patients, and performing more procedures increases the medical practicioner’s income. Now I am emphatically not saying that doctors are all just money-grubbing opportunists, but it is hard to believe that the structure of financial incentives has no impact on medical decisions. In fact, doctors generally do argue that financial incentives have a lot to do with the behavior of insurance companies and malpractice attorneys; I am not convinced that attending medical school makes one completely immune to this kind of consideration.

Furthermore, the current haphazard state of medical record keeping is a significant problem in trying to move toward what has been called “evidence-based medicine” — that is, a regime where tests and treatments are selected on the basis of solid evidence about what works and what doesn’t:

In those medical records lie the ugly truth about the business of medicine: sickness is profitable. The greater the number of treatments, procedures, and hospital stays, the larger the profit. There is little incentive for doctors and hospitals to identify or reduce wasteful spending in medicine.

The amount of unnecessary spending is huge. In a project that analyzed 4,000 hospitals, the Dartmouth College Institute for Health Policy and Clinical Practice estimated that eliminating 30 percent of Medicare spending would not change either access to health care or the quality of the care itself. The Congressional Budget Office then suggested that $700 billion of the approximately $2.3 trillion spent on health care in 2008 was wasted on treatments that did not improve health outcomes.

The really big payoff from better record keeping could be improved health outcomes at lower cost. Better record keeping would also facilitate preventative care, because it would be easier to identify people with risk factors for a disease, or those taking a drug that had a newly-discovered adverse side effect.

In a way, the issue here is trying to move away from the view of the doctor as an individual artisan, and toward a more prevention-focused, team approach to health care.