Computers and Health Care

In recent years, there’s been a lot of talk about computing technology as a potential enabler of major cost reductions and quality improvements for healthcare.

A recent study by the Harvard Medical School suggests that results with hospital computer systems so far are disappointing, to put it mildly.

4 thoughts on “Computers and Health Care”

  1. The comments in that post are the most interesting. There are things that IT systems can do very well but the point about grafting computers onto a paper based system is important. There have been a few successes in using clinical decision support to design user friendly systems. One spectacular example was at Intermountain Health Systems a decade ago. Adult Respiratory Distress Syndrome is a final common pathway for many critical illnesses. The patient is respirator dependent and is in a vicious cycle of infection, failing lung function and systemic collapse. In 1995, when this story began, the Mass General reported an 85% mortality rate. It should be noted that these are young healthy trauma victims, for the most part. IMS was considering buying a membrane oxygenator system, called ECMO (Extracorporeal membrane oxygenator), to treat these patients. It costs $100,000 minimum to use the system on one patient so they were quite concerned about choosing the cases to be treated.

    They decided to optimize ICU care of these cases by setting up a clinical decision support system on computer. A small group came up with a set of algorithms. Orders were written by the computer system as a default set of orders. Treating physicians could change the orders but had to sign their changes. They had frequent meeting to look at data and alter the algorithms based on outcomes. I haven’t seen a report on this in the literature but I attended a session on this at the American Society for Medical Informatics about a decade ago and am going on memory for the details. By the end of the first month, they had arrived at a set of algorithms that were accepted by 95% of the staff as generated by the system. After a few more months, they had gotten the overall mortality of ARDS down to a third of the MGH results WITHOUT ECMO.

    When I was on the faculty at UC, Irvine, I found that their IT physician order entry portal was used by the medical docs for 90% of orders but the surgical service had only 37% of orders entered by the docs. Why ? Well, one reason may be that internal medicine is practiced by writing orders but orders are mostly incidental to surgical practice. They could have changed this but chose not to. I suggested (I was there for other purposes so had no authority) that they set up default orders for the surgery staff and residents using their standard orders. Most surgeons write the same orders for most patients, either pre-op or post-op. They have personal preferences but this is a great place to use an IT system. Have standard orders for each physician and let them make changes as desired. If the default set of orders are OK, they have only to use an electronic signature. Had they done that, I suspect surgeon use of the system would have gone way up but there always seems to be a punitive motive behind many administrators’ rulings. “Who do they think they are ?” Anyway, it didn’t happen.

    The big variation in orders for surgeons is ICU. Here is another opportunity for an IT system. They could enter the diagnosis and the doctor’s name who is the treating physician and that could generate a standard algorithm. If Deming taught us anything, it was that random variation is the source of most errors. It should be pretty easy to write standard algorithms analogous to the IMS ARDS system.

    The same could apply to primary care. Dartmouth, when I was there, had a system for entering patient routine information for office visits. The patient was given a booklet with pages enclosed in plastic and a grease pencil. The pages had a series of questions about health status and other matters. Beneath each question was a series of answers in the form of face symbols smiling to frowning, as a set of yes to no, or good to bad, responses. There was a bar code for each symbol. When the patient returned the booklet to the receptionist, she scanned the bar codes and a sheet was generated for the physician or other provider to be used for interviewing the patient later. This was efficient, saved time and was reproducible.

    There are lots of ways that computers improve efficiency and quality but I fear that these are a small proportion of the use. My wife (I’m now retired from practice) still works as an operating room supervisor. Her stories suggest that very little of the potential is realized. For example, the daily schedule of staff for the operating rooms of a fair sized suburban hospital used to be made up by an experienced clerk who could check with her for questions about big cases and the other details that were not routine. A new director came about 18 months ago, who was actually a consultant supposed to improve efficiency. Soon thereafter, she had two nurses working all day on the schedule that the clerk used to do. They made many more mistakes and so two FTE nurses were doing a clerk’s job and not as well.

    Fortunately, the efficiency expert has moved on and they hired the OR supervisor from Cook County Hospital a couple of months ago and he seems to be more savvy than the person he replaced. Hospitals have to be the worst run institutions outside of Congress.

  2. Re the Intermountain case: Is there anything about this particular disease which makes it exceptionally difficult to choose the best treatment methods?…Or are there also lots of other diseases which would be good prospects for the same kind of decision support system?

    The Dartmouth health-questionnaire scanning system sounds good. Back when IBM was first getting going in the 1920s and 1930s, one of their primary selling points for punched card systems was that data could be entered *once* and then used in a wide variety of ways. This point seems to have been kind of lost: there is a *lot* of redundant data entry going on in organizations of all types, and much of it would be easy to fix given a little creative and problem-centric thought.

  3. ARDS is a disease in which multiple systems are failing. The treatment consists of multiple interventions that all affect each other. The result is a matrix calculation hour by hour. The measurements are physiologic data points like urine flow (measured with a drop counter), respiratory measures like oxygen saturation, pressure to move the lung, blood pressure, venous pressure, cardiac output, systemic vascular resistance and a bunch of others. This gets entered into the system which then runs rules based decision loops to optimize the critical values. It can run lots of iterations and the supervising physicians can modify the algorithms as they see the results. It took a month to optimize the algorithms and thereafter the system ran pretty much without further changes.

    I think this is one of the first papers from that program:

    International Journal of Clinical Monitoring and Computing. 1990 Jul;7(3):177-85.

    Clinical evaluation of computer-based respiratory care algorithms.

    Sittig DF, Gardner RM, Morris AH, Wallace CJ.

    Department of Medical Informatics, University of Utah/LDS Hospital, Salt Lake
    City.

    A collection of computer-based respiratory care algorithms were implemented as a
    prototype computer-based patient advice system (COMPAS) within the existing HELP
    hospital information system. Detailed medical logic recommended ventilator
    adjustments for 5 different modes of ventilation: assist/control (A/C),
    intermittent mandatory ventilation (IMV), continuous positive airway pressure
    (CPAP), pressure controlled inverted ratio ventilation (PC-IRV), and
    extracorporeal carbon dioxide removal (ECCO2R). Suggestions for adjusting the
    mode of ventilation, fraction of inspired oxygen (FiO2), positive end-expiratory
    pressure (PEEP), peak inspiratory pressure, and several other therapeutic
    measures related to the treatment of severe arterial hypoxemia in adult
    respiratory distress syndrome (ARDS) patients were automatically presented to the
    clinical staff via bedside computer terminals. COMPAS was clinically evaluated
    for 624 hours of patient care on the first 5 ARDS patients in a randomized
    clinical trial. The clinical staff carried out 84% (320/379) of the computerized
    therapy suggestions. In response to a questionnaire distributed to clinical users
    of the system, 86% judged the system to be potentially valuable. Through
    implementation of COMPAS, a computer-based ventilatory therapy advice system, we
    have laid the groundwork for standardization of ventilator management of arterial
    hypoxemia in critically ill ARDS patients.

Comments are closed.