Home Blog Autopilot


September 2011

Analogies abound about health care and the airline industry. If the IOM is to be believed, medical errors in hospitals cause almost 100,000 avoidable deaths a year. In the common analogy constantly proffered, those deaths are the equivalent of a 747 crashing every day.

Another analogy: Pilots use checklists when flying. As a result, airline travel is considered safer than your morning drive to work. We now see many attempts to bring checklists into our daily practice, especially in ICUs and operating rooms. While that effort has its critics, the results have largely been positive for our patients.

So not surprisingly, given the present state of technology and the current pressures to bend the cost curve, many are calling for the continued systemization of the practice of medicine. The great hope is that evidenced-based medicine can become imbedded into our daily practice via said checklists and “smart” CPOE. The idea is simple: It would be much safer to fly through the hospital if doctors where on autopilot. It is not hard to imagine the day in which we enter the analogous coordinates of our patient and await liftoff (admission) and landing (discharge).

Granted, many of us may feel like we already practice on autopilot, especially if we see high volumes. Chest pain? Troponins, EKG, stress, home. Or some permutation thereof, probably with more tests and consultants.

But this is not the autopilot I was beginning to allude to. If IBM’s Big Blue can take down Ken Jennings, I don’t think it would break a sweat diagnosing a rare vasculitis or genetic disorder.

Big Blue, or Dr. Blue to you, may prove to be my “overlord” as Ken Jennings so aptly referred to him after being humbled in defeat. Especially if Dr. Blue is programmed with evidenced-based guidelines that have been incorporated into errorless algorithms.

Take the bane of the human doctor’s existence: CHF. Too wet, then too dry, before you can say “Lasix 40mg q8, and I will see you in the morning.” But what if Dr. Blue incorporated all data instantaneously, based on our accumulated medical knowledge and constant input of vitals, labs, urine output, exam (input done by humans for now, but probably not for long) and patient symptoms? Dr. Blue could then calculate the probability of a safe discharge.

Patient says, “Please, can I stay one more day?”

Dr. Blue: “There is a 88.45% probability that you will not return within 30 days if discharged on the following regimen, which is within the CMS’s perfect care discharge confidence interval. It is 2:30 AM, your ride has been notified.”

Sounds too good to be true, other than the fact that human medical knowledge Version 1.0 might now be as useless as my old friend, the Walkman. Flash forward 20 years, and we may all be selling pencils. Oh right, who needs those anymore?

But then I came across an article about the investigation of two recent airline tragedies. I learned that pilots fly almost completely on autopilot from very shortly after takeoff to landing. (Even those flight bookends may soon be fully automated.)

In the investigation of an AirFrance disaster and a doomed flight to Buffalo, both planes apparently lost lift and stalled. The problem was a defective speed sensor in the AirFrance plane and icy conditions in Buffalo.

Apparently–and counterintuitively to the non-pilot–rule No. 1 when losing lift: You need to throttle forward to dip the nose of the plane. This initially causes the plane to increase its descent, while the acceleration allows the pilot to regain control of the plane. In both instances, the pilots throttled back, causing the plane to spiral out of control. It’s hard to believe that a fully engaged pilot who was manually flying a plane could ever violate one of the most basic rules of piloting.

I took away at least two potential lessons from this report. One, never disengage autopilot. In both cases, the pilots apparently went with their instinct rather than what the on-board computer was telling them. But my greater concern is two: What happens if we all become dulled by autopilot?

That perhaps leads you back to lesson lesson No. 1 if you are a pilot. But given the fact that patients don’t always follow the same laws of aerodynamics, the dulling of physicians’ skill due to the presence of Dr. Blue may have a downside.

This assumes that we will be dulled by the automatization of medicine. I do believe this is a legitimate concern as we inexorably progress toward a Dr. Blue health care delivery system.

Let me just add that I don’t fear technology. In fact, I think we need more of it in health care, not less, especially if technology never figures out how to craft a humorous, insightful health care blog. But we will do well to consider the implications of using technology every step of the way. If we become too complacent, a rusty doctor may be at risk of throttling back when the medical condition calls for us to throttle forward.

Of course, no offense intended to you, Dr. Blue. I suspect that it is especially important to get on your good side now, as I understand you may soon be downloading the senior administrator upgrade.