Protecting patients from errors: a look at three initiatives
by Bonnie Darves
Published in the May 2004 issue of Today's Hospitalist
In health care, one challenge facing the patient safety movement has been how to move discussions about quality from quiet conference rooms to the hectic front lines of care.
Since the Institute of Medicine released its headline-grabbing reports, “To Err Is Human” and “Crossing the Quality Chasm,” detecting and preventing errors has become a top priority at many health care facilities. Organizations that try to depart from the theoretical and implement real-world strategies, however, often find the road is rocky and littered with cultural land mines—namely, the culture of silence about errors and
“Other doctors come in and round and leave for the office, but the hospital is our office."
Janet Nagamine, MD Kaiser Permanente
Perhaps that’s why many of the programs created to improve patient safety have started with the goal of trying to change the blame-and-shame culture entrenched in U.S. health care.
In this article, we take a look at three institutions that have tackled patient safety by focusing on the basics of communication. From convincing physicians to take a fresh look at their notions of safety to developing systems that encourage physicians and other health care professionals to be more open about reporting errors, these institutions are finding ways to make real progress in patient safety.
They are also beginning to appreciate the role that hospitalists can play in these efforts as the primary drivers of safety-improvement initiatives. Because hospitalists see patients—and systems of care—from admission to discharge, they confront the elements that are “broken” on a daily basis.
“No one has been in a position to take this role before because no one has spent as much time in a hospital as us,” explains Janet Nagamine, MD, a hospitalist who is leading the patient safety effort at Kaiser Permanente in Santa Clara, Calif. “Other doctors come in and round and leave for the office, but the hospital is our office. So we have a vested interest in taking this safety improvement role. We also have a unique vantage point.”
The VA: “retail patient safety”
When John Gosbee, MD, teaches physicians patient safety, he doesn’t lecture about the Institute of Medicine’s “To Err Is Human” report. He doesn’t recite statistics about the causes of medical errors, and he doesn’t present hypothetical situations in which physicians or other providers make a wrong move that results in patient injury.
Instead, he often begins with an interactive exercise that involves, of all things, one of those new hard-to-open mint boxes or a juice bag favored by small children. While both of these products can be found in just about any grocery store, they are notoriously hard to use. That, Dr. Gosbee says, makes them ideal tools to demonstrate some of the problems that physicians regularly encounter in health care.
Dr. Gosbee, a human factors engineering and health care specialist at the Department of Veterans Affairs’ National Center for Patient Safety in Ann Arbor, Mich., asks participants to open and use the mint box or juice bag, imagining that it’s a medication dispenser. While the request initially elicits laughter, the response invariably changes to frustration after participants have spent a few minutes wrestling with the containers.
Participants who aren’t familiar with the juice bag, for example, often struggle as they try to open it without spewing juice all over the place. Once participants have observed themselves or other participants experiencing what Dr. Gosbee calls a “slip” while trying to open one of the containers, he engages them in a discussion about how the mint box or juice bag might have been better designed.
“People think it’s ridiculous until they actually try it,” says Dr. Gosbee. “It really gives them a chance to grapple with the safety issues, and it moves them to a mentality of redesign rather than saying, ‘What we need is a national program to teach people how to use mint dispensers.’ I call this retail patient safety.”
Once he has “warmed up” the participants, Dr. Gosbee might ask them to use a defibrillator they’ve never seen before, or he might ask them to quickly turn an oxygen cylinder on and off again. That exercise invariably leads participants to talk about times they’ve been stymied by devices or equipment—paracentesis trays are a common complaint—they have encountered in hospitals.
The point of the exercise is twofold. First, it underlines the extent to which processes, systems and a lack of familiarity with equipment can contribute to errors and near misses. It also illustrates the idea that even “smart people,” particularly in the absence of any checks and balances, can incorrectly perform a process that can lead to patient harm.
The curriculum Dr. Gosbee developed has been delivered in scores of VA facilities as part of a pilot project begun in 2002. It’s based in part on human factors science, the study of the interrelationships among humans, the tools they use, and the environment in which they work.
Long used in aviation and manufacturing and other so-called “high-reliability” settings, human factors has only recently begun making its way into medicine. In part, the slow embrace of human factors science can be chalked up to health care’s longstanding tradition of assigning error to individuals, not systems.
The curriculum has been well-received. Not surprisingly, Dr. Gosbee notes, many of the physicians who sign up for the workshops in the interest of imparting its principles to residents are hospitalists. “They’re the ones who spend a lot of their time fixing broken systems,” he says, “who see soup to nuts what’s going on in their institutions.”
Dr. Gosbee makes no bones about the challenges organizations face when they try to translate the curriculum’s principles into actual practice. For one, the course is designed to be taught by medical staff using a train-the-trainer model. New instructors must be committed to a culture of safety and be willing to vocally abandon the blame-and-shame model. They must be willing to stick up for anyone who is willing to report a near-miss, even in proceedings with their superiors if necessary.
If an instructor still has one foot stuck in the culture of blame, the course “will lack true meaning,” Dr. Gosbee says. As a result, one module is devoted entirely to developing trust. The goal is to create an environment in which residents feel comfortable talking about their close calls, knowing that they won’t be penalized.
“That’s the sort of trust we want the faculty who are teaching this to understand,” Dr. Gosbee says. “That this is real, and it isn’t like learning about five new rashes or five new medications they [residents] may never prescribe.”
Kaiser Permanente: open communication
At Kaiser Permanente in Santa Clara, Calif., an ICU-focused patient-safety pilot begun in 2001 has helped change both the overall culture and specific patient care processes.
The program relies on human factors training, provides communication tools to help prevent or mitigate errors, and has created patient-safety teams. It has already significantly increased near-miss reporting and improved working relationships among ICU staff.
The human factors training entails a four-hour course in which participants learn error theory and specific techniques to mitigate errors. The course is preceded and followed by a safety attitudes questionnaire adapted from the aviation industry.
Three ICU patient-safety teams are composed of a physician representative, a department manager and between three and six frontline staff members. The teams meet monthly in open-forum sessions that are facilitated by the medical center’s risk manager and patient safety leader.
The approach has produced results. Three years ago, fewer than 10 near misses were reported each quarter. Today, that number is closer to 200.
Dr. Nagamine notes that reporting has increased in part because of the Med-Match system Kaiser developed. It allows staff to anonymously report medication errors by completing a simple one-page form and placing it in a designated box in the ICU.
The pilot also helped contribute to a new insulin protocol, as well as improved allergy screening and follow-up of patients with abnormal test results. The intensive insulin protocol standardized both glucose measurement times and dosing to reduce the variability that often occurs with insulin administration.
“Staff had complained that some physicians were checking fingersticks every hour and others every two hours, which was too long for some patients,” explains Dr. Nagamine, who developed the pilot. “The insulin increments were quite variable, too, and that posed a risk.”
While these initiatives have all helped improve Kaiser’s safety efforts, the most important result may be the open, ongoing communication about safety issues. Based on anecdotal reports, she says, the organization’s openness has saved lives in Kaiser’s ICU and improved working relationships among staff.
“What the managers say is that the whole dialogue and tone of the staff meeting has changed, from ‘We’re working too hard and not getting paid enough’ to really articulating what the problems are,” says Dr. Nagamine. “They’re now taking that next step and venturing into the solution.”
Based on the success in the ICU, the pilot has been expanded into the OR. Dr. Nagamine says it is just one example of how organization-wide interest in safety training has grown substantially.
Dr. Nagamine now teaches a monthly class on human factors at the medical center, which typically draws up to 30 participants from a wide range of departments. In addition, the nursing education department has incorporated human factors modules into its regular curriculum.
“The human factors training has really been embraced by the nursing educators,” Dr. Nagamine says. “They are saying that these skills are as important or more important than some of the technical skills we teach. It’s being recognized as a positive in the workforce environment.”
Thomas Jefferson University: near-miss conference
Rachel Sorokin, MD, associate medical director for clinical effectiveness at Thomas Jefferson University Hospital in Philadelphia, is mildly amused that the 1999 Institute of Medicine report spawned new lingo and a bona fide “movement” for the work that health care quality professionals have been doing behind the scenes for decades. But she also is gratified by the momentum that is building when it comes to examining systems and processes of care, and in understanding their contribution to errors.
“The patient safety movement has brought a deeper understanding of the organizational structure that underlies having a good outcome for a patient, which has changed so much in the last 20 years,” Dr. Sorokin says. “The complexity of care is so much higher now that we need to think more about organizational issues.”
Those organizational issues and challenges often emerge as focal points during the Philadelphia hospital’s monthly “near miss” conferences. The gathering of internal medicine residents, faculty and other interested parties encourages open, “blame-free” discussion of near misses, defined as events or errors that didn’t harm patients but that might have led to injury.
Begun in 2000, the conference is led by the chief medicine resident and the treating residents involved in the near-miss event and typically focuses on two near-miss cases. The goal is to uncover near-miss contributing factors and how they might be addressed to prevent a recurrence. Cases range from missed diagnoses and treatment delays to medication errors and incorrectly performed procedures.
One recent case, for example, involved a patient with liver disease who was scheduled to receive IV calcium infusion based on a verbal order. Instead, the patient’s roommate received the infusion.
Neither patient suffered an adverse outcome from the error, which was traced to a not-so-unusual communication mishap. Participants at the conference, however, suggested that the problem might have been avoided with verbal read-back and spelling of the patient’s name.
Although faculty are present and sometimes offer their observations or answer questions, the conference tends to be dominated by the residents’ discussion. That, says Dr. Sorokin, is the key to the program’s success.
The candid format of the conference has produced both expected and unexpected benefits. In addition to shifting hospital culture away from the blame-and-shame model—an objective from the start—the conference has also contributed to safety-improving processes and policy changes.
Dr. Sorokin says that although it’s difficult to translate the conference into “distinct operational changes” that have occurred in the hospital, she is certain that it has helped reduce the “silo” mentality among various clinical departments. The conference also has provided a valuable setting for identifying educational deficits and reinforcing policies, she says.
One of the most promising results of the conference, in Dr. Sorokin’s view, is that it has evolved to include a growing number of physicians. If a case involves a patient who became hypoglycemic because of a near miss involving insulin dosing, for example, an endocrinologist is called in to provide an educational perspective. In a similar manner, surgeons are often asked to attend when cases involve post-surgical patients. “Whoever I call has always said, ‘I will come,’ “ Dr. Sorokin says.
Last year, the near-miss forum was expanded in certain months to become a joint conference with emergency medicine that focused on cases that included both aspects of care.
“We chose cases that had plenty of clinical issues, and where the opportunities to improve care were on all sides, not one side,” Dr. Sorokin recalls. “That reduced the finger-pointing. Putting the medical attendings and both sets of residents in the same room helps each group better understand the other group’s issues.”
Bonnie Darves is a freelance writer specializing in health care. She is based in Lake Oswego, Ore.