Home Evidence-Based Medicine Translating evidence into local practice

Translating evidence into local practice

An evidence-based center tailors literature reviews to local needs

February 2016

Published in the February 2016 issue of Today’s Hospitalist 

PLENTY OF GROUPS—from the international Cochrane Collaboration to quality improvement committees in local hospitals—are trying to figure out how to translate evidence into practice and close the knowing-to-doing gap. In 2006, the University of Pennsylvania Health System in Philadelphia decided to go big in that effort, launching its Center for Evidence-based Practice (CEP). Co-founded by hospitalists trained in epidemiology, the center has supplied nearly 300 systematic reviews of evidence in response to requests from Penn’s clinical and administrative leaders. In a study posted online in October by the Journal of Hospital Medicine (JHM), the center analyzed the reports produced during its first eight years. According to the article, 24% of reports focused on medications and 19% involved devices. In addition, 12% focused on care processes, while another 12% informed computerized decision-support interventions.

“If just a few of your projects have a sizeable impact, you have paid for your whole group.” 

q-and-a-umscheid~ Craig A. Umscheid, MD, MSCE, University of Pennsylvania Health System

In the study, the center also gave itself a report card, surveying the requestors of reports from the last four years. Most said the reports “answered the question” they’d asked in a timely manner and “informed their final decision,” even though only 7% said they had “changed” their minds based on report findings. Among the top reasons given for being satisfied with the CEP process was the staff’s “objectivity” and “evidence-synthesis skills.”

Moreover, the study found that the center’s focus has evolved over time, with a relative increase in the number of report requests coming from clinical departments. Hospitalist Craig A. Umscheid, MD, MSCE, who directs the center, talked to Today’s Hospitalist about the center’s work and why hospitals and health systems might consider supporting such a center at their own institutions.

How does a local center avoid duplicating national and international efforts to write guidelines?

A benefit of local centers is that they can incorporate local data into a systematic review. Here’s a good example: We were asked to review the evidence on the use of telemedicine to decrease ICU length of stay and mortality. Our health system was interested in expanding the telemedicine program we already have on select units.

We reviewed the literature and found a national systematic review, but all the studies in it focused on community hospitals. Those findings were not generalizable to us because community hospitals often don’t have residents, fellows and other physicians in critical care units at night, whereas ours do. In our report, we took the national findings and put them in the context of our health system with our own mortality and length-of-stay data for our ICUs with telemedicine and those without.

So local centers can adapt national evidence to their specific needs. Another benefit of doing this work locally is speed. Our research found that CEP reviews required on average two months to complete, while standard systematic reviews can take between 12 and 24 months. That speed comes from our ability to narrowly focus our reviews on the information most relevant to our requestors’ decisions.

Who can request a CEP report?
Accessibility is really important, and we have a low bar for people to reach out to us. People are often surprised, but we don’t have a formal route to submit a request. People e-mail me, and I then clarify their question and ask them when they need a report to make a decision.

Then we make sure that what we do is accessible to all who might use it. We just finished a report examining the literature on addressing disruptive or violent patients in the emergency department. We sent the final report to the requestor, then to all relevant leaders in our health system’s emergency departments.

We got an e-mail a couple of days later from one of our EDs saying it had just started to work on this issue—and didn’t know the requestor ED was working on it as well. Those two individuals have now met to work together. We didn’t set out to serve this bridging function, but it is an important part of what we do.

In addition, we post the reports to our intranet site, and most are included in a resource called the Health Technology Assessment Database. That is part of the Cochrane library and accessible around the globe.

How do people making requests use your reports?
In the paper, we note that 12% of reports have informed computerized decision-support interventions. For almost every decision support we have deployed—whether an order set or an alert—we study how people use it.

That’s another benefit of having a local center. A national center can synthesize generalizable evidence and put it out there. A local center can grab that evidence from the cloud and adapt it to the local setting, then work with local people to implement the evidence and measure its impact.

So the center is involved in implementing projects as well?
The JHM study focused on our rapid evidence reviews, but that isn’t the only thing we do. We spend a lot of time integrating evidence into practice using electronic health records and clinical pathways. We were asked, for example, to create a prediction rule in the electronic health record for patients at high risk of readmission. We first looked to the literature to determine the risk factors for readmission, then created the prediction rule and implemented it in the EHR.

We also educate staff and students on how to do evidence-based quality improvement. We teach people the PICO framework (Population; Intervention; Comparison; Outcome) to define a question, then how to search the literature, adapt that material to your local setting and measure the impact of implemented solutions.

Is this something only large academic health systems can do?
Formal hospital evidence-based practice centers remain uncommon in the U.S., but you don’t have to do it at the scale we do. Our center started with one FTE split between two people. We now have six FTEs including three full-time research analysts and a number of physician liaisons across the health system who come to our weekly meetings, bring issues to us and disseminate the work we do. We have also purchased part of the time of a faculty biostatistician and a health economist, and we work closely with our librarians, informatics staff and quality analysts.

If just a few of the projects you do have a sizeable impact, you have paid for your whole group, particularly given the current environment with value-based purchasing and pay for performance. If you are able to improve hospital-acquired infection rates or inform an intervention to reduce readmissions or address preventable mortality, these benefits can add up quickly.

Deborah Gesensway is a freelance writer who covers U.S. health care from Toronto.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments