Home Analysis You think the diagnosis is correct, but are you right about being...

You think the diagnosis is correct, but are you right about being right?

June 2005

Published in the June 2005 issue of Today’s Hospitalist.

When you make a diagnosis, how well can you judge whether you’re correct? If you’re like the physicians who took part in a recent study, your perception will be wrong about one-third of the time.

Researchers found that physicians who were given challenging clinical scenarios could accurately sense if their diagnoses were correct about two-thirds of the time. The other third of the time, physicians were either wrong when they thought they were right, or they were right when they thought they were wrong.

Knowing whether a diagnosis is or is not correct is important for a number of reasons. For one, if you think you’re right but you’re actually wrong, you won’t know to seek help, either from a colleague or a textbook or electronic reference.

How well physicians can assess their decision-making is important for another reason: With the current push for information systems to make health care safer and more effective, it’s critical to know when physicians want “or know they need “more information than they currently possess.

Aligning confidence and correctness

In the study, which appeared in the April 2005 Journal of General Internal Medicine, researchers gave lengthy synopses of medical cases to three groups: senior medical students, third-year medical residents and faculty internists who had been out of residency for at least two years.

Charles P. Friedman, PhD, the study’s lead author and a professor at the University of Pittsburgh, says that the cases were quite difficult. The case descriptions often left out information “biopsies or the results of diagnostic imaging tests, for example “that would lead directly to the correct diagnosis.

After physicians had made their diagnosis, researchers asked them to rate on a scale of 1-4 whether they would ask for further help with the case. This measure was used as a proxy to indicate how confident physicians were in their diagnosis, or their perceived likelihood that their diagnosis was correct.

How did the physicians do? Dr. Friedman says that as a reflection of the difficulty of these cases, the subjects in all three groups identified the right diagnosis only 40 percent of the time.

Not surprisingly, he adds, correctness improved with clinical experience. While residents correctly diagnosed 44 percent of cases, faculty correctly diagnosed 50 percent.

What was more central to this study, however, was not just how often subjects reached the correct diagnosis, but how often they believed they needed further help “and whether these beliefs were valid. To assess that, researchers measured what they call the “alignment” between subjects’ confidence in their diagnosis and the correctness of the diagnosis.

If physicians thought a diagnosis was correct and it was in fact right, their confidence and their correctness were aligned. If they thought they were wrong and they were wrong, they were similarly “aligned.”

If, however, physicians thought they were right and they were actually wrong “or if they thought they were wrong and they were actually right “those cases were defined as “unaligned.”

Overconfidence vs. underconfidence

Overall, physicians’ confidence and correctness were aligned 68 percent of the time, a result the study describes as “moderately” aligned.

While more experienced physicians tended to score better than residents when it came to making a correct diagnosis, they fared about the same in knowing whether their diagnoses were right or wrong. While the faculty internists in the study correctly assessed whether they were right or wrong in 64 percent of cases, residents were similarly “aligned” in 63 percent of cases.

(Medical students were aligned in 78 percent of cases, a number that is significantly higher than residents or faculty, but Dr. Friedman says there is a simple explanation: The cases were so hard that the students were out of their league, and they knew it. As a result, the medical students correctly predicted that a large number of their diagnoses were wrong.)

Interestingly, researchers found that physicians were more often “underconfident,” or thought that their diagnosis was wrong when in fact it was right, than “overconfident”.

Residents, for example, were “unaligned” in 37 percent of cases. Of those cases, they were overconfident 41 percent of the time. In the other 59 percent of cases, they were underconfident.

The numbers were not much different for faculty internists, who were unaligned in 36 percent of cases. In just over 35 percent of these cases, they were overconfident; and in just over 64 percent, they were underconfident.

Impact on patient care

What do the results mean? According to Dr. Friedman, who is currently on leave from the University of Pittsburgh as a senior scholar at the National Library of Medicine, the bottom line is simple: “We can’t count on physicians to know when they’re right and when they’re wrong.”

While “overconfidence” poses obvious problems to patient care, Dr. Friedman says the “underconfident” scenario can also be problematic.

“When physicians ask for help because they think they’re wrong when in fact they’re right,” he explains, “this may actually do harm. The additional information or advice they seek and receive may, for many reasons, undo an assessment that is actually correct.”

But it is the group of physicians who believe they are correct and have all the knowledge they need to reach an appropriate decision, when in fact their assessment of a case is incorrect, that is the bigger concern.

As the article notes, physicians who incorrectly think they’re right may not look for additional information to support or change their position. Even more importantly, they may be unreceptive to suggestions or evidence that counter their belief.

Dr. Friedman says his research is of particular interest to informaticians and software engineers interested in building decisionsupport systems. These people often ask whether they should automatically give physicians information, or whether their programs should “wait” for physicians to ask for help.

“That’s a big issue,” he explains, “because you don’t want to be giving people advice all the time. You want to tune software so people only get advice when they really need it. That raises a question: Do physicians really know when they need help?”

When physicians need help

If the answer is yes, Dr. Friedman says, decision-support systems probably don’t need to proactively give help to physicians. He talks instead about what he calls an “infobutton” approach in which physicians would call for help only when they thought they needed it. (The concept, which was first introduced by James J. Cimino, MD, from Columbia University, is similar to pushing the OnStar button in a car to ask for directions.)

But Dr. Friedman quickly adds that his study seems to indicate that waiting for physicians to realize they need help and then ask for assistance is probably a “suboptimal” approach. “You probably shouldn’t always assume that physicians know when they need help,” he says.

“We’re embarking in informatics as a field on a very interesting quest for the right balance point between overwhelming people with advice, much of which they don’t need, and waiting for them to ask and doing nothing unless they ask for it,” he adds. “I think what this paper contributes is a suggestion that always waiting for the physician to ask for help is not where we should be, even though it may be the most genteel, professional thing to do.”

And while the question of when to give physicians help in medical decision-making has always perplexed informaticians and software engineers, Dr. Friedman says it is more pertinent now than ever.

“It’s a new way of thinking about practice,” Dr. Friedman continues. “We’re bathed in this world of information, and if only our mental process was such that we always knew when we were wrong, then we always could make ourselves right by finding the information we need.”