Doctors Don't Know It All

The dangers of cognitive biases in medical practice.

Posted Jun 10, 2020

Brielle A. Marino
Source: Brielle A. Marino

He gestured toward the window with a slight tick of his head. His hands were still held in a relaxed clasp behind his back. I followed him.

“You see the sun there?” He pointed. The tip of his finger turned white from being pressed so firmly against the glass.

“Yes, I do,” I replied.

“I can move it with my eyes,” he stated. "Watch.” His voice was soft and serious. He stared intently at the sun in the distance. After a moment passed in silence he turned to me with child-like glee and sheer pride stretched across his face. “Did you see it?”

“I did not,” I replied honestly, “but I believe that you saw it.”

He returned his gaze to the window with the same, unwavering smile, seemingly unfazed by my contradiction.

The man in this story was diagnosed with schizophrenia. His predominant symptoms were delusions of grandiose type. Specifically, he believed he had magical powers that enabled him to move the sun with his eyes. He spent many years on various psychiatric wards before doctors discovered the truth: He had a pituitary tumor. The tumor was responsible for his psychiatric symptoms. By the time he received an accurate diagnosis, it was nearly too late.

How could this misdiagnosis have happened?

As modern trends in neuroscience dictate, what we see around us is not real. Our brains reconstruct our physical world. This reconstruction is based largely on intricate, Bayesian-like processes that combine sensory input with probabilistic expectations. Essentially, our brains put forth their best guess to interpret and make sense of the information around us based on what is most likely.

But what happens when we guess wrong?

From a clinical perspective, these “wrong guesses” can be illustrated in psychoses, wherein the brain interprets ambiguous stimuli as noxious, or perhaps generates a stimulus from expectation, such as hearing a voice or your name being called when no such sound was produced in reality. “Wrong guesses” are not limited to the clinical world, however. In fact, we are all susceptible to making these false guesses in the form of cognitive biases.

Cognitive biases are systematic errors in thinking that arise from the brain’s attempt to simplify the decision-making process. None of us is immune to these thinking errors, and in fact many psychologists argue that these errors possess an adaptive quality for facilitating swift judgments. This is particularly beneficial considering memory and attention are finite resources. Therefore, cognitive biases allow for mental shortcuts thought to assist problem-solving. These biases, however, are particularly salient, and potentially most detrimental, in medical practice.

Doctors use heuristics in clinical decision-making—specifically diagnostic decision-making. The amount of time it would take to consider all possibilities when decision-making would be exorbitant and thus inefficient. The same is true for considering all diagnostic possibilities. Physicians are therefore more likely to identify a small subset of diagnostic possibilities when assessing patients. In doing so, they are particularly susceptible to a variety of cognitive biases—namely, confirmation bias, availability bias, and anchoring bias.

Confirmation bias occurs when we overvalue and attend to information consistent with our theory or belief and ignore information that is incongruent or inconsistent with it. Physicians are thus more likely to defend their initial diagnoses—and more likely to be blind to aspects of the clinical presentation that are inconsistent with their hypotheses—when evaluating patients. This issue is further complicated by the implicit hierarchy embedded within Western culture, which views the physician as “all-knowing” and “expert.” This viewpoint not only disallows the physician from reconsidering his or her own potential biases, but also makes it unlikely for the patient or caregiver to contradict them.

The narrowed pool of available diagnostic possibilities is often dictated by the physician’s clinical experience and recent memory: Herein lies the availability bias. If the clinical presentation is similar to those presented in recent memory, the physician is more likely to suggest a diagnosis consistent with these recent examples. This bias in particular, however, is apt to elicit misdiagnoses in rare or complex medical cases, as the physician is less likely to have been exposed to prior examples of them.

Similarly, the anchoring bias occurs when we rely too heavily on one initial piece of information. Whereas it may seem common sense to generate hypotheses from a primary symptom, doing so disqualifies a full array of diagnostic possibilities. For the case illustrated above, the man’s behavior appeared so consistent with that of psychosis that other medical diagnoses were excluded almost at the outset.

Medicine is not without subjectivity. Qualitative assessments of patients are subconsciously guided by the physician’s pre-determined hypothesis. Added variables of race, class, and gender further exacerbate this issue, as individuals from minority groups are disproportionately subject to cognitive biases that hinder their medical care.

In unresolved, complex, and rare medical cases physicians need to be prepared to reconsider their initial clinical impressions and potential biases—something that may be easier said than done in a society that perpetuates the myth of physician omniscience.

While the suggestion of peer review is, at face value, an appropriate solution to potentially mitigate the effects of these biases, herd mentality often precludes this from being the case. This is often seen in group medical practices, wherein constant collaboration and case conferences among allied practitioners are thought to facilitate improved medical outcomes. More often than not, however, these conferences elicit groupthink, which further precludes physicians from considering alternate diagnostic possibilities.

While cognitive biases are a normal, and occasionally beneficial, aspect of the human cognitive experience, they can be disastrous in high-stakes medical decision making, particularly in chronic and complex medical cases.

Expectation guides perception. When relying on mental shortcuts and “best guesses,” we foster efficiency at the cost of efficacy. To improve medical outcomes, physicians must be willing to reflect on their potential biases, consult with outside sources, and acknowledge their fallibility.

References

Saposnik, G., Redelmeier, D., Ruff, C. C., & Tobler, P. N. (2016). Cognitive biases associated with medical decisions: a systematic review. BMC medical informatics and decision making, 16(1), 138.

White, A. A., & Stubblefield-Tave, B. (2017). Some advice for physicians and other clinicians treating minorities, women, and other patients at risk of receiving health care disparities. Journal of racial and ethnic health disparities, 4(3), 472-479.