Skip to main content

Verified by Psychology Today

Cognition

The Biggest Mistake Patients Make

Why thought process is more important than the answer thought process brings

The biggest mistake patients make isn't what you think. It isn't turning down tests or treatments their doctors recommend. Nor is it deciding not to take the medicines their doctors prescribe. It isn't insisting on getting a test or beginning a treatment their doctors recommend against, either, and it isn't failing to exercise, stay out of the sun (or use sunscreen), quit smoking, or lose weight. No, the biggest mistake patients make is thinking anecdotally rather than statistically.

We all tend to arrive at our beliefs about the frequency with which things occur not from statistical analysis but from the ease with which we can recall examples of their happening (see Daniel Kahneman's book, Thinking, Fast and Slow). So if we've recently heard a story of an airplane crash in the news, we’ll believe the likelihood that the airplane in which we’re flying might crash to be greater than it actually is. Or if a friend tells us about a complication he suffered following surgery, we’ll believe the likelihood of that complication happening after our surgery to be greater than statistics suggest.

We all tend to believe stories more than facts. And when faced with the need to make a decision—to start a medication, to have surgery—far more often than not (and mostly without consciously realizing it) we rely on our emotional beliefs about the risks and benefits. And our emotional beliefs come mostly from our experience and the stories we tell ourselves about it. "My wife's sister's boyfriend took that pill and had a terrible reaction. There's no way I'm going to take it!" one patient tells me. "Dr. X operated on a friend of mine and he's been in pain ever since. No way I'm letting that guy touch me!" says another. "I've seen that drug advertised on television. What do you think about me trying that one instead?" a third asks.

We believe thinking this way leads us to make wise decisions, but it doesn't. We hear about a friend or relative having a known complication as a result of a surgery (one we're told by more than one doctor that we need ourselves) and decide as a result of hearing that story that we don't want the surgery—even though the statistical likelihood of such a complication happening to us is less than one percent and has, in fact, never happened to any of our own surgeon's patients. Or we read about the side effects of a drug our doctor recommends and decide we don't want to take it even though studies show that the risk of those complications is far lower than the likelihood that it will treat our symptoms or even prolong our life.

Sometimes our intuition actually serves us well. Sometimes the recommendations doctors make are based on nothing more than their clinical judgment and a presumption that they know better than their patients what their patients should do. And while the former is unavoidable (much of what we do in medicine requires judgment because studies that provide clear-cut answers haven't been done) the latter represents a mistake that often leads doctors to have greater faith in the value of their recommendations than is warranted. But just because we may disagree that our doctor knows what's best for us, we shouldn't automatically dismiss his advice if it runs counter to our inclinations, for doctors have a crucial advantage over the patients for whom they care: the ability to think dispassionately about the choices their patients must face.

Photo: Sarah Reid

I'm not advocating that you surrender your judgment to your doctor. I'm saying that when deciding upon the best course of action to take, you need to critique your own thought process mercilessly. Most of us make our decisions emotionally. And while bringing emotion into decision making isn't wrong per se (how do we place value on something, after all, if not with our hearts?), our feelings can easily mislead us if not based on sound reasoning. And allowing our fears to be swayed by anecdotes rather than statistics is about as far away from sound reasoning you can get.

To think statistically is to calculate the true likelihood that something bad—or good—will happen to us. And in far more cases than most would believe, we have information that allows us to do so accurately—and not only in the medical arena. We know, for example, that the likelihood of any one of us becoming the victim of a terrorist attack is about one in twenty million. But think: is your fear of terrorism proportional to that statistic or to the frequency with which you hear about terrorism almost weekly on the news? Another example: you should be far more afraid of driving a car than flying in an airplane. Not only are car accidents statistically more likely than airplane crashes, but also most of us drive far more often than we fly, thereby exposing ourselves to the risk of a car accident far more often than we expose ourselves to the risk of an airplane crash. But how often do you worry about getting in an accident when you take your car? I'm not suggesting you need to; in fact, I'm arguing the opposite: that because of our exposure to anecdotes, we often worry far more than we should about things whose statistical likelihood is actually small (and, conversely, not enough about things whose statistical likelihood is actually large). So the next time you contemplate refusing a recommended test or treatment, notice the source of your anxiety: is it from a calculated statistic that yields the true likelihood of harm—or from a story that stirs you up far out of proportion to the real risk?

Dr. Lickerman's new book The Undefeated Mind: On the Science of Constructing an Indestructible Self is available now. Please read the sample chapter and visit Amazon or Barnes & Noble to order your copy today!

advertisement
More from Alex Lickerman M.D.
More from Psychology Today