Skip to main content

Verified by Psychology Today

Spirituality

The Two Kinds Of Belief

Why infants reason better than adults

Photo: prep4md

"I don't want to give them up," my patient told me.

"Why not?" I asked.

"I've been reading some articles on the Internet that say they might cure me."

Tragically, he was referring to vitamin supplements, which he'd somehow come to believe would cure him of Stage 4 metastatic colon cancer. I'd suggested he stop taking them because they seemed, by his own report, to be making him nauseated.

In the kindest tone I could muster, I told him I knew of no reliable studies that showed vitamins have any effect on colon cancer. And though I certainly didn't object to people taking them in general, if they were indeed the cause of his nausea, they were compromising the quality of however much life he had left while almost certainly providing him no real benefit. I urged him to experiment by stopping them to see if his nausea receded. If it didn't after two weeks or so, I argued, he could always start them back up again. In fact, I said, even if his nausea did cease once he stopped them, he could start them back up again to see if it came back in order to produce more definitive evidence that the vitamins were in fact his nausea's cause.

He paused and then after a few moments reluctantly agreed.

WHAT IS BELIEF?

Simply, a belief defines an idea or principle which we judge to be true. When we stop to think about it, functionally this is no small thing: lives are routinely sacrificed and saved based simply on what people believe. Yet I routinely encounter people who believe things that remain not just unproven, but which have been definitively shown to be false. In fact, I so commonly hear people profess complete certainty in the truth of ideas with insufficient evidence to support them that the alarm it used to trigger in me no longer goes off. I'll challenge a false belief when, in my judgment, it poses a risk to a patient's life or limb, but I let far more unjustified beliefs pass me by than I stop to confront. If I didn't, I wouldn't have time to talk about anything else.

What exactly is going on here? Why are we all (myself included) so apparently predisposed to believe false propositions?

The answer lies in neuropsychology's growing recognition of just how irrational our rational thinking can be, according to an article in Mother Jones by Chris Mooney. We now know that our intellectual value judgments—that is, the degree to which we believe or disbelieve an idea—are powerfully influenced by our brains' proclivity for attachment. Our brains are attachment machines, attaching not just to people and places, but to ideas. And not just in a coldly rational manner. Our brains become intimately emotionally entangled with ideas we come to believe are true (however we came to that conclusion) and emotionally allergic to ideas we believe to be false. This emotional dimension to our rational judgment explains a gamut of measurable biases that show just how unlike computers our minds are:

  1. Confirmation bias, which causes us to pay more attention and assign greater credence to ideas that support our current beliefs. That is, we cherry pick the evidence that supports a contention we already believe and ignore evidence that argues against it.
  2. Disconfirmation bias, which causes us to expend disproportionate energy trying to disprove ideas that contradict our current beliefs.

Accuracy of belief isn't our only cognitive goal. Our other goal is to validate our pre-existing beliefs, beliefs that we've been building block by block into a cohesive whole our entire lives. In the fight to accomplish the latter, confirmation bias and disconfirmation bias represent two of the most powerful weapons at our disposal, but simultaneously compromise our ability to judge ideas on their merits and the evidence for or against them.

EVIDENCE VS. EMOTION

Which isn't to say we can't become aware of our cognitive biases and guard against them—just that it's hard work. But if we really do want to believe only what's actually true, it's necessary work. In fact, I would argue that if we want to minimize the impact of confirmation and disconfirmation bias, we need to reason more like infants than adults.

Though many people think belief can occur only in self-aware species possessing higher intelligence, I would argue that both infants and animals also believe things, the only difference being they're not aware they believe them. That is, they do indeed judge certain ideas "true"—if not with self-aware minds, with minds that act based on the truth of them nonetheless. Infants will learn that objects don't cease to exist when placed behind a curtain around 8 to 12 months, a belief called object permanence, which scientists are able to determine from the surprise infants of this age exhibit when the curtain is lifted and the object has been removed. Animals will run from predators because they know—that is, believe—they will be eaten if they don't. In this sense, even protozoa can be said to believe things (e.g., they will move toward energy sources rather than away because they know, or "believe," engulfing such sources will continue their existence).

Infants and animals, however, are free of the emotional biases that color the reasoning of adults because they haven't yet developed (or won't, in the case of animals) the meta-cognitive abilities of adults, i.e., the ability to look back on their conclusions and form opinions about them. Infants and animals are therefore forced into drawing conclusions I consider compulsory beliefs—"compulsory" because such beliefs are based on principles of reason and evidence that neither infants nor animals are actually free to disbelieve.

This leads to the rather ironic conclusion that infants and animals are actually better at reasoning from evidence than adults. Not that adults are, by any means, able to avoid forming compulsory beliefs when incontrovertible evidence presents itself (e.g., if a rock is dropped, it will fall), but adults are so mired in their own meta-cognitions that few facts absorbed by their minds can escape being attached to a legion of biases, often creating what I consider rationalized beliefs—"rationalized" because adult judgments about whether an idea is true are so often powerfully influenced by what he or she wants to be true. This is why, for example, creationists continue to disbelieve in evolution despite overwhelming evidence in support of it and activist actors and actresses with autistic children continue to believe that immunizations cause autism despite overwhelming evidence against it.

But if we look down upon people who seem blind to evidence that we ourselves find compelling, imagining ourselves to be paragons of reason and immune to believing erroneous conclusions as a result of the influence of our own pre-existing beliefs, more likely than not we're only deceiving ourselves about the strength of our objectivity. Certainly, some of us are better at managing our biases than others, but all of us have biases with which we must contend.

What then can be done to mitigate their impact? First, we have to be honest with ourselves in recognizing just how biased we are. If we only suspect that what we want to be true is having an influence on what we believe is true, we're coming late to the party. Second, we have to identify the specific biases we've accumulated with merciless precision. And third, we have to practice noticing how (not when) those specific biases are exerting influence over the judgments we make about new facts. If we fail to practice these three steps, we're doomed to reason, as Jonathan Haidt argues, often more like lawyers than scientists—that is, backward from our predetermined conclusions rather than forward from evidence.

Some evidence suggests we're less apt to become automatically dismissive of new ideas that contradict our current beliefs if those ideas are presented in a non-worldview-threatening manner or by someone who we perceive thinks as we do. Knowing, for example, that my patient was more predisposed to consider ideas if they came from me, his doctor (whom he trusted had his best interests at heart), I felt obligated to wield that power for the good, to challenge any ideas that had the potential to cause him more harm than good. (Though an argument could be made that I shouldn't have challenged his misguided belief in the power of vitamins to treat colon cancer, when he stopped taking them, his nausea did indeed resolve.)

Despite my reluctance to challenge beliefs that people hold more strongly than evidence justifies, the inconvenient truth is that what one of us believes has immense power to affect us all (think of the incalculable harm the smallest fraction of us have caused because they believe if they die in the act of killing infidels, they'll be surrounded by virgins in the afterlife). As a society, therefore, we have critically important reasons to reject bad (untrue) ideas and promulgate good (true) ones. When we speak out, however, we must realize that reason alone will rarely if ever be sufficient to correct misconceptions. If we truly care to promote belief in what's true, we need to first find a way to circumvent the emotional biases in ourselves that prevent us from recognizing the truth when we see it.

If you enjoyed this post, please feel free to visit Dr. Lickerman's home page, Happiness in this World.

advertisement
More from Alex Lickerman M.D.
More from Psychology Today