Bias
Confirmation May Be the Root of Most Biases
Understanding how fundamental beliefs shape the strength and impact of bias.
Posted December 2, 2024 Reviewed by Michelle Quirk
Key points
- Biases can shift depending on the context and our experience within that context.
- Motivated reasoning intensifies when beliefs are connected to our personal identity.
- Addressing biases requires rethinking the underlying beliefs, not just suppressing biased responses.
Bias and cognitive bias have received significant attention from researchers and practitioners over the past half-century. Wikipedia’s List of Cognitive Biases provides an overwhelming array of categories and specific biases.1 At their core, biases represent nothing more than simple tendencies—patterns in how we process information, make decisions, and draw conclusions. If you look through the list of cognitive biases, you’ll see tendency appears repeatedly.2 So, biases guide the types of information we rely on and the strategies we use to decide.
Recently, Oeberst and Imhoff (2023) argued that many biases likely come back to a combination of some fundamental belief plus belief-consistent information processing. In other words, they argued that we possess a set of fundamental beliefs (see Figure 1), and we process information in ways that are consistent with those beliefs (i.e., confirmation bias).
I originally thought about providing a breakdown of Oeberst and Imhoff’s perspective, but Smets (2024) already did a fantastic job of that. Instead, what I wanted to do here was to dive more deeply into some of the implications posed by the connection of lots of biases to a much smaller set of foundational beliefs. There are three specific implications that warrant further mention:
- Bias strength is dependent on the strength of fundamental beliefs.
- The strength of fundamental beliefs is likely to vary across contexts.
- Motivated reasoning is likely more prominent when fundamental beliefs are more strongly connected to our sense of self.
Bias strength is dependent on the strength of fundamental beliefs
Oeberst and Imhoff (2023) defined beliefs as “hypotheses about some aspect of the world that come along with the notion of accuracy—either because people examine beliefs’ truth status or because they already have an opinion about the accuracy of the beliefs.” In other words, a belief is something we hold to be true, regardless of whether it’s justified or not. And while Figure 1 provides a list of some fundamental beliefs, this list is simply meant to demonstrate how multiple biases become manifestations of the same fundamental belief.
The Oeberst and Imhoff perspective aligns well with the one proposed by Bach and Schenke (2017), who argued that biases serve as predictions we make about a given situation based on our prior experiences and salient situational cues. The more confident we are in those predictions or beliefs, the stronger the bias will be. For example, if you’re shopping for a new car, and you strongly believe foreign cars are a better choice, then you’re likely to only consider information about cars that align with your belief.3 But if your belief is not as strong, you’ll be more willing to reasonably consider information about American cars.
The key issue is not whether we hold a fundamental belief but how strongly we hold it. The stronger the belief, the stronger the bias.
The strength of fundamental beliefs is likely to vary across contexts
In the prior example, there was a specific belief at play (about foreign cars). But that specific belief is connected with #1 and #2 in Figure 1. There must be a reason you possessed the belief, and that reason is likely to come back to experience (either direct or vicarious). And only if you strongly believed that the assessment was correct would that bias have risen to the level of being a strong bias.4
These beliefs were specific to the context of purchasing a new car. But when faced with other decisions—buying a vacuum, computer, or dishwasher—you might rely on different experiences and beliefs. The strength of your bias will shift depending on how much experience you have in each context and the strength of those beliefs.
And this is an important implication of the argument Oeberst and Imhoff put forth. Even if the list in Figure 1 was a full and complete list of fundamental beliefs, the strength of those fundamental beliefs is likely to vary based on the specific situational context. We may have strong fundamental beliefs in some contexts and much weaker fundamental beliefs in other contexts. A lot will depend on the degree to which one has experience related to that situational context.
Motivated reasoning is likely more prominent when fundamental beliefs are more strongly connected to our sense of self
One of Oeberst and Imhoff’s central claims is that motivated reasoning is not a necessary element of their framework. Although I was originally skeptical of this claim, having read their argument, I understood their point. As I argued previously, motivated reasoning occurs when our goals and/or values skew the decision-making process toward conclusions that align with those goals/values. This is typically framed as having a desired conclusion and subsequently reasoning our way into that conclusion (if possible).
But this assumes that fundamental beliefs are necessarily connected to our goals/values. But such an assumption would be faulty. No motive is required to believe that our experience is a reasonable reference nor that we make correct assessments of the world. A mere generalization from our own experiences (e.g., I rely on my experience a lot, and I generally make reasonably good decisions) would be sufficient to lead us to possess both beliefs.
For example, your experience might tell you you’ve successfully navigated traffic patterns for years. You’ve driven successfully through multiple conditions (e.g., adverse weather, congestion) and maintained a reasonably good driving record. Over time, you’ve come to believe you’re a good driver. This belief didn’t develop because of any specific value or goal; it’s simply a generalization from your past experiences.
However, the belief that you’re a good driver can still bias your decision-making. You might overestimate your ability to handle risky driving conditions or underestimate the possibility of making a mistake, even if plenty of evidence suggests you should drive more cautiously. The belief isn’t connected to a value like safety,5 but it still leads to biases in how you assess your driving skills and risks on the road.
Yet, this bias becomes much stronger if you attach personal significance to being a good driver—if it’s connected to your identity or how you want to be perceived. Here, the bias shifts from a simple generalization to motivated reasoning, making it harder to override, even in the face of conflicting evidence.
Wrapping things up
Human decision-making is often framed using a dual-process conceptualization, even though, as I previously argued, such a perspective is overly simplistic and an inaccurate view of the way we make decisions. But in that conceptualization, more conscious analytical thinking is the prescribed method for overriding biased conclusions. However, one of the important points that Oeberst and Imhoff made in their article is that thinking more consciously about the biased conclusion—unless it is in relation to a small world problem6—is typically not going to affect the conclusion we reach.
This is because the bias is not the problem. The bias is merely a logical manifestation of some fundamental underlying belief(s). So, if we want to change the biased response, we need to be willing to rethink (or at least temper) the strength of those underlying beliefs.
This is why many debiasing strategies fall short. They attempt to correct the bias without addressing the fundamental beliefs that support it. If we focus only on trying to suppress biased conclusions without questioning the underlying beliefs, we’re unlikely to see meaningful or lasting change. Moreover, when these underlying beliefs connect to our identity—whether related to political, social, or personal values—motivated reasoning can make it even harder to adjust those beliefs. In such cases, we’re not just challenging a cognitive bias, but a belief that is intertwined with our sense of self, making change much more difficult.
To overcome biased conclusions, we must move beyond analytical thinking and develop an awareness of our underlying beliefs, remaining open to re-examining them. This involves questioning the foundations of our thinking, a difficult but necessary step in reducing bias that may lead to error.7
References
1. Some of these confound biases and heuristics, but they aren’t the same, as I argued here.
2. As of October 21, 2024, the word tendency appeared 135 times on that page, more than half as many times as the word bias, which appeared 225 times. But if you factor out its use in headings, footnotes, the see also section, references, and external links (i.e., just the text proper), bias was only used 152 times.
3. You may not bother looking at information about other types of cars or reject/underweight information about American cars that might actually suggest a different conclusion.
4. Which does beg the question of whether #2 is a relatively universal belief underlying strong biases.
5. Though, admittedly, it’s unlikely most people set out to drive carelessly.
6. These are decisions where there is clearly a limited number of choices—as in most lab studies of human decision-making—and often don’t apply to more complex everyday decisions.
7. Remember, not all biases lead to error. When the underlying belief is true, a bias might very well be appropriate. But even if the belief is not something that can be proven, this doesn’t mean the bias will lead to error. It could still very well result in an ecologically rational decision choice, such as in the case of choosing to only consider foreign cars.