DonkeyHotey/flickr
Source: DonkeyHotey/flickr

When Peter Wason coined the term “confirmation bias”, it didn’t mean what it means today. He was suggesting that, whenever people consider a hypothesis, no matter what the hypothesis is, they are naturally inclined to look for reasons that support the hypothesis and to overlook reasons that count against the hypothesis.

For example, if you were asked to consider the hypothesis “burning fossil fuels contributes to global warming”, the suggestion was that, no matter what you previously thought about the matter, you would naturally find yourself looking for evidence that burning fossil fuels contributes to global warming. And you would naturally overlook evidence against it.

And, if they changed the hypothesis to “burning fossil fuels has no effect on long term global temperatures”, you would notice and ignore the opposite evidence.

If people really did have this kind of confirmation bias, it could be really good or really bad. On the one hand, it would be nice to be able to get people to see both sides of an issue simply by changing how the hypothesis is worded. On the other hand, it would also be extremely easy to manipulate people into supporting your side of an issue if that was your intention.

For good or for ill, that’s not how things work.

Today, when researchers use the term “confirmation bias”, they are using it the way Wikipedia defines it:

“Confirmation bias . . . is the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses.”

For example, if you already believe that burning fossil fuels contributes to global warming, you will tend to search for, interpret, favor and recall information that supports the idea that this is true, and you will tend to overlook or avoid evidence that burning fossil fuels does not contribute to global warming -- regardless of how the hypothesis is stated.

And, if you are initially skeptical of man-made global warming, you will tend to seek out and avoid the opposite evidence.

In other words, we tend to notice and seek out evidence that supports “our side”, and overlook or avoid evidence that supports “their side”. And, in fact, many psychologists today are avoiding the term “confirmation bias” and calling it “myside bias” instead.

So, what else can we say about “myside” bias?

Well, first, it seems to be a real thing. And to see that, all you have to do is start a conversation about a controversial topic (such as gun control or global warming) in a mixed group on social media and see how people argue. Chances are those who oppose gun control will have statistics supporting their position at their fingertips, and will have trouble seeing the relevance of the statistics shared by the pro-gun-control people. And vice versa.

Second, myside bias has not escaped the notice of our best writers, philosophers, poets and musicians.

In the Divine Comedy, Dante, writing in the 13h century, wrote:

"[O]pinion—hasty—often can incline to the wrong side, and then affection for one's own opinion binds, confines the mind"

Three centuries later, Francis Bacon wrote:

“The human understanding when it has once adopted an opinion ... draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects.”

Roughly three centuries after that, Tolstoy wrote:

I know that most men—not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical, or philosophic problems—can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty—conclusions of which they are proud, which they have taught to others, and on which they have built their lives.

And less than a century after that, Paul Simon penned the words:

“. . . a man hears what he wants to hear, and disregards the rest.”

Now you might have your own opinions about “myside” bias already. Perhaps you think it’s a bad thing.  And perhaps you even blame myside bias for a good chunk of the extreme political polarization we see in the world today.

If so, then, in all fairness, we really should consider some reasons against this hypothesis, some reasons that myside bias might be a good thing after all.

Is Myside Bias Always Bad?

In The Enigma of Reason Hugo Mercier and Dan Sperber defend the idea that myside bias is not always a bad thing. In fact, they think myside bias is one of Reason’s essential features. And they argue that human reasoning works just fine when used in its natural context.

So what is the proper function of Reason? And what is Reason’s “natural context”? And how can they say myside bias is a good thing, given all the obvious problems we see with it?

According to Mercier and Sperber, the proper function of Reason is to help us find reasons that will persuade others and allow us to justify ourselves to them. And the natural context for reasoning is in cooperative in-group discussions.  

It’s easy to see how reasoning with myside bias can be good for the individual. If we spend time searching for reasons that undermine our cause, then we will create a weaker case for ourselves. And, if we present evidence on both sides, we might even undermine our case. If our goal is to justify one of our actions to the group, then it will often make sense to present only the reasons why our action was a good thing to do, and to leave out any reasons why it was a bad thing to do.

But Mercier and Sperber go further than this. They suggest that sometimes myside bias is actually good for the group as a whole. And the reason for this is that it facilitates an efficient division of cognitive labor.

If you think we should hunt antelope in the morning, and I think we should hunt antelope in the afternoon, and if it’s easier to evaluate reasons than to generate them, then the group gets more reasons to work with if you only present reasons for hunting in the morning, and I only present reasons for hunting in the afternoon.

Suppose we were both capable of coming up with six reasons in a given time frame. If we try to be unbiased, and both of us try to provide three reasons for hunting in the morning, and three reasons for hunting in the afternoon, we might come up with the same reasons, and then the group will have only six total reasons to work with.

If, on the other hand, you spend your reason-generating energy coming up with six reasons for hunting in the morning, and I use my reason-generating energy to come up with six reasons for hunting in the afternoon, then the group will have twelve reasons to consider, and will presumably be in a position to make a better decision.

But the key here is that people have to be willing to change their minds if the weight of all the reasons goes against them.

And that brings us to the topic of the natural context for reasoning.

If we are in a late night bull session with trusted friends, we sometimes feel free to push our ideas further than we normally would, even to the point of silliness, while others knock them down. And that’s a good thing, because at the end of the evening, we are perfectly OK with giving up our position if it doesn’t hold up.  We trust our friends not to hold the idea against us, and that frees us up to both push the idea as far as it can go, and to change our minds if it doesn’t hold up.

And that’s key. Late night bull sessions, like tribal hunting meetings, are cooperative ventures. We’re all searching for the best way forward, we all care about each other, and we’re all mutually persuadable to some degree.

And under those conditions, not only is myside bias not so bad, it’s arguably very good.

Unfortunately, we’re often not in Kansas anymore.

Myside Bias and Social Media

In a mixed group on social media discussing a controversial topic, we often have people presenting only one side of a case at each other. That’s nothing out of the ordinary. We see the same things in late night bull sessions and tribal hunting meetings. What’s different in the case of social media is that people are often in an environment where they don’t feel safe being mutually persuadable.

Political debates on social media are not generally cooperative in-group dialogues. They are competitive debates between people in our in-group, and people in our out-group. And, because we can’t even see our opponents face-to-face while we argue, the biggest in-group, the group we share with all living, breathing human beings, isn’t even particularly salient. This isn’t a cooperative venture between people who care about each other. It’s tribal warfare.

The nature of politics doesn’t help, either. They want one policy, and we want a different, incompatible policy. And the way the game is set up, whichever tribe loses the debate has to live with the other tribe’s policy. And, because they’re not in our tribe, their policy doesn’t take our interests into account very well. These are high stakes.

And so our natural tendency to search for reasons on one side of an issue is not countered very well by our ability to recognize good counterarguments when we see them. In the late night bull session there is some social pressure toward being persuadable. In the mixed social media political debate, most of the social pressure is toward resisting the other tribe’s reasons.

In fact, changing your mind can be seen by your side as disloyal, or (since this war) even treasonous.

And so we do the social media political discussion dance. We pretend to engage in open dialogue, and everybody comes away more convinced of their own positions than when they started.

What Now?

OK, so is there anything to do about this? Can we make our social media discussions more enlightening? Can we change the tide on political polarization and start working together again?

As a unilateral policy trying to be less biased might be good for our souls, but it probably won’t do much to change the behavior of others. They’ll just take our even-handed treatment as a sign of capitulation.

A more promising approach is to try to change the context of reasoning from competitive to cooperative.  And we can do that by emphasizing our common humanity, describing how we came to hold our views, and asking them to do the same.

I say more about that here.

You are reading

Clear, Organized and Motivated

Four Organizing "Sins" to Start Committing on Purpose

Here's why you might already be more organized than you think.

How Much Diversity Can We Handle?

If diversity is good, why does it have to be so stressful?

One Simple Question for Modern Humans Who Want to Be Happy

Modern life is different from the life we evolved to live on the savanna.