Motivated Reasoning
Confirmation Bias and Stigma
Contrary information disrupts the status quo.
Posted July 21, 2015
Confirmation bias is our well-known tendency to see things as we expect them to be. Its effects are ubiquitous, ranging from our interpretations of others’ behaviors based on the categories we put them in, to ignoring aspects of ourselves that don’t fit our master narratives, to hearing our spouses saying what they usually say instead of what they actually say. Its advantages are obvious, even beyond the advantage of feeding our fears to keep us safe. It’s the perceptual and cognitive equivalent of using a map to get around a city, useful until the map is wrong because of construction, rerouted streets, or some other change in the terrain. Confirmation bias allows us to devote our energies to other, more variable aspects of the situation, like two friends strolling on a sidewalk and focusing on their conversation rather than on the footing—or vice versa: I assume that friends on uncertain terrain exhibit more confirmation bias regarding what the other person is saying.
Stigma is Erving Goffman’s well-known phenomenon of maintaining group cohesion and cultural norms by marginalizing aspects of the system that breach those norms. Systemically, we define situations in a manner that allows us to know what role we and others are playing. Knowing the roles has obvious advantages, both with respect to pulling off one’s own role and knowing what to expect from others. We hide our role departures—our discrediting conduct and identity elements—to preserve our own role. We stigmatize others to pressure them to maintain theirs.
It seems to me that the two phenomena are intricately linked. I was reflecting on my certainty, without having heard any of the evidence, that the Aurora theater shooter would be convicted. My colleague Kim Gorgens said on TV that jurors sometimes become pariahs when they return unpopular verdicts. Pariahs are people who have been discredited from the performance of any social role. I wonder if the main thing that happens in a jury room is a contest between identification with the norms of the jury and identification with the people back home you will have to face. The evidence in a particular case can have more or less impact depending on how much or how little its implications conflict with either group’s norms.
Goffman defines “tact” as the tendency to ignore discrediting information and conduct in order to support the performances of others, like a benign audience facilitating a play by ignoring the age difference between the character Juliet and the actress playing her. Of course, younger actresses who failed to get the part might not be so quick to ignore this difference, so confirmation bias is also a function of whether you want the world to be as you think it is or whether you want it to be different. White privilege, in this context, can be seen as a confirmation bias based on experiencing a situation as essentially functional. This is why I say that the shape of knowledge is a spear: new and contrary information punctures the master narrative and the status quo.
We learn tact as children when we are taught to ignore discrediting performances by parents and other people who have power over us. We also learn to appreciate tact when family members graciously overlook our own discrediting acts—although some people conclude from other people’s tact that they really are all that.
As strong as the motivation toward confirmation bias may be to keep the world as it is, an even stronger motivation is to keep ourselves as we are. Each of us is a system unto ourselves, and each of us has stigmatized the identity elements and past acts that don’t fit with the definitions of ourselves that we are trying to portray. For most people, this portrayal may be simplified as “a good person”; for some, it’s “a powerful person,” “a loveable person,” “a free spirit,” or whatever. Confirmation bias is fueled by the desire to keep being that good person, which requires a world that reflects back to us a successful portrayal. When the world doesn’t oblige, we hear it anyway. Humble people focus on the few audience members who weren’t engaged; narcissists ignore them and focus on the few admirers.
A great deal of what we know about cognition may actually be a function of group relations. A politician who is accused of being out of touch is actually in touch with people who see things as he does. When he doesn’t know how it sounds to say that people should work more, it’s because in his subculture, blaming the less fortunate for their misfortune keeps him a full-fledged member of his group (a group that explains their own good fortune as deserved, whether because of industry, intellect, or God’s grace). When scientists look at the available evidence and disagree, it’s often a function of which group they will face after reporting their conclusions. Eminent researchers don’t like to be told that they don’t understand psychotherapy; eminent therapists don’t like to be told that they don’t understand Bayes’ Theorem. Only a culture that privileges evidence over performance can hope to find out what’s true—to develop statements that are useful guides and lead to what Skinner called effective action. Privileging evidence and logic is what the culture of science is supposed to do, but the culture of science is populated by people, each with a relational stake in the outcome.
Soon, I will blog about the connection between this sort of confirmation bias—confirming not just expectations about the percept but also those relating to the sort of world we live in and our role in it—and the psychoanalytic concept of defense mechanisms.