Robert J King Ph.D.

Hive Mind

Dissonant Cognitions

"Cognitive dissonance" probably does not mean what you think it means.

Posted Feb 24, 2020

No, gawd-dammit internet, no. You aren’t having this one. You might have turned “beg the question” into “prompts the question” (grrr) and created a new coffee type called an “Expresso” (snort) but you aren’t wrecking the meaning of “cognitive dissonance." Not while I have breath in my body. It’s too damned useful, and powerful, a concept.

You see the term “Cognitive dissonance” bandied about quite a lot these days, but often wrongly. It does not, and I want to be really really clear about this, mean “believing two dumb things at once”, or “believing two contradictory things at once." I have seen online people looking at contradictory statements by other online people, and going “the cognitive dissonance is strong here” or similar.

No. No it isn’t. That’s the whole point. We have a word for doing those things, fortuitously coined by that expert in propaganda, George Orwell. That word is “doublethink” and it’s an excellent word. Doublethink is the suppression of the natural human tendency to be bothered by logical contradiction. Orwell used it to highlight Voltaire’s horrifying insight that those who can make you believe absurdities, can make you commit atrocities. In Orwell’s nightmare 1984, one of the totalitarian horrors was to make people say and believe contradictory things such as “War is peace” and “Freedom is slavery." That’s “doublethink."

But “Doublethink” means almost the precise opposite of “Cognitive Dissonance.” Why so cross Rob? It’s just another bit of sloppy internetting. Why does this particular bit of misuse make your blood boil, ears steam, and cause phones to be hurled across the room?

I’m glad you asked.

Because the real meaning of cognitive dissonance is one of the most explanatorily powerful, robustly measurable, and downright kewl mechanisms in the whole of behavioral science. It marries together insights from psycho-therapy, evolution, cognitive neuroscience, and social psychology, in one testable package.

It was term coined by Leon Festinger, the sort of pioneering psychologist that used to be possible in the 1950s, who would pick up areas of interest to himself and just make up the methods of testing as he went along. One of his most famous exploits was to join a doomsday cult, because he wanted to know how the believers adjusted when the predicted end of the world didn’t happen. Did they just shrug and go about their business? Join a new cult?

What he found was those most invested in the cult—some having left families, and jobs, selling all their possessions—were the most likely to concoct the most preposterous reasons why the predicted end of the world had been postponed. Maybe the world had been saved by the extreme piety of the cultists? Maybe god was really impressed by a small bunch of chanting robe-wearers? However, as interesting as this was, perhaps it was just the case that the most loopy cultists were more inclined to concoct loopy stories? What Festinger needed was a way to manipulate a situation experimentally.

What he came up was simultaneously one of the most mind-numbingly tedious, yet brilliantly insightful, social experimental manipulations of all time. He also found a way to experimentally demonstrate the defense mechanisms that therapists such as Anna Freud had hypothesized—something worth telling to the sort of bore who snootily proclaims that talking therapy isn’t scientific. But, back to the experiment. I’ll deal with the snooty bores at a later date.

The first thing was to get participants to do something mind-numbingly dull for an hour (or more). Turning pegs in a board was one such task. Now came the experimental maguffin—You tell the (bored) participant that the person who normally coaches the peg-turners is “away”, and would they be good enough to try to persuade the next person in line that the task is actually fun?

They all agree to do this, which tells you that not only should you never trust a social psychologist, you can’t trust experimental participants either. Anyway, they all happily lie to the next poor boob who has to come along and be bored for an hour.

How happily, you ask? Now—this is the really clever bit, and, it’s the crack in reality through which torrents of human misery, delusion, and political careers, flow like raw sewage. Some participants were given $20 to lie, quite a lot of money back in the 1950s, and some were only given $1.

Now, without peeking, who was the happiest? The $20 people, or the $1 people?

Festinger & Carlsmith (1959)
From the original study
Source: Festinger and Carlsmith (1959)

It was the $1 people. Let that sink in for a second, because it’s counter-intuitive. But, once you realize the implications, then an awful lot of human life starts to make more sense. The $20 people knew that they were bored, but that’s ok—they were paid $20 to be bored. “It’s called work because they have to pay you to do it”, and all that. But the $1 people had a problem. An internal problem, that is. They had just wasted an hour of their lives, and, got very little back for it.

And, this is the crucial part, it is this tension that is “dissonant." Not believing two contradictory things at once—it’s the internal urge to reduce the tension caused by the world backing you into a corner, where you appear to have to believe two contradictory things at once. Two things such as “I’m a smart person who values their time”, and “I’ve just had my time wasted." Something has to give, and what gave, in the original Festinger study, was the assessment of how enjoyable the task was. Not what they told the other poor boob next in line—they all lied to them, it’s what the participants told themselves that altered. The lower paid participants were even more likely to want to take part in similar experiments in the future than the higher paid ones.

The implications of this are hard to overstate. Humans like to keep a consistent internal narrative of themselves going and, when this is threatened, they have to perform some mental chicanery to keep that story coherent—at least, coherent to that audience of one. They can deny reality, alter their previous feelings, project their anger onto the world. The whole panoply of things that therapists are all well aware, of and call “defense mechanisms”, in other words.

And, if you force compliance with something nasty, such as with unpleasant hazing rituals, you can increase the love for the group that has done this to you. The alternative is to face the fact that you have been bullied. The lure of “forbidden fruit”, which can all too easily turn into “sour grapes” all makes sense in the light of reducing dissonance. From believing in Emperor’s new clothes, to thinking that a bottle of wine could possibly be worth over $300,000, a lot of the potty things that we humans do is driven by a desire to reduce that internal appreciation of dissonance-causing absurdity. And don’t ask me how long I managed to come up with clever ways to keep myself smoking cigarettes, none of which had anything to do with being addicted to nicotine, of course. It’s embarrassing.

Incidentally, that last example illustrates that just knowing about the power of cognitive dissonance does not necessarily protect you from its effects. You can shoot a ballistics expert, after all. What you can’t do is make them like it.

References

Festinger, L., & Carlsmith, J. M. (1959). Cognitive dissonance. J. Abnor. Soc. Psychol, 58, 203-210.