Cults and Cognition: Programming the True Believer
How do cognitive processes contribute to bizarre—and lethal—cult beliefs?
Posted October 2, 2020 | Reviewed by Devon Frye
In 1978, a messianic figure, Jim Jones, persuaded a great number of his followers to commit suicide, which the victims termed a “revolutionary act.” Revolution against whom or what was not specified.
In 1993, Branch Davidians followed another messianic leader, David Koresh, into a compound at Waco, Texas, with misplaced faith and automatic weapons. Many of the faithful were killed in an epic standoff with law enforcement.
And in 1997, a former music professor named Applewhite persuaded 38 people to commit suicide with him, with plastic bags and purple cloths over their heads and Nike sneakers on their feet, in an effort to reach a UFO hiding behind Comet Hale-Bopp. This UFO was to take them, via what they termed Heaven’s Gate, to an unspecified interplanetary reward.
“Nike” is Greek for “victory.” Dying with your head in a plastic bag doesn’t seem especially victorious.
Now, if you survived even high school physics, you know that a UFO can’t hide behind a comet; basic orbital dynamics make the whole idea impossible. It just won’t work.
Yet a large number of modern adults put Victory Nikes on their feet and purple blankets over their faces and killed themselves, in shifts, for God’s sake, to become a post-mortem away team. (That really was the term they used—they were actually wearing Star Trek-inspired Away Team patches on their unisex black uniforms).
And of course, they didn’t go anywhere. They just died.
Star Trek is fiction. So was the UFO behind Comet Hale-Bop, and so was all the amazing nonsense that Applewhite, and Jones, and Koresh, and many others have used to persuade people to die for them. Most adults know that fiction is not reality. Yet many adults have died for the sake of carefully-fabricated fictions.
The obvious question: How does cult psychology work? How is it possible to persuade human adults to enter a weird cognitive landscape with no basis in reality? To enter a fantasy realm so profound that they’ll willingly die for whomever has been selected as the local Messiah?
A complete answer to this crucial question is beyond our scope or available space, so in this and the next two Forensic Views, we’re going to focus on three specific cognitive cult dynamics: dissociation, group psychology, and cognitive dissonance.
Today, we’ll take up cognitive dissonance (e.g., Festinger et al. 1956), which manifests itself in the tendency to overvalue anything in which we’ve invested too much—money, time, emotional energy, whatever. Cognitive dissonance essentially means that the more you’ve paid, the better you like. Whether it makes any sense or not.
Enter Marian Keech.
Keech (a pseudonym) believed she had received messages from the space aliens of Planet Clarion (seriously—that’s what she said) that large chunks of the Earth and its population were to be destroyed by flood at dawn on December 21, 1954.
Keech managed to persuade her followers that the apocalyptic flood was going to happen, that they should join her to await the disaster, that they would be rescued by the Clarion Flying Saucer. That everybody else could drown—well, the hell with them.
Now, since the world was basically ending, there wasn’t much point in holding on to property, careers, or perhaps even relationships. So many of Keech’s Believers gave these things up. You can imagine how the relevant marital conversations went, when the Believers explained that they were now unemployed and they’d sold the house or car for a few bucks and a smile.
Keech’s followers paid tremendously for their participation in her cult. Property, careers, spouses—all of those were flushed away.
The devotion of Keech’s believers to her nonsense seems bizarre, but it actually makes perfect sense when you think about it in terms of cognitive dissonance. They’d paid so much that there was no way they could believe they’d been wrong.
When December 21 came, of course, there was no flood. And no space aliens. Nothing happened at all. Keech started to cry.
But then, Keech received a "message" by means of automatic writing. (“Automatic writing” happens when you write a message to yourself and pretend [or believe?] that somebody else, who's invisible, is controlling the pencil. This happened to Keech a lot.) But anyway, the message, apparently from God himself, was that He was so impressed by Keech and Company’s devotion that He’d decided to stand down and not cap much of the world's population after all.
Do you think you could sell obvious nonsense like this even to a child?
But Keech’s adult followers had invested so much, their cognitive dissonance was so great, that not only did the majority of them believe Keech’s story, they became even more devoted to these beliefs, proselytizing Keech’s message like never before (Festinger et al, 1956).
Festinger's methods, and his profound reliance on the concept of cognitive dissonance, have been criticized. Not everyone agrees with his perspective, and it is clear that there were other psychological factors involved; but before you decide that the Keech effect must have happened because people in the 1950s were stupid or something, consider this:
My lab conducted studies of beliefs in the much anticipated 2012 Mayan-Calendar-End-of-the-World, also scheduled for December 21, the winter solstice that year. (Something about solstices seems very important to some people. It's hard to say why. We have a winter solstice every year. We can’t help it. It's pretty much standard on this planet.) But anyway, we found that about 10 percent of our population thought the apocalypse was certainly going to happen (Sharps et al., 2013).
And after nothing happened at all, 10 percent were still certain that the Mayan apocalypse was definitely going to happen (Sharps et al., 2014). Physical reality has nothing on cognitive dissonance, and related psychological dynamics, as determinants of human behavior. And these effects continue into the modern world.
In our next Forensic View, we’ll continue with other cognitive determinants of beliefs that are damaging or even lethal to our own welfare, cognitive mechanisms which program the true believer.
Festinger, L. Reicken, H.W., & Schachter, S. (1956, reprinted 2011). When Prophecy Fails. Blacksburg, VA: Wilder Publications.
Sharps, M.J., Liao, S.W., & Herrera, M.R. (2013). It's the End of the World, and They Don't Feel Fine; The Psychology of December 21, 2012. Skeptical Inquirer, 37, 34-39.
Sharps, M.J., Liao, S.W., & Herrera, M.R. (2014). Remembrance of Apocalypse Past. Skeptical Inquirer, 38, 54-58.