Skip to main content

Verified by Psychology Today

Cognition

Cults and Cognition III: Programming the True Believer

Part 3: Dissociative responses help us to live in the imaginary worlds of cults.

Publicdomainvectors
Source: Publicdomainvectors

In our last two Forensic Views, we examined cognitive dissonance and social factors, which tend to maintain and enhance cult and cult-like beliefs, such as those of Heaven's Gate or of the Manson family. But how do such beliefs get started, for the individual, in the first place?

Human beings are very social creatures, so it’s important that we be cognitively capable of going along with a given crowd; but when that crowd is racing headlong toward something irrational or self-destructive, like the Keech Space Alien effect we saw previously (Festinger et al., 1956), how does the individual mind manage to go along? The personality and socioeconomic influences that may result in destructive mass movements have been well-documented (e.g., Hoffer, 1951). But what, in the individual human mind, allows us to believe in the unreal, in fantasies in which the rules of logic and evidence are suspended?

In a word: dissociation.

We’re not talking about the terrible psychiatric condition of dissociative identity disorder. We’re talking here about subclinical dissociation, probably experienced by everybody at times, in which anomalous perceptions of individual experience result from a sense of diffusion or unreality (see Sharps et al., 2014). This is the subclinical dissociative state. Some people experience it more than others; and it can be devastating.

My laboratory deals with eyewitness issues. We’ve learned a lot by looking at the extremes of such issues; and there’s almost nothing more extreme than an eyewitness experience of Bigfoot, or of UFOs or ghosts from the afterlife. Lots of people report these things; and most of these people are also subject to relatively high levels of dissociation. These people are not mentally ill. They just experience a bit more of the sense that things are a bit unreal, a bit like a waking dream; and as a result, not only do the dissociated believe more in Bigfoot, ghosts, and space aliens, they can actually see these things more in everyday stimuli. And if we change the social environment to be more persuasive, to provide a cognitive framework that emphasizes UFO aliens or ancient mysticism, we can get people to see things that aren’t there at all (e.g., Sharps et al., 2020).

This is exactly how cult thinking operates. A cognitive framework is established that initiates novice believers into fantasy landscapes; dissociative processes are instrumental in helping the believers buy into the bits that don’t make sense; and believers defend their beliefs through the cognitive dissonance we’ve previously discussed (Sharps, 2020, October 2).

But what allows dissociation to operate in this system? We introduced (e.g., Sharps, 2003) a continuum of cognitive processing from Gestalt (G processing), in which we rapidly consider situations without much attention to the details within them, to Feature-Intensive (FI processing), which is slower but in which we do consider internal features of situations. This is important here because it’s easier to believe bizarre or paranormal things in G processing mode than in FI (Sharps et al., 2016).

FI processing effectively forces you to confront absurdities head-on. Feature-intensive physics means the Heaven’s Gate starship couldn’t really hide behind a comet. Charles Manson's history of petty crime was inconsistent, on an FI basis, with his being the Messiah. And so on.

We’re beginning to get a better picture of the cognitive systems which help to create and maintain cult-like and other bizarre beliefs. Cognitive dissonance, and human tendencies to obey and conform, help to maintain such beliefs; and dissociative processes help to set them up in the first place.

Well and good; but what can we do about it? How can we prevent these sinister chains of cognitive events in the first place?

If people never start on the road to cult beliefs, they won't have the opportunity to obey cult leaders or to conform to cult norms, and cognitive dissonance will take care of itself; it there’s no investment, there will be no cognitive dissonance (Festinger et al., 1956). But what can be done to counter dissociation itself?

In several experimental venues, we have found that FI processing reduces the tendency to enter into bizarre beliefs. For example (Sharps et al., 2020), we found that asking questions about paranormal phenomena in FI terms significantly reduced paranormal beliefs in favor of more prosaic answers. Additional FI processing might therefore have been very helpful in getting a given Heaven’s Gater, for example, to consider explicitly the concept that suicide can't really beam you to a starship, which can’t really be hiding behind a comet anyway.

So FI processing, which reduces the Gestalt thinking conducive to cult behavior, may be an important answer. But there is a problem of delivery: as Festinger (et al., 1956) observed, the sources of facts can be questioned, and logic can be ignored. And in our research to date, we have found that FI processing only seems to reduce paranormal beliefs when it is generated by the respondent.

In other words, telling people outright that the Heaven’s Gate Space Aliens don’t exist won’t be particularly effective. However, if you ask specific questions which will force individuals themselves to consider the issues in FI terms, to consider the real-world details of an apparently paranormal situation, people generally respond with much more prosaic, real-world thinking that tends to reduce the vague, amorphous thinking necessary for cult cognition (e.g., Sharps et al., 2020).

We don’t have a complete panacea for cult behavior, or for the dissociation that helps to initiate and maintain it. Such a solution will require expertise from many areas of psychology other than cognitive science. However, we are at least at the beginning to understand the cognitive underpinnings of cult behavior, and what to do about them.

References

Festinger, L., Riecken, H.W., & Schachter, S. (1956, reprinted 2011). When Prophecy Fails. Blackburn, VA: Wilder Publications.

Hoffer, E. (1951). The True Believer: Thoughts on the Nature of Mass Movements. New York: Harper and Brothers

Sharps, M.J. (2003). Aging, Representation, and Thought: Gestalt and Feature-Intensive Processing. New Brunswick NJ: Transaction.

Sharps, M.J., Liao, S.W., & Herrera, M.R. (2014). Remembrance of Apocalypse Past. Skeptical Inquirer, 38, 54-58.

Sharps, M.J., Liao, S.W., & Herrera, M.R. (2016). DIssociation and Paranormal Beliefs: Toward a Taxonomy of Belief in the Unreal. Skeptical Inquirer, 40, 40-44.

Sharps, M.J., Nagra, S., Hurd, S., & Humphrey, A. (2020). Magic in the House of Rain: Cognitive Bases of UFO 'Observations' in the Southwest Desert. Skeptical Inquirer, 44, 46-49.

advertisement
More from Matthew J. Sharps Ph.D.
More from Psychology Today