Cults and Cognition II: Programming the True Believer

In the world of cults, our strong social affinities can have a dark side.

Posted Oct 23, 2020

Publicdomainvectors
Source: Publicdomainvectors

In our last Forensic View (Sharps, 2020), we discussed the importance of cognitive dissonance in the maintenance and even enhancement of cult beliefs. Many of the ideas we considered are bizarre by normal standards; it’s difficult to see how a person could come to endorse such beliefs, and unless that person were relatively dissociated (see our next Forensic View), he or she probably wouldn’t. Not while thinking as an individual.   

But what if that person is a member of a group? 

A great research psychologist once told me of his studies of monkeys. He was having trouble distinguishing a given monkey under observation when it mixed in with its fellows, so he dabbed a little drab paint, hardly visible, onto the hindquarters of the specific monkey. 

When this painted monkey was introduced back into its troop, the psychologist anticipated no problem; the paint was practically invisible. But the other monkeys completely freaked out. They ran away screaming, then huddled together, then spread into a semi-circular formation to stalk the painted monkey (it would have been better if they’d been snapping their fingers like the Sharks or the Jets, but they weren’t); and then, in a group, leaped homicidally (monkeycidally?) onto the painted monkey in question.

But they didn’t kill him. Instead, they ripped all the painted fur out of his backside. After that, they still wouldn’t let him fully back into the troop, not at least until his fur grew back; but apparently, a bald and bleeding monkey was better than a painted monkey, and when his fur grew back, there were no further problems—he was just like them. 

Now, human beings obviously aren’t monkeys, but our ancestors evolved in the same environments some of them did, and we retain many of the same hypersocial characteristics that those varieties have. One of those is an extreme tendency toward conformity. 

Enter Solomon Asch.

Asch conducted experiments involving the matching of stimulus items of various lengths (Asch, 1955), but we’re going to skip over the methodological details here and cut to the chase: Asch basically asked people to tell him which sticks, longer or shorter, were actually longer. 

Pretty much everybody told him the longer sticks were longer. 

Until Asch put a few other people in the situation. These other people were actually working for him, but the poor experimental subject didn’t know that. The subject had correctly stated that stick A was longer than stick B. But then Asch’s confederates all picked the wrong one, basically sending a false message: “No, B is longer than A. You think that A is longer than B? No, we all think that B is longer than A.” And so on. 

Now, there can’t be anything simpler than two sticks, one longer than the other. Yet under social pressure, about 1/3 of normal adults were willing to go along with this absurdity again and again, affirming that the long stick was shorter than the short stick; and 3/4 went along at least once!

There have of course been many criticisms of Asch’s work; but there’s a very reasonable, parsimonious interpretation of this research. Just like the monkey who, shorn of his painted backside fur, had to grow back normal fur to be accepted, we are deeply desperate to be accepted by our groups; and we’re willing to avow palpable absurdities to that end.

And these tendencies increase dramatically if our group, or cult, has a leader.

Enter Stanley Milgram.

If the most important scholar of cognitive dissonance was Leon Festinger, and the most important scholar of conformity Solomon Asch, then surely the most important scholar of obedience was Stanley Milgram. 

Again, we’ll skip over methodological details, but basically, Milgram would invite you to his lab and essentially ask you to give huge and increasing electrical shocks to some poor victim in the next room who got verbal learning items wrong. Now, nobody was ever actually shocked; but the experimental subjects thought they were shocking people, through ascending levels of perceived pain and stress, to a final point of silence, a point at which a reasonable person might assume the victim to be unconscious or dead. 

The upshot: depending on how Milgram set up the task (Milgram, 1974), 63 percent of people on average were apparently willing to shock total strangers (potentially to death?) on the demands of a total stranger. Admittedly, a mildly authoritative stranger, some kind of laboratory director in a white coat and a tie; but apparently, for most people, Milgram became their leader and they obeyed his "orders." Criticisms and disagreement have of course been leveled at these studies, but it’s hard to argue, parsimoniously, at least with the general implications of this research. 

Now, the cults we considered earlier (Sharps, 2020) all had charismatic leaders. Stan Milgram was just a university professor in a lab coat; he never had the cultic power of, for example, Charles Manson, who claimed to be a Messiah.

Yet people obeyed Milgram anyway. In view of that fact, what might they do for the Messiah Manson?

Or Jones? Or Koresh? Or Applewhite? Or, for that matter, Hitler? We might not pull each other’s fur out, but we might very well commit suicide for our own beliefs, or kill others simply to conform and to obey our charismatic leaders. This has happened many times in history.

We are a conformist species which tends to obey, tendencies which we probably inherited from our most ancient evolutionary ancestors. But even so, there is hope. We are also a species with extraordinary cognitive powers, which can be used to resist the impulses which drive cult behavior. The fact is that not everybody goes along with a cultic crowd, with Heaven’s Gate or the Manson family or the Nazi Party. There are individual differences

We’ll deal with some of those individual differences, from a cognitive perspective, in our next Forensic View

References

Asch. S.E. (1955).  Opinions and Social Pressure.  Scientific American, 193, 31-35.

Milgram, S. (1974).  Obedience to Authority.  New York: Harper & Row.

Sharps, M.J. (2020, Posted October 2).  Cults and Cognition: Programming the True Believer.    Psychology Today, The Forensic View, https://www.psychologytoday.com/us/blog/the-forensic-view/202010/cults-and-cognition-programming-the-true-believer.