Skip to main content

Verified by Psychology Today

The Science of Antiscientific Thinking

How science can inform our understanding of those who reject it.

Over the last year or two, the headlines have been a bit depressing for those who believe that public policy and education should be guided by scientific findings, rather than short-sighted self-interest or blind faith in ill-informed demagogues. For example, one New York Times headline decried “President Trump’s War on Science,” That article discussed Trump’s public statements that global warming is a hoax, despite the fact that the National Aeronautics and Space Administration/the National Oceanic and Atmospheric Administration (NASA/NOAA) website dedicated to reviewing the scientific evidence on this topic said: "The planet’s average surface temperature has risen about 2.0 degrees Fahrenheit (1.1 degrees Celsius) since the late 19th century, a change driven largely by increased carbon dioxide and other human-made emissions into the atmosphere." A recent Gallup poll suggests that this denial is rampant within the Republican party, where only 32 percent of members accept the scientific findings about global warming (the percentage of those agreeing with the scientists used to be higher, showing the power of oft-repeated “alternative facts”).

Bishop Samuel Wilberforce/Wikimedia Commons. Public Domain.
Bishop Samuel Wilberforce who asked Darwin's defender Thomas Huxley on which side of his family he claimed descent from an ape.
Source: Bishop Samuel Wilberforce/Wikimedia Commons. Public Domain.

Another place where scientists and the lay public part company is with regard to the evidence for evolution by natural selection. 98 percent of scientists in AAAS believe humans evolved. But 34 percent of American citizens reject evolution entirely, and another 25 percent believe evolution was guided by a supreme being (that is not, in case you were wondering, what natural selection means). Among some groups, the denial of evolution is even stronger. Only 11 percent of Evangelical Christians and 6 percent of Jehovah's Witnesses believe in evolution by natural processes. And the majority of the population in Afghanistan, Iraq, and Indonesia reject evolution. Lest all y’all New York Times reading liberals get too self-righteous, most of the people who fail to inoculate their own children (because of a disproved rumor of a link with autism) identify as liberals.

When scientists hear these findings, their standard response is to state the facts more loudly and clearly. But as my colleagues and I argued in an article in this month’s Scientific American magazine, there is a wealth of research on social cognition that suggests that this approach can sometimes backfire.

How can people so wrong be so confident they are right?

For the article on antiscientific thinking, Adam Cohen (who studies the psychology of religion), Steve Neuberg (who studies cognitive biases in social judgments), Robert Cialdini (who studies the tactics persuasion professionals use to trick us into thinking in oversimplified ways), and I (who have studied how evolved cognitive biases can warp our social and economic decisions) pooled our interests to try to help understand how antiscientific thinking can be so resistant to change in the face of evidence.

In the article, we focused on three sets of psychological obstacles to scientific thinking:

Original by Douglas T. Kenrick, used here with permission
Overview of 3 sets of psychological obstacles to processing scientific evidence
Source: Original by Douglas T. Kenrick, used here with permission

1. The cognitive efficiency hurdle. Our brains have facile mechanisms for dealing with information overload: when we are overwhelmed by too many arguments, or when we don’t have enough time to fully consider a question, we rely on simplifying heuristics, such as “go with the local consensus” or “trust an expert.” Cialdini’s book Influence covers a lot of evidence on how this all plays out. One of the most shocking findings out there was a study by Hofling and colleagues, in which a male researcher called up nurses on a ward and identified himself as a doctor whose patient had just been admitted to their ward. The unknown "doctor" directed the nurse who answered the call to check her medicine cabinet for a drug that she had likely never heard of (because it was fictional). There was in fact a bottle of pills by that name that had been placed there at the end of the previous shift. The bottle had explicit instructions, clearly stated on the label, to never to exceed a certain dosage. The doctor asked the nurse to give the patient a dose 3 times that high, and to do so immediately because he was busy, and could not follow usual hospital protocol and give the nurse a written prescription. How many nurses were willing to go against the instructions on the medicine and known hospital protocol just because the person had called himself a doctor? Almost all of them (95 percent).

2. The confirmation bias hurdle. Sometimes we have the time, and perhaps even the interest, to move beyond simple rules of thumb—to actually consider available evidence. But even then, we often process that information in a manner less like an impartial judge and more like a lawyer working for the mob. The human mind is predisposed to pay selective attention to some findings over others, and to reinterpret mixed evidence to fit with pre-existing beliefs.

My favorite example of this is a study by Charlie Lord and colleagues in which Stanford students were presented with mixed scientific evidence on the effectiveness of the death penalty–one study that supported it, followed by counterarguments by other scientists and a rebuttal by the original researchers. Next they read a study that showed results in the opposite direction, followed again by counterarguments, and then by rebuttals. If the evidence on a topic is so mixed, you would think that rational and open-minded students who started with opinions leaning in one direction or the other would move towards the center, and come away less certain of their biases. But that’s not what happened at all. Instead, the students who were originally in favor of capital punishment became even more favorable, citing the strength of findings on their side and the weaknesses of the opposing evidence. More evidence of conservatives being antiscientific? Not quite, those who opposed capital punishment did exactly the opposite–they ended up more opposed than ever, citing the strengths of evidence on their side and the “obvious” weaknesses of evidence on the other side.

3. The social motivation gauntlet. Even if we can surmount the first two–cognitive–obstacles, powerful social motives can interfere with an objective analysis of the facts at hand. For example, whether or not one is biased to come to one scientific conclusion versus another can be influenced by desires to fit into a particular social network, to win status, or to attract a mate. Vlad Griskevicius and several members of our research team did a study of conformity to group judgments in which people were confronted with evidence that the majority of other subjects disagreed with their judgments (about the interest value of an ambiguous geometric figure). Did they go along with the group’s opinion, or stick to their guns? The answer depended on their motivational state. People who had been made fearful by watching a scary movie clip were more likely to conform to the group’s opinion. People in a mating frame of mind (who had watched a romantic film clip) reacted differently depending on their sex. Women in a romantic frame of mind conformed more than women in the neutral control condition, but men did just the opposite–men in a mating frame actually went against the group’s opinion. Presumably, mating motives in men activate a desire to stand out and demonstrate their leadership potential, whereas the same motives motivate women to demonstrate their cooperativeness. In our book The Rational Animal: How Evolution Made Us Smarter Than We Think, Vlad and I discuss many examples of how fundamental motives can hijack our thinking processes.

So, there are lots of natural cognitive and motivational obstacles to thinking objectively. We should not have been surprised then, when our article came out in Scientific American, and my colleagues and I received several emails from people who claimed to have scientific evidence not only against global warming, but also against evolution by natural selection.

But should we just resign ourselves to a world in which people’s scientific views never progress? No. Although many members of the Anglican clergy, like Bishop Wilberforce, rejected Darwin’s theory of evolution when it was first proposed, the director of public affairs for the church wrote a belated apology to celebrate the occasion of Darwin’s 200th birthday, and the majority of British citizens now accept the evidence for natural selection (so there is hope that America will someday follow suit).

Scientifically minded people should approach the processing of scientific evidence empirically, though. If we want to convince someone to take another look at the evidence, we need to understand which one of the obstacles is at work. For example, scaring people about a future dystopian world of climate disaster may only lead them to cling more strongly to their group’s beliefs (about global warming being a hoax, for example). On the other hand, finding a member of the target group who has changed his opinion in the face of new evidence (a former opponent of climate change, for example) is likely to be a much more effective strategy.

    References

    Cialdini, R.B. (2009). Influence: The Psychology of Persuasion. Harper Business.

    Griskevicius, V., Goldstein, N., Mortensen, C., Cialdini, R.B., & Kenrick, D.T. (2006). Going along versus going alone: When fundamental motives facilitate strategic (non)conformity. Journal of Personality and Social Psychology, 91, 281-294.

    Hofling, C.K., Brotzman, E., Dalrymple, S., Graves, N., & Pierce, C.M. (1966). An experimental study in nurse–physician relationships. Journal of Nervous and Mental Disease, 143, 171–180.

    Kenrick, D.T., Cohen, A.B., Cialdini, R.B., & Neuberg, S.L. (2018). The science of antiscience thinking. Scientific American, 319, 36-41.

    Kenrick, D.T. & Griskevicius, V. (2013). The Rational Animal: How Evolution Made Us Smarter Than We Think. New York: Basic Books.

    Lord, C.G., Ross, L., & Lepper, M.R. (1979). Biased assimilation and attitude polarization. Journal of Personality and Social Psychology, 37, 2098–2109.

    Related posts:

    The evolved wisdom behind our seemingly stupid decisions.

    Do conservatives have a monopoly on antiscientific thinking: Liberals, conservatives, and selective skepticism.

    Why are their political views so self-interested? Revealing the hidden agenda of the political mind.

    Pre-suasion: Before you try to persuade someone...

    The 6 principles of social influence. Tips from the "guru of social influence."

    Is opposition to pot-smoking really just fear of sex?

    advertisement