Self-Correction: The Methodological Revolution in Psychology

When does evidence change scientists' and laypeople's minds?

Posted Mar 13, 2018

Andy Vonasch is currently a postdoctoral research fellow at the University of North Carolina at Chapel Hill, and will soon join the faculty at the University of Canterbury in Christchurch, New Zealand. He studies morality and rationality. This blog post was inspired by the work of the self-correction committee at SIPS (the Society for the Improvement of Psychological Science) that Lee’s mentioned in his previous posts. This post is hopefully the first of many steps we take to recognize and laud self-correction ongoing in our field.

---------------------------------------------

In a previous post, Lee wrote about how important self-correction is in science, as well as some of the barriers that impede self-correction in practice. Part of what makes science so valuable is that it corrects its own errors—but it’s often a slow process that’s further slowed by scholars’ attachments to their own ideas. It’s difficult for anyone to acknowledge that they were wrong, and it’s especially difficult when the idea you were wrong about is a part of your identity as a scientist and a scholar. There is very little recognition in the field for this kind of intellectual courage.

Lee Jussim
Source: Lee Jussim

The methodological revolution psychology is currently in has mainly focused on criticizing the negative—failed replications, inadequate sample sizes, p-hacking, and so forth. I think it would be great to focus more on the positive, because there is a lot of great research going on! Ultimately, it would be great to give awards for self-correction, as our committee suggested—change behavior through carrots, not sticks—but for now let’s start with some well-deserved public recognition. In this post, I will highlight two scientists who have responded to new evidence by updating their beliefs about a phenomenon—even though it changes the meaning of some of their previous work. The scientists are Brendan Nyhan and Jason Reifler, and they have responded admirably to recent work suggesting the backfire effect they are known for may occur less frequently than previously thought.

The Backfire Effect: One of Many Examples of Self-Correction in Science Going Right

Why do people believe things that aren’t true? One reason is lack of evidence. Ancient Greek philosophers didn’t know about subatomic particles, so it was

Lee Jussim
Source: Lee Jussim

reasonable to speculate that the world was made of various elements like water, fire, earth, and air. Similarly, medieval astronomers lacked the kind of telescopes that would have enabled them to see distant galaxies, so it was reasonable to believe our planet was one of just a few celestial bodies—even though scientists now estimate that figure to be off by a factor of 1 trillion trillions!

It’s not unusual to be wrong because you just don’t know. What’s more troubling is the persistence of false belief despite widely available information that could correct one’s false beliefs. Twenty percent of Americans believe vaccines cause autism—they don’t. Eleven percent believe the Bush administration purposefully allowed 9/11 to happen—they didn’t. Sixteen percent of Republicans firmly believe Barack Obama was born in Kenya—he wasn’t. These beliefs are not only wrong, they have consequences. Believing one’s political opponent was illegally elected or purposefully caused harm to America undermines trust in our elected officials and democratic institutions, and presumably increases division and stymies cooperation between our political parties.

We need to understand why misperceptions persist in the presence of disconfirming facts.

That’s why Brendan Nyhan and Jason Reifler’s research is so interesting. They found that in some cases learning corrective facts may actually backfire—strengthening rather than reducing false beliefs. Their 2010 paper on the backfire effect has been very influential in the field. Its logic has been used to explain a host of phenomena, including why JFK conspiracy theories persist, why messages about vaccine safety and disease prevention fail to persuade parents they should vaccinate their children, and why Sarah Palin supporters believed that Obamacare would create death panels. It’s a simple but powerful explanation for why facts don’t persuade—telling people facts often backfires. But new research suggests that telling people facts rarely backfires, meaning other explanations are needed for the persistence of false beliefs.

The 2010 paper reported four studies examining how conservatives and liberals responded to news articles reporting that no weapons of mass destruction (WMDs) were found in Iraq after the US invasion. As one would expect, conservatives were initially more supportive of the war started by Republican president George W. Bush. Also unsurprisingly, liberals who read articles about the absent WMDs (compared to those who read unrelated articles that didn’t discuss WMDs) were then more likely to declare that there had been none before we invaded. What was surprising was that conservatives who read the articles about no WMDs being found were more likely to believe that WMDs had been in Iraq before we invaded. Fact-checking backfires.

Or does it? New research by Thomas Wood and Ethan Porter examined 52 polarizing issues in over 10,000 participants. They found no evidence of backfire for any of their issues. Looking back in the original paper, Nyhan and Reifler report that fact-checking backfired in a few studies, but not in others. It only backfired for conservatives reading about the Iraq war and about tax cuts. There was no backfire effect on liberals, but they only asked about one issue they hypothesized would backfire for liberals: stem cell research bans. One definite strength of the new article is it thoroughly tests for bias in both liberals and conservatives—a welcome corrective for a field that often looks for conservative irrationality but sometimes neglects to ask whether liberals might show similar biases.

Lee Jussim
Source: Lee Jussim

Good self-correction depends on your response to new evidence—do you change your mind or dig your heels in? Nyhan and Reifler could have tried to stop publication of the new paper, or undercut its importance, but they are taking the high road. The research addresses important issues, and they want to get the answer right. Reifler tweeted “I know I speak for Brendan Nyhan when I say this is normatively important work and we welcome anyone interested into this subfield. Even (perhaps especially!) if they find something different than us.” Elsewhere on Twitter, Nyhan encouraged the publication of additional new studies that suggest backfire effects are rare, even if they contradict their earlier findings. Being open to contradictory evidence is the hallmark of great self-correction.

Moving forward, the important thing will be to figure out why the studies produced different results. There are some key differences between the studies. While in the original studies participants read long news articles about the issues, which might allow more time to reflect and develop strong counterarguments to the inconvenient facts in the articles, in some of the newer studies participants read much shorter statements of facts, which could be harder to argue against. Later studies in Wood and Porter’s paper did embed their inconvenient facts in longer articles, and they found then that people were less likely to correct their misperceptions. But still, no backfire effects.

Another difference is the sample size. One lesson from the methodological revolution in psychology is that scientists need larger samples than we previously thought to get accurate measures. This is a change that Nyhan says he has made in his more recent research, along with preregistering the statistical and methodological approach in his experiments. These are the two main changes I’ve made to my research, too—highly recommended!

Another difference is sample characteristics. Most work on the backfire effect was done on college students who might be more prone to counterargument. Of course, the newer studies were also done more recently, so perhaps people have learned over time to correct their misperceptions. However, given the present “fake news” political climate this doesn’t seem likely. One possibility is that people have learned to support candidates despite inconvenient facts, according to a pair of studies done by Nyhan, Porter, Reifler, and Wood during the 2016 presidential campaign. Trump supporters confronted with facts about crime and jobs that were central issues in Trump’s campaign did change their minds—about the issues, but not about their favored candidate.

Lee Jussim
Source: Lee Jussim

With more research, hopefully we’ll come to a better understanding of why misperceptions persist, and how we can correct them. But it promises to be a clearer and faster process, given the researchers’ openness to the possibility that they could be wrong. Rather than begin an expensive and protracted process trying to figure out whether the original findings hold up, researchers can refocus on the bigger questions at stake: When do people maintain false beliefs? and When do they assimilate relevant information to change their views?