Fact-Checking Is Ineffective Where It Counts
Why psychology beats authority.
Posted May 3, 2019
“I take it on faith that doing more of it will have a positive effect on democracy. I would love some evidence that it’s true. If you have any that it’s not, I’m not going to pay any attention to it.”
—Brooks Jackson (co-founder of FactCheck.org)
Fact-checking has become an article of faith in the era of dueling facts. The problem is that many psychological mechanisms work against it. And the available evidence indicates that it does not change perceptions. If fact-checkers insist on admitting the evidence, why won’t they admit that fact-checking is not working?
Studies indicate that citizens resist fact-checking messages that oppose their prior beliefs, especially if the appeals clash with their partisan leanings. Perhaps even more interesting are findings suggesting that attempts at correction may actually induce greater confidence (the backfire effect). And even apparent corrections may mask the persistence and re-emergence of prior beliefs, or what have been called “belief echoes.” While many advocates and scholars maintain a stubborn support of the fact-checking industry, their views of its efficacy are more hopeful than empirical.
Fact-checkers are fighting against two dominant problems in how ordinary people react to their reporting. The first is the broad set of psychological mechanisms that support resistance to opposing information (which are especially strong among more educated citizens). If voters were rational consumers of information, fact-checking might work. But we know that non-rational and psychological forces abound. Perhaps the lead actors are selective cognition, group conformity, and motivated reasoning. Humans process information in a highly selective way. We can’t give equal weight to the millions of possible inputs, so we only pay attention to some, accept some, and remember some. The information we select tends to be the pieces that reinforce our prior views and positive perceptions of ourselves; the rest we tend to overlook.
Conformity to the beliefs of our group is another core mechanism that warps our perceptions. Social proof is the term for relying on the judgment of others when we are in doubt. And the others we trust the most are ones who are like us. But the real heart of conformity is the price paid for disagreement. As Somerset Maugham phrased it in The Moon and Sixpence, “The desire for approbation is perhaps the most deeply seated instinct of civilized man.” The most famous psychology study of conformity was not about dress or language or politics, but about conformity of perceptions.
Motivated reasoning—the tendency to shift perceptions toward ones that fit our political identities—combines these cognitive and social effects. Mental errors are common and social conformity is rampant, but they are even more powerful if they have a political motive behind them.
Even if a skilled individual overcame all of these tendencies, they still might reject fact-checking for a different reason. The second problem that fact-checkers face is the impression that they employ illegitimate methods or are ideological partisans (which is also especially strong among the more educated). A Rasmussen poll from September 2016 illustrates the prevalence of distrust: “Do you trust media fact-checking of candidates’ comments, or do you think news organizations skew the facts to help candidates they support?” Only 29 percent expressed trust in fact-checking.
Why don’t voters trust fact-checkers? Perhaps because they shouldn’t. One of the first studies of fact-checking that we conducted was on whether the different major fact-checkers—PolitiFact, Factcheck.org, and The Fact Checker of The Washington Post—actually agreed with each other. It is hard to test whether the fact-checkers are providing us with the accurate facts, but at the very least they should be agreeing with each other. Otherwise one of them is clearly wrong. What we found was that often they do not agree on the prevailing facts. A more recent study by by Chloe Lim in Research & Politics comes to similar conclusions about a wider range of fact-checks. The inherent subjectivity of the enterprise, combined with a lack of established standards for identifying and weighing evidence, means that fact-checkers will continue to disagree even among themselves.
This doesn’t mean that fact-checking has no positive effects. Recent empirical studies have demonstrated that fact checking efforts (when voters read them) may actually alter perceptions of a candidate’s truthfulness. Attention to fact-checks can also influence whether citizens believe negative claims about politicians. And perhaps most importantly, fact-checking seems to be able to encourage politicians to refrain from making unsubstantiated claims. What fact-checking does not do is alter perceptions of prevailing facts when they are in dispute.
The evidence says that fact-checking is not effective in altering public perceptions of facts, yet many journalists and scholars (and especially journalism scholars) remain devout believers. Perhaps the clearest reason that fact-checking fails is that there is a low cost to continuing to believe as one likes. On the other hand, there can be a high cost to our self-perception if we admit that our prior beliefs are wrong (and that the values that support them are questionable). Perhaps the highest cost is in social status if we disagree openly with the perceptions of our peer group or the social network we rely on for aid and comfort. Of course the same applies to journalists and scholars: There is a low cost to insisting that fact-checking works. And there is a high cost to expressing the truth to other professionals who want to believe in the gospel. It does not get applause from other academics to say that fact-checking does not work (or that education may not work). So many choose to continue cheerleading even though they know the evidence does not support it. The dueling facts about fact-checking are as psychologically-driven as other perceptions.