Skip to main content

Verified by Psychology Today

Bias

How Fact-Checking is Flawed

The perils of uneven standards.

My inbox often includes a “PolitiFact Truth Rundown.” I have been critical of the fact-checking industry as unsound, but still look at their offerings. A recent one on Dan Crenshaw, a member of Congress from Texas, illustrates a major flaw in fact-checking that causes a great deal of distrust.

Crenshaw lost his eye serving as a Navy SEAL in Afghanistan. He received a great deal of attention for his sense of humor in an appearance on Saturday Night Live to receive an apology from Pete Davidson after a disrespectful joke. More recently he appeared on The View and made this statement about asylum seekers coming across the Mexican border:

“As it turns out, about 80 to 90 percent of those don’t have a valid asylum claim, once we actually get their documentation.” PolitiFact rated this statement to be False.

The interviewer immediately interjected, “That fact, what you are saying, actually has been debunked.” Crenshaw responded, “It has not been debunked. I’m sorry, no it has not. I deal with the Department of Homeland Security and these are the numbers that are coming out… Explain why those are false then. You’re just saying they’re false, you’re not providing evidence they are.”

Why does PolitiFact endorse The View and dispute Crenshaw? The PolitiFact report admits that he was citing the figures from Homeland Security accurately:

“The truth is, about 20 to 30 percent of asylum requests have been granted annually since 2009” and “asylum was granted in 16% of cases that originated from a credible fear claim.”

But here is the catch: “Experts said that does not mean that the remaining 70 to 80 percent of cases are invalid.”

According to PolitiFact’s experts—an analyst at the Migration Policy Institute and two law professors—the remaining claims were not necessarily invalid, but could have been dismissed for other reasons even though there was truth to them. So “we rate this claim False.”

In other words, Crenshaw’s experts were the administrators and judges who rejected the asylum requests. His definition of “valid” is what an immigration judge said passed the standard. PolitiFact's experts offer a different definition of “valid” and think many of the other cases might have merit as well.

Who is right? Whom do you trust? More importantly, what standard should PolitiFact apply?

One of the core flaws of fact-checking is that the answer depends entirely on which frame they choose. They could ask two distinct questions:

  • Can this statement be considered to be true?
  • Can this statement be considered to be false?

If they asked the first question about Crenshaw’s statement, it clearly could be considered to be true. He is relying on government data and the government’s definition of validity: a judge accepts the claim “once we actually get their documentation.”

If fact-checkers chose the second question, Crenshaw’s statement can be challenged by experts who prefer a different interpretation of validity and therefore question the government’s data.

And if we pay attention to fact-checkers over time, we see them ask if something could be considered to be true and demonstrate that it could be; hence they report that it is honest. Other times we see them ask and then demonstrate that something could be considered to be false; hence it is lying.

The problem is they could have reversed the standards and then reversed the conclusions. Fact-checkers have no consistency about the question and standard applied. Which approach they take seems to be dictated by their initial perception of whether something seems to be suspect (to them). So for something that appears at first blush to be true, they will ask if it could be true, and something that appears fishy they will ask if it could be false. In this way the process is untrustworthy. (For a very good discussion of this problem, see Ben Rowen in Pacific Standard.)

The psychology behind this flaw in fact-checking centers on confirmation bias, or the tendency to seek and accept only the evidence that confirms our first inclination. Belsky and Gilovich describe it: “Once you develop a feeling about a subject—no matter how unconscious that preference may be—it becomes hard to overcome your bias” (page 134). They phrase the problem identified above as the difference between asking “can I believe this?” and “must I believe this?” When people want to believe something they ask the first question. When they don’t want to, they ask the second question. And then reach a different conclusion from the same evidence.

It is important to remember that fact-checkers are not just asking a question to themselves; they are posing questions to experts for feedback, and it matters very much which question the expert is asked. “Is there evidence this statement is true?” or “Is there reason to think this statement is false?” will receive different responses from the same expert.

This is especially true for fact-checkers who only publish negative findings (like Factcheck.org). If fact-checkers at Factcheck.org believe a statement to be false, but in the process of researching find out it is actually true, the process stops and they move on. All of the effort is wasted. So the fact-checkers only begin from the perspective of thinking (hoping) a statement is false. This working condition is deeply likely to bring all of the problems of confirmation bias into play. The professional fact-checkers respond that they are trained professionals and not subject to the psychological mechanisms of ordinary people. But that is nonsense of the type fact-checkers should reject.

How could fact-checkers address this problem? They could be honest about the questions they ask—to themselves and to experts. And honest about the questions they do not ask. And why they pick one over the other. Or they could present both questions and see where that leads their analysis.

If they did, trust in fact-checking would grow.

Note: Inconvenient Facts will be moving to a fortnightly format, appearing every other Friday beginning 21 June. Thanks for reading.

advertisement
More from Morgan Marietta Ph.D.
More from Psychology Today