- A 2006 meta-analysis estimated that people's average accuracy in identifying lies was about 54 percent, slightly better than chance.
- A subsequent 2014 meta-analysis suggested that other factors, like the liar's motivation, did not affect accuracy judgments.
- Some studies suggest that people can detect lies more accurately by adopting appropriate methods, but the gains are modest.
Sterling: "You can just tell."
Charlie: "Just that something is off. That's the best way to describe it, I can just tell."
Sterling: "—when anyone is lying, 100 percent of the time?"
In the first episode of Peacock’s murder-mystery-of-the-week drama "Poker Face," the show’s main sleuth (and friend to an astonishing number of future murder victims) Charlie Cale (played by Natasha Lyonne) explains to her boss, Sterling Frost, Jr. (Adrien Brody) how her signature gift of detecting when anyone is telling a lie works. It’s not a trick, it’s like an intuitive feeling.
Charlie’s gift is basically just a plot device that adds a fun layer of dramatic tension to every conversation she has with a suspect. Some of them know about her ability and have to carefully choose their words, dancing around the truth to avoid triggering her lie detector; others inadvertently expose themselves as liars but for reasons that Charlie doesn’t understand at first.
To be clear: The show never makes any serious psychological claims about how Charlie does what she does. But it’s still fun to ask: Is what Charlie does really possible? What has psychology research learned about the accuracy of people’s lie-detection abilities?
People are Pretty Poor Lie Detectors
Across numerous studies, people have shown themselves to be lousy lie detectors. For instance, a 2006 meta-analysis of 206 studies conducted by Charles Bond, Jr. and Bella DePaulo, encompassing a total of about 24,000 truth or lie judgments, found that the mean accuracy across studies was around 54 percent.
To put this figure in context, most lab studies of lie detection test people in a situation where half the statements they hear are lies, so chance performance is 50 percent. In other words, people are better than chance, but only slightly. When they analyzed the accuracy of true and false statements separately, they found that people were more accurate at identifying truths than lies.
People Aren't Better Lie Detectors in More Realistic Settings
One criticism you could level at much past research on lie detection is how artificial it is: frequently conducted in lab settings, often with students, asking people to lie about things they have no investment in or asking them to catch other people lying about things that have no real consequence.
This is partly what motivated a later 2014 meta-analysis of 125 studies by Maria Hartig and Charles Bond, Jr. For this analysis, they focused only on studies that tested the predictiveness for lying of more than one cue, like tone of voice, speech patterns, or gestures. And they tested whether factors like the participant population—student or non-student—or the motivation to lie affected accuracy. For example, in some studies, participants watched videos of actual criminal investigations in which people lied about their involvement to avoid arrest.
Surprisingly, they found that none of these factors seemed to affect accuracy. People were just as poor at detecting lies told by highly motivated criminals as they were at detecting lies told by apathetic students getting paid to do a university lab study.
Some Lie Detection Methods Work Better Than Others, But Not Intuitive Ones
There is some evidence that people can be trained to become better at detecting lies, but not in the intuitive way that Charlie Cale does in "Poker Face."
A 2020 meta-analysis by Erik Giolla and Timothy Luke analyzed 23 past studies of the cognitive approach to lie detection. The idea behind the cognitive approach is that telling lies is more cognitively demanding than telling the truth, so an interrogator can root out a liar by asking probing questions or asking someone to recount a story backward. (For a more detailed summary of the analysis, see this review by Arash Emamzadeh on Psychology Today.)
They found that the mean accuracy across studies of cognitive lie detection was about 60 percent, a modest improvement over the 54 percent figure found earlier.
The important point here is that the cognitive approach isn’t about getting a “feeling” about how someone is acting. It’s about applying a cognitive limitation that affects liars and truth tellers differently and then looking for its effects. Unfortunately, these effects are not very reliable or easy to detect.
The researchers found that the accuracy boost was magnified when participants were trained in what to look for compared to novices. But they also found that the subset of studies that compared the cognitive approach to a control group that didn’t use the cognitive approach showed that it offered no benefit in accuracy.
Why We Usually Don't Need to Be Charlie Cale
Charlie Cale is a fantasy. It seems fair to estimate that people’s accuracy at judging truth and lies is around 50 to 60 percent. But, how can we be walking around having productive interactions with people judging the truth of what they say with only 60 percent accuracy?
The answer has to do with base rates. As I noted earlier, nearly all studies of lie detection assume half of all statements are lies. That might be true in Charlie’s world, but it’s probably not true in yours. As Bond, Jr. and DePaulo point out, the fewer lies you encounter, the more accurate your judgments are going to be, especially because people are generally better at identifying truths than lies.
In other words, lie detection is hard, but it’s a lot easier when nobody is lying.