Verified by Psychology Today

How We Believe Lies Despite the Obvious Truth

Research shows how expectation and motivation shape acceptance of lies.

The psychology of disinformation is more important now than ever. Our fates literally depend on how we navigate the information landscape. Because of social media and information technology, and how tightly glued we are to our screens and scrolls, the innate capacity of the human mind to fool itself is amplified manifold.

Deception and illusion serve important evolutionary purposes, allowing us to maintain a strong sense of self and imagine optimistic outcomes despite the presence of daily existential threats and, sometimes, aspects of ourselves we'd prefer to approach with caution. However adaptive our capacity to envision the future is, it also makes us susceptible to lies and manipulation, both deliberate and inadvertent.

For individuals, self-deception to support a positive sense of self needs to be balanced with frank self-appraisal for optimal living. In culture wars, in politics, whomever controls the conversation (Guastello, 2007) most effectively emerges as leader, shaping the indeterminate future according not only to the present truths but more so perhaps to the future envisioned–a future, it turns out, that is plastic and subject to all kinds of unconscious influence. Imagine a world in which we have more control over the information context to ensure accuracy.

Leveraging the "-ish" of Truth

How we come to believe and justify lies is elucidated in complex research published in the Journal of Personality and Social Psychology: Attitudes and Social Cognition (Helgason and Effron, 2022).

The authors devised an ingenious series of six studies designed to test factors contributing to how people can be primed with information about possible outcomes that make lies believable, morally acceptable, and, moreover, more likely to be liked and shared on social media. Across the studies there were over 3,600 participants from 59 countries. A few concepts are key to understanding how the authors probed dishonesty.

First is the distinction between “verbatim” truth and “gist” truth. Politicians often say things that are not exactly true—for example citing statistics that exaggerate numbers—but still present them in the right direction, such as CEO salaries being much higher than employees'. They may not be 500 times higher (per the example from the original paper), but 265 times higher is heading in that direction. The gist is true or "true-ish" (truish?)—close enough to capture the sense of it. This works, because many of our decisions are based on probability–experience-informed guesses–in which close-enough is usually good-enough. For evolutionary purposes, erring on the side of belief can be life-saving—there isn't always a saber-tooth tiger hiding in the bushes, but there might be.

Next, the concept of "prefactuals"—information before the facts are known—is key. Prefactuals are ideas that influence how we interpret the truth value of what we hear depending on expectations. In other words, if we think something will be true later on, that "prefactual" perception may change what we believe today. If we believe that CEO salaries are likely to rise, that prefactual may influence whether we believe the verbatim lie via changing the perceived gist of what it means.

Why do prefactuals shape whether we believe lies and whether we excuse them or condemn them? Psychologically, the capacity for mental simulation, our ability to imagine different possibilities, and moral flexibility, the extent to which we hold on to values or allow them to bend depending on circumstances and our own gain, are key. In addition, confirmation bias plays an important role—we tend to look for information that supports what we suspect and discount what contradicts what we want to be true.

Research Design

In each of the studies, participants were presented with scenarios of false information and primed with suggestions about what might become true in the future (as contrasted with neutral suggestions). Level of belief in the falsehood was measured along with the extent to which lies were deemed unethical and the impact of the gist on moral perception of lying.

Each subsequent study added a layer of detail, for instance, introducing Republican vs. Democrat prefactuals, to see whether the fit of personal motivations would influence acceptance of dishonesty; seeing whether vividness of future simulation influenced belief in lies and extent of unethicality; and seeing how likely participants would be to like and share false statements on social media.

There were several key findings.

1. Basic prefactual effect. When people expect that something may become true in the future, they were more likely to excuse the falsehood in the present. Across the several studies, this effect held for lies about commercial products, falsehoods on resumes in job applications, and around divisive political issues.

2. Ethical determination. Believing that lies would become true blunted moral condemnation. Participants who were primed to believe a lie was likely to become true were less likely to both hold others accountable for spreading lies on social media and more likely to share disinformation themselves. The stronger the gist was felt to be true, the stronger was the prefactual effect.

3. Plausibility and vividness. When participants imagined prefactuals more vividly and believed there was a good chance of the facts changing, they were less likely to judge lies as unethical, because the gist of the statement was experienced as true even if the facts weren’t quite right. The more plausible and vivid the prefactuals, the stronger the truth-distortion effect.

4. Motivated reasoning. The effect was even more powerful when the prefactual fit well with individual beliefs and motivations, due to both greater reward if the lies were to be true (motivated reasoning) and also because the more vividly people can imagine things, they more they think they are true (regardless whether they are true or not). Although the authors did not mention it in their discussion, the “illusory truth” effect may also be important: People are more likely to believe something is true merely by frequent repetition of a lie.

5. Counterfactuals. The above effects were present even when people were presented with information that directly contradicted the verbatim and gist-based falsehoods. The results help explain why, even when people do receive correct information, they continue to believe what is false.

Momentous Implications

Pulling together the findings, the research showed that when people’s beliefs (e.g. political) were strong, they were able to more vividly imagine lies becoming true in the future (prefactuals), which in turn led them to rate the gist as truer, and therefore to see the falsehood as less unethical—finally leading to differences in behavior including a greater tendency to give known lies a pass on social media. Sound familiar? It's kind of exactly what we’ve been witnessing the last several years, an amplification of more conventional propaganda.

Helgason and Effron's findings weave a cautionary tale about susceptibility to bias and overt deception, about how easily we can be convinced not only that lies are more or less true but also how folks can give a moral pass to potentially harmful lies. Whether we see the other person as malicious or ignorant in our political partisanship, our preconceived notions influence what we believe is real.

The work also suggests ways in which we might safeguard ourselves, for example identifying prefactual thinking in messages we hear, attending to whether we are looking at the facts or merely the slippery gist, and frankly recognizing the way our own preferences and biases allow us to believe what fits with our preconceived notions.

The research is important, as it shows how relatively easily we can be manipulated by rhetoric. We are profoundly unfamiliar with the information environment in which we live and the ways our own psychology plays a role in what we decide is real and true. By becoming aware of the mechanisms of distortion, we can safeguard against unwanted influence and hold ourselves more accountable.

References

Guastello, Stephen. Non-linear dynamics and leadership emergence, The Leadership Quarterly, Vol. 18, August 2007, DOI 10.1016/j.leaqua.2007.04.005.

Helgason, B. A., & Effron, D. A. (2022). It might become true: How prefactual thinking licenses dishonesty. Journal of Personality and Social Psychology. Advance online publication. https://doi.org/10.1037/pspa0000308

Note: An ExperiMentations Blog Post ("Our Blog Post") is not intended to be a substitute for professional advice. We will not be liable for any loss or damage caused by your reliance on information obtained through Our Blog Post. Please seek the advice of professionals, as appropriate, regarding the evaluation of any specific information, opinion, advice, or other content. We are not responsible and will not be held liable for third-party comments on Our Blog Post. Any user comment on Our Blog Post that in our sole discretion restricts or inhibits any other user from using or enjoying Our Blog Post is prohibited and may be reported to Sussex Publishers/Psychology Today. Grant H. Brenner. All rights reserved.

More from Grant Hilary Brenner MD, DFAPA
More from Psychology Today
Most Popular