Skip to main content

Verified by Psychology Today

Media

Differing Ideas on What “Misinformation” Actually Is

Research shows that “misinformation” means different things to different people.

Key points

  • People are very concerned about misinformation.
  • When researchers ask people what "misinformation" is, they get diverse responses.
  • The lack of consensus about misinformation will make it even more difficult to deal with.
 Gerd Altmann/Pixabay
Source: Gerd Altmann/Pixabay

The vast majority of Americans have become increasingly concerned with “misinformation.” This is linked with Americans’ concerns about social media users and companies.

But what exactly is “misinformation?” Is it any kind of statement that’s objectively false? Does it have to be political (e.g., about an election) or can it be about anything, like the existence of Bigfoot or aliens? Could it include factually true information that is presented out of proper context, or a biased/one-sided presentation of facts? And what happens when things that are initially believed to be false are then revealed to be true (or inconclusive)? Who decides what gets classified as “misinformation”?

These are all important questions, and it may be helpful to look at some of the psychological research on this topic. After all, no matter how adept we are in distinguishing truth from fiction, that won’t matter if we have different ideas about what “misinformation” really is.

Intentions

According to a recent analysis based on data from representative samples in the United States, United Kingdom, Russia, and Turkey, most people believe that when it comes to true/false information, intention matters. That is, lots of statements or claims can be false, but about 70 percent of participants surveyed believe that for something to be considered “misinformation,” there needs to be an intention to mislead or deceive.

This is consistent with other studies on how people judge moral wrongness and their own experience of pain and suffering. Intentions matter a lot. In a laboratory experiment by Kurt Gray and Dan Wegner, people received mild electric shocks while being told that this was being done to them intentionally from someone in an adjacent room, or accidentally. Participants reported feeling greater pain when they believed that the shocks were being given intentionally.

But this view of misinformation may not be very useful in helping us identify it in the world, since very rarely (if ever) do people who say false things also admit that they are deliberately trying to mislead or deceive others. If people are being dishonest about misinformation, then we’ll probably never know their true intentions.

Interestingly, some respondents believed that unintentionally (accidentally) misleading statements could still count as “misinformation,” and still others believed that misinformation could be created without caring whether it’s misleading or not. Still, only about 30 percent of respondents agreed with either of those ideas, which means that most people would not characterize misinformation in those ways.

Sources of Information

Most respondents (60 percent) agreed that something is “misinformation” if scientific evidence indicates that it’s incorrect. Next, about half (50 percent) of respondents said “expert groups” could reliably indicate whether something is misinformation, although it wasn’t clear how the researchers or participants defined what an “expert group” was. Less than half of respondents (30–40 percent) trusted their own instincts/beliefs, but, overall, people still trusted their own instincts more than mainstream media reports (30 percent) and the views of other people they know personally (20 percent).

Context of Information

In the United States and the United Kingdom, nearly 60 percent of people agreed that something would be misinformation if it “exaggerates” the facts. But only 40 percent in Russia and Turkey agreed with this idea. Almost 50 percent of people agreed that something could be misinformation if it doesn’t represent the “full picture.” About 40 percent agreed something is misinformation if an opinion or rumor is being presented as a fact, and just under 40 percent said they would suspect something is misinformation if it’s on a topic for which misinformation is known to be a problem.

Overall, respondents were not particularly swayed on contextual factors for misinformation. The study authors noted that none of these factors scored higher than 50 percent agreement, although a substantial minority of respondents did agree that something would seem fishy if a statement based on facts was exaggerated or didn’t capture the full picture. Thus, in the eyes of some people, misinformation is not just totally false ideas, but also true ideas that are not presented with an appropriate level of nuance.

It is heartening to see that most people trust the validity of scientific evidence to prove whether something is true or false. However, the authors pointed out that there are some instances in which a clear consensus among scientists does not exist yet (as is the case with the origin of severe acute respiratory syndrome coronavirus 2 [SARS-CoV-2]), and yet misinformation could still emerge for this topic. In addition, what is considered “misinformation” would likely shift over time as more facts are revealed and a consensus is built. It may have been premature for some to label the “lab leak hypothesis” as misinformation in the spring of 2020.

One thing is clear. There’s a lot of wiggle room in the minds of the average people across the world about what constitutes “misinformation.” This is a big problem for our shared epistemic reality. If we can’t agree on how to identify and deal with false information, that will make it easier for bad-faith actors to wreak havoc.

But a greater problem may lie within the masses of people who will use their own personal definitions of misinformation to stifle or suppress ideas they find distasteful. We already have other evidence that it is common for people to try to censor others’ viewpoints, even if those viewpoints are trivial or inoffensive. If people attempt to suppress ostensibly “harmful” speech under the guise of fighting “misinformation,” it will be even more difficult for us to have important conversations about a variety of topics.

Two Suggestions About Misinformation

Moving forward, I have two suggestions:

  1. We all work together on coming up with a common, shared definition of what qualifies as “misinformation.” This is not only the job of behavioral scientists, but also journalists, historians, and others.
  2. Until we accomplish this, it may be useful to keep reminding ourselves that the idea of misinformation is very slippery. It means different things to different people. So if you come across something (especially in the news or on social media) that triggers a “spidey sense” for misinformation, it may be helpful to pause and try to cultivate a spirit of humility and compassion.

References

Gray, K., & Wegner, D. M. (2008). The sting of intentional pain. Psychological Science, 19(12), 1260-1262.

Osman, M., Adams, Z., Meder, B., Bechlivanidis, C., Verduga, O., & Strong, C. (2022). People’s understanding of the concept of misinformation. Journal of Risk Research, 1-20. DOI: 10.1080/13669877.2022.2049623

advertisement
More from Dylan Selterman Ph.D.
More from Psychology Today