Can We Stop the Spread of Misinformation?
Misinformation may infect social media, but maybe we can inoculate ourselves.
Posted July 29, 2019
Fake news, misleading memes, and conspiracy theories in social media. Is there any way to stop people from spreading misinformation and disinformation?
Social media should be a lovely place to watch cat videos, see pictures of your friends visiting interesting places, and serve as a quick source for news of the world. But I worry about the spread of misinformation. Some of the stuff we see on social media (and on news sites) is misinformation. Why is there so much misinformation on social media? We all probably contribute. When people encounter misleading information on social media or in news, they may believe and decide to share that information. Their friends see the misinformation, and they share it too.
Misinformation develops a life of its own, infecting social media. But if misinformation is like a virus infecting social media, are there ways to inoculate ourselves? Are there Internet antibiotics to eliminate the contagion of misinformation?
Stopping the spread of misinformation will be a serious challenge. We must first understand the nature of the problem: Some people intentionally spread misinformation. For some people, spreading misinformation is a choice, and often a profitable one. For others, spreading misinformation is a political job—in these cases, you should think of it as disinformation. The goal of political bad actors is often to discourage participation by overwhelming people with misleading information.
But alone, these bad actors should be hidden in the hard to find corners of the Internet. Of course, in reality, there isn't any hard to find corners of the Internet. Google will find it for you. YouTube will cue up misleading information and conspiracy theories for the next suggested video you should watch. All of us will encounter some false information. And much of that false information will be intentionally planted.
The real problem is the spread of misinformation. Yes, there are people intentionally planting and promoting lies. But each of us may be what Kate Starbird (2019) calls an unwitting agent. Each time we decide to share a piece of misinformation, we contribute to the spread. And people are constantly sharing misinformation. More than likely, you will see the same false news several times in your social media, as many of your friends decide to share that information. The repetition of false information will make that information feel truer (something called the illusory truth effect). If the misinformation starts to feel true, you may decide to share.
Thus you become an unwitting agent of the people trying to spread misinformation and disinformation. You may help the Russians undermine our democracy, you may support the people trying to stop us from doing something about climate change, you may lead parents to not vaccinate their children.
We all think we are careful critical thinkers. Everyone believes that other people are vulnerable to fake news. We are confident that we wouldn’t do this. Unfortunately, none of us are immune to this virus. We can all spread misinformation.
How do we stop this?
I am going to make two suggestions. The first is somewhat obvious. Social media and other Internet companies need to be more careful. For example, YouTube should not cue up videos with information that is known to be false and Google really should help move that information into the hard to find corners of the Internet. Journalists also play a role. They should be careful about what information they promote and not simply repeat information. Lots of cognitive psychologists make these sorts of arguments (see Lewandowsky et al., 2017; Hyman & Jalbert, 2017). But this won’t be enough. We’ll still encounter some misinformation. We’ll still run the risk of spreading that misinformation and becoming unwitting agents.
The second suggestion is something for all of us. Let’s face it. We generally aren’t careful about what we choose to share from social media and news sources. We aren’t careful because we aren’t in critical thinking mode when looking through our social media feeds. We are laughing at the cat and dog videos. We are happy to see our friends and family enjoying life. I hit the like buttons for most photos of friends with their children. I share information about interesting local events. When I’m doing this, I don’t always turn on my critical academic self.
But we can be more careful and limit the effects of misinformation. Ayanna Thomas (2019) has found that people can limit the use of misinformation in some classic eyewitness memory studies. The simple thing is to not require answers from people. Instead, encourage people to withhold answers if they aren’t sure. When you do this, people are more likely to evaluate information. When they evaluate, they are less likely to provide misinformation they’ve encountered. This seems easily applicable in social media. We need systems that encourage people to be more evaluative before sharing news information on social media. Part of that may include changes in social media platforms.
Essentially, we need to slow people down. When people slow down, they do a better job of distinguishing fake and true information (Bago, Rand, & Pennycook, 2019). If social media requires a few extra steps, if people are asked to evaluate the information they are about to share, then they may be less likely to share misinformation.
I have certainly tried to be evaluative before sharing news information on social media. I read articles rather than sharing based on a title. If something seems too good, or too in line with my existing views, then I check the veracity before choosing to share.
Keep sharing the cat videos. Like and share pictures of life events from friends and family. But we have to be thoughtful before sharing news that we stumble across. Some of that news will be misinformation. We have to stop the spread of this virus.
Hyman, I. E., Jr., Jalbert, M. C. (2017). Misinformation and worldviews in the post-truth information age: Commentary on Lewandowsky, Ecker, and Cook. Journal of Applied Research in Memory and Cognition, 6, 377-381. DOI: 10.1016/j.jarmac.2017.09.009
Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition, 6, 353-369.
Starbird, K. (2019, July 24). Disinformation’s spread: Bots, trolls, and all of us. Nature.
Thomas, A. (2019, May). Avoiding memory inaccuracies by exercising control: The impact of metamemorial processes. Psychological Science Agenda.