- Concerns about misinformation may be alarmist.
- Studies show that people believe misinformation is bad because other people are gullible.
- The "third-person effect" demonstrates how people overestimate others' gullibility.
- Scientists should study how audiences influence leaders, rather than only the other way around.
In my last post, we looked at what people believe misinformation is. But we didn’t yet explore the important question, why exactly is misinformation a problem? There is rising concern about it, and some scientists are working hard to find methods for reducing the spread of misinformation. But we shouldn’t take for granted the idea that misinformation is something we should be distressed about. Some researchers, including myself, are actively questioning whether an increased amount of misinformation is actually causing harm, or if we’re overly concerned about a minor nuisance.
Before reading further, take a pause and ask yourself the question, do you think we should be concerned about the existence of misinformation? Why or why not?
The Third-Person Effect
In a preprint entitled Misinformation Is a Threat Because (Other) People are Gullible, researchers Sacha Altay and Alberto Acerbi argue that concerns about misinformation are alarmist, and they find evidence that this overblown panic stems from a psychological bias called the third-person effect. That is, individuals are relatively confident in their own ability to identify and resist falsehoods, but at the same time tend to overestimate the gullibility of other people. Indeed, the third-person effect was the strongest predictor of whether people believed misinformation was a big problem, more so than the belief that misinformation is generally difficult to deal with, or general anxieties that the world is a dangerous place. This pattern emerged for participants both in America and in the UK.
Altay and Acerbi also found that people with stronger concerns about misinformation were more likely to share articles on social media (e.g., Facebook) highlighting the dangers of misinformation. This is unsurprising. But they go on to describe how sharing exaggerated concerns about misinformation ironically contributes to the same problem. For example, when people are told about the dangers of “deepfake” videos, which appear realistic but are wholly fabricated, then people become more skeptical about the veracity of all videos they see, even those that are authentic!
Overestimating persuasive messaging
A related reason why people are panicked about misinformation is because we tend to overestimate the power of propaganda. We (falsely) believe that mainstream advertising, political messaging, and conspiratorial ideas have an overriding effect on our minds. People talk about misinformation as if it’s a pandemic, like the actual pandemic (COVID-19) we’re currently experiencing. This way of thinking is very revealing because it implies that simply being “exposed” to falsehoods will “infect” us, just like a virus would infect our bodies. This Tweet is a good example — the author uses medicalized terms such as “prophylactic & therapeutic interventions.” But this is not an accurate way of thinking about misinformation given what we know about how our minds work. Misinformation is where psychology diverges from biology.
Imagine if you could be immune to germs based on what you believe about those germs. That would be really amazing, right? Unfortunately it doesn’t work that way. Even those who believed that COVID-19 was part of a hoax were just as likely as the rest of us to get sick. But when the human mind is exposed to falsehoods, then its preexisting beliefs matter a lot in terms of whether those falsehoods are accepted. Cognitive scientists like Hugo Mercier argue that humans have built up a ton of strong cognitive defenses against dangerous beliefs, and that most people are not very gullible. In a review of studies in this area, he claims that “communication is much less influential than often believed—that religious proselytizing, propaganda, advertising, and so forth are generally not very effective at changing people’s minds.” We tend to forget that it’s extremely difficult to persuade others about anything, let alone weighty topics within politics or religion.
A good example of this misperception occurred recently with the very popular Joe Rogan Experience podcast. Rogan has a huge audience, with tens of millions of listeners, and he was heavily criticized for hosting conspiracy theorists like Robert Malone who made outrageously false claims about COVID vaccines. Musicians (and the U.S. Government) put pressure on Spotify to reign in this misinformation, ostensibly because Rogan and his guests would influence listeners to avoid getting vaccinated. But polling data revealed that a majority of devoted Rogan listeners have received the COVID vaccine (to say nothing of casual listeners, among whom vaccination rates are likely even higher).
Let’s study “audience capture”
While people tend to believe that dominant leaders (e.g., celebrities, politicians) influence their audiences in a one-directional manner, it may be that audiences actually exert more influence on leaders than the reverse. Audience capture is a non-scientific term used by some public figures to describe the pressure they face to deliver more extreme content to their audience, which may increase their fame or profits. Essentially, there’s a perverse incentive structure for the flow of communication between leaders and followers. Popular demand for inflammatory statements or falsehoods may cause public figures with large platforms to adopt increasingly fringe positions. Then they attract more select, but passionate followers who have similarly fanatical or anti-establishment ways of thinking. I suggest scientists take this phenomenon more seriously and investigate under what circumstances public figures may be influenced by their followers, rather than just assuming that masses of sheeple will mindlessly follow charismatic pied pipers.
Altay, S., & Acerbi, A. (2022). Misinformation Is a Threat Because (Other) People are Gullible. DOI: 10.31234/osf.io/n4qrj.
Bak-Coleman, J. B., Kennedy, I., Wack, M., Beers, A., Schafer, J. S., Spiro, E. S., ... & West, J. D. (2022). Combining interventions to reduce the spread of viral misinformation. Nature Human Behaviour, 1-9.
Mercier, H. (2017). How gullible are we? A review of the evidence from psychology and social science. Review of General Psychology, 21(2), 103-122.
Ternovski, J., Kalla, J., & Aronow, P. (2022). The Negative Consequences of Informing Voters about Deepfakes: Evidence from Two Survey Experiments. Journal of Online Trust and Safety, 1(2).