Cognition
Misinformation, Social Media, and the Method of Assertion
What are the crucial factors contributing to the spread of misinformation?
Posted July 8, 2024 Reviewed by Monica Vilhauer Ph.D.
Key points
- Much misinformation involves nothing more than making stuff up.
- Asserting something, no matter how effectively, does not make it true.
- Humans’ conformity bias makes them complacent about the transmission of misinformation via social media.
When advance copies of her book, No Going Back, were circulated, the greatest uproar surrounded South Dakota Governor Kristi Noem’s account of shooting one of her dogs, which she included to illustrate her ability to make difficult decisions. What that uproar says about the media is unclear, though, given that Noem also stated that she met with North Korean dictator, Kim Jong Un, during the years when she served on the House Armed Services Committee. Foreign affairs experts quickly pointed out that this was not true. Noem’s spokesperson, Ian Fury, described this as one of a few “small errors” that arose because Kim Jong Un’s name was mistakenly included in a list of leaders with whom Noem had interacted. Fury stressed that Noem’s ghostwriter had been misinformed.
Fury was, in effect, characterizing this error as what Paul Thagard describes as misinformation in his insightful new book, Falsehoods Fly: Why Misinformation Spreads and How to Stop It. Thagard adopts the U.S. Surgeon General’s account of misinformation as “information that is false, inaccurate, or misleading according to the best available evidence at the time.” Misinformation should be distinguished from disinformation, which involves “misinformation that is spread deliberately by people who know it is false.” In that advance copy of her book, Noem remarked that “I remember when I met with North Korean dictator King Jong Un. I’m sure he underestimated me, having no clue about my experience staring down little tyrants (I’d been a children’s pastor after all).” Since, presumably, Noem did not intend to propagate disinformation (because, among other things, it would have been sure to elicit the sort of reaction that the advance copy, in fact, managed to elicit anyway), it appears her ghostwriter had considerable leeway. Thagard supplies a refreshingly straightforward category for the sort of misinformation that Noem’s ghostwriter produced. Thagard calls this “making stuff up.”
The Method of Assertion
Making stuff up often works because, as Thagard notes, people tend to believe what they are told unless something is clearly untoward either about the statement itself or about the person making it. Since, currently, so many get so much of their information from social media, people typically haven’t a clue about who originally advanced most of the claims that they encounter. Consequently, just as with hearsay, unless the stuff that people make up is obviously faulty, recipients are unlikely to bring their critical faculties to bear. Thagard observes that one of the less attractive features of the new AI systems is precisely their ability to make things up (known as AI “hallucinations”).
Making stuff up is a species of what the father of cognitive psychology, Ulric Neisser, targeted (in a personal communication) when he coined “the method of assertion.” People assert things. Sometimes they assert them confidently. Other times they assert them amusingly, or calmly, or firmly, or sincerely (or insincerely!), but the crucial point is that saying something does not make it so. Neisser’s ironic point was that no method is deployed in just proclaiming things. There is no method of assertion. The import of Neisser’s coinage was that too often people state things without supplying any reasons for thinking that what they have stated is true. He was poking fun at figures who are long on declarations but bereft of arguments supporting those declarations. Thagard stresses that claiming something is not the same thing as providing a reason for believing it. To assume that it is the same thing is to commit the classical fallacy of begging the question.
The Power of Repetition
Thagard argues that people are no less mistaken when assuming that asserting something in a way that attracts attention renders it more plausible. People might, for example, proclaim something loudly, or cleverly, or passionately. Most notably in our time, people might simply declare something repeatedly. Thagard suggests that this is one of the keys to the promulgation of baseless conspiracy theories.
Constant repetition of a claim that goes unchallenged furnishes it with a currency that makes people increasingly less likely to challenge it going forward. This taps into what cultural evolutionists describe as humans’ conformity bias (which facilitates the transmission of culture). They hold that conformity constitutes a good guess about what is adaptive. When you don’t know much about some situation, a safe way of managing is merely to do what everyone around you is doing. Thagard comments that conspiracy theorists can be strikingly forthright about this combination of repetition and conformity. He quotes (p. 174) former President Trump, who said in one of his speeches that “If you say it enough and keep saying it, they’ll start to believe you.”
This is an important reason why social media have attracted so much criticism. These systems are incredibly effective force multipliers for misinformation. Not only can they transmit, i.e., repeat, misinformation endlessly, much of what they transmit is context-less and unattributed, which, according to Thagard, torpedoes one of the two most prominent grounds for engendering skepticism.
References
Gilbert, D. (1991). How mental systems believe. The American Psychologist, 46(2), 107-119.
Mesoudi, Alex. (2011). Cultural Evolution: How Darwinian Theory Can Explain Human Culture and Synthesize the Social Sciences. Chicago: University of Chicago Press.
Thagard, P. (2024). Falsehoods Fly: Why Misinformation Spreads and How to Stop It. New York: Columbia University Press.