Vaccine Hesitancy: From Misinformation to Conspiracy Theory

The anti-vaxx movement is a product of mistrust and misinformation.

Posted Oct 01, 2019

Public domain
The Sick Child, J. Bond Francisco (1863–1931)
Source: Public domain

I recently "sat down" over email with Julie Saetre to talk about my previous blog post,  "Antivaxxers and the Plague of Science Denial," and more about vaccine hesitancy and conspiracy theories for her article "A Dangerous Debate" that appeared in the September 2019 issue of Kiwanis Magazine. Here's the full transcript of our interview:

It seems conspiracy theorists are thriving in the 21st century, with bizarre explanations for everything from 9/11 to the Sandy Hook shooting to “chem trails” and, now, vaccines.

Are people today more susceptible to the anti-vaccination message and other conspiracy theories than in the past? If so, why do you think that is?

We don’t have a lot of information about whether people today are more or less susceptible to conspiracies, but we do know that conspiracy theories have been around for a long, long time. We also know that belief in conspiracy theories is very common — around 50% of the US population believes in at least one, a prevalence that has held steady for at least several decades.

Political scientists Joseph Uscinski and Joseph Parent conducted a study of conspiracy theories over the past century, based on surveying letters to the editor written to the New York Times and the Chicago Tribune going back to 1890. They found that conspiracy theories elaborated in those letters occurred with ebbs and flows in an overall steady stream, with variations occurring in terms of what kinds of evil forces have been implicated within the conspiracy beliefs. Typically, the evil forces claimed by conspiracy theorists have been US political parties when in power or foreign governments during times of war.

That said, it’s very tempting to suggest that the ubiquity of information shared over the internet has created a climate ripe for the growth of conspiracy beliefs. Certainly, it’s much easier now to gather online “evidence” for virtually any belief imaginable and to find like-minded souls who share your belief at the click of a button.

I am continually puzzled by the fact that anti-vaxxers will not believe of many well-researched, well-performed studies involving high numbers of people, but fight to defend two small studies proven not only to be false, but to have been manipulated to financially benefit both a law firm and the man who conducted the studies.

You mention the Dunning-Kruger Effect. Can you expand on the role this effect plays in the mindsets of anti-vaxxers?

The Dunning Kruger effect is a finding from psychology research that almost everyone overestimates their level of knowledge on any given subject. This mismatch between self-rated knowledge and actual knowledge tends to be largest for people with the lowest levels of actual knowledge whereas it gets reversed at the highest level, where true experts underestimate themselves, providing evidence for the so-called “imposter syndrome.” This effect was demonstrated in a recent study of anti-vaccine beliefs — those with the lowest levels of knowledge about vaccines rated their knowledge as on par with that of doctors and scientists.

But it would be a mistake to use this finding as evidence that “anti-vaxxers” are merely uneducated or that modifying their beliefs is a simple matter of educating them about research that shows that vaccines don’t cause autism. On the contrary, conspiracy theorists believe that they know as much as doctors and scientists because they believe that it’s the doctors and scientists who are uneducated, brainwashed, hoodwinked, or “in on it” with ties to the pharmaceutical industry. And so, they reject scientific studies, no matter how many times something has been replicated and no matter the degree of scientific consensus. This is a defining feature of conspiracy theorists — they reject the authoritative account of things through a kind of denialism in favor of their own “alternative facts.” I like to think of conspiracy theories as emerging from mistrust in institutions of authority, with the resulting “epistemological vacuum” vulnerable to being filled with misinformation.

I would mention however that not all “anti-vaxxers” are necessarily conspiracy theorists. Believing that vaccines cause autism, or other ill-effects, isn’t a conspiracy theory in itself. The conspiracy belief is that the government, the Centers for Disease Control and Prevention (CDC), and physicians are in cahoots with the pharmaceutical industry to suppress that information from the public. Many so-called “anti-vaxxers” are parents with concerns about the health effects of vaccines who don’t necessarily believe there’s a conspiracy. In healthcare research, this kind of concern is called “vaccine hesitancy” as a way to distinguish it from the more extreme conspiracy beliefs and the pejorative connotations of terms like “anti-vaxxer” and “conspiracy theorist.”

I often read comments following reports of studies disproving the MMR vaccine/autism connection that refer to the information as “fake news.” Does the emergence of the “fake news” concept add to a type of “perfect storm” that is causing the anti-vax movement?

What role does social media play in the anti-vaccination movement?

Explain the concept of “weaponized” information being spread by content polluters, bots, etc. and its impact on the anti-vax movement.

There’s good evidence that we are living at a time of significant mistrust in institutions of authority, whether we’re talking about the government, the media, scientists, or physicians. This isn’t to say that mistrust of those institutions has never been higher; only that we’re riding a wave of populism in this country that frowns upon “elites” and discounts the authority of experts. Tom Nichols has described this as “the death of expertise.” Philosophically, this could be likened to a rekindling of Post-Modernism, a movement that rejected the Age of Enlightenment’s embrace of rationality and the scientific method. These days, it’s often suggested that we’re living in a “post-truth” world.

From a psychological perspective, one of the most important cognitive biases associated with belief formation is confirmation bias. Confirmation bias is a general tendency we all have to favor sources of information that confirm our pre-existing intuitions and beliefs and to reject sources of information that contradict them. With “echo chambers” and “filter bubbles” that are, in a sense, pre-programmed into online search engines and the social media experience, I like to say that this has created a kind of “confirmation bias on steroids.” Simply put, the internet has made it very, very easy to find “evidence” to support whatever we want to believe.

The concern is that the concept of truth has been lost in the process. The internet is rife with misinformation, with subjective opinion and experience conflated with objective facts. To make matters worse, some misinformation has been deliberately added to the mix, whether by “conspiracy theory entrepreneurs” who profit from misinformation or by trolls that are stirring the pot for amusement or, in the case of Russia, to sow discord among the American populace. It is now well-known that many anti-vaccine internet memes — including both anti-vaccine and pro-vaccine posts — have been generated from such sources. Unfortunately, research has also found that fake news travels faster and more widely than real news. To top it all off, few of us have ever had any formal education about how to distinguish between reliable information and misinformation on the internet.

Your article mentions the back-fire effect and how trying to educate anti-vaxxers through accurate information and stories from parents whose children contracted measles can actually increase anti-vaccination beliefs. Can you expand on why this happens?

Information science is a relatively new field that seeks, among other things, to understand how information is propagated and how people come to hold beliefs as a result of exposure to that information. The “backfire effect” is a finding from information science that suggests that efforts to correct misinformation sometimes have the opposite effect. A related finding is the “illusory truth effect,” whereby people are more likely to believe information, including misinformation, after repeated exposure. So, one of the problems with correcting misinformation is that efforts to do so often mention the misinformation as part of the correction with the unintended consequence of reinforcing its belief.

These effects have important implications for things we read online and in print. For example, a recent headline quoted Andrew Wakefield, the physician who published fraudulent data in support of his bogus claim that vaccines cause a form of autism, as saying that vaccines make the measles virus stronger. On social media, I saw physicians and other people retweeting the headline, enraged that something like this was published in the mainstream press. But despite intentions, the news headline itself — even if the article went on to refute the claim — and the retweets of the article on social media, might have only served to reinforce the perception that Wakefield’s claim is true. This “boomerang effect” has also been demonstrated for headlines that pose questions like, “Do Vaccines Cause Autism?” or “Was Obama Born in the US?”

I recently interviewed a conspiracy theorist as part of a legal case and asked him whether he believed in “The Illuminati.” He replied that “there was so much out there about it” that he figured it must be true. And so, sometimes information quantity trumps information quantity. Internet trolls and bots, and some politicians and governments well versed in the psychology of propaganda know this all too well.

You mention Dr. Larson’s support of listening to and engaging anti-vaxxers to counter their beliefs. But people who try to do so are often shouted down at best, harassed and threatened at worst. One mother in Australia who lost her son when he contracted whooping cough as an infant began a campaign to promote maternal vaccinations and was accused on social media of killing her own child and being secretly involved with Big Pharma. Doctors and legislators promoting vaccination have received death threats. Why are some anti-vaxxers so violently vitriolic?

Some doctors have reported death threats not only to themselves but to their children. Is such vitriol making some people reluctant to speak out in favor of vaccines?

The decision to speak out or to remain silent is an individual one, with advantages and disadvantages. Unfortunately, when people do speak out — whether they’re on social media or a larger public stage — they can become targets of considerable harassment including threats of violence. Certainly, that has a silencing effect for some.

Conversely, there’s evidence that the anonymity of the internet facilitates people speaking out in a way that they might not otherwise, kind of like “road rage.” Many of us yell things at other drivers in the privacy of our cars that we would never say face-to-face or when other people are looking. Online discourse is often the same way, which is to say that social media is often a hostile environment.

When people ask me how to engage anti-vaxxers and other conspiracy theorists, my first answer is that I don’t particularly recommend it if we’re talking about online interaction. If we’re talking about face-to-face interactions, then my answer is that the best strategy depends on your goal. If you’re trying to change “hearts and minds,” that has to start with empathic listening in an effort to truly understand where someone’s coming from and why they believe what they do. Once you have developed some rapport, you may then be able to introduce different information for them to consider. But their receptivity will likely depend on whether they’re looking for answers or trying to resolve ambiguity — so called “fence-sitters” who might be genuinely willing to learn — or whether they’re just looking for a fight and on guard to being attacked. For the issue of vaccine hesitancy, these conversations are best had one-to-one between patients and their doctors and other healthcare providers where trust is earned through open communication.

Part of the reason that belief systems can be so resistant to change is that they’re often entwined with our identities. Changing our beliefs can therefore feel like we’re renouncing ourselves or even losing some important existential battle. So that keeps us entrenched and sometimes willing to defend our beliefs at all costs, as if our lives depend on it. In contrast, being more flexible about our beliefs, admitting we don’t know something (a.k.a. “intellectual humility”), and seeing ourselves as beings of constant change are worthwhile goals that doesn’t come naturally for most of us. 

You refer to Andrew Wakefield as “the Messiah of the anti-vaxxer movement.” I found that interesting, because I see similarities between the belief systems and actions of anti-vaxxers and those of cult members: the formation of insular communities (in the case of anti-vaxxers, online forums and websites); the refusal to accept solid evidence that disproves their beliefs; the tendency to shun and/or harass and threaten those who challenge their beliefs; the difficulty in “deprogramming” or trying to disengage someone from the belief system.

Do you believe the two groups share some of the same mindsets and personality traits?

The hallmark of so-called cults has traditionally been the various methods of “brainwashing” their members and keeping them insolated from external influence, which I don’t necessarily think applies to the anti-vaxxer movement beyond the echo chambers of the internet. Cults can however also provide a kind of safe-haven for like-minded individuals and a sense of group belonging and identity that may have been previously absent in an individual’s life. That aspect of cults and other belief-based organizations like political and religious groups does have some parallels to the anti-vaxxer movement that dovetail with research that has found that conspiracy beliefs are associated with psychological needs for control, certainty, or closure.

If your child has autism, it can be comforting to find the support of other parents who share the opinion that autism isn’t so much caused by genetics — that is, inherited from you — as it is caused by vaccines and an organized effort to suppress this information from the public. And dedicating yourself to “researching” this link and the details of a conspiracy theory can also give someone a life mission that, in some cases, can even become a career as a spokesperson with a degree of fame and fortune. And if you’ve achieved that sense of purpose along with a financial stream linked to an identity, it’s highly unlikely you’ll change your tune.

For more on vaccine hesitancy and conspiracy theories:

Antivaxxers and the Plague of Science Denial

What Makes People Believe in Conspiracy Theories?

Delusions, Conspiracy Theories, and the Internet