Delusions, Conspiracy Theories, and the Internet
When "epistemic mistrust" leads us down the misinformation rabbit hole.
Posted Sep 03, 2019
"When you're young, you look at television and think, there's a conspiracy. The networks have conspired to dumb us down. But when you get a little older, you realize that's not true. The networks are in business to give people exactly what they want."
— Steve Jobs (Wired magazine interview, 1996)
Conspiracy theories are a hot topic these days, with increasing relevance in our post-truth world. The recovering political reporter turned freelance writer Jake Flanagin reached out for an interview on the topic for an article for the New Republic. Here's the full transcript of our interview:
Is it fair to say that the Internet, and specifically social media, can indulge and/or exacerbate an inclination toward delusion?
Concerning beliefs, what the internet does in a previously unprecedented way is to provide users with nearly inexhaustible sources of “evidence” and agreement for even the most fringe beliefs that you or I can imagine. A few decades ago, if you revealed an unusual belief or conspiracy theory over beers at the local bar, your friends would probably laugh you out of the place. Now, you can hole up in the privacy of your room in front of a computer as an anonymous user, and find like-minded individuals on the other side of the world.
The problem, of course, is that all too often, the “evidence” that you find online consists of misinformation, lies, and trolling that’s mistaken as fact. Whereas we previously respected information published in books, academic journals, or in print newspapers, the internet has created a kind of false equivalence whereby all sources of information are treated on equal footing and opinions are regarded just as valid as facts (e.g. “alternative truths”). This has been described as a “democratization of knowledge” which has led to the “death of expertise.” While many people think of this as a kind of coup, it has left us vulnerable to believing strongly in things that aren’t true.
Would you classify a fixation on bizarre political conspiracy theories as a delusion (e.g., the notion that the world is run by a shadowy cabal of extraterrestrial lizard people)? If not, what would you call it?
In psychiatry, the definition of a delusion has shifted over time, but the core concept has been a “fixed, false belief.” But applying that definition to individual beliefs is often challenging because many of our beliefs aren’t falsifiable. As a result, psychiatry has taken care to exclude shared beliefs sanctioned by cultures or subcultures, like religion or political beliefs, from the definition of delusion.
Delusions are most obvious when they aren’t or can’t be shared by other people, which often means there’s a self-referential component. For example, it’s not a delusion to hold the shared belief that the son of God was a human being who was crucified by the Romans, but most of us would agree that believing that you are the second coming of Christ is delusional.
Of course, there have been examples of charismatic individuals who have made claims like that who have gained a following. In the past, psychiatry applied the term “shared psychotic disorder” to explain this phenomenon, but that diagnostic category was eliminated in DSM-5. Whether or not a delusion is shared continues to be an important consideration in diagnosing a delusion, but as I wrote in a recent academic paper called "Integrating Non-Psychiatric Models of Delusion-Like Beliefs into Forensic Psychiatric Assessment" that was published in The Journal of the American Association of Psychiatry and the Law, “the internet now makes sharing beliefs possible in a way that the authors of the DSM could never have anticipated.”
As for your example, the notion that the world could be run by a shadowy cabal probably has mainstream acceptance these days. But once we start getting further “out there” in terms of the details of just what that cabal consists of — Jews, the Deep State, the Illuminati, extraterrestrial lizard people — the belief becomes harder and harder to share.
Religious and political beliefs that are shared within subcultures, but are extreme and can result in violent behavior are problems for society and present a real challenge for how to categorize them, in psychiatric terms or otherwise. Beliefs that seem delusional based on their content, but are shared, are often referred to using a kind of wastebasket term called “delusion-like beliefs.” In my paper, I discuss delusion-like beliefs and a different way of thinking about them that involves moving away from categorical terms and definitions and understanding the psychological processes that allow such beliefs to develop and persist.
Is there a certain psychological profile that is especially vulnerable to believing and fixating on bizarre conspiracy theories?
There’s been a lot of work in psychology in recent years that has attempted to determine why some people are particularly drawn to conspiracy theories. Much of that work has focused on a search for associations between belief in conspiracy theories and specific psychological traits or cognitive quirks.
For example, research has found that people who believe in conspiracy theories tend to have a greater need for “cognitive closure” (the desire to have an explanation when explanations are lacking) and the desire to be unique, and are more likely to have a cognitive bias called “hypersensitive agency detection” or “teleologic thinking” (whereby events are overattributed to hidden forces, purposes, and motives).
Some research has also found that conspiracy beliefs are associated with lower levels of education and analytic thinking. But studies have also revealed that half of the US population believes in at least one political or medical conspiracy theory, so belief in conspiracy theories is far more “normal” than many of us might think.
Are there certain traumas or triggers that can spark this fixation?
One aspect of conspiracy theories that I’m particularly interested in, but has been relatively neglected in research, is the issue of “epistemic mistrust” or the lack of faith in knowledge and in institutions like “science” or governments that have traditionally supplied that knowledge. This, in my view, is the starting point for many “truthers” who end up down the rabbit hole and find an answer in a conspiracy theory.
The loss of faith in those institutions can come from a variety of triggers, including real-life instances of bias, corruption, fraud, and failure, and even examples of conspiracy theories that turned out to be true.
For example, the anti-vaxxer movement was born out of fraudulent research conducted by a physician that was published in a respected medical journal. Although the study has since been retracted and the physician was stripped of his medical license, faith in the institution of medical research has suffered. Ironically, the physician who published the fraudulent research continues to have a following, while anti-vaxxers often believe that the medical establishment is “in bed” with the pharmaceutical companies that make vaccines and therefore can’t be trusted [for more on this, see my blog post, “Antivaxxers and the Plague of Science Denial”].
Other national traumas, like the assassination of JFK or the death of Princess Diana, have resulted in a lot of people trying to achieve “cognitive closure” through searches for alternative explanations even to this day. In a previous blog post titled, "Never Forget: The Lasting Psychological Impact of 9/11,” I speculated about how 9/11 was a collective trauma that might have paved a path to conspiracy theories through a loss of faith in government:
“If the historically low ratings of our current presidential candidates is any reflection of the State of the Union, it would appear that we live in a time of maximal pessimism about government. Perhaps that was an inevitable outcome for a country that lived through the deadliest attack on homeland soil in the history of our existence. If our leaders were unable to keep us safe then, is it any wonder that some took their skepticism to the point of conspiracy theory, coming up with 9/11 denialism and the so-called Truth Movement? Is it any wonder that some level of skepticism has taken root in the mainstream, reflected in the backing of political outsiders like Bernie Sanders or Donald Trump who we hope might take our country in a different direction?”
For individuals with certain cognitive traits, such events may very well be a pivotal nidus that started them on a path towards believing in conspiracy theories.
What would treatment look like for a person with a demonstrable, life-affecting fixation on conspiracy theories (e.g., spends all of their time online, alienating friends/family, potentially engaging in illegal/violent acts inspired by the fixation, etc.)?
There isn’t a lot of evidence that conspiracy theories, any more than religious beliefs, can be or even should be “treated.” But I imagine that unplugging from the internet and engaging in life through work, relationship, exercise, and getting good sleep is probably a good start.
The problem, of course, is that some people live in conditions that can’t be so easily modified (e.g. “incels” who are socially isolated and have adopted a perspective and an online life that keeps them that way, or disenfranchised young people living in poverty that become seduced by the lure of terrorist martyrdom). And then there’s the psychiatry lightbulb joke — “How many psychiatrists does it take to change a light bulb? Just one, but the bulb has to really want to change.” People with delusions and delusion-like beliefs are by definition unable to see different perspectives and aren’t trying to modify their beliefs. On the contrary.
Do you think large-platform tech companies that facilitate the spread of potentially delusion-triggering information have a responsibility to self-regulate? If so, would you recommend any measures in particular?
When I watched Carole Cadwalladr, the journalist who broke the real-life conspiracy theory of Cambridge Analytica [see my blogpost “The Internet, Psychological Warfare, and Mass Conspiracy”] give a TED Talk about “How Facebook Broke Democracy.” I found her argument pretty compelling:
"Liberal democracy is broken, and you [the Gods of Silicon Valley] broke it. This is not democracy — spreading lies in darkness, paid for by illegal cash from God knows where. ...And it's not about left or right or leave or remain or Trump or not. It's about whether it's actually possible to ever have a free and fair election ever again. Because as it stands, I don't think it is... Is this what we want? To let them get away with it and to sit back and play with our phones as this darkness falls?"
I support efforts like YouTube de-prioritizing anti-vaxxer videos within their search algorithms. But I don’t know what we can depend on from the social media giants, who have their vested interests in potential conflict with the greater good. For the rest of us, the best answer is probably to unplug, but as Cadwalladr suggests, putting down our mobile phones can be hard if not impossible to do.
There’s another ambitious, but maybe less drastic, approach that’s worth implementing — we can try to become more informed consumers of online information [see my blogpost “Fake News, Echo Chambers & Filter Bubbles: A Survival Guide”]. I’ve recently stumbled onto a project called “Calling Bullsh*t,” created by two University of Washington professors, that aims to “help students navigate the bullsh*t-rich environment by identifying bullsh*t, seeing through it, and combating it with effective analysis and argument.” They define bullshit as “language, statistical figures, data graphics, and other forms of presentation intended to persuade by impressing and overwhelming a reader or listener with a blatant disregard for truth and logical coherence.” Taking a look at their college syllabus, something like it should probably be incorporated into the core curriculum of every American student, by high school, if not much earlier.
To read more about conspiracy theories and the internet, check out my previous blog posts: