Can the Online Proliferation of QAnon Be Stopped?

Should social media providers be held accountable for conspiracy theories?

Posted Sep 12, 2020

Source: TheDigitalArtist/Pixabay

QAnon continues to receive considerable media attention as we head into the November election season. The following is an interview I did with Erik Hawkins for his recent article on QAnon for Heavy:

Have you engaged with any QAnon followers or treated them?

I’ve had lots of conversations with people who agree with QAnon dogma at the metaphorical level—that America is in danger from a liberal “Deep State” and that President Trump leading the charge in a climactic battle between good and evil. That’s arguably the GOP platform at this point. I’ve had much fewer conversations with QAnon “true believers” and none in any professional capacity. 

In any case, I would avoid the word “treat” in the context of conspiracy theories since we’re not talking about mental illness. Belief in conspiracy theories is arguably more of a “symptom” of societal dysfunction than that of an individual. In that sense, they’re a sign of the times moreso than a kind of psychopathology.

Is it completely useless to use outside sources that would ordinarily be considered reliable when someone is so deep into these beliefs and rejecting anything that comes outside the QAnon world?

Beyond “fence-sitters” who are genuinely looking for answers and still open to facts, presenting counterfactual evidence to “true believers” doesn’t usually work. Since conspiracy beliefs are rooted in mistrust of authoritative sources of information, there’s often a core disagreement over what counts as evidence and facts that makes meaningful debate almost impossible. Changing people’s minds isn’t likely to happen if we can’t agree that facts exist or how to decide on what’s real or not. 

Another problem is that all too often factual debates amount to a fight over about who’s right rather than a meaningful dialogue aimed at understanding different perspectives. That’s almost doomed to fail, especially on social media where so many of these so-called debates occur. And when beliefs become deeply enmeshed with identity, people are usually very, very resistant to giving them up.

As far as the '5 Stages' model by Bradley Franks that you mentioned in your post on QAnon, where do you think the majority of QAnon followers (active ones online) generally fall?

The short answer is I really don’t know and I don’t think anyone else does either. We do know that about half the U.S. population endorses belief in at least one conspiracy theory. . . The question from a “clinical” standpoint is just how much they believe it and how much time they spend thinking about it or “researching it” to the exclusion of other activities.

Conspiracy theory researcher Rob Brotherton put this well in Adrienne Lafrance’s excellent Atlantic article on QAnon:

“It’s tempting, I think, to simplify and just say, you know, 4 percent of people think the United States is run by lizard people,” he continued. “But really do they? Or is there a percentage of people in there who are just fucking around with the survey, or who are saying this ’cause they think it’s funny or because they think all the people in charge are bad? They’re not necessarily, literally lizards. But you know, I’m going to say that I think this is true in a metaphorical sense. There are a lot of reasons for people entertaining conspiracy theories, not all of them because they literally believe it to be true,” Brotherton said. “One of the ideas is that it could be just signaling something like your broader worldview.”

Speaking more generally, I do think there’s reasonable evidence that “epistemic mistrust”—that is, mistrust in authoritative accounts of information—is quite prevalent these days. Polls indicate that trust in government, trust in the media, and trust in experts are at near-historic lows going back 50 years. This mistrust is a key aspect of the populist movement that elected Donald Trump and other populist/nationalist leaders around the world in recent years.

I know it's outside your realm of expertise, but as far as you know, do social media companies have a responsibility to do anything more about the proliferation of this content on their platforms? Would that actually make the situation worse, or could it help?

Yes. . . getting beyond psychology and psychiatry into the realm of economic philosophy and politics, but it's really the most intriguing question. Is capitalism’s bottom line only growth and profits or is there room to incentivize products and services that offer a social good? What do we mean when we talk about “responsibility?” Is moral responsibility relevant to capitalistic business models? These questions lie at the core of political debates about capitalism and “democratic socialism” that are ongoing today. 

We live in a country where freedom of speech is a First Amendment right. The hope of the internet and its “democratization of knowledge” was that the truth would rise to the top within a kind of free market of ideas, but that hasn’t happened. Instead, the internet has contributed to a fracturing of the very idea of truth, leaving us in a so-called “post-truth” world.

The challenge right now is that misinformation is big business. We know that false information travels faster and farther than accurate information online. And we can probably agree that the click-based revenue model of the internet has meant that “click-bait” headlines and provocative op-ed “hot takes” have largely come to replace factual reporting—and how many people who share information on social media even read the entire article?

In terms of restricting misinformation, redactions and retractions can contribute to a back-fire effect when it comes to conspiracy theories, at least for those who are drawn to conspiracy theories in the first place. The faulty logic of conspiratorial thinking is that suppressing information must mean that it’s true rather than understanding the removal of misinformation as a necessary social good.

We expect and demand that product labeling—say on a food product or a medication—is accurate. Is that an unreasonable expectation for companies whose product is information itself? Unfortunately, incentivizing social media companies to remove misinformation when it’s profitable is a “tough sell.”

I think that one thing we might expect in the future however is greater liability for individuals and companies that profit from misinformation and deliberate disinformation. The lawsuits against Alex Jones are an early sign of that society may be coming around to the idea that peddlers of disinformation should be held accountable when that disinformation causes harm. In that sense, disincentivizing misinformation may come to be more impactful than incentivizing accurate information online.

For more about how to talk to loved ones who have become obsessed with QAnon: