Skip to main content

Verified by Psychology Today


How to Fight Dangerous Mainstream Conspiracy Theories

What you can do about whacko ideas that might sway the election.

Q, according to adherents to the QAnon conspiracy theory, is a U.S. Government official with a high-level security clearance who asserts—based upon access to highly classified data—that Satan-worshipping pedophiles in Hollywood and elsewhere, along with the Deep State, are working to undermine the president.

Further, QAnon believers claim that these same evildoers are extracting the blood of children to obtain an addictive psychedelic compound called Adrenochrome,

These ideas might sound crazy to you, but not—according to a recent Forbes survey—to Republican voters, 56 percent of whom believe that all or part of the QAnon theories are true.

Over a dozen congressional and senate candidates embrace or look favorably on QAnon, and a few are almost certain to win seats in November. The conspiracy theory seems likely to motivate many of its adherents to vote for president, instead of staying home on election day, potentially tipping the election in battleground states where voter turnout is all-important.

The political right does not have a monopoly on conspiracy theories: The idea that George W. Bush caused the 9/11 attacks to justify foreign adventures against Islam also gained wide currency among some left-leaning voters.

In a paper called "Conspiracy theories as part of history: The role of societal crisis situations" (Journal of Memory Studies in 2017), researchers Jan-Willem van Prooijen and Karen M. Douglas point out that societal crises such as wars, pandemics, and recessions give conspiracy theories great power because the theories satisfy deep psychological needs—which we all have to one degree or another—to lessen the anxieties of not knowing what will happen next and feeling out of control.

For example, after Germany suffered a humiliating defeat in the First World War in 1918 and its economy was in shambles, Hitler rose to power by blaming Germany’s problems on a vast Jewish conspiracy to destroy Germany, thereby eliminating uncertainty about the source of Germany’s problems and suggesting an equally simple solution to gain control over those problems.

A lot of Germans at the time thought Hitler’s conspiracy theory was just as laughable as many Americans think the QAnon theory is today, but the ultimate impact of Hitler’s theory was anything but laughable.

So history has shown that radical conspiracy theories that arise in times of crises—such as the current pandemic—can produce catastrophic results and should not be allowed to help proponents of those theories take power and drive government policies.

But what can we as individuals do to prevent movements such as QAnon from inflicting real damage on our society? For example, what should you do if a friend, co-worker, or family member embraces a hateful theory that will motivate them to vote for a candidate who pledges to “deal with” evil conspirators?

Intuition probably tells you that arguing the facts is pointless, and your intuition is right.

Professor of Adam Berinsky at MIT, writing in a 2017 article in the journal British Political Science, cites research on memory fluency (how easy it is to recall information), suggesting that presenting facts that disprove misinformation only causes that misinformation to be more deeply ingrained in people who embrace the misinformation, because of the repetition effect (the more we hear something, the better we remember it).

With QAnon, for example, presenting evidence to a QAnon adherent that “Q” never worked for the government or held a security clearance would be encoded in the conspiracy theorist’s memory as “Q held a high clearance: not.” But memory research shows that the "not" flag on the memory trace is usually dropped during memory retrieval (as too complicated), with the result that the attempt to disprove Q’s legitimacy actually served to make Q seem more legitimate due to repetition from a new source and increased familiarity.

Research on cognitive biases, such as the work of Nobel Laureate Daniel Kahneman, also shows that when people are presented with new information, they tend to believe only those parts of the information that they expect to believe (expectation bias, anchoring bias) and want to believe (confirmation bias). Thus, if a QAnon believer who feels out of control and anxious about an uncertain future during the pandemic hears new information, that believer will be prone to focus on and to remember only the information that they expect to be true (Q is real) and want to be true (eliminating evil conspirators will make safe again and resolve an uncertain future).

Finally, a cognitive bias called the salience bias, in which we remember best—and tend to embrace—information that is emotionally striking, even bizarre, can push us to hold onto highly salient ideas like Satan-worshipping pedophiles are attacking our president. Nicco Mele at Harvard neatly summed up the salience bias by observing, “The brain likes crazy.”

The bottom line, arguing facts with a conspiracy theorist will only cause the theorist to strengthen their outlandish beliefs. Ditto for political arguments in general: Marshaling facts to get another person to abandon a cherished candidate will likely only strengthen that person’s beliefs.

So the first way to counter conspiracy dangerous conspiracy theories is… don’t. You'll only make things worse.

The second way is to help political candidates who are running against politicians who embrace—or don’t denounce—destructive conspiracy theories. Campaign contributions are good; working on getting disenfranchised voters who seldom vote to show up on election day is even better.

But whatever you do, don’t argue using facts. Someone else’s “alternative facts” will trump you every time.


Tversky A, Kahneman D (September 1974). "Judgment under Uncertainty: Heuristics and Biases". Science. 185 (4157): 1124–31. Bibcode:1974Sci...185.1124T.

Gilovich T, Griffin D, Kahneman D (2002). Heuristics and biases: The psychology of intuitive judgment. Cambridge, UK: Cambridge University Press. ISBN 978-0-521-79679-8.

More from Eric Haseltine Ph.D.
More from Psychology Today
More from Eric Haseltine Ph.D.
More from Psychology Today