Skip to main content

Verified by Psychology Today

Media

The Internet, Psychological Warfare, and Mass Conspiracy

Controlling minds and manipulating behavior through social media

Public domain
A Conspiracy, John Tenniel (1850)
Source: Public domain

"The Matrix is everywhere. It is all around us. Even now, in this very room. You can see it when you look out your window or when you turn on your television. You can feel it when you go to work…when you go to church…when you pay your taxes. It is the world that has been pulled over your eyes to blind you from the truth... That you are a slave, Neo. Like everyone else you were born into bondage. Born into a prison that you cannot smell or taste or touch. A prison for your mind. Unfortunately, no one can be told what the Matrix is. You have to see it for yourself. This is your last chance. After this, there is no turning back. You take the blue pill, the story ends, you wake up in your bed and believe whatever you want to believe. You take the red pill, you stay in Wonderland, and I show you how deep the rabbit hole goes. Remember: all I'm offering is the truth. Nothing more."

— Morpheus, The Matrix (1999)

It’s official. On March 18, 2018, I became a conspiracy theorist.

Many conspiracy theorists claim to have started as skeptics searching for the truth, only to stumble upon some hidden seed that crystallizes and spreads within a moment, resulting in a sudden broader awakening in which the world is seen in a new and often ominous light. I would seem I fit that mold — since embarking as the author of Psych Unseen four years ago, I’ve spent each post trying to debunk false beliefs, fake news, truth denialism, Alex Jones and Infowars, flat earthers, and even “breatharians” by explaining the psychological forces that allow them to thrive and highlighting the role of the internet in the rampant spread of misinformation within filter bubbles and echo chambers.

So, what was my moment of satori? It started, predictably enough, on Twitter. A tweet from my psychiatrist colleague, fellow Psychology Today blogger, and occasional co-author Dr. Allen Frances, linked an article that he described as “the scariest story I’ve ever read.” The piece, authored by Carole Cadwalladr and appearing in the March 18 edition of The Guardian, was called “The Cambridge Analytica Files – ‘I made Steve Bannon’s psychological warfare tool’: meet the data war whistleblower.” In it, Cadwalladr shines the spotlight on Christopher Wylie, a young man pursuing a behavioral economics PhD who was hired as a research director for a “behavioral research and strategic communication” firm called SCL Group to apply his knowledge of fashion forecasting to political elections. Apparently disregarding non-disclosure agreements out of a guilty conscience, Wylie provided Cadwalledr with ample quotations and source material that connects the dots of a conspiracy that links SCL Group to an offshoot shell company called Cambridge Analytica funded by Republican donor Robert Mercer, to another company called Global Science Research (GSR) owned by St. Petersburg University psychology professor Aleksandr Kogan (aka “Dr. Spectre”), to Cambridge Analytica board member/investor and later Trump presidential campaign manager Steve Bannon, to Russia and Vladimir Putin.

The lines of connection are outlined in the article, as well as another co-authored by Cardwalladr in the New York Times, but the short version is that Kogan, replicating the work of Cambridge University psychologists Michal Kosinki and David Stillwell, developed an app called “thisismydigitallife” that collected “psychographic” data on the personality traits of users while gaining access to their Facebook profiles and those of their friends under the guise of academic inquiry. Kogan’s GSR then partnered with Cambridge Analytica to mine the profiles of some 30-50 million Facebook users without their permission and, under the direction of Mercer and Bannon, directed Wylie to develop ways in which their personal data could be used to promote and shape the political campaigns of Ted Cruz and Donald Trump. The issue of a “data breach” from Facebook and the use of personal information without informed consent aside, the partnership of GSR (with Kogan’s ties to Russia) and Cambridge Analytica (staffed mostly by Canadians and Europeans) is now under scrutiny for possibly violating US laws that limit the involvement of foreign nationals in American elections. According to The Guardian article, Cambridge Analytica also made a business pitch in 2014 to a Russian oil company with close ties to Russian President Vladimir Putin that had nothing to do with anything related to petroleum, but instead focused on “election disrupting techniques” involving the use of misinformation to influence voters based on their online psychographic profiles. It is suggested that Cambridge Analytica thereby handed a loaded gun to Russia, if not through any formal business arrangement, giving it the means to influence the 2016 US Presidential Election.

It’s still not yet clear whether there’s cause to invoke conspiracy or collusion within the Trump campaign — that part of the story will no doubt continue to unfold as the FBI investigation under the direction of Robert Mueller trudges on and as Wylie, Kogan, and Cambridge Analytica become household names. But even if that piece of the puzzle does materialize into something beyond a liberal echo-chamber fantasy, it could be considered trivial within the larger context of something else destined to become a household word — “information warfare.”

If information warfare has a chief conspiracy theorist with a mainstream voice, it might be NYU philosophy professor Tamsin Shaw who is quoted at the end of Cardwalladr’s article. In a recent New York Times book review, Shaw defines modern information warfare as “the exploitation of information technology for the purposes of propaganda, disinformation, and psychological operations.” Writing in another New York Times book review that highlighted Cambridge Analytica’s use of Facebook data a year ago, Shaw explained:

“The findings of social psychology and behavioral economics are being employed to determine the news we read, the products we buy, the cultural and intellectual spheres we inhabit, and the human networks, online and in real life, of which we are a part. Aspects of human societies that were formerly guided by habit and tradition, or spontaneity and whim, are now increasingly the intended or unintended consequences of decisions made on the basis of scientific theories of the human mind and human well-being.

The behavioral techniques that are being employed by governments and private corporations do not appeal to our reason; they do not seek to persuade us consciously with information and argument. Rather, these techniques change behavior by appealing to our nonrational motivations, our emotional triggers and unconscious biases. If psychologists could possess a systematic understanding of these nonrational motivations they would have the power to influence the smallest aspects of our lives and the largest aspects of our societies.”

In connecting her own conspiratorial dots, Shaw traces the origins of modern informational warfare back to Daniel Kahneman, who shared a 2002 Nobel Prize for his pivotal work in the field of behavioral economics. She suggests that at the heart of his theory of binary systems of thought described in Thinking, Fast and Slow, Kahneman’s lasting practical contribution to economics was to reveal how psychological “nudges” can guide human decision making and therefore be harnessed to influence choice. Shaw seems to have a dark view of psychology’s potential for evil, highlighting the role of psychologists in developing torture/interrogation techniques in the wake of 9/11 and indicting the moral authority of psychology as a field along with the specific contributions of noteworthy psychology luminaries like Steven Pinker, Jonathan Haidt, and Joshua Greene.

Beyond psychology at large, Shaw takes “the Big Five” tech companies Microsoft, Apple, Facebook, Amazon, and Google to task for exploiting the psychology of choice in her recent review of Alexander Klimsburg’s The Darkening Web: The War for Cyberspace:

“Only in recent months, with the news of the Russian hacks and trolls, have Americans begun to wonder whether the platforms they previously assumed to have facilitated free inquiry and communication are being used to manipulate them. The fact that Google, Facebook, and Twitter were successfully hijacked by Russian trolls and bots (fake accounts disguised as genuine users) to distribute disinformation intended to affect the US presidential election has finally raised questions in the public mind about whether these companies might compromise national security.

…the Internet has exacerbated the risks of information warfare. Algorithms employed by a few large companies determine the results of our web searches, the posts and news stories that are featured in our social media feeds, and the advertisements to which we are exposed with a frequency greater than in any previous form of media. When disinformation or misleading information is fed into this machinery, it may have vast intended and unintended effects.”

For Shaw, the most concerning “intended effect” of weaponizing psychology has been to transplant it from its initial military applications into corporate and political sectors. Indeed, it's hardly conspiratorial to note that “psychological operations” (aka PSYOP) have been a tool of the US military and the CIA since the 1950s, applied in the name of winning over “hearts and minds” during military conflicts as well as in steering foreign elections in favor of democratic regimes and US interests. Nor can it be disputed that wrangling the wedded power of psychological influences on behavior and social media by US presidential campaigns began before Bannon and Trump. After recruiting a Social and Behavioral Sciences Team (SBST) to advise and direct his campaign efforts, it was President Obama that came to be dubbed “the first social media President.” A 2012 article’s title appearing in The Atlantic — “Meet the Psychologists Who Convinced You to Vote for Obama” — speaks for itself. A subsequent 2017 article in The Atlantic suggests that President Obama was “too good” at social media, which “blinded him to technology’s dangers” and all but set the stage for the Trump campaign. Already, there are claims that little beyond personal bias allows us to condemn the use of informational warfare by the Trump campaign, and by extension Russia, while praising the innovation of President Obama’s, though Mike Masnick, writing for techdirt.com’s (Mis)Uses of Technology blog, notes:

“…there is one major difference between the Obama one and the Cambridge Analytica one, which involves the level of transparency. With the Obama campaign, people knew they were giving their data (and friend's data) to the cause of re-electing Obama. Cambridge Analytica got its data by having a Cambridge academic (who the new Guardian story revealed for the first time is also appointed to a position at St. Petersburg University) set up an app that was used to collect much of this data, and misled Facebook by telling them it was purely for academic purposes, when the reality is that it was setup and directly paid for by Cambridge Analytica with the intent of sucking up that data for Cambridge Analytica's database.”

Of course, the trouble with conspiracy theories is that every once in a while, they end up being true. In retrospect, this one seems obvious, hardly requiring a stretch of the imagination and lying just under our noses all this time. Make no mistake though, the “real conspiracy” — because there’s always a bigger picture in conspiracy theories — isn’t about Trump and Russia. It isn’t about one country, or one political party, or one corporation. It’s about the potential exploitation of cognitive biases as cognitive vulnerabilities at every level and in every sphere.

Though cynical to say, it was probably an inevitability that psychology, as a science set on understanding human behavior, would be applied to not only predict, but manipulate that behavior. What couldn’t have been foreseen 50 years ago with the start of modern PSYOPS, and what is just now coming into focus, is just how the internet has made that possible on a much larger scale and in a much less ridiculous way compared to say, mass atmospheric spraying (aka “chemtrails”). Nor how a tool seemingly devised for beneficence could be applied for more nefarious purposes (realizing full well the quagmire of moral relativity in the realms of economics and politics where one can forever debate the merits of democracy, capitalism, and globalism as the best models for the “greater good”).

Although Shaw draws a connection between the exploitation of cognitive vulnerabilities and the moral bankruptcy of psychology, we can hardly fault psychologists for revealing cognitive biases that are already there. And if there’s blame to be laid, we shouldn't point fingers at an inanimate entity like the internet either, but at those who exploit its power, glossing over ethical responsibilities related to autonomy, privacy, data protection, and informed consent. In the movie Terminator 2: Judgment Day, the “rise of the machines” can be traced back to the work of Miles Dyson, an engineer who develops artificial intelligence called Skynet for a company called Cyberdyne Systems. In our current version of art-becomes-life, it’s not so much the machines that we have to worry about, but the people. In the coming years, Facebook founder Mark Zuckerburg could come to be known as a real-life Miles Dyson, the man responsible for unwittingly causing the downfall of humanity.

Chamath Palihapitiya seems to imply as much in recently coming out to apologize for the unintended effects of his own role as former “vice-president of user growth” at Facebook:

“It literally is a point now where I think we have created tools that are ripping apart the social fabric of how society works. That is truly where we are. The short-term, dopamine-driven feedback loops that we have created are destroying how society works: no civil discourse, no cooperation, misinformation, mistruth. And it’s not an American problem. This is not about Russian ads. This is a global problem.”

…Bad actors can now manipulate large swaths of people to do anything you want. And we compound the problem. We curate our lives around this perceived sense of perfection, because we get rewarded in these short-term signals — hearts, likes, thumbs up — and we conflate that with value and we conflate it with truth. And instead, what it is is fake, brittle popularity that’s short-term and leaves you even more, admit it, vacant and empty before you did it.”

In the end though, the worst and most insidious part about the conspiracy to hijack social media for the purposes of psychological warfare is that we’re all willing, or at least semi-willing, participants. We know that decades of television advertisements have biased our choices as consumers, but we still eagerly tune in to Superbowl commercials. We’ve read that internet search engines offer a biased view of what’s out there in cyberspace and that online algorithms are designed to make us more prolific consumers, but we still go to Siri and Alexa for knowledge. We’ve come to accept that Russian “web brigades” and “troll farms” churn out social media "bots" that attempt to foment dissatisfaction with living in a multicultural democracy at every click, and we've recently been told that real human beings are 70% more likely to retweet falshoods than facts, but we still rely on Facebook and Twitter as our main news sources. And now that we learn how social media platforms are bypassing consent to access personal information and use it for agendas beyond our awareness and potentially contrary to our own on the scale of a presidential election, we still click on Facebook quizzes and submit photos of ourselves to apps that purport to analyze our ancestry or find our dopplegangers in fine art.

We do all of this because we tell ourselves the opposite of what psychologists like Kahneman have told us, holding onto our own intuition that we have unmitigated contra-causal free will and are immune to hidden forces that manipulate our behavior. We tell ourselves that the power of the internet, with its fake news and Russian bots, is limited.

In short, we’re in denial. On some level, we know that we should devote less time to debating anonymous strangers online and more time to face-to-face discourse and human interaction. Palihapitiya suggests that the path to salvation is to unplug and notes that he doesn’t allow his own children to use social media. But can we really unplug? Will we? Do we want to?

advertisement
More from Joe Pierre M.D.
More from Psychology Today