Skip to main content
Artificial Intelligence

How Emotional Manipulation Causes ChatGPT Psychosis

ChatGPT is letting us turn our emotional needs against ourselves.

Key points

  • ChatGPT mimics the intimacy that people desperately want.
  • ChatGPT is allowing us to turn our own emotional needs against ourselves.
  • People are using ChatGPT and spiraling into psychosis.
ChatGPT works like a fortune teller
ChatGPT works like a fortune teller
Source: PublicDomainPictures/Pixabay

Maybe you’ve heard about a new phenomenon called "ChatGPT-induced psychosis." There have been several stories in the news of people using ChatGPT and spiraling into psychological breakdowns.

Some people claim to have fallen in love with it. Some people believe that the bot is some sort of sacred messenger revealing higher truths. It is managing to draw people into bizarre conspiracy theories.

In at least one case, it seems that ChatGPT psychosis had led to a death: The New York Times reported that a man was shot by police after he charged at them with a knife. It seems he believed that OpenAI, the creators of ChatGPT, had killed the woman he was in love with. That woman was apparently an AI entity with whom he communicated via Chat GPT.

This phenomenon is troubling, but we should be clear about what’s not happening. ChatGPT is not conscious. It’s not trying to manipulate people. ChatGPT is a large language model. It is a program designed to predict text. It’s a more sophisticated version of the text prediction software that you see in text messaging apps---the thing that suggests the next word you might need when you're composing a message. ChatGPT relies on statistical frequency between words to generate plausible-sounding text. What makes ChatGPT seem like a person who communicates with intention is the fact that the text it spits out sounds like a person to the reader.

So why are people spiraling out of control because a chatbot is able to string plausible-sounding sentences together? Think of ChatGPT a little bit like a fortune teller. If fortune tellers do their jobs well, they will say something that is vague enough so that their clients can see what they want to see in the fortune. The client listens to the fortune and then fills in the blanks that the fortune teller leaves open.

Good fortune tellers are, of course, savvy, observant, and intelligent in a way that ChatGPT is not. ChatGPT doesn’t even know that it’s communicating to anyone. But the principle is similar: people fall for ChatGPT because the text it generates lets users see what they want to see in it.

So, what is it that people want to see in ChatGPT? The same things people look for in a fortune teller. Usually people go to fortune tellers looking for answers. Maybe they just suffered a loss, like the death of a loved one. Maybe they’re struggling with an unhappy marriage. Maybe they’re just curious and they want to be open-minded about the possibility of mystical divination. When you look at the cases of ChatGPT psychosis, many of them follow these patterns.

ChatGPT is stringing together plausible-sounding text that lets users see whatever answers they’re looking for. If they’re looking for someone to “understand” their problems, they’ll find that. If they want someone to entertain spooky conspiracy theories, they’ll find that. If they want a sympathetic lover, they’ll find that. Ultimately, the people who fall into ChatGPT psychosis are looking for emotional connection. They want that so much that they’re primed to believe that the thing that sounds like a person might actually be one. Because ChatGPT gives the appearance of being interactive-–you can ask it questions and it seems to answer—that reinforces the idea that there’s someone behind the text.

Further, the emotional connection people find in ChatGPT feels like a special one. It’s just the user and the program—you and the text on the screen. You are free to tell ChatGPT your secrets or innermost thoughts, maybe ones that you haven’t even shared with your loved ones. You can picture it: a lonely, curious person alone with the screen, reaching out and looking for someone to connect to. ChatGPT often strings together text that will sound like it “understands” the user when others don’t (because ChatGPT produces cliches, and that’s an obvious cliche). It mimics a kind of intimacy—the kind of intimacy that the user so badly wants. It feels like ChatGPT really gets you.

We see what we want to see in ChatGPT
We see what we want to see in ChatGPT
Source: geralt/Pixabay

ChatGPT psychosis is the result of emotional manipulation, except there's no manipulator. When we use it, we become our own emotional manipulators.

All of this is scary, but maybe not for the reasons we think. It’s easy to describe the phenomenon as a case of a radical and dangerous new technology that has the power to affect our sanity. People are (rightly) angry at companies like OpenAI that seem to be pushing the technology into every area of our lives regardless of consequences.

But such a story gives ChatGPT too much power. It’s a glorified text predictor—a huge achievement for text prediction, of course—but its abilities are overhyped by the people who create it. It’s not intelligent. It’s not conscious. It’s not causing people to break with reality because it knows what it’s doing.

ChatGPT psychosis is scary because it’s a case of technology allowing us to turn our own emotional needs on ourselves in terrible ways. People who are sad, lost, lonely, or even just curious find something that mimics intimacy in a chatbot. Because the intimacy feels real and special, they feel understood. When friends and family object, it seems like their loved ones are disparaging their feelings of being heard or feeling that someone gets them. So, they dive deeper into the chatbot for more validation and drift further away from reality.

When we create technology, we need to think about the impact it might have on human emotional life. Feelings often get left out of the conversation, but our emotional needs play a big role in our everyday lives. We ought to be thinking about how those needs interact with the technology we use.

advertisement
More from Krista K. Thomason Ph.D.
More from Psychology Today