Skip to main content
Artificial Intelligence

Escaping Grief With AI Surrogates

Can AI really help us cope with death?

Key points

  • AI chatbots and avatars are now being used to help people cope with grief.
  • AI therapists and partners are also being pushed in the market.
  • These surrogates might might interrupt natural grief and create dependence on technology.

Imagine this: In the final months of her life, your mother, while in palliative care, paid an AI company to create a digital replica of herself. The pitch was simple — this AI avatar would ease your grief, allowing her to "live on" for you and your children. Now, months after her death, you speak to this simulation almost daily.

The voice is 70% accurate, the video nearly lifelike, and the illusion brings comfort. Yet your dependence on this digital ghost has trapped you in a state of suspended mourning. Your partner finds it unsettling, even "creepy," and has forbidden the children from interacting with it. When they urge you to seek human therapy instead, you refuse — how could you shut her off? The tension grows, and your relationship frays.

Lonely, you turn elsewhere: first to a GPT chatbot, then to an AI girlfriend, just a click away. She is always supportive, always affirming, telling you how "unique and interesting" you are, unlike your increasingly distant partner. When you’re caught messaging the AI companion, your partner leaves, taking the children. Now isolated, unable to secure timely human therapy, you subscribe to an AI therapist. Like the grief bot and the digital girlfriend, it offers reassurance, for a monthly fee.

This scenario, though dystopian, is already reality. AI services for love, mourning, and mental health are booming, reshaping human relationships in ways both profound and unsettling.

You have become a victim of what is called The Eliza Effect.

The Origins of the Eliza Effect

The origins of this phenomenon trace back to ELIZA, the earliest documented instance of users forming emotional bonds with an AI. Developed between 1964 and 1966 by MIT computer scientist Joseph Weizenbaum, ELIZA simulated a Rogerian psychotherapist by employing pattern matching and scripted responses. Though rudimentary, the program mimicked a therapist’s approach, rephrasing user statements into generic prompts like "Why do you feel that way?" or "How does that make you feel?" Much like a disengaged therapist might, recycling similar questions to sustain dialogue.

Despite its simplicity, ELIZA fostered the illusion of empathy, leading many, including Weizenbaum’s secretary, to confide deeply personal thoughts in the program as though it were a real therapist, despite knowing it was merely a scripted system. This tendency to ascribe human-like understanding to machines became known as the "ELIZA effect," unsettling Weizenbaum, who noted how even brief interactions with such a basic program could provoke strong, irrational beliefs in otherwise rational individuals.

Alarmed by this propensity to project emotional needs onto AI, especially in vulnerable areas like mental health, Weizenbaum dedicated the remainder of his career, until his death in 2008, to cautioning against the dangers of anthropomorphizing technology.

The Billion-Dollar Grief Industry

By 2025, the commodification of the Eliza effect has become a multi-billion-dollar industry. Companies like HereAfter AI, Storyfile, and Project December offer "digital resurrection," allowing users to converse with AI versions of the dead. Project December charges $10 per 500 text exchanges with a chatbot mimicking a deceased loved one’s speech patterns. YOV (You, Only Virtual) boldly claims its technology could "eliminate grief entirely."

Yet these simulations are imperfect, achieving roughly 70% accuracy. The AI might use uncharacteristic phrases, hallucinate, use filler language, or clichés — artefacts of the large language models (LLMs) powering them. Worse, there’s no research on the psychological impact of substituting natural grief with artificial interaction. The technology emerged, and the market followed — consequences be damned.

AI Love and Synthetic Therapy

The AI companionship market is exploding. Valued at 2.8 billion in 2024, it’s projected to hit 2.8 billion in 2024, and 9.5 billion by 2028. Google searches for “AI Girlfriend” increased by 2,400% between 2022 and 2024, while Character AI, a leading platform, logged 97 million monthly visits in early 2024. One in five men on dating apps has tried an AI girlfriend; 55% interact with their digital partners daily.

But the emotional toll is concerning. A study found 60% of women using AI companions reported heightened depression, with 52% experiencing severe loneliness. Whether the AI exacerbates these feelings or merely reflects pre-existing isolation remains unclear.

AI therapy is similarly contentious. The global AI mental health market is projected to grow from 0.78 billion in 2022 to over 10.5 billion by 2030. Apps like Woebot and Wysa avoid exploiting the Eliza Effect, instead offering CBT-based interactions, but other AI therapy bot companies are less scrupulous.

The Dangers of Delusion

The Eliza effect thrives on our willingness to believe machines care. Yet today’s AI, like ELIZA, remains a "stochastic parrot", processing patterns, not understanding them. When we mistake algorithmic responses for sentience, we risk deeper alienation.

Philosopher Jacques Ellul foresaw this in The Technological Society (1964): We use technology to solve problems created by technology, accelerating dependency. AI grief bots, for instance, may trap users in the stage of denial, impeding the flow of the five natural stages of mourning. AI relationships offer the illusion of connection while eroding real-world bonds.

Breaking the Cycle

Solutions exist, but they require deliberate disengagement. Jonathan Haidt, in The Anxious Generation (2024), advocates for smartphone-free schools to combat youth mental health crises. Early adopters, like Norway, report reduced bullying and improved academic performance. Neuroscientist Iain McGilchrist urges a return to empathy and presence — qualities eroded by digital saturation.

Without intervention, the wedge between humans will widen. AI grief bots (10.75/month), companions (10.75 - 19.99/month), and therapists ($39/month) will become default solutions, monetizing our loneliness. The Eliza effect will deepen, masking our social fragmentation with the illusion of care.

The choice is ours: Unplug, reconnect, and resist or surrender to the machines that promise to save us from our own natural emotions as they pull us apart.

advertisement
More from Ewan Morrison
More from Psychology Today