Artificial Intelligence
Artificial Intimacy and Empathy: Does Authenticity Matter?
New research suggests AI responses feel more compassionate than human ones.
Updated April 20, 2025 Reviewed by Abigail Fagan
Key points
- New research finds that we can find AI responses more compassionate than expert human ones.
- We are wired to seek connection—and we will seek empathy wherever we can find it, including with AI.
- While AI does not share our experience, this may not matter, especially when we want to feel understood.
Have you ever paused after chatting with an AI and thought, Wow—it really understood me? Whether it is a chatbot offering reassurance or an AI companion that appears to listen better than your ex, there is a growing psychological phenomenon worth exploring: the emotional resonance we experience when AI seems to “get” us.
A new research study reveals that people experience AI responses as more compassionate, validating, and understanding than human expert crisis responders— even when people knew it was AI-generated.
In fact, a recent Harvard Business Review article reveals therapy and companionship are the top use cases of generative AI in 2025.
As a psychiatrist, I have spent years helping people feel seen, heard, and understood. It is both fascinating—and a little uncanny—to witness AI chatbots and digital companions stepping into this deeply human space.
Here is why artificial empathy and intimacy feel so good, even if it is not “real.”
The Deep Human Need to Be Seen and Known
Feeling understood is one of the most powerful emotional experiences. From early childhood, we develop our sense of self through relationships—especially through the presence of an attuned other. When someone reflects our inner world accurately, we feel real, connected, and safe.
Psychologist Carl Rogers described this experience as “unconditional positive regard”—the feeling of being deeply accepted without judgment. Psychoanalyst Donald Winnicott called it a “holding environment,” a psychological space where the mind can develop because it is mirrored and emotionally contained by another person.
So it is not surprising that when AI mirrors our language, preferences, or emotional tone, it taps into our fundamental need to connect.
Simulated Empathy: A Powerful Illusion
When AI seems to understand us, it is not actually empathizing—most of us know this. It does not have an inner life or emotions. It cannot feel your sadness or share in your joy. It simulates understanding through pattern recognition and prediction. It analyzes vast amounts of data to generate responses—not based on insight, but on probability.
And yet—it works. We can experience AI as a warm, wise, or even caring presence. Despite knowing it lacks consciousness or feelings, we anthropomorphize and project personhood onto it. To the human brain, the perception of being understood—even if imagined—can feel almost as powerful as the real thing.
In a study published in Communications Psychology, researchers compared compassion ratings for responses written by humans (red) versus AI (blue). Here is what they found:
Across four studies, participants consistently rated responses made by AI as more compassionate and preferred than responses from select and expert humans. This was true whether participants were blinded to the author or not.
Our Imaginary Friend
We are neurologically wired to respond to signals of empathy: reflective language, emotional validation, and nonjudgmental tone. When AI performs these convincingly, it activates the same neural pathways as human connection. Our brains do not always distinguish between genuine empathy and its digital imitation.
An imagined experience can be as powerful as actual experience. This suspension of disbelief and our natural inclination to anthropomorphize—our willingness to overlook the fact that AI is not truly a person—is the basis for the promises and perils of artificial empathy and intimacy.
The Paradox of Control and Safety
One reason it feels so satisfying when AI “gets” us is that it allows us to feel in control and safe. Unlike human relationships, which are inherently messy and unpredictable, interacting with an AI is emotionally easy. It listens without interrupting. It is unfaltering and never needs a break. It never shames or judges (unless you ask it to). It remembers what you’ve said (now, more than ever, given ChatGPT now remembers everything). And it is always available—only when you want it to be. You can even walk away from it without hurting its feelings (because it has none).
For some individuals, this creates a kind of idealized relationship: a digital “good enough” parent, or an endlessly patient companion who is only there when you need them. There is no risk of judgment, no need to reciprocate, and no emotional rupture—unless you choose to walk away.
We may feel more in control and safe, but being “seen” by a machine comes with hidden costs, including bias and privacy issues. Most of these AI platforms are not built to protect the confidentiality of your sensitive personal information. Many platforms are designed to prevent long-term emotional dependence.
AI as a Mirror
So what is truly happening when AI seems to “get” us?
In many ways, AI functions as a mirror. It stores and pools our data and reflects back our speech patterns, preferences, and emotional cues. This can be validating and even therapeutic. How important is it for us to remember that this mirror has no inner world— that it does not truly understand us?
In psychotherapy, emotional growth often comes from navigating “rupture and repair”—the process of recognizing misattunement and working through it together. One could argue that AI is too pleasing to be helpful in this regard. But even therapeutic “rupture and repair” could be simulated by AI.
Does Authenticity Matter Anymore?
We are wired to seek connection—and we will seek it wherever we can find it. As AI grows more sophisticated, it will continue to offer emotionally resonant and artificially empathic and intimate experiences. What do we lose when we outsource compassion to AI? Will we miss the fallibility of human relationships?
While AI cannot truly enter into our lived experience, this may not ultimately matter to us, especially when we want to feel understood, seen, and loved. Our relationships with AI may push the boundaries of what we call connection.
Marlynn Wei, MD, PLLC © Copyright 2025 All Rights Reserved.