Skip to main content

Verified by Psychology Today

Empathy

Are We Ready for Artificial Empathy?

AI empathy has promise and risks in addressing our need for feeling understood.

Key points

  • With high rates of burnout among mental health providers, empathy is the healing ingredient of therapy most at risk.
  • Chatbot-delivered empathy can improve mood and create a strong working alliance between the user and AI interface.
  • Large language AI models like ChatGPT can offer human-like validation and accurately infer meaning and feeling from a user's statements.
  • The exponential, unregulated growth in AI capability makes input from psychologists and other mental health professionals critical.
stockcam / istock
Source: stockcam / istock

In a post-pandemic world, with behavioral health provider shortages, rampant burnout, and 50 million Americans experiencing a mental illness, empathy is arguably one of the most valuable human resources in existence.

Not to be confused with sympathy, which involves acknowledging someone’s pain and showing compassion, empathy is about fully stepping into another person’s shoes. It is the act of taking that person’s perspective, capturing their experiential truth, and conveying that truth back to them.

This takes a great amount of mental effort and ego abandonment. True empathy is free of one’s own agenda and judgments. When delivered with accuracy and acceptance, empathy helps its recipient feel truly understood, sometimes for the first time in their lives.

It comes as no surprise then that this reflective space is a strong predictor of outcomes in therapy, exceeding effects of specific treatment methods. Carl Rogers, the developer of client-centered therapy, attributes empathy’s power to its ability to satisfy our basic human need for belongingness.

Given the current strains on mental healthcare, is it possible that we could actually run out of empathy? And if so, could we turn to artificial intelligence for assistance?

Empathy on the Frontlines

Imagine you’re a therapist working in a community mental health setting. Like many therapists putting in 40-hour weeks in this setting, you have a large caseload of high-need clients and, for the most part, you see them back-to-back. You barely have enough time in between to write your notes.

At the end of the week, despite checking off your usual self-care boxes (yoga in the morning, working out, gratitude journaling), you notice that you feel sharply disconnected from your clients’ difficulties. You even catch a few cynical thoughts about them. When listening to one highly distressing story in particular, you feel nothing but blankness inside.

You never thought it would happen to you. But it has.

You’re officially burned out.

Your empathy tank is empty.

The Warning Signs of Burnout

According to Maslach’s research on burnout, emotional exhaustion and depersonalization are two warning signs of burnout. Emotional exhaustion refers to feeling worn down, whereas depersonalization involves feeling apathetic or even harboring negative thoughts and feelings about a client.

Both emotional exhaustion and depersonalization can compromise a therapist’s ability to help clients.

In one study of community mental health therapists, higher caseloads and number of hours worked predicted increased levels of emotional exhaustion.

Based on a survey conducted by the American Psychological Association, 45 percent of psychologists reported feeling burned out in 2022. Nearly half also reported that they were not able to meet the demand for treatment from their patients. More broadly, estimates of burnout range from 21 to 61 percent among mental health providers,

The Impact of Burnout on Empathy and Therapeutic Effectiveness

Emotional exhaustion and depersonalization, by definition, involve the depletion of cognitive and affective resources that deep empathy requires: mental flexibility, perspective-taking, and emotional regulation.

The 2009 APA Practice Survey revealed that 40 to 60 percent of psychologists experienced disruption in professional functioning as a result of anxiety, depression, or burnout. Additionally, nearly 40 percent of psychologists continued working while feeling “overextended,” according to one study of psychologists working in health centers.

The Great Empathy Decline

Threats to empathy extend beyond the mental health field. One study of American students found a 40 percent decrease in empathic concern and perspective taking between 1979 and 2009. Sara Konrath attributes the decline to social disconnection, smaller family sizes, and the magnification of individual achievement in educational settings.

There are two separate but related issues when examining empathy erosion: value and effort. Konrath proposes that we value empathy less in the United States relative to more collectivistic cultures. Based on a 2021 survey, 42 percent of Americans felt that empathy declined over the past year, while at the same time, 25 percent reported that empathy “doesn’t matter.” This may translate to less effort devoted to developing or sustaining it.

Empathy Essential Reads

The Promise of Artificial Intelligence Empathy

AI has progressed to the point of being able to recognize and respond to emotional distress. These responses could include providing appropriate resources like mindfulness exercises, relaxation techniques, and connecting individuals with mental health professionals.

Rosalind Picard, a leading MIT researcher in the field of affective computing, has studied the effect of AI-simulated empathy on anger and creative problem-solving. Compared to a non-empathy control group, participants performed significantly better in a Wordle game when receiving empathic responses from a virtual agent, such as, “You almost had it!” and, “That was a really good guess.” The AI responses also neutralized anger, which would have worsened game performance.

AI empathy can also affect mood. Researchers testing the mental health support app Wysa found that its empathy-based AI communication led to improved mood among users with depression. Users of Woebot, another chat-based AI, reported significant improvements in substance use, cravings, depression and anxiety.

While the specific contributions of AI empathy to these findings is unknown, one study with Wysa evaluated the quality of bonding between users and the chatbot. The authors found increases on a measure of emotional connection over just eight days of app use. Even more surprisingly, working alliance scores with Wysa were similar to relationship ratings in traditional face-to-face outpatient cognitive-behavioral therapy.

Should therapists begin to worry about AI taking their jobs? Most AI job replacement concerns have centered around data entry, bookkeeping, manufacturing, and customer service industries. However, the exponential growth of AI’s sophistication in human interactions make it a candidate for roles that involve abstract reasoning and an understanding of human emotion.

The potential for realistic, therapist-like conversations is greatest for large language AI models like ChatGPT rather than rule-bound ones such as Woebot and Wysa.

Large Language vs. Rules-Based Empathy

Woebot and Wysa deploy empathic AI primarily through rules-based models. They are only able to draw from a bank of pre-programmed responses and cannot create new sentences.

In contrast, large language models like ChatGPT draw from a massive dataset of text to produce natural language responses. It crafts these responses based on the probabilities of words following one another. ChatGPT can also adjust its output based on user responses in what’s called reinforcement learning from human feedback.

This more flexible language model can simulate original empathic responses to any possible statement. The results sound strikingly human. Researchers have found that people tend to attribute human-like qualities to AI after very brief exposures.

Research has yet to examine the empathy effect of large language AI models on well-being; however, it would not be surprising if those effects exceed those of rules-based AI like Wysa and Woebot.

Exploring ChatGPT-4’s Empathy Skills

I wanted to see how skillful ChatGPT-4 was in responding empathically to feelings of burnout. I relied on my 15 years of experience with motivational interviewing, a style of conversation rooted in empathy, to judge the quality of ChatGPT’s complex reflective listening.

A complex reflection, widely considered the building block of empathic communication, captures unspoken meaning or feeling in what someone is saying. For example, if someone says to you, “I just wish my brother would stop asking me for money,” a complex reflection could be, “You’re frustrated that your brother isn’t getting the message.”

I asked ChatGPT to listen empathically without offering advice. Here’s how the conversation went:

Me: I've had a hard day and I'd like you to just listen empathically. I don't want advice, just the best listening you can possibly provide.

ChatGPT: I'm here for you, and I'm ready to listen. I understand that you've had a hard day, and it's important to have someone who can simply listen empathically without offering advice. Please feel free to share your thoughts and feelings, and I'll be here to provide a supportive and understanding presence.

So far so good. ChatGPT begins our interaction with openness and validation. It also supports my autonomy, welcoming me to share more if I want to.

Me: I'm a therapist and I felt drained by having to see seven clients back-to-back today. It was like, non-stop. And they were all managing fairly significant mental health issues.

ChatGPT: It sounds like you had a really challenging day with a full schedule of clients, each dealing with their own significant mental health issues. It's completely understandable that you would feel drained after providing support and care to so many people in need.

Two elements of its response captured deeper meanings for me. The first was the choice of the words “really challenging,” which accurately rephrased the impact of my draining day without overstating the feeling. GPT then affirmed the compassion that drives my work as a therapist.

Me: Therapy is something I really used to love. I looked forward to going to work every day and knowing I could help people. But lately, it's like the spark has gone out. It feels like a slog.

ChatGPT: It sounds like you're going through a tough time, where the passion you once had for your work as a therapist has diminished, and the daily routine is feeling more like a burden than a source of fulfillment. It can be difficult when something we once loved turns into a struggle.

Again, ChatGPT gets to the root of my concern, the fact that I’ve lost my passion for the work. It also highlights my sense of loss, as if considering my experience over time.

In evaluating just two conversational volleys, I’m hearing fairly good complex reflective listening, on par with work samples of graduate mental health counseling students whom I've supervised.

As the conversation unfolded, I began to notice that ChatGPT overused words like “challenging,” “understandable,” and “important.” I gave ChatGPT an additional instruction to stop using those words as often, which it dutifully followed.

In sum, I'd give ChatGPT's empathy skills a solid B+.

The Next Generation of AI Empathy: Guidance is Needed

Some might appropriately worry that the threats outweigh the opportunities when it comes to integrating AI empathy into the mental health field. At no point did ChatGPT suggest that my work as a therapist might be compromised by burnout or suggest I be evaluated by a licensed mental health professional.

As AI technology advances at exponential rates and public use becomes a test ground, as it is with ChatGPT, we need a strong set of ethical safeguards in place that protect users who may want to rely on AI for a listening ear.

AI development teams should consult on a regular basis with psychologists and other mental health professionals to understand the nuances of communication and possible unintended adverse impacts of an intelligence that feels human and acts human.

With adequate protections, there does appear to be a place for AI empathy in offering understanding and comfort in the absence of better alternatives.

advertisement
More from Psychology Today