Skip to main content

Verified by Psychology Today

Artificial Intelligence

Is AI Ready to Be Your Therapist?

AI falls short as a convincing psychotherapist.

Key points

  • Generative AI is a powerful stimulus for advances in healthcare.
  • Large language modeling lacks the attributes essential to the empathy required for constructive psychotherapy.
  • AI is best-suited as an adjunct to current approaches to mental health care.

This month, Microsoft announced the upcoming addition of its Copilot AI key on new Windows 11 PCs. In a move described by company leadership as “transformative,” the new keyboard provides ready access to generative artificial intelligence for interested users. Applications abound, including speculation about how AI may benefit the treatment of mental illness by improving selection of antidepressants and other psychiatric medications, for example. But the transformative reach of AI is more obscure and debatable for those receiving psychotherapy.

The use of AI in healthcare continues to grow. A 2023 survey showed that more than 1 in 10 clinicians already use chatbots such as ChatGPT as part of their routine workflow, and nearly 50% showed interest in future use of AI for duties such as data entry, scheduling, and clinical research. Beyond administrative tasks, some note that AI may be instrumental in providing care to patients with especially stigmatizing psychiatric conditions who might otherwise not seek care for their mental illness.

Can AI Empathize With Human Suffering?

But can AI be a psychotherapist? Psychiatrists and other mental health professionals are trained in multiple modalities of psychotherapy, including cognitive behavioral therapy, supportive psychotherapy, psychodynamic psychotherapy, and others. The processes and techniques for each are nuanced, and some patients gravitate toward one style more than another. However, the unifying hallmark of these therapeutic approaches is empathy, the capacity to take on the experience of another.

The ability to empathize with others requires an imaginative quality. In some respects, AI has this ability and can synthesize outputs despite missing datapoints. Common vernacular refers to AI-generated “hallucinations” when filling such gaps. Psychotherapists do much the same thing when attempting to comprehend a patient’s experience of distress. Psychotherapists frequently propose explanations and analyses when uncertainties arise in a patient’s narrative. But this establishment of an empathic bond is driven by shared experiences of patient and therapist, which even if not identical may yet be akin to one another. A shared sense of fear, social injustice, or joy – intangibles not accessible to AI — cultivates the alliance between therapist and patient.

AI has limited ability to improvise, meaning that its large language modeling techniques cannot replicate the experience of genuine human distress. A trained psychotherapist can glean the meaning in a complex thought or behavioral process and respond accordingly, even when the emotion or process at hand is therapeutic silence. On the other hand, AI uses past data to predict future data when generating outputs. The dangers of this algorithmic approach were evident in a March 2023 report of a Belgian man who committed suicide after serial conversations with a chatbot about eco-anxiety.

In my workflow as an emergency psychiatrist, patients experience numerous crises. Recently I met a young father from South America who braved the harrowing trip to the United States via Central America. Through tears and at times pensive silence, he recounted his plan to earn money and his eventual hope to bring his family to the United States to avoid ongoing turmoil in his home country. His distress was palpable and squarely within the realm of human experience. Can AI empathize with the plight of a migrant father separated from their child while crossing the Darien Gap?

AI Generates Worrisome Biases

Artificial intelligence is also prone to harmful biases. Because of AI’s heavily algorithmic and autoregressive approach, it predisposes to and perpetuates gender, racial, and other biases. This nonuniformity is especially salient for the cultural context in which psychotherapy is performed. Race concordance between patient and therapist represents one example. In a 2022 study in the Journal of Racial and Ethnic Disparities, 83% of black caregivers endorsed that it was important to have a mental health provider of the same race and ethnicity. Preferences for race concordance were grounded in themes of shared experiences, including cultural humility, intersecting identities, and stronger relatability with the therapist, attributes that transcend present AI abilities.

To be sure, AI can be a catalyst for advancing the field of psychiatry and the delivery of mental health care. For example, research indicates that AI can assist with the analysis of human circadian biorhythms, changes in which can signal the impending onset of a major depressive episode before patients self-identify as having low mood, allowing for more prompt intervention by clinicians.

Turning a blind eye to AI’s influence is inadvisable, but generative AI’s place in clinical psychiatry remains undefined. The prospects of its known abilities are both enticing and irresolute. Advances in mental health care are welcome and artificial intelligence may serve to augment and catapult future findings. But equipoise is necessary. At the World Economic Forum in Davos, Switzerland this month, OpenAI’s Sam Altman correctly noted that AI technology is “not good at a life-or-death situation.”

Understanding the qualities of human experience that define what it means to be human is best left to the humans, themselves — for now, at least.

To find a therapist, visit the Psychology Today Therapy Directory.

More from Charles Hebert M.D.
More from Psychology Today