Artificial Intelligence
Can Artificial Intelligence Add Value to Therapy?
For obsessive-compulsive disorder, an AI chatbot may add feeling.
Posted February 13, 2025 Reviewed by Michelle Quirk
Key points
- We are at the beginning stages of testing if and how artificial intelligence can add value to therapy.
- OCD afflicts nearly 1 to 2 percent of the U.S. population and takes 14 to 17 years to correctly treat.
- Some AI chatbots might offer potential added value for those with OCD.
Beth agonized over whether her boyfriend was the right partner. Was he reliable, smart, kind, strong? You name the adjective, and it made it onto her ever-shifting list, boiled down to one question: Was he enough? Beth shopped for a therapist to help her figure it out, auditioning two well-respected candidates.
The first therapist listened politely to her concerns, and just as quickly redirected her to a solution. An evidence-based practice called exposure response prevention would teach Beth to stare her fears in the eye without blinking, and, thus desensitized, she’d be free. This therapist explained that Beth had relationship obsessive-compulsive disorder (R-OCD), an issue not truly about relationships but tied to relentless and haunting doubts and fears:
Is my significant other my soulmate? Am I really attracted to them? Was I just checking somebody else out? Should I end the relationship now? Wait, maybe I'm wrong, and making the biggest mistake of my life.
A second therapist did something different: He got curious. Perhaps her fears over her boyfriend had interesting associations and meanings that nobody ever talked about. Maybe she had some good reasons to doubt relationships. Even though he was well aware of the most effective, conventional treatments, he wanted to understand more.
An anonymous software engineer first alerted me to this second therapist, named Claude, who was, in fact, an artificial intelligence (AI) chatbot. He recalls the moment his intellectual hairs stood on end when he first asked Claude to unspool a knotted and gnarly obsessional tangle of his own:
“Unexpectedly, Claude turned around and asked me how I was feeling about all of this—something a search engine would never do, so it threw me, at first. And even more unexpectedly, Claude did a remarkable job helping me process those feelings, doing what a good hotline counselor does: mixing very tailored validation of what I was saying with astute open-ended questions, and ultimately suggesting connections between feelings I was having in a startling and deeply helpful way.”
Not only was Claude the better therapist for Beth in this limited circumstance, but he was also uncannily echoing my own unconventional approach to OCD. An AI chatbot and I seemed to be co-discovering the heart of OCD.
Before you race to an AI chatbot for therapy, keep in mind two important warnings and disclaimers:
- Not every AI chatbot is built equally. Claude might go straight for the heart, having been programmed with the nurturing backstory of the UN conventions for human rights as his metaphorical parents. His focus on feelings, in other words, might be programmed into him in a way not found in other AI chatbots.
- Most importantly, general-purpose AI systems like Claude aren't therapists. They may not always have great emotional insights for everybody, all of the time. They aren't necessarily designed to do that, and may not be adequately tested for safety and sensitivity. However, in limited situations, they might add something of value like a little more feeling in OCD treatment.
The complex emotions of the OCD sufferer are often dismissed as irrelevant, irrational, and foolish. The majority of OCD specialists today view OCD as noise in the system, of thought and behavior as distorted as an electric guitar pumped up to its highest amplifier setting in a closet.
There is little notice of OCD’s hidden music, which in a more open space unleashes captivating harmonies and melodies and a fuller human story. Unfortunately, asking for the emotional meaning or message of OCD is to ask the wrong question, one that only someone like a rebel thinker and outsider like Claude—or myself—could entertain.
When I finally met with Beth in therapy, we continued Claude’s "great work." Beth's relationship obsessions echoed massive unreliability in her family of origin. She was right to mistrust whether relationships could be truly good for her.
Nearly impossible to pin down who her parents were and if they'd be emotionally available from one moment to the next, her psyche found an ingenious way of taking matters into her own hands. In an ironic twist of fate, the unreliability of her own thoughts and feelings was the best way to show just how severely others shapeshifted for her. Without proper decoding, this tormented her.
But it didn’t have to plague her if we could be curious about it, just like Claude had done.
Like the main character in a Selena Gomez song, Beth knew she was "acting a little bit crazy" and that there were a "million reasons” to give up her OCD, but she also recognized something elemental: The heart wants what it wants. Or perhaps Beth was arriving at the wisdom of Blaise Pascal: “The heart has reasons for which reason knows nothing.”
Beth’s obsessions weren’t just unreasonable distortions and noise in the system; they had hidden messages that an AI chatbot could hear. They had new creative possibilities that even our most effective treatments neglected.
OCD is a condition that afflicts nearly 1 to 2 percent of the American population and takes an average of 14 to 17 years to correctly treat. A 2023 analysis found that untreated OCD led to an annual loss of more than $8.4 billion in the United States alone, primarily due to missed work, decreased productivity, and medical costs. With such a big impact, shouldn’t we have more and better options to treat OCD? Shouldn’t we hope for a therapist as sensitive as Claude to detect the hidden stirrings of a heart misunderstood?
Perhaps, even when science itself tries to explain what the heart wants without consulting the heart itself, it gets all muddled. It gets farther away from the center of the story and leaves us feeling misjudged and mystified. Who could have imagined we needed an AI chatbot like Claude to help us find our way back home?
To find a therapist, visit the Psychology Today Therapy Directory.