Skip to main content
Artificial Intelligence

AI Could Make Fools of Us All

Chatbots might not be helping us; instead they could be rotting our brains.

Key points

  • Studies suggest that AI negatively affects critical thinking.
  • AI may also affect confidence and skill development.
  • AI therapy is on the rise with younger people.
Pixabay / Scholaris
Source: Pixabay / Scholaris

In the past month alone, I’ve had one person self-diagnose their symptoms via ChatGPT before we even had our first session – they then read that diagnosis out to me (it was pretty spot on to be fair) – whilst two others have presented therapeutic assignments that were clearly undertaken via AI rather than, you know, via thinking things through for themselves. The consensus seemed to be one of, “I use it at work, so I thought I could use it here.”

However, according to a new study, rather than supporting our endeavours (at work, in therapy or, for that matter, anywhere), AI appears to be having a negative effect on our critical thinking skills and practices. Not only that, but it can also knock our confidence and mess with our memories.

The research authors surveyed 319 ‘knowledge workers’ to investigate their use of GenAI and the effects it had on them in several key areas.

They found that higher confidence in GenAI was associated with less critical thinking, and that higher self-confidence was associated with more critical thinking. The authors stated that “qualitatively, GenAI shifts the nature of critical thinking toward information verification, response integration, and task stewardship.”

They also found that whilst AI tools (including Copilot and ChatGPT) can boost writing productivity by assisting with tasks such as content generation, idea creation, and stylistic editing, an overreliance could impair long-term skills development by bypassing critical writing processes such as constructing logical arguments and understanding subject matter.

A friend of mine who runs her own strategic PR company hates AI because it has actively increased her workload, mainly because writers no longer think for themselves. “It only takes a few hours to fact-check copy written by a human,” she said. “But when someone submits copy written by AI, it can take days to unravel.”

Because of the way it harvests data, AI can get a lot wrong with specialist, niche, and complex subjects. Sadly, more and more writers are submitting copy this way, which means more and more unravelling needs to be done before publication.

Not only that, but in an effect termed ‘digital amnesia’, AI could also negatively affect your memory and harm your ability to learn and remember.

I am a rational emotive behaviour therapist (REBT). It’s a form of cognitive behaviour therapy (CBT) that very much wants to teach people essential critical thinking skills. We actively direct people to both challenge their more dysfunctional thought patterns and reinforce their more functional thought patterns through such objective questions as, “is this belief true?” and “does this belief make sense,” or “does this belief help me achieve my goals,” as well as “and where is it all going to end up if you keep thinking like this then, eh?”

The first question is an evidence-based question. Whether you claim ‘true’ or ‘false’, you will need to back your answer up with evidence. Does it make sense is the logical, or philosophical, or commonsense question and requires quite a lot of thought before answering. Hopefully, the third question needs no introduction at all, as its merits kind of speak for themselves, whilst the final one can instigate a deep dive into all sorts of strange, but very personal territorial depths.

Teaching people these techniques raises further questions that will need answering. More importantly, they also promote insights, which is how people gain mastery over their problems. But if you hand your psychotherapy over to a chatbot, what will you even be learning or developing? It all seems rather laissez-faire.

According to a Forbes article published only last week, people (especially Gen Z people) are turning more and more to ChatGPT for their therapy and coaching needs. Mainly because it is both affordable and on-demand.

Few and far between are the therapists who want to be woken at midnight to hear your thoughts and, if there are any that are, I can imagine they would be charging a significant amount for the service.

Several people have claimed that talking to a chatbot daily has been more helpful than months and even years of therapy. There go all my years of study and practice. Thank you, AI psychotherapy! Licensed therapists, meanwhile, advise caution.

AI is a general-purpose technology, for a start. Your therapist, however, will have spent years studying and honing their craft. Plus, despite how it feels, your AI isn’t displaying genuine empathy, whereas your therapist is. Or are they? Maybe not.

One rather unnerving study found that people perceived AI as being more compassionate than human beings. The study didn’t focus on psychotherapy per se, but instead compared empathic responses from people with empathic responses from AI to a series of positive and negative statements given to the participants to share out. And AI won the empathy response game.

Plus, I’m guessing your chatbot won’t suffer from compassion fatigue or need a holiday every couple of months or so, there to replenish the well.

Whether a chatbot is better than a human with compassion in the face of such things as trauma and abuse remains to be seen, but I personally doubt it.

What happens when you need additional support? What if you are a danger to yourself or to others? Would ChatGPT know what to do or whom to contact in those situations? A trained therapist knows how and when to break confidentiality, and exactly who to contact. It is always done with the patient or client’s highest good in mind. It requires not only empathy, but also understanding. What about those highly sensitive safeguarding issues, or the ethical conundrums surrounding disclosure of illegal activities?

“I’ll be back,” says your AI therapist, as it scuttles off to call the police.

AI can be useful when used alongside traditional therapy but despite the low cost and always accessible convenience, and no matter how empathic it seems, it really should not be thought of as a standalone at this time.

Not if it could reduce your critical thinking skills, confidence, skills development, and memory, it’s not.

Before I was a therapist, I was a journalist working in mainstream print media and contract publishing. AI has taken over those industries. If it takes over this one too, then that’s me out of a job twice over. I am going to want universal basic income, a beachside apartment, a hammock, a stack of books, and a banana daiquiri, thank you very much.

References

1. https://dl.acm.org/doi/10.1145/3706598.3713778

2. https://fortune.com/2025/06/01/ai-therapy-chatgpt-characterai-psycholog…

3. https://www.nature.com/articles/s44271-024-00182-6

advertisement
More from Daniel Fryer M.Sc., MBSCH
More from Psychology Today
More from Daniel Fryer M.Sc., MBSCH
More from Psychology Today