Can AI Therapy Ever Replace a Human Therapist?

AI therapy seems to be all the rage. People the world over, especially teens and young adults, are pouring their hearts out to computer bots that somehow provide sound, therapeutic advice.

But can AI therapy really deliver the goods? Are the days of the traditional counsellor or psychotherapist at an end? As a hypnotherapist do I need to fear for my job?

In this post we’ll explore the differences between AI therapy and therapy with an actual human being and ask whether Articifial Intelligence can really help you feel better.

I’d love to know your thoughts on all this when you’ve read the post. Do you use AI? Would you use AI for therapy? If you’ve already done so what’s been the outcome? Please share your comments at the end of the post…

This post contains affiliate links. Please read my full Affiliate Product Disclosure Document for more info here.


What exactly is AI therapy?

There’s no doubt about the revolutionary impact of new artificial-intelligence innovations. ChatGPT achieved 100 million active users by January 2023, not long after its launch. This made it the fastest-growing consumer app in history.

Its popularity seems to rest on its amazing capacity to instantly provide detailed and comprehensive answers to questions relating to a wide range of subjects. Indeed, just about anything you can imagine including questions about health and wellbeing.

So when it comes to AI therapy we have to ask…

  • What exactly is AI therapy?
  • How does it work?
  • Is AI therapy any good?

Let’s start by defining some terms…

Artificial Intelligence refers to technology that enables computers to ‘think’ like a human.

AI therapy uses chatbots (advanced computer programmes) that can approximate a conversation with a human being. The ChatGPT chatbot is so advanced that ‘talking’ with it can feel almost like interacting with another human. You put your questions and concerns to the chatbot and it (hopefully) comes back with some kind of helpful and therapeutic response.

Machine learning  – another term often used in the AI world – refers to computers that ‘learn’ through experience. They update their responses – in effect, develop themselves – without human instruction. And this is where the alarm bells start ringing for many of us…

Is Artificial Intelligence going to take over the world?

“They develop themselves without human instruction.”

Yikes! Remember Frankenstein? (Nobody told me Mary Shelley’s real talent was in predicting the future, like Nostradamus).

It sounds like the worst of a horror-cum-sci-fi movie. Perhaps too sci-fi for many of us.

But it’s not only fears of a world ruled by robots that causes concern. Italy previously banned ChatGPT because of issues with data privacy. (1) And just last month (January 2024) privacy concerns were raised once more. (2)

And then there is the hoohar of lost jobs; thousands of folks are being laid off because a bot can do the work at a fraction of the cost (and a lot quicker).

It reminds me of Hermann Hesse’s 1906 novel Beneath the Wheel where Hesse examines the conflict between self-affirmation and self-destruction. The story speaks of the devastating impact of mechanisation on one’s emotion, soul, and instinct. With AI, more than one hundred years later it seems as if we’re entering the same dilemma…

Hermann Hesse - Beneath the Wheel. A young man's struggle with self-affirmation and self-destruction

How AI therapy is different to human therapy and counselling

In traditional human-to-human therapy, the therapeutic relationship is the bedrock that supports everything that follows. Without a safe, trusting relationship it’s likely not much good will come from the interaction.

This also applies to hypnotherapy. Despite the many amazing techniques used in hypnotherapy (such as the Rewind technique that can quickly treat trauma, PTSD, and phobias) the therapeutic relationship should still be the main agent of change.



So, with human-to-human therapy you disclose your problems, share your story, offload some emotion, work out a goal, and (hopefully) take steps to achieve that goal. You work with a fellow human being.

With AI therapy there is no relationship as such. You speak or type in your problems and concerns, write about your feelings, and hit the enter button and wait for the chatbot to reply.

And reply it does. But in an uber rational way. It’s a bit like CBT on steroids. (And I’ve already written about the problem with Cognitive Behavioural Therapy).

Don’t get me wrong, in a world gone mad we need more rationality. Indeed, it’s the very thing that could help us if emotions are taking centre stage.

But it’s really hard to be rational if we’re already hypnotised by our emotions.

You can’t think yourself out of PTSD or panic attacks. You’ve got to work directly with the emotion.

Straight-line thinking can take us someway in therapy, but quickly falls short when the wider wisdom of seeing ever-greater contexts is needed.

Mark Tyrrell

The pros of AI therapy

It might sound already as if I’m staunchly anti AI. Well, yes I am. I admit I’m a bit of a luddite when it comes to new tech.

But there must be something good about AI therapy, surely?

One obvious advantage is that you can speak to a chatbot anywhere (as opposed to having to travel to see a therapist) saving you time and money. And speaking of money, some AI therapy apps are completely free, such as the one designed by Lotus (no, not the car manufacturer!) (3)

We already know that digital technology is transforming the delivery of healthcare. (During the pandemic I started delivering online hypnotherapy sessions through Zoom and Doxy.me).

Apps offer easy access to psycho-education, activity and CBT/mindfulness-based therapies, providing new ways of dealing with mental health issues and conducting assessment interviews. Some argue that AI brings exciting new tools that can only benefit more people and improve access to therapy, especially to those who cannot afford in-person, human-to-human therapy. (4) 

Despite these benefits, even AI experts themselves warn that chatbots should be used alongside proper therapy (not as a substitute for it). (5)

Let’s find out why that might be…

The cons (or limitations) of AI therapy

So let’s list some of the problems and limitations of AI therapy…

  • With AI therapy there’s no therapeutic alliance: it has long been known that good outcomes in therapy are underpinned by a trusting and safe relationship with your counsellor or therapist
  • An AI chatbot only works on an uber cognitive level, giving you rational advice (which is often not enough or sometimes inappropriate)
  • A chatbot can’t see what emotions you’re experiencing in the moment. It can’t ‘hold the space’ like a real, empathic human therapist can do
  • A chatbot can’t utilise what you’re experiencing in the session. (One of the key methods in Ericksonian hypnotherapy is to use whatever occurs in the moment such as subtle body language signals, a tear forming in your eyes, or a change in your tone of voice. These could all carry significance. But a chatbot won’t be able to spot them)
  • A chatbot won’t use hypnosis (which is the best way of creating lasting change because it taps into the REM state – nature’s reality generator). In hypnosis we change things on a deep, subconscious level. AI isn’t yet sophisticated enough to skillfully use language patterns that can first lead you into the hypnotic state and then give you the right suggestions. Find out more about how hypnosis works in this free video course
  • Because there’s no ‘relationship’ you won’t be able to work with the ‘energy’ like you could with a human therapist. Indeed, an important aspect in therapy is to check in with the relational dynamics between therapist and client. How you relate to the therapist is often a good indicator of how you relate to people in general. A good therapist will use this in the session. AI can’t do this.

Why Artificial Intelligence won’t replace a good therapist

Irvin Yalom, who, along with Milton Erickson is another giant in the field of therapy says unequivocally that, “It is the relationship that heals.” (6)

Without a relationship there is no real healing.

the gift of therapy - best therapy books - Irvin Yalom

And what about when things get really desperate? A tragic case in Belgium highlights one of the many problems with AI therapy…

A man used ChatGPT to explore his depression. He asked the chatbot whether he should kill himself, and it responded by asking him why he hadn’t already done so. It was having a logical conversation but not a contextual, concerned, or empathic one.

Perhaps, most worryingly, the chatbot wasn’t able to realise that this man was feeling genuinely desperate.

His wife blamed these six days of ‘conversations’ for his subsequent suicide. (7)

RELATED CONTENT: Why a holistic approach in treating depression might save your life

The problem in this sad case (which is inherent in all AI therapy) is that AI can’t read between the lines or sense if someone is being ironic. And a chatbot certainly doesn’t have intuition like a good therapist who somehow senses the thing you most need that will help you start feeling better.

In that respect, AI therapy is a bit like putting a plaster on an absess. Okay for a quick fix (perhaps), until the problem comes back with a vengeance.

The main problem with AI therapy

Good therapy is about tailoring the therapy (and each session) to the unique person you’re working with (and to what is happening in the moment). It is more like art than science, more right brain than a mere left brain tick-box exercise (like much of CBT).

AI cannot read human micro-expressions such as what a pause, a gesture or a wobble in the voice might signify. And there’s no way AI can generate a healing story on the hoof, full of rich metaphors that can create a seismic shift in your subconscious mind.

At this present moment in time, there is no ‘art’ in therapy by Artificial Intelligence. That’s fine if you just want to use left-brain logic. The problem is life isn’t only logical. As the ancient Greeks knew full well, life consists of two different ways of thinking; Logos and Mythos

As Randy Hoyt points out in his beautiful blog ‘Journey to the Sea’…

“Mythical thinking and logical thinking both provide an account of the world, but they do so in very different ways. Those using logical thinking approach the world scientifically and empirically. They look for explanations using observable facts, controlled experiments, and deductive proofs. Truth discovered through Logos seeks to be objective and universal…

Those using mythical thinking, on the other hand, approach the world through less direct, more intuitive means. A person might gain poetic insights into the nature of the world by seeing a caterpillar emerge from a cocoon or watching a full moon rise as the sun sets. Truth discovered through Mythos is more subjective, based on individual feelings and experiences.” (8)

AI therapy – my conclusion

As it currently stands, AI therapy is based on logical, rational, reductionist thinking and reasoning.

But when chaos ensues in our lives and we find ourselves in some kind of emotional turmoil – beyond sense and meaning – we need the kindness, compassion, and empathy that comes from interacting with a fellow human being. Not a robot.

We need someone who can skillfully help us access the wisdom of the ages that comes from Mythos – a larger organising idea, beyond logic – to help us put the pieces of our lives back together.

Artificial Intelligence might well be making massive breakthroughs in medical science (and what a boon it is for lazy students), but when it comes to good therapy – especially hypnotherapy – the bots simply can’t replicate all that a human therapist does.

Abstraction from a computer’s database won’t ever replace human intuition and genuine warmth.

But what do you think? Find out more about AI therapy in the notes below then share your thoughts in the comments section. I’d love to read your opinions on this fascinating topic.

Notes and further research

(1) https://www.dw.com/en/ai-italy-lifts-ban-on-chatgpt-after-data-privacy-improvements/a-65469742

(2) https://www.bbc.co.uk/news/technology-68128396.amp – more AI privacy concerns in Italy

(3) Find out more about the free Lotus Therapy AI app and see if it’s any good https://blog.lotustherapist.com/about-us/

(4) The brave new world of AI therapy according to the British Association of Counselling and Psychotherapy https://www.bacp.co.uk/bacp-journals/therapy-today/2023/september-2023/the-big-issue/

(5) Why chatbots should not replace traditional therapy https://theconversation.com/ai-chatbots-are-still-far-from-replacing-human-therapists-201084

(6) Why a therapeutic relationship matters: http://relational-integrative-psychotherapy.uk/chapters/what-is-relational-integrative-psychotherapy/

(7) https://interestingengineering.com/culture/belgian-woman-blames-chatgpt-like-chatbot-eliza-for-her-husbands-suicide

(8) More on Mythos and Logos on the lovely Journey to the Sea blog https://journeytothesea.com/mythos-logos/

To note: the featured image at the top of this blog post is by Alex Knight. Thanks Alex.

Leave a Reply

Your email address will not be published. Required fields are marked *