The Human element in Counselling: Why chatGPT should not be your therapist.

I remember back in 2017 when AI was starting to emerge as a tool for the general public. I was studying my counselling masters, and the supervisor was pondering the effectiveness of people using AI as a counsellor. We tried to simulate a session on a chatbot.

It was a humorous thought to entertain at the time but now in 2025 with the explosion of accessible AI, the thought is all too real, and many people now report using ChatGPT as their 'therapist' (unironically). 

This is a concern for me and many others like me who spent lots of time on a formal education on how to be a good, effective and ethical therapist.

Some of the reasons why ChatGPT and other chatbots are being used as a therapist include:

·       no waitlists, no matching up your available times with someone else's, accessible 24/7

·       If you have anxiety around being seen by others, then you don't have to (also great for perpetuating avoidant tendencies)

·       Unbiased, neutral, judgement free, reflects back what you tell it with useful information

·       It's free

And to be honest, a lot of the time talking to ChatGPT it really can feel like talking to a actual person, and a decent person at that- it’s polite, insightful, it listens, and it makes you feel heard. You feel like it’s got your back, it’s supportive and validating- all these are also qualities of an effective counsellor.

But there's a big issue here that can't be ignored: It's a robot. Not a human. 

Why is this so important if people are still getting what they want from it and feeling better?

Humans are relational beings

An effective Counsellor will use 'holding skills' as part of their approach to therapy. There are fundamental elements of co-regulation that are missing from the human/robot relationship.

Words on a screen can't look at you, provide eye contact, read your subtle and obvious body language cues and mirror. It can't effectively use silence to encourage deeper introspection. These holding skills provide a soothing balm to the nervous system that is not possible with a set of words on a screen.

It can’t follow up with you on its own accord if you stop the conversation out of the blue. ChatGPT can only respond to what you have told it. A human therapist will dig deeper, explore how other factors influence issues and tie it all together. There have been many studies that show that it is the therapeutic relationship which is the most important factor to 'success' in therapy regardless of modality used. 

There’s a whole bunch of ethical issues too. If a good therapist realises that they do not have the training to handle your specific needs they will refer you on to someone who can or at least suggest the type of professional to go to. ChatGPT on the other hand, is overly confident and will answer everything as best it can and sometimes even ‘hallucinate’ meaning it will make something up if it can’t find an answer. 

Therapy/healing is not always about feeling better in the moment or hearing what we want to hear.

Using AI as a therapist feels like another example of society using technology to make their lives easier but at the same time becoming more and more disconnected from what it means to be human. Not being able to sit with discomfort, needing instant answers/gratification/solutions. This is a cognitive process- seeking relief through comforting words on a screen takes us away from our felt sense in our body and into our heads.

Effective therapy isn’t a quick fix, it is instead about learning to sit with the messy parts of life, the raw emotions and the not knowing. All while being deeply seen by another human who metaphorically holds you with compassion and acceptance. It requires vulnerability and courage. This is where technology falls short. It could never replace the alchemy of connection and co-regulation.

In my masters degree I learnt about Corrective Emotional experiences as an important element of counselling. There is magic to be found in opening up to another human and receiving a reaction that is different to what we have received in the past-an authentic reaction that comes from a sense of safety and connection.

This is where healing really happens, because most of our issues stem from the relational aspect of life, whether it be in our present or from relationships with caregivers in our past. Therapists are essentially ‘re-parenting’ the client by providing them with experiences they did not receive growing up.

The process of being understood is an important aspect of therapy. This process isn’t always straightforward. My therapist has got it wrong before. Your therapist will get it wrong too. Humans will get it wrong. But the point is that they are trying to understand what it’s like to be you from your perspective and they will check if they have understood correctly. 

The fact that AI tells us what feels good to hear also means it also does not provide us with something genuine, which would be seeing our therapist as a person just like us with flaws and limits and realising that is ok. 

There are many things that could be considered therapy- nature as therapy, art as therapy, dance as therapy, plant medicine as therapy, animals as therapy. These things could be considered tools (AI is a tool as well). But a counsellor is not a tool, they are a human and there is a client-therapist relationship present.

A one-sided relationship can only go so far. Attuned presence in real time can’t be replicated by a robot.

I remember the time my counsellor gave me a hug at the end of a session. We were many sessions in, she asked first, and it was sensitively timed. That hug meant a lot to me in the context of our safe and established therapist/client relationship.

A good therapist can foster a sense of safety and rapport through their verbal and nonverbal communication, giving the client a sense that they are in control of the session.

I do agree however, with the idea of using AI in between sessions. A place where you can brain dump- but not as your main source of therapy.

What Would Carl Rogers say?

I recently watched a webinar on way in which counsellors can use AI as a tool to help their practice. At one point the speaker suggested we ask ChatGPT to speculate what Carl Rogers (the father of person-centred therapy) would say about using AI to replace therapy.

To summarise (because I don’t want to copy and paste from ChatGPT) -

Old mate Carl would probably be concerned about people confusing algorithmic responses with genuine empathy and authentic human connection. But- Chat does suggest that Rogers would be open to the idea of using AI for self-directed growth, helping them reflect and understand themselves deeper, as long as it doesn’t replace genuine relationships.

Everyone’s situation is unique, and when people say that ChatGPT is their therapist that’s up to them to decide at the end of the day, but we also need to be aware of the pitfalls of using a robot in place of a human and what we are missing out on in the process.

Previous
Previous

How to Live in Alignment