I’ve been sitting with some discomfort lately. It started after listening to Jonathan Passmore and Marion Neubronner discuss the rapid uptake of AI use in coaching and therapy, in recent professional development webinars. It was strangely echoed by the latest season of Black Mirror (how creepy, but how good is it?!) and further reinforced after reading a recent ABC News article by Julia Baird.
Is AI going to replace coaches and psychologists?
AI is everywhere. Its reach is expanding fast, and coaching and therapy - two of the most human-centred professions - are no exception. We’re seeing it show up in tools that simulate real-time conversations and therapeutic dialogue, predictive models that tailor interventions based on mood patterns, and coaching platforms that adapt to a user’s behaviour and progress in real time.
A lot of it is clever, genuinely helpful, and provides valuable support to people who might not otherwise access or afford these services. It’s easy to see the appeal. AI doesn’t judge, it’s always available, and it responds quickly. For some, it may feel safer to type complex or overwhelming feelings into a chat box than to share them out loud with a stranger, especially at the start. Given how compelling these tools are, are they on track to replace real human connection?
Don’t get me wrong, used as a starting point, these tools can be great. They can help someone name what they’re feeling, get clarity, and feel seen, especially when waitlists are long, access is limited, or symptoms are acute. With 24/7 availability they can be a helpful early line of support. In that way, AI can be a useful and sometimes vital resource.
But it’s not the same as being truly met by another person.
Real coaching and therapy are relational and authentic. They rely on presence, tone, timing, silence. On someone noticing that you said one thing, but your body said something else. On sitting with you in confusion and being curious without rushing to fix it.
A human coach or therapist senses “shifts in the room”. They know when to pause, challenge you gently, and bring in humour to soften an edge, or when to simply wait because you're not quite ready to take that next step. They help you tune into the intelligence of your heart and gut, not just your head. They also adhere to codes of ethics, undergo regular supervision, and work within clear boundaries.
My understanding of AI (and I’m happy to be challenged) is that its strength lies in spotting patterns and responding to them. But people are more complex than patterns. Without consciousness, I don't think we can program or mimic the kind of care and connection you build with a real coach or therapist… yet.
That said, I do believe AI has a role, as an adjunct, not the main act. I’d recommend it for tracking moods, guiding breathing, prompting reflection, or helping someone ground in a moment of panic. I also certainly see its value where real coaching and therapy aren't accessible.
However, my concern is that AI might become a shortcut. That people may start turning away from others and from themselves in favour of quick fixes. Or that organisations might promote AI as a full replacement for human support. When big emotions like grief, shame, anger or fear surface, people need to be heard fully, understood slowly, and helped to heal properly. Some of that can be supported by code. But not replaced.
If you’ve been using AI for support, I’d love to know - was it for convenience? Because access to human support wasn’t an option? Or because it feels easier than being vulnerable with another person? (No judgement if so.)
Some of the coaching bots I’ve come across: Rocky.ai, CoachVici and AIMY. Some of the AI therapy tools: Happi.AI, Woebot, Wysa, Youper and Tess.