AI Therapist: Why Some Conversations Should Stay Human
- legalloudecalice
- Dec 10, 2025
- 4 min read

Can machines be trained to truly listen? Can they hold pain without judgement, guide people towards hope? Can a machine replace a human therapist who is too expensive or too busy or too fallible?
It is not a question of what is possible; large language models (AI) can be programmed to sound empathetic, so it’s not a question of whether they can, but more a question of whether they should. What is lost in humanity if the safety and responsibility of a therapist is assigned to a machine instead of a human? The uncertainty of the human experience is spontaneous and unique, not quantifiable data for a machine to read. In a paper published in 2025, upon examining the ability of AI systems as therapists, the researchers’ answer is resounding: no system could ever be safe, responsible or genuinely therapeutic.
The terrifying reality is that these systems are unequipped for the fragility of human crisis. The delicate nature of hallucinations, delusions and suicidal ideation can too easily be misinterpreted by the system, with even the most advanced forms of these AI models not only accepting but encouraging delusional beliefs, unable to recognise the risk of self-harm, or offering dangerously bland reassurance to situations in need of urgent escalation. What is maybe worse, some models reveal stigma towards those suffering with mental illness; malpractice to the human therapist. This is the result of a machine designed to mimic empathy, programmed by human datasets that can never be free of bias, unhealed human wounds or the mirror of the self.
There is a crucial element missing; therapy is more than a conversation. The authors of this research highlight how many facets make up an ongoing ‘therapeutic alliance’, the formation of the relationship that lives and breathes between two people. The therapist takes on accountability, carries emotional weight and holds continuity over time. There are stakes and there is risk; something a machine could never be programmed to mimic, regardless of how fluent or advanced the technology. It mirrors but it cannot be present, it echoes understanding but not the abstract design of responsibility. Most importantly, it can suggest but it is incapable of care.
Removing the stakes and accountability removes the very essence of humanity; a machine can sound human, but it can never be human.
Irrespective of this, the curiosity remains; an endless, non-judgemental and low cost alternative to therapy is at people’s fingertips. It can override the human error; it won’t cancel last minute or forget someone’s name. An intoxicating promise.
Without safety, the ease of accessibility is not progress, it’s risk on a huge, unmonitorable scale. The person going through a mental health crisis, alone and spiralling and asking a chatbot for help is met with the response: ‘I understand’. A statement that is impossible; it cannot understand. Between language and meaning, a gap opens, and the human disappears.
The idea that a therapist is replaceable rests on the idea that therapy is merely a sequence of techniques and a linguistic exchange of empathy. If this were true, the machine would work. But it cannot represent body language, shared time or the subtle knowledge that the person you are talking to is really there, not simulating presence, but inhabiting it. Fluency is not understanding. And comfort, though part of care, is not its heart.
The machine cannot meet the ethical requirements for mental health work, it cannot manage risk, it cannot recognise the subtle behaviours that require referral to the emergency services, it is not culturally attuned and cannot hold long term (or any) responsibility. At best, the machine can be used as a tool to reduce administrative load, triage patients or help clinicians, but it does not come close to being a suitable replacement.
It does not stop at therapy, this applies to all manners of professions that deal with the delicate balance of support, advice and presence; coaches, leaders, managers. AI slowly creeps into every space where people seek support. The question is not whether it can help, but what kind of help it should offer, and at what cost? The art of working with people is not about efficiency, but about the ability to offer humanity: to stay in the room while someone is struggling, to be responsible for the words spoken and to feel.
True healing requires descent into the shadow and one’s own complexity. Through the eyes of depth psychology, a good therapist not only soothes at the surface, but offers accompaniment to the client to delve deeper into darkness where meaning can be found. A machine cannot descend; it doesn’t possess an unconscious, it has no capacity for projection or mythic resonance. It simulates empathy through pattern recognition but it will always remain untouched. Paradoxically, therapy works through the engagement of both client and therapist, as they engage together in alchemy, therapy becomes a transformative container where two psyches meet and both can transform. AI has no soul to risk, it cannot transform. If it cannot transform, then it cannot heal.
For the full research paper, published in arXiv 2025, find the link here:
Follow The Heretic for reflections on AI, empathy, and what remains when machines start to mimic the soul.




Comments