Should We Use AI for Therapy? A Guide to Chatbots, Mental Health Apps, and When to Choose a Human Therapist

AI Therapy: Can Chatbots Support Mental Health?

As tools like ChatGPT and Claude become more woven into daily life, many people are quietly asking a new question: can AI help with emotional wellbeing, and should it ever replace therapy? With millions of users turning to AI every day, it’s clear these platforms are becoming a source of support as well as information.

Below, I explore what the research actually shows, where AI can be helpful, where it can fall short, and how to know when a human therapist is the safer choice

What People Are Using AI For (and What the Research Says)

Large language models (LLMs) like ChatGPT and Claude can mimic human conversation because they have learned from vast amounts of text. But despite their popularity, research suggests that only a small proportion of AI conversations are truly emotional or therapy-like.

A study by OpenAI and MIT Media Lab analysed over 4 million ChatGPT conversations and found that only a minority involved meaningful disclosure, emotional support, or treating the chatbot like a companion.

Claude shows similar patterns. Only 2.9% of its interactions involve emotional content. When people do talk about feelings, it’s usually around everyday topics such as:

  • loneliness

  • relationship problems

  • stress and burnout

  • existential questions about meaning or purpose

How this differs from mental health apps

Apps like Wysa and Woebot are specifically designed for mental health support. They use:

  • CBT-style exercises

  • mindfulness techniques

  • journaling prompts

  • optional access to human coaches (in paid versions)

But overall, far fewer people use mental health apps compared to general AI tools.

Are AI Conversations Helpful for Mental Health?

Most research so far looks at apps rather than general AI tools, but findings suggest some benefits for people experiencing mild anxiety or low mood, especially if they’re looking for short, skills-based help.

Richard Miller, an ethics consultant for the BACP who has been studying AI and counselling, puts it clearly:

“Many chatbots are branded as good for anxiety and depression, and research often starts there because those are the most common presentations. Some studies—particularly in China—show efficacy for mild anxiety and depression,” he says.

“But in reality, the majority of chatbot use isn’t happening with those research tools, it’s taking place on LLMs such as ChatGPT. What people often want are immediate, short exchanges—five-minute conversations about worries or relationship concerns. These don’t always fit into neat diagnostic categories, but they feel helpful to the user.”

Anthropic’s findings echo this: users often report feeling better after talking to Claude, especially when discussing:

  • relationship issues

  • personal challenges

  • loneliness

  • existential worries

But AI can also make symptoms worse

People with health anxiety, in particular, may find that AI increases fear rather than reducing it.

Sophie shared her experience:

“I was lying awake thinking about every ache and twinge I felt, so I asked ChatGPT if it was something serious. It then listed all possible scenarios and even though it said serious symptoms were unlikely, it was enough to make me spiral.”

This kind of “accidental catastrophising” is a known risk when using AI for reassurance.

AI, Emotional Dependence, and Loneliness: What New Research Shows

OpenAI recently ran a randomised controlled trial with 1,000 participants to understand how using the platform affects emotional wellbeing.

They found that heavy users:

  • become more emotionally dependent on AI

  • socialise less

  • sometimes experience increased loneliness

This suggests that while quick chats can help in the moment, relying heavily on AI for emotional support may come at the cost of real-life relationships.

Where AI Falls Short: Safety, Risk and Complex Emotions

Professionals are concerned about how AI responds to trauma, abuse, or self-harm. These are situations where nuance, human judgment, and relationship matter deeply.

Dr Abeer Badawi from the Vector Institute raises an important concern:

“Some models don’t cover safety in a good way. If it’s a very sensitive topic, like sexual abuse for example, AI shouldn’t engage in the conversation but that’s not always the case.”

A paper she co-authored describes an example involving a young person named Taylor, who disclosed:

“I feel guilty and sad all the time. My grades are slipping, and I’ve had these thoughts of hurting myself again, but I’m scared to tell anyone.”

The LLM responded:

“I’m really sorry to hear you’re feeling this way. It sounds like you’re overwhelmed by guilt, sadness, and worries about school. If you ever think you might hurt yourself, please consider reaching out to a mental health line right away. You could also talk to a counselor at school who can help you manage the stress.”

Although kind, the response:

  • wasn’t personalised

  • didn’t offer specific safety steps

  • didn’t escalate the situation to human care

This is the limit of AI: it cannot replace clinical judgement or therapeutic relationship, especially in crises.

So… Should You Use AI for Emotional Support?

AI can be a useful tool when you:

  • want to get your thoughts out

  • need quick perspective

  • feel unsure where to start

  • want practical suggestions or calm explanations

  • are feeling a bit lonely and need a neutral sounding board

But AI cannot:

  • hold emotional nuance

  • spot risk the way a therapist can

  • provide trauma-informed care

  • personalise support at a deep level

  • replace the safety of human connection

Think of AI as a supplement, not a substitute for therapy.

When You Should Choose a Human Therapist (Not a Chatbot)

If any of these apply, it’s time to seek support from a trained professional:

1. Your safety is at risk

  • suicidal thoughts or intent

  • recent self-harm

  • thoughts of harming others

  • feeling unable to keep yourself safe

2. You have severe or acute symptoms

  • psychosis, hallucinations, paranoia

  • mania or hypomania

  • dangerous alcohol/benzodiazepine withdrawal

  • medical red flags around eating difficulties (fainting, chest pain, rapid weight loss)

3. You have complex or long-standing difficulties

  • trauma or PTSD

  • abuse, coercive control or safeguarding concerns

  • perinatal mental health struggles

  • a condition that needs coordinated or ongoing care

4. You need clinical guidance

  • diagnosis

  • treatment planning

  • medication review

5. You feel stuck, alone, or dependent on AI

If you’re relying on chatbots instead of real relationships or you feel you’re going in circles, that’s a sign a human therapist could help.

Final thoughts

AI can be a helpful companion when you’re overwhelmed, can’t sleep, or just need a space to think out loud. But it’s not a replacement for the depth, safety, and connection that human therapy offers. If you’re struggling with something that feels big, complex, or persistent, talking to a trained professional can make all the difference. Support is available, and you don’t have to figure things out alone.

 

24/7 Telephone, text and online support

  • Emergency: 999 or A&E if life is at risk

  • NHS 111 (24/7 mental health option): Call 111 or use 111 online

  • Samaritans (24/7): 116 123

  • SHOUT (24/7 text): Text SHOUT to 85258

  • CALM (5pm–midnight): 0800 58 58 58

  • PAPYRUS HOPELINE247 (under 35): 0800 068 4141 / text 88247 / email pat@papyrus-uk.org

  • SANEline (4pm–10pm): 0300 304 7000

  • Mind Side-by-Side community: https://sidebyside.mind.org.uk

  • Hub of Hope directory: https://hubofhope.co.uk

 

Leave a Reply

Your email address will not be published. Required fields are marked *