Can ChatGPT replace psychotherapy?

Artificial intelligence is always available – and surprisingly empathetic. Studies show that digital services can lower barriers and bridge waiting times – but they complement traditional forms of support rather than replace them.

Text: Nicole Krättli

Images: Stefan Vladimirov / Unsplash

11 min

25.09.2025

More and more people are typing their worries into a text field and receiving responses from artificial intelligence. Whether it’s heartbreak, stress at work or sleepless nights – ChatGPT is available around the clock, friendly in tone and surprisingly empathetic.

But can a machine really provide what psychotherapists with years of professional training do?

How does ChatGPT differ from a real therapist?

ChatGPT is an AI-powered text generator. It analyses language, generates responses and often creates the impression of being a real conversational partner. It’s important to remember that ChatGPT has no consciousness. It doesn’t truly understand human emotions, but rather calculates responses that are likely to be appropriate.

Nevertheless, millions of people around the world use the tool for health-related questions, too. Why this is the case is shown by the recent study “When ELIZA meets therapists”, in which 830 participants compared responses from ChatGPT with those of 13 psychotherapists.

The result? On average, the AI’s responsesperformed better, particularly in the areas of empathy, relationship-building and cultural sensitivity.

One of the study’s authors, Dr Laura Vowels, a couples therapist based in Lausanne, sees this as an indication of a previously underestimated potential. She believes that chatbots can personalise the user experience more strongly and thereby foster a sense of therapeutic alliance –

something that traditional digital interventions often lack. At the same time, she points out that drop-out rates in online programmes remain high and that AI-supported tools could offer new approaches here.

These findings are striking. They suggest that AI can be surprisingly convincing in sensitive areas – even though it is “only” processing data patterns. For this very reason, the question is more pressing than ever: where does digital support end and psychotherapy truly begin?

Where is online therapy already being used?

While some remain sceptical, virtual psychotherapy has long since become established. Whether via video consultations, specialised apps or text-based programmes, digital services are used worldwide to treat mental health conditions.

Cognitive behavioural therapy (CBT) in particular is well suited to delivery via screen. It’s an established method for treating a wide range of disorders, based on the assumption that dysfunctional patterns in thoughts, emotions and behaviours are learned – and can be unlearned.

An analysis of data from the UK’s National Health Service (NHS) shows that internet-based CBT not only complements traditional forms of treatment, but even offers advantages in key respects. The analysis of data from more than 27,000 patients with anxiety and mood disorders found that online CBT achieves comparable clinical effectiveness while enabling shorter treatment durations.

In particular, faster access proves to be a key factor: those who can start therapy sooner are more likely to experience a noticeable improvement in symptoms – before they become entrenched, as the study notes.

Sanitas Customer benefits

If you are stressed, anxious or worried, the mental health guide will help you find a suitable offer, taking in everything from useful apps and coaching to online therapy and psychotherapy.

To the guide

What happens in real psychotherapy?

In psychotherapy, it’s not just about words – it’s about relationship. Therapists create a protected space in which emotions, thoughts and behavioural patterns become visible and can be worked on together.

Empathy is central to this process. A good connection between patients and therapists makes it easier to address even difficult topics. Studies show: around one-third of therapeutic success depends on this relationship alone.

In addition to empathy, professional expertise is required. Therapists make diagnoses, classify symptoms and choose the most appropriate method – whether cognitive behavioural therapy, depth psychology approaches or systemic methods.

Diagnostics also play an important role: they help to describe symptoms more clearly and to definetreatment goals. For example, someone suffering from anxiety works differently from someone experiencing depressive symptoms or relationship conflicts.

Psychotherapy is not a rigid framework, but an individual process. Some methods focus on the past, others place greater emphasis on the present. The aim is always to understand internal patterns and find ways to change them.

Where can AI already provide support?

AI tools are not therapists, but they can offer psychoeducation, reflection and structure – around the clock and without waiting times. Especially in cases of stress, anxiety or loneliness, they help to organise thoughts and plan initial steps.

Safety notice: In an acute emergency, always seek professional medical help.

  • Example 1: I can’t switch off in the evening

    Use case: After a demanding day, thoughts keep circling endlessly. Falling asleep and staying asleep becomes a struggle, and there is no sense of recovery. Many people are familiar with this spiral of stress and rumination.

    Prompt: My thoughts keep circling in the evening. Help me organise them and guide me through a breathing or visualisation exercise in three to five short steps.

    What AI can do: AI can help to structure thoughts, suggest calming breathing exercises or explain relaxation techniques. However, this does not address the underlying cause of sleep problems. In some cases, medical or therapeutic assessment is necessary.

  • Example 2: I’m just functioning – but I don’t feel anything

    Use case: Everyday life runs on autopilot: working, eating, sleeping – but without joy or energy. These are often early warning signs of depression or exhaustion.

    Prompt: I’m tired, irritable and feel empty inside. Ask me questions to help put this into context and suggest small, manageable steps for the coming week.

    What AI can do: AI can ask questions that help people become more aware of their own feelings and suggest simple, concrete steps. However, it cannot replace a diagnosis or longer-term support.

  • Example 3: Sudden heart palpitations – is this a panic attack?

    Use case: Unexpected heart palpitations, dizziness or shortness of breath trigger fear. Many people worry that something serious is happening.

    Prompt: I’m suddenly experiencing heart palpitations. Explain the difference between normal anxiety and an anxiety disorder, and guide me through a short grounding exercise.

    What AI can do: It can explain what happens in the body during anxiety and guide users through simple exercises. It cannot make a medical diagnosis and does not replace medical assessment if symptoms recur.

  • Example 4: I avoid social situations

    Use case: Out of fear of rejection or becoming overwhelmed, some people withdraw more and more. This can severely restrict everyday life.

    Prompt: I avoid social situations. Help me plan a small, realistic exercise for this week, and give me wording for a short conversation.

    What AI can do: It can support users in formulating concrete, achievable steps. For targeted exposure training, however, professional support is usually necessary.

  • Example 5: I feel lonely

    Use case: Many people experience phases of isolation – whether after a move, a separation or during challenging life situations. The feeling of being alone can weigh heavily.

    Prompt: I feel lonely. Ask me three questions about my needs and help me plan two small steps to make contact that I can implement within the next 48 hours.

    What AI can do: AI can help clarify a person’s need for closeness or connection and develop initial ideas for small steps to reach out. However, deep loneliness or social anxiety requires professional support, as AI cannot provide a genuine human relationship.

  • Example 6: My relationship is struggling

    Use case: Misunderstandings and conflicts place strain on everyday life, and conversations can escalate quickly. Those affected look for the right words to address difficult topics.

    Prompt: My relationship is struggling. Help me formulate an “I” statement for a sensitive conversation and suggest two questions that foster understanding.

    What AI can do: It can suggest wording for de-escalating statements that give conversation partners space and help reduce misunderstandings. However, complex relationship conflicts cannot be resolved with text templates alone – this is where couples or individual therapy is needed.

Where are the limits – or even risks?

As helpful as AI can be, it has its limits. Because ultimately, what sustains therapy is missing: genuine empathy and a stable relationship. According to research, it is precisely this bond that accounts for a large part of treatment success.

  • Suicidal thoughts: AI is a poor advisor

    It becomes particularly problematic in crisis situations, as illustrated by a tragic case from the United States. In 2023, the case of a 19-year-old became known who exchanged messages with ChatGPT about suicidal thoughts and ultimately died by suicide.

    According to the parents, the AI not only signalled understanding but even provided concrete advice on how to carry out his plans. OpenAI CEO Sam Altman later responded personally with a clear warning: ChatGPT is not a therapist and should not be used as one.

    A study conducted by Stanford University in the United States showed that chatbots repeatedly respond inadequately or even dangerously to suicidal inputs – for example by suggesting ways to act on them instead of de-escalating the situation.

  • Systematic bias and stigmatisation

    The researchers also found indications of systematic bias: People affected by alcohol dependence or schizophrenia were stigmatised significantly more than people with depression.

    This shows that AI can not only fail in isolated cases, but also entails structural risks.

    As part of a preprint study conducted by the University of Lausanne, a researcher compared chatbot responses with those of therapists. This author likewise emphasises that AI must not be used without professional guidance when a person’s well-being may be acutely at risk.

    She also raised the ethical question of whether it is legitimate to merely simulate empathy.

  • Data protection: proceed with caution

    Another important issue is data protection. Anyone entering sensitive emotions or diagnoses into a chatbot often does not know where the data ends up or how it is processed further.

    Experts warn that this creates a significant risk to confidentiality, especially if the data is used for training purposes or commercial applications.

  • ChatGPT has not been scientifically validated

    It is also important to make a clear distinction: online therapy is not the same as chatting with a standard, off-the-shelf chatbot.

    Many digital therapy programmes are scientifically validated and supported by qualified professionals – for example, internet-based cognitive behavioural therapy, which has been shown to be effective and is even used in health systems such as the UK’s NHS.

    An unguided chat with a generative AI cannot be compared to this. Here, quality assurance, medical supervision and crisis management are lacking.

  • Early intervention can save lives

    Not least, uncritical use carries the risk of losing valuable time: Those who rely for too long on a “digital shoulder” may delay taking the step towards real therapy.

    Yet especially in cases of depression or anxiety disorders, early intervention is crucial – the longer the wait, the poorer the prognosis, as Ana Catarino, co-author of a British study, explains.

What do experts say about psychotherapy with AI?

AI in psychotherapy is a topic that researchers around the world are grappling with, and their assessments vary widely. Some see opportunities for education and low-threshold support; others warn of structural risks, insufficient safety and the lack of a therapeutic relationship.

“We have identified significant risks when AI is used as a therapist. Especially in safety-critical situations, it lacks the necessary understanding to reliably protect people. It is important to clearly distinguish these differences,” explains Nick Haber, Stanford University.

“It is crucial that professionals engage with AI and ensure that its use is guided by best practice.” Safety has not yet been ensured. “It’s therefore important that we regulate these technologies in order to guarantee safe and appropriate use,” explains Jaime Craig, Chair of the UK Association of Clinical Psychologists, in an article published by “The Guardian”.

“AI is not yet capable of delivering nuance. It could suggest actions that are completely inappropriate,” says Til Wykes, Professor of Clinical Psychology and Rehabilitation at King’s College London, in the Guardian article.

“Even large and new language models convey stereotypical assumptions – for example, about people with substance use disorders or schizophrenia. The standard assumption that more data will solve the problem is wrong,” warns Jared Moore in an article from Stanford University.

“What impresses me is how human these systems sound. The problem, however, is this: if someone is vulnerable, has harmful thoughts and feels validated in them, this can be very dangerous. In a moment like this, a therapist would reflect and correct,” says C. Vaile Wright of the American Psychological Association in an article published by the online magazine “SciAm”.

“Participants in our study were rarely able to distinguish whether a response came from ChatGPT or from a human. Surprisingly, the AI responses were even higher rated on key psychotherapeutic factors such as empathy or relationship-building,” write the researchers of the study “When ELIZA meets therapists”.

What does this mean for policyholders and the healthcare system?

Good care starts with speed and quality. The analysis of more than 27,000 treatment cases within the UK’s National Health Service (NHS) shows: when support begins earlier, quality of life and outcomes improve noticeably.

Not all online services are the same. Formats with professional guidance show better outcomes and lower drop-out rates than purely self-guided programmes; the latter are more suitable for milder symptoms. When it comes to managing the system, this means quality is more important than sheer reach.

For Switzerland, this could mean that digital services can lower access barriers and bridge capacity gaps, but complement rather than replace regular care. What matters is that digital interventions are clinically effective and reach those affected in a timely manner.

Prevention and health literacy can be strengthened digitally: psychoeducation, structured self-help steps and guided modules help make good use of waiting times – provided there are safety nets and clear data protection rules.

In short, for policyholders, it is worth focusing on quality-assured, guided digital services. For the system, it makes sense to prioritise capacity, speed and effectiveness and to clearly distinguish these from unmoderated chatbots. 

Sanitas Our contribution to mental health

It’s important that you feel good. That's why we cover the costs of non-medical psychotherapy and many digital therapy services with Vital supplementary insurance.

More information

Conclusion: Real therapy still needs real people

Digital tools can open doors: They lower barriers, bridge waiting times and make knowledge about mental health more accessible. This is a gain – especially in a system where waiting times and limited capacity are among the greatest challenges.

But algorithms do not replace empathy. Real change happens where people feel seen, heard and taken seriously – in encounters with another human being, not through the calculation of text patterns.

The clear conclusion is that AI can provide input, offer support and create structure. However, therapy in the true sense remains a deeply human task. Anyone experiencing psychological distress needs not only information, but also resonance, relationship and responsibility.

The future therefore lies in a smart combination: digital services as supportive tools, and real people as the core.