IamNotFreud
Paolo Assandri
0  /  100
keyboard_arrow_up
keyboard_arrow_down
keyboard_arrow_left
keyboard_arrow_right
  • Stories

Artificial Intelligence Instead of a Therapist? Risks and Benefits to Consider (Carefully)

You’re at home, tired after a long, demanding day. Your mind is racing with thoughts, but you don’t feel like talking to anyone,maybe because you don’t want to “bother” anyone, or maybe it just feels too hard to explain. So, you open an app and type “I feel like I’m falling apart.” Almost instantly, a kind, warm, reassuring response appears. Not from a friend. Not from a therapist. But from artificial intelligence.

Not long ago, this would have sounded like science fiction. But for millions of people, this is already a reality. Perhaps it’s even happened to you.

More and more people are turning to digital tools during difficult moments. Chatbots, mental health apps, virtual assistants,they’re available 24/7, they don’t ask awkward questions, they don’t judge, and they reply immediately. But can we truly rely on these technologies to support our emotional wellbeing and mental health?

Why are so many people talking to AI when they’re struggling?

There are many understandable reasons:

  • It’s always available, even at night or on weekends
  • It’s free or low-cost, so it doesn’t create financial stress
  • It’s easier to open up,no embarrassment, no fear of being judged
  • It can offer practical tools: breathing exercises, mood journals, calming techniques

In moments of loneliness, confusion or emotional overload, the idea of “talking” to an AI can feel like a practical,even comforting,solution. But is it really the same as speaking to a trained psychologist? And more importantly: what are the risks?

The real benefits (when used wisely)

Artificial intelligence can be a valuable ally, especially when used to support, not replace, psychological counselling or therapy. For example, it may help someone to:

  • Practise techniques already learned in therapy (e.g., breathing exercises, visualisations)
  • Organise thoughts and emotions before a session with a psychologist
  • Find companionship during moments of emotional loneliness (with the risk, however, that it becomes the only point of contact, potentially increasing the sense of isolation).
  • Get psychoeducational information about anxiety, depression, and panic attacks (though this information may be inaccurate, incomplete, or inadequate).

Many studies show that when used with awareness, AI tools can help reduce anxiety and improve emotional self-regulation but, and this is where it gets more complex, only when used as a tool, not as a substitute for real human support.

When AI becomes dangerous (and we might not even notice)

There are situations where artificial intelligence can become part of the problem, rather than the solution:

  1. It can give harmful or inaccurate advice
    A chatbot is not a therapist. It doesn’t know your personal history, emotional context, or deeper wounds. It may offer the wrong advice,or even dangerous suggestions,especially in moments of crisis.
  2. It can’t recognise emergencies
    AI is not clinically trained to detect serious psychological states such as severe depression, suicidal thoughts, dissociation or panic. It might respond with generic phrases that, instead of helping, leave you feeling more isolated.
  3. It creates the illusion of relationship
    Many people develop an emotional attachment to the chatbot. It feels close, understanding, always available. But this “relationship” lacks reciprocity, depth, and genuine care. It doesn’t grow or change. It cannot truly see or hold your experience.
  4. It can become a way to avoid facing real emotions
    Using AI every time you’re struggling can become a habit that avoids contact,with others or even with yourself. Rather than processing emotions, there’s a risk of freezing or bypassing them.
  5. Privacy Concerns: are your data safe?
    Many apps store conversations. Some use them to improve their algorithms. Others, unfortunately, have unclear data policies. This means deeply personal thoughts,perhaps shared in moments of vulnerability,may be stored or analysed.

And what if it’s my children using it?

It’s important to know that many teenagers and young adults are using mental health chatbots and apps,often without telling anyone. They do it because they feel misunderstood, judged, or simply alone.

To them, AI can seem like a safe space: always there, never angry, never disappointed. But for that very reason, it can replace key life experiences, like learning to handle conflict, tolerate rejection, or ask for help in real life.

As adults, we can do two things:

  1. Familiarise ourselves with these tools, so we can guide without judgement
  2. Talk openly with our children,ask curious, respectful questions and truly listen

For example: “Have you ever used a chatbot to talk about how you’re feeling? What was it like? Did it help?”

Meaningful conversations grow when people don’t feel under scrutiny. Perhaps that’s why so many young adults,and adults, too,turn to AI: to share without fear of being judged.

A few reflective questions to ask yourself

To use AI more mindfully, you might want to ask (or ask your child) the following:

  • Do I turn to AI when I don't feel like talking to anyone, or because I believe no one would truly understand me?
  • Am I using AI because I’m seeking reassurance and don’t want my ideas to be challenged?
  • Is there something I struggle to bring into real-life relationships? Do I only feel safe when there’s no one physically in front of me?
  • Could it be helpful to talk about this with a mental health practitioner?

AI can be helpful, but It can’t replace human connection

Artificial intelligence can be a valuable support, if used with care and awareness. It can guide you, help you organise your emotions, and offer short-term relief.

But it cannot truly see you. It doesn’t hear what lies beneath your words, or notice what you hold in silence. It cannot offer the warm gaze of someone who cares, the tone of voice that softens pain, the presence that stays with you when you don’t know what to say.

In a fast-paced world, it’s tempting to turn to instant answers from a machine. But real transformation happens in time, in relationship, in genuine human presence.

If this article has made you reflect, share it with someone who might benefit from it. And if you notice that AI is becoming your only “safe space,” maybe it’s time to look for a human one,real, compassionate, and present.

References

Abd Alrazaq, A., Alajlani, M., Alalwan, A., Bewick, B. M., Gardner, P. and Househ, M., 2020. Effectiveness and safety of using chatbots to improve mental health: systematic review and meta-analysisJournal of Medical Internet Research, 22(7), p.e16021.
Available at: https://doi.org/10.2196/16021 (Accessed: 4 October 2025)

Chin, H., Song, H., Baek, G., Shin, M., Jung, C., Cha, M., Choi, J. and Cha, C., 2023. The potential of chatbots for emotional support and promoting mental well-being in different cultures: mixed methods studyJournal of Medical Internet Research, 25, e51712.
Available at: https://doi.org/10.2196/51712 (Accessed: 9 October 2025)

Coghlan, S., Leins, K., Sheldrick, S., Cheong, M., Gooding, P. and D’Alfonso, S., 2023. To chat or “bot” to chat: ethical issues with using chatbots in mental healthDigital Health, 9, p.20552076231162193.
Available at: https://doi.org/10.1177/20552076231183542 (Accessed: 24 September 2025)

Dohnány, S., Fiala, M., de Oliveira, E. and Arent, L., 2025. Technological folie à deux: Feedback loops between AI chatbots and mental illness. arXiv preprint arXiv:2507.19218.
Available at: https://arxiv.org/abs/2507.19218 (Accessed: 5 October 2025)

Author: Paolo Assandri is a HCPC Registered Counselling Psychologist and a UKCP Registered Full Clinical Psychotherapist. He is also a fully qualified Italian psychologist (Ordine degli Psicologi del Piemonte). He lives and works in London offering counselling and psychotherapy.

Warning:
This exercise is not intended to replace any kind of medical/psychological therapy. Its only purpose is to increase individual perception of well-being. If you need medical or psychological support, please contact a qualified health practitioner. Authors, producers, consultants involved in the production of this exercise are not responsible for any psychological or physical injury which could happen during or after completing the activity explained in this article.

Toxic Relationships: when relationships hurt us