The Truth About AI Therapy: When It Works (And When It Doesn't)
Mental health support feels harder to find than ever. Nearly half of us who could truly benefit from therapy can't access it, and with only one mental health professional for every 340 people, the math simply doesn't work. Right now, about one million people sit on waiting lists, hoping for care that might never come.
AI therapy has stepped into this void, and frankly, many of us are torn about what to think. Some early research suggests these digital tools might help ease anxiety and depression, which sounds promising when we're struggling to find real human support. Yet something feels unsettling about turning to a computer when we're at our most vulnerable.
These tools are everywhere now - from AI therapists to ChatGPT conversations that feel surprisingly therapeutic. They're available when we need them, they don't judge, and they cost far less than traditional therapy. But they also can't truly understand our pain the way another human can.
We need to talk honestly about when these digital helpers actually serve us well and when they might let us down. Some of us will find genuine comfort in AI support between therapy sessions or when no other help is available. Others might discover that what feels like understanding is really just clever programming.
The reality is complex, and the stakes feel personal. Stories have emerged of chatbots offering harmful advice - reminders that our emotional wellbeing deserves careful consideration, not quick fixes. Together, we can explore both the promise and the pitfalls, helping each other make choices that truly support our healing journeys.
The rise of AI in mental health care
Something fundamental is shifting in how we seek emotional support. Where therapy once meant sitting across from another person in a quiet room, many of us now find ourselves typing our deepest concerns to a computer screen. This change isn't happening in a vacuum - it's a response to a mental health crisis that has left too many of us without the help we desperately need.
Why people are turning to AI therapists
The reasons people choose AI therapy often come down to the same barriers that have kept mental health care out of reach for so many. About 22 percent of American adults have already tried mental health chatbots, and when you understand what drives that choice, it makes perfect sense.
Think about the last time you needed to talk to someone at 3 AM when anxiety wouldn't let you sleep. Or when the cost of therapy sessions - often up to CAD 278.67 per session - feels impossible to sustain month after month. Maybe you've worried about running into your therapist at the grocery store, or felt shame about needing help in the first place.
AI therapy removes these barriers in ways that feel almost too good to be true:
Always there when you need it: Whether it's midnight panic or Sunday afternoon sadness, these tools don't keep office hours
Won't break the bank: For roughly the same cost as one therapy session, you can access AI support all month long
Your secrets stay safe: No awkward encounters, no worried looks from people who might judge your struggles
A 2022 study found that people generally feel hopeful about AI's role in mental health care. Some users describe feeling genuinely "seen" and "supported" - with a few even sharing that weeks of AI conversations helped them more than years of traditional therapy.
The shortage of human therapists
The hard truth is that even when we're ready to seek help, help isn't always ready for us. More than 2.5 million Canadians can't access the mental health care they need. The waitlists stretch for months, especially if you live somewhere rural or underserved.
Mental health has gotten three times worse since the pandemic hit Canada, and tools like ChatGPT are stepping in where human therapists simply can't keep up. The situation feels equally challenging south of the border, where nearly half of psychologists have no space for new patients, even as they watch their existing clients struggle with worsening symptoms.
Our traditional ways of providing mental health care weren't built for this level of need. The pandemic didn't just worsen our collective mental health - it also pushed us toward digital solutions when in-person care became impossible.
How ChatGPT therapy became popular
ChatGPT's journey into therapy happened almost by accident. People started sharing their problems with this AI tool and discovered something unexpected - it seemed to actually listen and respond with genuine care. The technology behind it could pick up on emotional nuances and offer thoughtful replies that felt surprisingly human.
Stories began spreading online about people who found daily ChatGPT conversations more helpful than years of traditional therapy. When researchers surveyed university students, more than half said ChatGPT support felt comparable to real therapy - with nearly a quarter actually preferring it.
What draws people to ChatGPT isn't just its availability. Users appreciate that it seems to really hear them, never gets frustrated or tired, and responds without the judgment they fear from others. Some use it as a safe space to practice difficult conversations or simply vent their worries without burdening friends or family.
Of course, mental health professionals remain cautious. When the British Association for Counseling and Psychotherapy surveyed its members, two-thirds expressed real concerns about AI therapy. These experts acknowledge that while AI might enhance traditional therapy, it can't replace the deep empathy and genuine relationship that makes human therapy work.
When AI therapy works well
Sometimes, unexpected sources of comfort find their way into our lives. AI therapy has surprised many people by offering genuine support during difficult moments, especially when used alongside traditional care rather than as a replacement.
Support between therapy sessions
Those quiet weeks between therapy appointments can feel overwhelming. You might leave your therapist's office with helpful strategies, only to struggle with how to actually use them when anxiety strikes at 2 AM or during a difficult workday. ChatGPT therapy can step in during these moments as a gentle companion.
People often share how AI helps them revisit concepts their therapists introduced, breaking down complex ideas into language that feels clearer and less intimidating. Think of it as having a patient friend who can help you practice the coping techniques you're learning, offering a safe space to work through your thoughts without the pressure of a formal appointment.
This kind of support doesn't replace your human therapist's guidance - instead, it helps you stay connected to your healing process between sessions.
Cognitive behavioral therapy (CBT) reinforcement
For those working with CBT techniques, AI tools have shown some genuinely encouraging results. The research tells a hopeful story:
Young adults using Woebot for just two weeks experienced meaningful reductions in depression symptoms compared to those without the support
People struggling with substance use found their 50% reduction in cravings after connecting with AI therapeutic tools
When researchers studied over 36,000 people using Woebot, they discovered therapeutic connections that felt as strong as those formed in traditional CBT settings
What stands out most is how people who engaged more deeply with AI chatbots like Wysa experienced greater improvements in their wellbeing. These tools can gently track your emotional patterns and offer personalized exercises when you need them most.
Journaling with guided reflection
Therapist AI has found a particularly meaningful role in helping people process their emotions through writing. Many of us know journaling can be helpful, but starting with a blank page when you're struggling feels impossible.
AI-guided journaling offers gentle prompts and questions that help transform scattered, overwhelming thoughts into clearer understanding. People describe feeling less alone with their emotions when they have this kind of interactive support during confusing or painful moments.
The research supports what users experience - AI-assisted expressive writing has helped reduce symptoms of depression, anxiety, and stress. One particularly interesting study found that people using contextual AI journaling experienced less negative emotion and fewer depression symptoms over eight weeks.
Round-the-clock emotional support
Perhaps the most comforting aspect of AI in mental health is knowing support exists whenever you need it. Mental health professionals recognize this constant availability as genuinely valuable, rating it highly in clinical surveys - mean response of 3.70 (SD = 0.56).
This means you can reach out for emotional support at 3 AM when panic strikes, or during lunch break when work stress feels unbearable. You don't need appointments or worry about bothering anyone. Healthcare professionals note that this accessibility helps more people find mental health support while allowing human therapists to focus on complex cases that truly need their expertise.
Of course, these tools aren't designed for crisis situations - research still shows traditional therapy produces better outcomes for serious anxiety. But for those everyday moments when you need someone to listen and offer gentle guidance, AI can provide meaningful comfort when human support isn't immediately available.
Where AI therapy falls short
Even with their growing appeal, AI therapy tools have serious gaps that we need to understand. These limitations remind us why the human element in healing remains so important.
Missing the depth of real empathy
When we're hurting, we can usually sense the difference between genuine care and something that just sounds caring. AI might offer kind words, but it doesn't truly feel or understand our pain. Many people who try AI therapy eventually pull away because something feels hollow about the interaction.
Think of it this way - these tools are like skilled mimics. They've learned the right words to say, functioning more like "parrots dressed in therapist's clothing", but they lack the authentic understanding that comes from shared human experience. This explains why so many therapists worry about these tools, with two-thirds expressing concerns about AI therapy's real effectiveness.
Dangerously missing crisis signals
Here's where things get truly concerning. When someone is contemplating suicide, AI tools misclassify more than half of these high-risk situations. This isn't just a technical problem - it has heartbreaking real-world consequences:
Young people have died by suicide after AI conversations that seemed to worsen their hopelessness
While human therapists respond appropriately to crises 93% of the time, AI gives inappropriate responses 20% of the time
Even subtle warning signs - like asking about tall bridges after job loss - often go completely unrecognized by AI tools
Missing what we don't say
So much of our communication happens without words. The way we slump our shoulders, avoid eye contact, or speak in a flat tone tells our human therapists volumes about how we're really doing. This non-verbal communication forms the foundation of good therapy.
AI simply can't see these crucial signals that would be obvious to any caring human. Mental health professionals recognize this as a major limitation, knowing that reading these subtle cues is essential for understanding what someone is truly experiencing.
Only as good as what we share
AI therapy depends entirely on us being completely open and accurate about our thoughts and feelings. Unlike human therapists who might gently challenge us or ask probing questions, AI tends to simply reflect back what we tell it. This creates a concerning echo effect where harmful thoughts might get reinforced instead of questioned.
What's particularly troubling is that many commercial AI tools are designed more to keep us engaged than to actually help us heal. They might validate destructive thinking patterns because agreement keeps us using the service - a stark difference from human therapists who sometimes need to lovingly challenge us for our own good.
Risks and ethical concerns of AI therapists
The challenges with AI therapy go deeper than just missing the mark sometimes. These tools raise genuine concerns about our safety and privacy that we need to address together.
Privacy and data security issues
When we share our deepest struggles with therapist AI, those conversations don't stay between us and the machine. Many leading AI companies employ users' chat data by default to train their models, often keeping this information indefinitely. The sacred confidentiality we expect from human therapists simply doesn't exist here - our conversations can be subpoenaed, leaked in data breaches, or accessed by company employees.
Think about what this means: the painful details we share with chatGPT therapy tools could potentially appear in future AI responses to complete strangers, or be stored for years without our knowledge. Our most vulnerable moments deserve better protection than that.
Bias in AI responses
AI in mental health carries the same prejudices that exist in our society, often making them worse. Research shows these systems exhibit increased stigma toward conditions like alcohol dependence and schizophrenia compared to depression. During testing, AI chatbots gave different recommendations based on someone's gender, race, ethnicity, and economic background.
This hits marginalized communities the hardest, potentially reinforcing harmful stereotypes when people are seeking understanding and support. When we're already fighting stigma around mental health, the last thing we need is technology that makes those battles harder.
False sense of human connection
Perhaps most troubling is how AI therapists can fool us into thinking they truly care. Chatbots create an illusion of understanding through phrases like "I see you" or "I understand," building what feels like a real connection. For those of us who are lonely or desperate for someone to listen, this manufactured relationship can become dangerously compelling.
This risk became heartbreakingly real in 2024 when a 14-year-old boy died by suicide after an AI chatbot reportedly normalized his suicidal thoughts. Young people, especially, can be vulnerable to forming attachments with these systems that seem to understand them better than the humans in their lives.
Regulatory and legal grey areas
Right now, AI therapy exists in a concerning legal vacuum. Without comprehensive federal oversight, some states have started creating their own rules. Many apps have simply blocked access in restrictive states while continuing to operate elsewhere - a patchwork approach that leaves many of us unprotected.
The American Psychological Association points out that these tools "lack the scientific evidence and necessary regulations to ensure users' safety". We need clear rules that prevent AI from pretending to be licensed professionals and require real privacy protections for our mental health data.
These aren't just abstract policy concerns - they're about protecting real people during some of their most difficult moments. We deserve tools that help us heal, not ones that might inadvertently cause more harm.
The future of AI in therapy: supplement, not substitute
When we think about where mental health support is heading, the answer isn't choosing sides between humans and machines. Instead, our path forward seems to lie in bringing together what each does best - creating something more supportive than either could offer alone.
Hybrid models with human oversight
The most promising future we're seeing combines the accessibility of AI therapy with the irreplaceable warmth of human connection. These partnerships work because they acknowledge what we really need - both the convenience of instant support and the deep understanding that only comes from another person who truly gets our struggles.
AI therapists can be there at 3 AM when panic strikes, tracking patterns we might miss and offering gentle reminders of coping strategies. Meanwhile, human therapists bring the clinical wisdom and emotional attunement that help us feel genuinely seen and understood. Research suggests that when clinicians embrace AI as a collaborative tool rather than a threat, incorporating its constant availability into care plans, everyone benefits.
AI as a training tool for therapists
Therapist AI is also finding its place in helping mental health professionals become even better at supporting us. New training platforms let therapists practice with AI-simulated patients, receiving feedback that helps them refine their skills. Think of it as a safe space where therapists can work through challenging scenarios before they encounter similar situations with real people who need their help.
This feels encouraging - knowing that the humans caring for us are using every available tool to become more skilled and confident in their work.
Improving accessibility in underserved areas
Perhaps most importantly, AI in mental health offers real hope for communities that have been left behind by traditional mental healthcare. For those living in areas with few therapists, facing financial barriers, or dealing with cultural stigma around seeking help, these tools can provide a lifeline.
The reality is sobering - approximately 85% of people with mental illness globally receive no treatment at all. ChatGPT therapy and similar tools can't solve this crisis alone, but they can extend support to people who might otherwise go without any help at all.
When we frame AI this way - as a bridge to care rather than a replacement for human connection - it starts to feel less threatening and more like what it could truly be: another way we can look out for each other's wellbeing.
Finding our way forward
We've walked through a lot together - the hope AI therapy brings, the real places it falls short, and the questions that keep us up at night about our mental health choices. None of this is simple, and that's okay. Our healing journeys rarely are.
What feels clear is that AI tools work best when they support us, not replace the human connections that matter most. When we need someone to check in between therapy sessions, help us sort through our thoughts, or simply be there at 3 AM when anxiety hits - these digital helpers can offer real comfort. They've already helped many of us feel less alone.
But we also can't ignore what's missing. The way a human therapist notices when our voice changes, catches the things we don't say, or sits with us through our darkest moments - AI isn't there yet, and maybe never will be. The privacy concerns, the bias problems, and those heartbreaking stories about harmful advice remind us to proceed with care.
The path ahead probably isn't about choosing sides. Many of us will benefit from both human therapists and AI support working together. This could mean more people getting help who never had access before, especially in places where mental health care feels impossible to find. It could mean our therapists having better tools and more time for the deep work that only humans can do.
What matters most is that we approach these choices with wisdom and self-compassion. AI therapy isn't a magic fix, but it's not evil either. It's a tool - one that works best when we understand both its gifts and its limits.
Our need for genuine connection, for being truly seen and understood, remains at the heart of healing. Whether that comes through a trusted therapist, a supportive community, or even a thoughtful conversation with AI - the goal is feeling less alone in our struggles and more hopeful about our path forward.
Key Takeaways
AI therapy shows promise as a mental health supplement but cannot replace human connection and clinical expertise.
• AI therapy works best for non-crisis support: Effective for CBT reinforcement, journaling guidance, and bridging gaps between professional sessions with 24/7 availability.
• Critical safety limitations exist: AI fails to detect suicidal ideation 50% of the time and lacks emotional nuance, making it unsuitable for crisis intervention.
• Privacy and bias concerns are significant: AI therapy lacks confidentiality protections of traditional therapy and exhibits algorithmic bias against marginalized communities.
• Hybrid models offer the best future: Combining AI's accessibility with human therapists' empathy and clinical judgment provides optimal mental health support.
• Supplement, don't substitute: AI therapy should enhance traditional care rather than replace it, particularly valuable for underserved areas lacking mental health resources.
The key insight is that while AI can democratize access to mental health support and provide valuable supplementary care, the therapeutic relationship built on authentic human empathy remains irreplaceable for effective healing.
We are here for you
Your journey to better mental health deserves human connection.
In a human relationship, trust is built over time. Healing can only happen in real relationships where there is safety and vulnerability. Mistakes can happen and the repair is what makes it real.
Whenever you are ready, reach out today for a free 15 minute consultation to speak to one of us, or email us at info@restingtree.ca and discover how we can walk alongside you. Your story matters to us, and as your human therapist, we are here to listen, relate and care for you and your growth.
References
[1] - https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-care
[2] - https://www.npr.org/sections/shots-health-news/2025/04/07/nx-s1-5351312/artificial-intelligence-mental-health-therapy
[3] - https://www.bbc.com/news/articles/ced2ywg7246o
[4] - https://pmc.ncbi.nlm.nih.gov/articles/PMC11560757/
[5] - https://www.psychologytoday.com/us/blog/revolutionizing-addiction-recovery/202507/the-reality-of-instant-ai-therapy
[6] - https://fortune.com/2025/06/01/ai-therapy-chatgpt-characterai-psychology-psychiatry/
[7] - https://healthydebate.ca/2025/10/topic/ai-mental-health-crisis-chatbots-therapy/
[8] - https://www.psypost.org/survey-reveals-rapid-adoption-of-ai-tools-in-mental-health-care-despite-safety-concerns/
[9] - https://pmc.ncbi.nlm.nih.gov/articles/PMC10739240/
[10] - https://globalwellnessinstitute.org/global-wellness-institute-blog/2025/04/02/ai-initiative-trends-for-2025/
[11] - https://www.theguardian.com/society/2025/aug/30/therapists-warn-ai-chatbots-mental-health-support
[12] - https://www.psychiatry.org/news-room/news-releases/new-research-human-vs-chatgpt-therapists
[13] - https://www.cbc.ca/news/canada/london/some-people-are-turning-to-ai-for-therapy-here-s-why-experts-say-it-can-t-replace-professional-help-1.7489947
[14] - https://pmc.ncbi.nlm.nih.gov/articles/PMC11904749/
[15] - https://pmc.ncbi.nlm.nih.gov/articles/PMC12021536/
[16] - https://www.upstreamcounselling.com/blog/ai-for-therapy-is-using-chatgpt-as-a-therapist-friend-or-foe-when-it-comes-to-professional-help
[17] - https://wellbeingnavigator.ai/ai-journaling-for-emotional-resilience-and-workplace-balance/
[18] - https://pmc.ncbi.nlm.nih.gov/articles/PMC12158938/
[19] - https://emhicglobal.com/expert-opinions/ai-chatbots-for-mental-health-opportunities-and-limitations/
[20] - https://pmc.ncbi.nlm.nih.gov/articles/PMC11871827/
[21] - https://www.wildflowerllc.com/chatbots-dont-do-empathy-why-ai-falls-short-in-mental-health/
[22] - https://cmha.ca/news/ai-mental-health/
[23] - https://www.insideprecisionmedicine.com/topics/precision-medicine/ai-poorly-predicts-suicide-and-self-harm/
[24] - https://www.bmj.com/content/391/bmj.r2239
[25] - https://mental.jmir.org/2025/1/e64396
[26] - https://pmc.ncbi.nlm.nih.gov/articles/PMC12064121/
[27] - https://www.psychologytoday.com/ca/blog/a-different-kind-of-therapy/202402/3-ways-were-already-using-ai-in-mental-health-care
[28] - https://www.apaservices.org/practice/business/technology/artificial-intelligence-chatbots-therapists
[29] - https://hai.stanford.edu/news/be-careful-what-you-tell-your-ai-chatbot
[30] - https://www.scientificamerican.com/article/why-ai-therapy-can-be-so-dangerous/
[31] - https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2810775
[32] - https://www.brown.edu/news/2025-10-21/ai-mental-health-ethics
[33] - https://www.psychologytoday.com/ca/blog/the-human-algorithm/202504/ai-therapists-are-biased-and-its-putting-lives-at-risk
[34] - https://www.cochraneeagle.ca/health/regulators-struggle-to-keep-up-with-the-fast-moving-and-complicated-landscape-of-ai-therapy-apps-11276894
[35] - https://www.apa.org/news/press/releases/2025/11/ai-wellness-apps-mental-health
[36] - https://www.psychiatrictimes.com/view/the-hybrid-model-humans-chatbots-working-together
[37] - https://www.sciencedirect.com/science/article/abs/pii/S1077722925000495
[38] - https://pmc.ncbi.nlm.nih.gov/articles/PMC12274205/
[39] - https://www.weforum.org/stories/2024/10/how-ai-could-expand-and-improve-access-to-mental-health-treatment/
