Anxiety

Why Are So Many People Using AI Chatbots for Therapy in 2026?

May 11, 2026

Why Are So Many People Using AI Chatbots for Therapy in 2026?

More than 1 in 5 Americans have now turned to an AI chatbot for mental health support. Not as a last resort,  but as a first one.

That number would have seemed far-fetched just a few years ago. But in 2026, opening an app and typing out your anxiety feels less strange than making a phone call to schedule a therapy appointment six weeks out. For millions of people, AI has quietly become their first point of contact for emotional support and the reasons why say a lot about where mental healthcare in America actually stands right now.

The Therapy Gap Nobody Wants to Talk About

The demand for mental health support has never been higher. Anxiety and depression rates have risen by nearly 10% since 2025 alone. More than half of U.S. adults report feeling lonely and socially disconnected. And yet, access to actual care has gotten harder, not easier — with over half of Americans reporting they've never tried talk therapy or psychiatry despite knowing they probably should.

The barriers are familiar: long waitlists, high costs, insurance headaches, and the lingering (if fading) stigma of saying out loud that you need help.

Into that gap stepped the chatbot.

What's Actually Drawing People to AI for Mental Health?

When researchers asked people why they turned to AI for emotional support, the answers weren't what you might expect. It wasn't because they thought the bot was better than a therapist. It was about something simpler: safety and access.

The top reasons people give include:

  • No judgment, no waiting. You can talk to an AI at 2 a.m. on a Tuesday when your anxiety is spiking and no one else is available.
  • Lower stakes. Saying hard things to a machine feels less exposing than saying them to a person, especially for a first-time conversation.
  • It bridges the gaps. Many people use AI chatbots between therapy sessions — to process thoughts, track moods, or work through something before their next appointment.
  • It's free (or close to it). When financial stress is one of the fastest-growing barriers to mental healthcare, cost matters enormously.

As one user described it to WBUR this month: AI helps her piece together "fractured" thoughts between her regular therapy sessions, especially during moments of emotional upheaval when her therapist isn't available.

That's a telling use case. For many people, AI isn't replacing therapy. It's filling the space around it.

Does It Actually Work?

This is the fair question and the honest answer is: somewhat, for some things, for some people.

A major systematic review and meta-analysis published in late 2025 examined the effectiveness of AI chatbots across randomized controlled trials. The results showed meaningful improvements in depression, anxiety, and stress symptoms among users, particularly young adults aged 15 to 39. A landmark randomized trial from Dartmouth testing an AI therapy chatbot called Therabot found promising results for people with major depressive disorder, generalized anxiety disorder, and eating disorder risk, calling it "the first RCT demonstrating the effectiveness of a fully generative AI therapy chatbot for treating clinical-level mental health symptoms."

Those are real findings. But researchers are careful to note the limitations: most studies involve short timeframes, relatively small samples, and people who may not represent the full range of mental health needs. The science is promising — but still early.

What AI chatbots tend to do well:

  • Psychoeducation (helping you understand what anxiety or depression actually is)
  • Mood tracking and journaling prompts
  • Cognitive-behavioral therapy (CBT) exercises and coping strategies
  • Offering a non-judgmental space to articulate thoughts

What they don't do well:

  • Crisis intervention (this is a hard limit — if you're in crisis, a human must be involved)
  • Diagnosing mental health conditions
  • Replacing the relational depth of ongoing therapy
  • Navigating complex trauma

The Real Risks Worth Knowing

AI chatbots for mental health aren't without genuine concerns, and ignoring them doesn't help anyone.

Over-reliance is a real risk. If someone uses a chatbot instead of seeking real care when real care is what they need, that's a problem, not a solution.

Privacy is murky. Not all mental health apps have the same data protections as licensed healthcare providers. Your conversations with a chatbot may not be protected by HIPAA. Reading the fine print matters.

The empathy isn't real. AI can mimic warmth and understanding extremely convincingly, but it doesn't actually understand your experience. For some people, that distinction doesn't matter much. For others, it matters enormously. Knowing which camp you fall into is worth reflecting on.

Vulnerable users need extra caution. Teens, people in active crisis, and those dealing with serious psychiatric conditions should not rely on chatbots as a primary mental health resource.

So Should You Try One?

If you're curious, struggling to access traditional care, or just want a low-stakes way to start working through something, trying an AI chatbot isn't a bad idea. Think of it the way you might think of a mental health journaling app: a useful tool, not a treatment plan.

The best approach: use it as a bridge, not a destination. Let it help you articulate what you're feeling, explore coping strategies, or get through a hard night while continuing to pursue human support when you can.

And if you're in crisis,  if you're having thoughts of self-harm or suicide, please reach out to the 988 Suicide & Crisis Lifeline by calling or texting 988. That's a moment where human connection isn't optional.

The Bottom Line

The surge in AI chatbot use for mental health isn't a trend born from tech enthusiasm. It's born from a system that isn't meeting people where they are. When therapy waitlists stretch for months and costs run hundreds of dollars an hour, people find other ways to cope, and right now, many of them are finding their way to a chatbot.

That's worth taking seriously: both the need it reveals, and the tool it's produced.

AI won't replace therapists. But for millions of people navigating life's hard moments without access to one, it's becoming something they didn't have before, a starting point.

If you found this article helpful, consider sharing it during Mental Health Awareness Month. And if you or someone you know is struggling, the 988 Suicide & Crisis Lifeline is available 24/7 by calling or texting 988.

Share this post:

join the newsletter

Get the latest Elite Psychology news

Subscribe to get regular updates and thought-leading pieces on psychology, performance and more.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.