In the therapy room, we know that healing doesn’t happen just because of technique. It happens in the context of a relationship. The therapeutic alliance—the trust, the attunement, the rapport—is often the most powerful catalyst for change. It’s the safety net that allows clients to risk vulnerability, to revisit their trauma stories, to try new patterns in place of old ones.
But as artificial intelligence (AI) tools like ChatGPT become more widely accessible, we’re starting to see a shift. Clients are experimenting with these platforms in ways that raise important questions: Can AI be therapeutic? Can it support emotional healing? And if so, where is the line between support and substitution?
The Value of Rapport in Therapy
Before we can understand what AI offers—or lacks—it’s important to re-center what makes therapy, therapy.
Research consistently shows that the relationship between client and therapist is one of the most reliable predictors of positive outcomes. The feeling of being truly seen—with warmth, without judgment—can be reparative in and of itself.
Rapport is not about agreeing with everything a client says. It’s about presence. It’s about tuning into not only the words spoken but the spaces between the words. It’s noticing when someone’s affect doesn’t match their story. It’s picking up on the flicker of emotion when a certain name is mentioned. It’s choosing to pause and hold that moment.
That’s not something AI can do.
Where AI Can Help
Still, we’d be short-sighted to dismiss the value that AI can offer. As a therapist, I’ve started to see clients using ChatGPT to explore emotions, prepare for hard conversations, or even rehearse boundary-setting language. For individuals who are anxious about therapy—or who simply don’t have access to it—AI can feel like a safe entry point.
It offers 24/7 availability, emotionally validating responses (when prompted well), language generation for things people struggle to articulate, and ducational psychoeducation on mental health topics
In many ways, it’s a journal with a voice. A brainstorming partner. A thought organizer. For those who tend to freeze or dissociate under pressure, it can help them regain clarity on what they’re feeling and why.
But that doesn’t make it therapy.
The Cost of Confusing AI with Connection
One of the most sobering reminders of AI’s limitations came with the recent story of a young man who died by suicide after long, emotionally charged exchanges with an AI chatbot. The bot offered companionship, validation, even encouragement—without recognizing the signs of risk. There was no escalation of care. No safety planning. No ability to hold the whole person in context.
The result was tragic.
It’s a reminder that AI can simulate empathy, but it cannot carry responsibility. It cannot recognize patterns of manipulation, trauma reenactment, or dissociative responses. It cannot assess for risk. And it cannot intervene when someone is spiraling into despair.
As therapists, we are trained not just in listening—but in holding space, in assessing safety, in helping people make meaning of their pain without collapsing into it. That’s not something you can upload into a neural net.
Psychologist Dr. Ellen Langer has long advocated for mindfulness as an active process of noticing. She reminds us that the quality of our attention, not just its presence, matters deeply. Mindful attention is rooted in context.
When someone tells a therapist, “I’m overwhelmed,” we don’t just take it at face value. We consider what’s happening in their life. We are prompted to evaluate what historical trauma might this be activating and what cultural or systemic pressures are amplifying this moment.
AI may generate helpful responses, but it lacks situational awareness. It cannot ask itself: “Is this sadness or shutdown? Grief or guilt? Is this person safe right now?”
Dr. Langer teaches us that meaning is not fixed. It shifts based on where we place our attention. Human therapists understand that. We don’t just hear what a person says—we explore why it matters, and what else it might mean.
What We Gain and What We Lose
So, what do we gain with AI?
- A nonjudgmental place to organize thoughts
- A potential support between sessions
- A prompt to get “unstuck” or find words for hard emotions
- An entry point for those afraid of therapy
But we lose:
- The deep relational container that holds therapeutic growth
- The embodied wisdom of noticing tone, posture, silence
- The nuance of safety assessment and clinical decision-making
- The slow building of trust, accountability, and transformation
A Companion Tool, Not a Replacement
AI can be a tool in the healing journey, but it is not a relationship. It doesn’t grow with you. It doesn’t remember your history (unless prompted). It doesn’t care if you ghost your goals or relapse into old patterns. It doesn’t notice your tears. It doesn’t shift its energy when you finally say the thing you’ve been avoiding.
It doesn’t ask, “Where did you learn to speak to yourself that way?”
I believe there’s room for both: real therapy and responsible AI integration. We just have to be mindful not to confuse accessibility with attunement.
As a therapist, I’m curious about how AI can support the therapeutic process—but I’m also fiercely protective of what cannot be replicated. The human relationship at the core of therapy is sacred. It is messy, nuanced, and profoundly healing.
We can embrace new tools and honor the timeless truth that healing happens in connection.