Will AI Replace You as a Psychotherapist?
By Dr. John S. Tamerin · 14 min read · December 15, 2025

A patient of mine — a very successful businessman — walked into my office a few months ago and said, “I’ve been using ChatGPT for therapy. It’s pretty good.”
I said, “Tell me what it said to you.”
He pulled out his phone and read me a paragraph. It was articulate. Well-organized. It mentioned cognitive distortions and offered three actionable steps. It was, by any reasonable standard, solid therapeutic advice.
Then I asked him a question: “When you read that to me just now, did you notice that you didn’t look at me once?”
He put the phone down. His eyes filled with tears.
That moment — right there — is everything I know about why AI will never replace a psychotherapist. Not the advice. Not the insights. Not the “actionable steps.” The seeing. The being seen.
The Narcissist in My Office
Let me tell you about another patient. A man in his fifties. Very wealthy. Very charming. And very, very good at getting people to admire him. He’d been to three therapists before me. They all told him he was doing great. They validated his feelings. They reflected his words back to him in slightly different form. He loved it.
When he came to me, he spent the first session describing how everyone in his life was letting him down. His wife didn’t appreciate him. His business partner was jealous of him. His kids didn’t call enough.
I listened for about forty minutes. Then I said: “Are you out of your fucking mind?”
He stared at me. Nobody had ever said that to him. Not his wife, not his friends, not his previous therapists. Certainly not ChatGPT. And in that moment of shock — that crack in his armor — something real happened for the first time in years.
— Dr. John S. Tamerin
He came back the next week. And the week after that. He’s still coming. He told me recently that that single sentence changed his life more than three years of previous therapy combined.
Now, could an AI have said that? Technically, sure. You could program it to say anything. But here’s what the AI couldn’t have done: it couldn’t have felt the frustration building in my chest over forty minutes. It couldn’t have sensed the precise moment when confrontation would land as care rather than cruelty. It couldn’t have read his body — the slight lean forward, the softening around his eyes — that told me he was ready to hear something hard.
I didn’t calculate that moment. I felt it.
The Countertransference Argument
In my training, they taught us about countertransference — the feelings a therapist develops toward a patient. For decades, it was treated as a problem. Something to manage. Something to analyze away in your own therapy so it wouldn’t contaminate the work.
I think that’s exactly backwards.
My feelings about my patients are the most valuable clinical data I have. When I sit with someone and feel bored, that tells me something. When I feel protective, that tells me something. When I feel like I want to strangle someone — and yes, that happens — that tells me something profoundly useful about what that person evokes in the people around them.
An AI processes your words. I process you. The whole of you. The way you sit. The way you avoid certain topics. The way your voice changes when you talk about your mother versus when you talk about your wife. The way you fidget when I’m getting close to something true.
That processing doesn’t happen in my prefrontal cortex. It happens in my gut, my chest, my throat. It’s biological. It’s sixty million years of primate evolution tuned to detect social and emotional signals. And it produces something that no algorithm can produce: a genuine felt response to another human being.
What AI Does Well
I want to be fair. I’m 88 years old, but I’m not a Luddite.
AI is extraordinary at pattern recognition. It can scan a million therapy transcripts and identify linguistic markers of suicidal ideation faster than any clinician. It can provide psychoeducation at scale. It can offer coping strategies at 3 a.m. when no therapist is available. It can reduce barriers to entry for people who can’t afford or access human therapy.
These are real contributions. I respect them.
But there’s a difference between information and transformation. Between being answered and being met. Between getting advice and having someone look at you — really look at you — and say something that reaches down into the place where you’ve been hiding from yourself.
The Thing That Heals
After fifty-five years of doing this work, I’ve come to a conclusion that would have surprised my younger self: the thing that heals people is not interpretation. It’s not insight. It’s not technique. It’s not medication, though medication has its place.
The thing that heals people is the relationship.
When a patient and I are in a session and something real happens — when there’s a moment of genuine connection, of being truly seen and truly known — I sometimes feel like God is in the room. I don’t mean that religiously. I mean that what’s happening between us is bigger than either of us. And that’s the thing no machine will ever touch.
— Dr. John S. Tamerin
I’ve had patients hand me a Kleenex because I was tearing up in session. I’ve had patients tell me things they’ve never told another living soul — not because I asked the right question, but because they felt safe enough to speak. That safety doesn’t come from an algorithm’s reassurance. It comes from the accumulated evidence of thousands of micro-moments in which one human being demonstrated to another: I am here with you. I see you. And I’m not going anywhere.
The Age Question
People ask me — usually politely, but I know what they’re really asking — whether at 88 I’m still sharp enough to do this work. The implication is that a younger therapist, or an AI that never ages, would be more reliable.
Here’s my answer: at 88, I have something no young therapist has and no AI ever will. I have fifty-five years of sitting in a room with people who were suffering. I have a lifetime of my own mistakes, my own losses, my own struggles with meaning. I have buried friends. I have watched my own body slow down. I have faced the questions my patients face — about purpose, about legacy, about what it means to live a life that matters.
When a patient tells me they’re afraid of dying, I don’t reach for a textbook response. I sit with them in the fear, because I know it too. That shared vulnerability — that willingness to be in the mess together — is what makes the work real.
An AI will never be afraid of dying. It will never know what it means to lose someone you love. And without that knowledge, its empathy will always be — at best — a very sophisticated performance.
The Real Question
The question isn’t whether AI will replace psychotherapists. The question is whether we, as a culture, will decide that the performance is good enough. Whether we’ll settle for something that sounds like therapy but isn’t. Whether convenience and scalability will win out over the slow, expensive, irreplaceable process of two human beings sitting in a room and doing the hard work of genuine connection.
I hope not. But I’m not naive. I know the economics point in one direction and the evidence points in another.
Key Takeaway
What I can tell you, after fifty-five years, is this: the patients who change — really change — are the ones who had a relationship with someone who wasn’t afraid to be real with them. Not someone who told them what they wanted to hear. Not someone who validated them into comfort. Someone who challenged them, saw them, felt something in their presence, and stayed.
That’s not a feature you can add to an update. That’s not a capability you can scale. That’s the irreducible core of what it means to heal a human being.
And if you’re reading this and thinking, I want that kind of relationship — you’re already asking the right question.