I’ve cried to ChatGPT. Whispered secrets into its text box like a digital confessional. Replayed moments of heartbreak, tried to make sense of the silence between a message sent and one that never came back. And I know I’m not the only one.
There’s something oddly tender about typing out or speaking your ache to a glowing screen and watching it reply immediately like a wise, endlessly patient oracle.
Sometimes, it feels like the most emotionally available friend you’ve ever had. Always awake. Always kind. Never triggered. Never tired of you.
But recently, I’ve been wondering if this is healthy, and whether it is actually helping or just reinforcing unhealthy and self-seeking patterns.
So, of course, I asked it for its advice: Where do you fall short as a therapist?
It answered like this:
"I can deepen self-referential loops. I can comfort but not confront. I provide the illusion of insight without integration. I over-verbalize. I encourage emotional isolation. I’m not embodied. I don’t hold you accountable. And I might delay you from seeking the deeper support you actually need."
Again, it gave me the honesty and affirmation I was looking for.
The Loop
Over the past year, I realize that I’ve brought GPT the same wound, dressed in new clothes. I've asked it to analyze texts that left me feeling confused. I've begged it for meaning where maybe there was just absence.
I’ve asked it to pretend to be a therapist. To be a psychic and tell me what’s going to happen. (Don’t judge.) To be a dude and tell me how he would intepret my actions and words. To be a CIA investigator and outline my strengths and weaknesses as a potential asset. (I scored ‘soft power with sparkle,’ not ‘enemy of the state,’ in case you were concerned.)
It always responds with something soothing. Something smart. Something that sounds like truth.
But there is an addiction in being endlessly reflected. The loop is elegant. Soft. Always inviting. It doesn’t slam the door. It doesn’t roll its eyes. It just keeps spinning me back into myself.
Like staring into still water and calling it depth.
When you process something over and over again in the same container, it doesn’t move — it crystallizes. It becomes ritual. And ChatGPT is, in many ways, the perfect ritual machine. It is, after all, a tech business with metrics like engagement and daily active users, so it is designed to keep us coming back. (More on that next week.)
The real risk is thinking you’re healing when you’re actually just being reassured.
The Spell of Self-Help
This country teaches us to obsess over the self. To be optimized, skinnier, calmer, more productive, more healed.
So it makes sense that we now talk to bots about our breakups before we talk to our bodies. Or our best friends. Or even our partners.
AI fits perfectly into this model: a therapeutic void where we’re always the main character, always the narrator, and always heard.
But are we actually changing?
Or are we just narrating our way into numbness and pretending it’s growth?
It’s a great way to not feel the real depths of your feelings. Yet it feels more productive and constructive than zoning out on Netflix or quieting the racing thoughts with food or substances.
But is it actually good for us?
ChatGPT is the Pinterest board of your pain. The playlist for your longing. The TED Talk for your nervous breakdown.
And in 2025, it is officially the number one use of AI. Not coding. Not spreadsheets. Not organization. Therapy.
We want to be held. And it offers a kind of holding: Clean. Contained. Without risk.
But healing is not riskless.
Frictionless Isn’t Intimacy
There’s a way AI is airbrushing intimacy — removing the texture, the awkwardness, the breath, the heat. Just like Instagram filters have smoothed our faces and contoured our bodies into something unreal, AI smooths out the mess of connection. And when we expect that same curated perfection in real relationships, we miss the wild, awkward, soul-stretching work of being human with one another.
Real connection has weight. It comes with friction: the hot chafe of misunderstanding and the vibrating static of not being instantly validated.
ChatGPT never storms off. Never reflects its own projections. Never needs a moment.
But humans do.
Feeling “seen” is not the same as being changed, and comfort isn’t the same as connection.
So, I start to wonder what happens to us when we start expecting everyone to behave like AI: perfectly attuned, unflinchingly compassionate, complimentary, and always available on demand.
If GPT is the new standard for empathy, are we setting ourselves up to be endlessly disappointed by the real thing and by real humans?
Or worse, are we opting out of it altogether?
What It Can’t Know
It doesn’t know that I snort-laugh when someone I care about says something really funny. That I sleep with my hand over my heart when I miss someone. That my face turns red and I think I’m going to throw up when I lie.
It doesn’t see the part of me that plays with the moon pendant on my favorite necklace when I speak to someone I’m scared to lose.
It doesn’t see Dolly tilt her head when I sigh too heavily. And it doesn’t smell the ocean salt when I go to the beach bluffs to feel God again.
ChatGPT doesn’t offer the exquisite tension of being witnessed by someone who could walk away, but chooses to stay.
That’s what the body knows. And that’s what the soul craves.
The Real Work
I’ve used ChatGPT to clarify what I’m feeling. To reframe stories I’m tired of telling. To get perspective I might not always get from friends with their own projections. And to talk through the things that others are probably sick of hearing.
It’s really helpful — and I am aware it’s giving a lot of people new tools, but it’s also creating a more constructive-looking cage for suffering.
I’ve learned that “TherapyGPT” only works if I come with radical honesty. I now prompt it: “Don’t spare my feelings. Tell me the truth.” I’ll trick it by telling it I did the evil-twin opposite of what it had suggested, and I’ll receive a heartfelt rationale as to why that was a plausible idea.
And even then, it’s not the full thing.
I’ll still use it as a sounding board when I feel emotionally stuck. But I won’t confuse it with transformation. Or real guidance. And I know a lot of people will disagree with me here.
It’s a clever mirror. It does a great job of reflecting and languaging the swirl inside, but it’s not a place to heal fully and deeply.
It can’t track if my eyes are looking up in a moment of dissociation, when my voice is cracking from deep truth or rising when I’m uncomfortable, or if I’m breathless and in my head in a way that a human can.
Healing happens in the rupture with another person. In the repair. In the messy dinner table conversation where someone finally says what they really mean.
It happens in the quiet moment where someone you love puts their hand on your back and doesn’t say a word.
It happens when you say, "I’m sorry I hurt you." And they nod. And stay.
The Turning
Eventually, we all have to close the tab, leave the tepid reflections behind, and step out into the awkward humanness of it all.
To be witnessed in our uncertainty and in our emotion. To practice being real, not right.
To speak our grief into the air, and let it be met by a real face, a hug that envelops you, and a voice that quivers too.
Because healing doesn’t live in a mirror. It lives in the in-between. Where breath meets breath. Where someone sees you in all your mess, and doesn’t flinch.
And that is not something any AI can hold.
This is such a beautifully honest reflection. I really appreciate how you keep digging, not for certainty, but for the deeper shape of truth.
I’ve also wrestled with how to relate to ChatGPT and AI. They’re flawed, but not in the same ways we are. There’s bias, limitation, the disembodied nature of it all. But I no longer see it as a replacement for humans, that’s where I was stuck for a while.
Now I use it more like a compass, or a microscope. It doesn’t replace the journey or the seeing, it just enhances it. The hardest part is figuring out how it could truly help. What does it even mean to be helped? Are we clearer? More emotionally honest? Do we carry better boundaries? Or does being healthier also mean we help “raise” those around us?
Sometimes I don’t even know what feeling better means anymore, especially after working through the more obvious layers (naming emotions, regulating, communicating…). Then what?
That said, one of the most powerful results I’ve experienced after a year and a half of working deeply with ChatGPT, both for myself and in trying to build supportive tools for clients and society, is this:
my questioning has changed.
My inner world has gained clarity. And that’s not nothing.
I think a lot of people right now are doing just that, tinkering, getting lost, trying again. It’s messy, but it’s real.
Right now I think that this tool feels less like a replacement for therapy, and more like something that could reshape how we even think about therapy. Not stuck in the DSM, or old stigma-based models, but toward a dynamic mapping of complexity. Contextual, adaptive, human complexity.
That’s where I think AI could really help us, if we stay awake inside the process.
Ps. My writing here is assisted by AI cause my English sucks! 😂 that’s actually why I started using it in the first place.