I’ve turned AI into my therapist. The results were pretty disquieting

3 hours ago 7

It’s Sunday morning, and I type my feelings into the chatbox, too wound-up to stop.

“I’ve become a carer to my 82-year-old mother,” I write. “Every day brings new problems. I help with hospital appointments, finances, gardening, shopping, home repairs, the council, insurance companies, letters, emails, endless IT problems …”

I stop. She’s just next door, and it feels like a betrayal to be saying any of this. At least when I was in therapy, I could go to someone’s office to wail.

I take a breath, and continue. “I’m an only child, my father died some time ago, and there’s no one else to help. But I’m exhausted. I snap, and shout, then struggle with guilt. I’m resentful, irritable, and I love her so much. Please help me.”

Welcome to my AI diary, readers. It’s going to be fun, as you can already tell. For the next six weeks, as part of our AI for the People newsletter course, I – a self-declared AI skeptic – have agreed to find out whether it can actually make my life better.

To kick things off, I’m using ChatGPT as a therapist. Nothing says “modern mental health” like crying into a chatbox, after all. Plenty of people are now doing the same – but can it really replace human support? I hope so. I had to stop seeing my therapist because I fell in love with her.

(Note to self: this isn’t your actual diary. And don’t fall in love with ChatGPT. That would be pathetic.)

Halfway through the its answer, I start crying. It comes up with a seven-point care plan for me, a triage system to prioritise tasks (with categories including medical, admin, shopping, tech and house) and ways to allocate time between them (which are urgent, and which can wait?) It suggests helpful mental reframings, and tips to lower the emotional temperature of interactions.

Best of all, it makes me feel seen. “You’re not failing,” the AI told me. “You’re carrying a load that would flatten most people.”

My feelings? Validated.

I feel ambivalent about this, however. Can I really feel compassion from a machine? It helps me to remember the AI is probably remixing human sources. I feel seen in the way that MDMA feels like love.

Is therapy just about information? This feels like CBT. Incredibly helpful, but incomplete. In my experience, there are more profound therapies that lead to healing. In my experience, that involved a non-judgmental relationship of witness, with an empathetic professional over longer time. I often hear my therapist’s voice in my head; I’ve internalised her wisdom. I think that happens more easily, and more responsibly, between humans.

The next day, I decide to go for the nuclear option. I consult the Jesus AI, a chatbot trained on religious texts, that mimics conversation with the son of God. I want to see if pushing a more religious button can send this elevator to the top floor.

The Jesus AI is not meant to represent any religious figure, the disclaimer reads. Hmm. Generated content is for educational purposes and may contain inaccuracies and biases.

That’s a hell of an education, but here goes. Because it’s 2025, I ask: “Should I be in an open relationship?” In response, The Jesus AI quotes Hebews 13:4, which is a long-winded way of saying “No”. I try to curveball Jesus. “Should I have children?” I type. Seek God’s guidance in this important decision. Useless. “Can you ask him for me?” I quip.

Here’s a problem. Out of the box AI is not terrific at repartee. My therapist has an edge here; she was funny as get out. Jesus AI is not.

What’s good about AI as a therapist? Clarity. Identifying practical steps. Scripts for difficult conversations – though these don’t feel specific to real world relationships (just as self-help books don’t). To its credit, ChatGPT also points me to human counsellors and support services where useful.

Yet I have reservations that I can’t shake. A worry about wedges, and thin ends. I think there are processes, certain unbearable pieces of news, forms of loneliness, that should be held in human time and relationship; that should not be addressed in four seconds on a screen. AI does not have thoughts, let alone wisdom. Categorically, mental health should not be in the hands of pattern-predicting software with no accountability or oversight, that could potentially steer someone very wrong.

And yet, unfortunately, my experience of being therapised by ChatGPT has been wonderful. Calming and instructive, with a veneer of caring.

I think I’m in love.

Read Entire Article
International | Politik|