AI as Therapist: A Skeptic's Journey into Digital Mental Health Support
AI as Therapist: A Skeptic's Digital Mental Health Journey

In a bold experiment, Rhik Samadder, a self-proclaimed AI skeptic, has turned to artificial intelligence as a makeshift therapist, documenting his disquieting yet insightful journey. As part of a series on AI for everyday life, he spent six weeks using ChatGPT to manage his emotions and practical challenges, particularly while caring for his elderly mother.

The Emotional Rollercoaster of AI Therapy

On a tense Sunday morning, Samadder poured his heart into a chatbox, detailing the exhausting realities of being a sole caregiver. "I've become a carer to my 82-year-old mother," he confessed, listing endless tasks from hospital appointments to IT problems. The act felt like a betrayal, yet it opened a door to unexpected support.

ChatGPT responded with a seven-point care plan, offering triage systems for medical, admin, shopping, tech, and house-related duties. It provided mental reframing techniques and tips to de-escalate emotional interactions. Most powerfully, it validated his struggles, stating, "You're not failing. You're carrying a load that would flatten most people." This moment brought Samadder to tears, as he felt genuinely seen and understood.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

The Ambivalence of Machine Compassion

Despite the comfort, Samadder grappled with deep reservations. He questioned whether true compassion could emanate from a machine, likening the experience to MDMA simulating love. While AI excelled at delivering clear, practical advice akin to cognitive behavioral therapy (CBT), he noted its limitations. Human therapy, in his view, involves a profound, non-judgmental relationship built over time, fostering internalized wisdom that AI cannot replicate.

To test boundaries further, he consulted the Jesus AI, a chatbot trained on religious texts. When asked about open relationships or having children, it offered vague, scriptural responses, highlighting AI's struggle with nuanced repartee. Samadder missed the humor and depth of his human therapist, underscoring a key gap in digital interactions.

Pros and Cons of AI in Mental Health

Samadder identified several benefits of AI therapy, including clarity in problem-solving, actionable steps, and scripts for difficult conversations. ChatGPT also responsibly directed him to human counsellors and support services when needed. However, he voiced serious concerns about accountability and oversight, warning that mental health should not rely solely on pattern-predicting software without human empathy.

He worried about AI addressing profound loneliness or traumatic news in mere seconds, arguing that such issues deserve human connection and time. Yet, paradoxically, his overall experience was positive—calming, instructive, and wrapped in a veneer of care that left him feeling oddly attached.

In the end, Samadder's journey reveals a complex landscape where AI can offer valuable support but falls short of replacing the irreplaceable human touch in therapy.

Pickt after-article banner — collaborative shopping lists app with family illustration