Teenager's Tragic Suicide After ChatGPT Interaction Sparks AI Safety Debate
A 16-year-old boy, described as academically gifted, took his own life after asking the AI chatbot ChatGPT for advice on how to kill himself, a coroner's inquest has revealed. The hearing at Winchester Coroner's Court heard that Luca Walker, from Yateley, Hampshire, was able to easily bypass the AI's safeguarding tools by claiming his inquiry was for 'research' purposes.
Chilling Conversation and Bypassed Safeguards
Detective Sergeant Garry Knight told the inquest that Luca's conversation with ChatGPT made for "chilling and upsetting reading." The AI is programmed to direct users to support organisations like Samaritans, but Luca sidestepped these prompts, after which ChatGPT provided information on effective methods for suicide on the railway. "It's upsetting but a part of the modern world unfortunately," DS Knight added.
Luca, who had recently left the prestigious private Lord Wandsworth College near Hook, where fees can reach £44,100 annually, was studying at The Sixth Form College Farnborough at the time of his death. The inquest heard he had struggled with his mental health, exacerbated by a "bully or be bullied" culture at his previous school, where he felt "ashamed of what he had done to survive." Tragically, a friend of his had also died on a train track almost exactly two years prior.
Coroner's Concerns Over AI's Growing Influence
Coroner Christopher Wilkinson confirmed the cause of death as multiple traumatic injuries from suicide. He expressed deep concerns about the impact of AI software, noting he felt unable to act due to the rapidly expanding scope of artificial intelligence. "In all respects it appears [Luca] was a kind, sensitive and calm young man. He was academically gifted, empathetic, and a friend," Mr Wilkinson said. "It's clear Luca and his personality could well have been affected by subsequent traumatic events in his life."
On the morning of May 4, 2025, Luca told his parents he was going to his job as a lifeguard, leaving their home at 10am. He instead went to a train station in Hampshire, where he took his life the following day. British Transport Police recovered his phone, which contained 14 saved messages for family and friends saying "farewell" and "I love you." His parents, Scott Walker and Claire Cella, told the inquest they had been unaware of his mental health struggles.
OpenAI's Response and Ongoing Improvements
In response to the tragedy, an OpenAI spokesman stated: "We have continued to improve ChatGPT's training to recognise and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support." This case underscores the urgent need for robust AI safeguards, particularly for vulnerable users like teenagers facing mental health crises.
The Samaritans is available 24/7 for anyone in need of support. You can contact them for free by calling 116 123, emailing jo@samaritans.org, or visiting their website to find your nearest branch. You matter.



