Deepfake Pornography Crisis in UK Schools: 1 in 10 Teachers Report Incidents
Deepfake Porn Crisis Hits UK Schools

The casual creation of sexually explicit deepfake images by pupils is becoming a pervasive and devastating crisis in British schools, with new data revealing the shocking scale of the problem. An exclusive poll for the Guardian has found that approximately one in ten secondary school teachers in England were aware of students creating deepfake, sexually explicit videos during the last academic year.

The Chilling New Norm in the Classroom

Gone are the days when 'sexting' was the primary digital threat in schools. Advances in artificial intelligence have placed powerful 'nudify' apps into the hands of children, making it disturbingly simple to strip clothes from images, animate pictures suggestively, or graft a peer's face onto pornographic content. A headteacher recounted a stark example: a teenage boy, on a school bus, openly used his phone to select a girl's social media picture and digitally undress her using an app. "He obviously wasn't hiding it... That's what was quite shocking," the head said.

The fallout from such acts is profound and traumatic. In one case from Australia, a student was so horrified by fake explicit images of herself that she vomited. In the UK, the impact is similarly severe. "The fallout was quite significant. Students were really upset. They felt very betrayed and violated. It's a form of abuse," explained Dolly Padalia, CEO of the School of Sexuality Education. The charity was called into a school after a student created deepfakes of multiple peers, leading to police involvement and the perpetrator's removal.

An Epidemic of Image-Based Abuse

The Guardian-commissioned Teacher Tapp survey of 4,300 teachers paints a worrying picture of how widespread this issue has become. Alarmingly, three-quarters of reported incidents involved children aged 14 or younger, with one in ten concerning 11-year-olds. A separate Girlguiding survey found one in four girls aged 13-18 had seen a sexually explicit deepfake of a celebrity, friend, teacher, or themselves.

Professor Tanya Horeck of Anglia Ruskin University, who has been investigating the issue, confirmed its prevalence. "All of them [the headteachers she spoke to] had incidents of deepfakes in their schools... Almost all the examples were boys making deepfakes of girls." She also highlighted a troubling inconsistency in how schools respond, swinging between immediate expulsion and restorative justice, due to a lack of clear national guidance.

The problem is global, with high-profile cases in Spain, Australia, and the US. In the UK, police have investigated private schools over allegations of deepfake image sharing. The children's commissioner for England, Dame Rachel de Souza, has called for apps like ClothOff to be banned, stating children are "frightened by the very idea of this technology."

A Culture of Undermined Consent

Experts warn that the trivialisation of these tools on platforms like TikTok and Snapchat, through filters that "change your friend into your boyfriend," is fostering a dangerous culture. The charity Everyone's Invited states this "reflects and reinforces a culture where consent and respect for personal boundaries are undermined." Against a backdrop of widespread misogyny in schools, teachers are also increasingly becoming targets.

Safeguarding lead Seth James describes the challenge: "'More education'... on its own is like trying to hold back a forest fire with a water pistol." He urges society to confront the normalization of technology that can instantly create pornographic material featuring real people known to children.

Professor Jessica Ringrose of UCL stresses the urgent need for a joined-up approach. While welcoming updated government guidance on relationships and sex education that recognises misogyny, she argues: "They need to join up a concern with gender and sexual-based violence with technology... We need proactive, preventive education." A Department for Education spokesperson said its new guidance will ensure young people understand the dangers of content like deepfakes.

There is a glimmer of hope in education. The boy on the bus was only caught because a girl who witnessed his actions had recently had an online safety lesson. She recognised the abuse and reported it. This case underscores that while the technology presents a formidable challenge, awareness and curriculum-based intervention can be a powerful first line of defence.