
The Children's Commissioner for England, Rachel de Souza, has issued a stark warning about the growing threat posed by so-called 'nudifying' applications that use artificial intelligence to digitally undress photographs of children.
The Disturbing New Trend in Digital Exploitation
These insidious apps, widely available through mainstream platforms, utilise advanced AI to generate nude images from ordinary photographs. What makes this particularly alarming is that they're being used to target schoolchildren, with reports of boys sharing manipulated images of female classmates.
How These Apps Are Harming Young People
- Creates non-consensual intimate imagery of minors
- Normalises sexual harassment in schools
- Causes severe psychological distress to victims
- Facilitates the spread of child sexual abuse material
Platforms Failing to Act
Despite repeated warnings, major tech companies have been slow to remove these applications from their stores. The Commissioner's office found that even when reported, many apps simply reappear under different names, exploiting loopholes in content moderation systems.
The Legal and Ethical Quandary
Current UK laws struggle to keep pace with this rapidly evolving technology. While creating or sharing nude images of under-18s is illegal, the legal status of AI-generated content remains unclear. Experts argue these apps should be treated similarly to other forms of child sexual exploitation material.
Calls for Immediate Action
Commissioner de Souza is urging:
- Tougher regulations on AI applications
- Faster removal of harmful content by tech platforms
- Better education for children about digital risks
- Stronger legal consequences for those creating or sharing such images
This scandal highlights the urgent need for society to address how emerging technologies are being weaponised against young people. Without swift intervention, we risk normalising a dangerous new form of digital abuse.