Child Abuse Image Crimes Surge Nearly 10% in UK, Sparking Tech Firm Criticism
Child sex abuse image offences recorded by UK police forces have increased by almost 10% over the past year, according to alarming new data. This rise has prompted urgent calls for technology companies to implement stronger measures to prevent the capture and distribution of nude images on children's devices.
NSPCC Warns of Persistent Online Dangers for Young People
The National Society for the Prevention of Cruelty to Children (NSPCC) has issued a stark warning that children continue to face significant risks online, including grooming, extortion, abuse, and the non-consensual sharing of intimate images. The charity's latest research highlights the ongoing threat, with 36,829 offences involving indecent and prohibited images of children recorded between April 1, 2024, and March 31, 2025.
This figure represents a notable increase from the 33,886 offences documented in the previous year. The data was gathered through Freedom of Information requests submitted to 45 UK police forces, with 42 providing responses. The Police Service of Northern Ireland and Police Scotland were included, while Gloucestershire, Hampshire, and Thames Valley forces did not respond.
Tech Platforms Identified as Major Venues for Abuse
According to the NSPCC's analysis, Snapchat was the platform most frequently associated with these crimes. Of the 10,811 offences where police recorded the platform used by perpetrators, 43% (4,615 crimes) occurred on Snapchat. Meta platforms collectively accounted for nearly a quarter of all offences, with Instagram at 8%, WhatsApp at 7%, Facebook at 5%, and Messenger at 4%.
The charity emphasized that the true scale of online abuse against children remains "hidden" due to the widespread use of end-to-end encryption on many messaging services.
Government Strategy and Calls for Mandatory Protections
In December, the UK government published its strategy to tackle Violence Against Women and Girls (VAWG), which included an aim to "make it impossible for children in the UK to take, share or view a nude image." The strategy stated that officials were "working constructively with companies to make this a reality."
However, the NSPCC argues that this approach must be made mandatory. The charity is urging the government to take decisive action against technology companies if they fail to embed existing protective technologies on children's phones. These device-level protections would automatically block nude images from being created, shared, or viewed on children's devices, with adult users having the option to opt out through a verification process.
Technology Exists to Prevent Abuse at Source
The NSPCC explains that such technology can intercept nude images at the point of capture, transmission, or reception on a device. Because the image is never fully created or sent, there is nothing to encrypt, effectively stopping abuse at its source. This method represents a proactive approach to safeguarding children online.
NSPCC chief executive Chris Sherwood stated: "Children across the UK are being completely failed by tech companies that should be protecting them online. We cannot keep letting them off the hook when they can do more to prevent this from happening in the first place."
Sherwood added: "Technology already exists that could be deployed today to stop children from taking, sharing or receiving nude images. So, the real question is: what's stopping them? If they continue to drag their feet, government must show their might by stepping in and compelling them to act."
Regulatory Pressure Mounts on Tech Giants
The data emerges as regulatory bodies increase pressure on major technology companies. Communications regulator Ofcom recently wrote to platforms including Facebook, Instagram, and Snapchat, giving them until the end of April to detail their actions regarding age verification and grooming protections. Simultaneously, the Information Commissioner's Office (ICO) contacted the same companies, requesting explanations of how their age assurance policies protect children.
Kerry Smith, chief executive of the Internet Watch Foundation, described the NSPCC's findings as "yet another wake-up call," adding: "Mandatory introduction of on-device protections will protect children from unsolicited nude imagery, and from being coerced into sending sexually explicit material. We must see these measures applied across the board."
Government Response and Legislative Actions
Safeguarding minister Jess Phillips called the data "nothing short of deeply shocking." She affirmed: "Predators cannot continue like this – unstopped and unchecked. We plan to stop them. We have committed to making it impossible for children in the UK to take, share or view nude images, and have already announced a ban on so-called 'nudification' apps to stop abusive images being created and spread in the first place. We will not hesitate to go further until our children are safe from sexual abuse online."
Earlier this year, the government announced that nudification apps would be criminalized as part of the Crime and Policing Bill, which is currently progressing through Parliament. This legislative move aims to prevent the creation and distribution of artificially generated abusive imagery.
The persistent rise in child abuse image crimes underscores the critical need for coordinated action between government, regulators, and technology companies to implement effective safeguards for young people in the digital age.
