AI-Generated Child Sex Abuse Images Used in School Blackmail Attempts
AI Child Abuse Images Used in School Blackmail

Blackmailers are exploiting photographs of pupils featured on school websites to generate sexual abuse imagery through artificial intelligence, according to warnings from child safety experts and the UK's National Crime Agency (NCA).

Criminal Use of AI to Manipulate Children's Photos

The NCA and child safety organisations have reported that criminals are employing AI technology to alter images of children, subsequently demanding substantial sums of money under threat of publication. They are advising educational establishments to eliminate identifiable pictures of children from their websites and social media profiles.

The Internet Watch Foundation (IWF) disclosed that a secondary school in the United Kingdom became the target of a blackmail scheme after offenders converted student photographs into child sexual abuse images. The blackmailers transmitted the doctored images to the school, threatening to release them unless a monetary payment was made.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

The IWF indicated that 150 images involved in the extortion attempt could be categorised as Child Sexual Abuse Material (CSAM) under UK legislation. This incident, which occurred last year, is not an isolated case in the UK involving the distortion of school website or social media account photos, the watchdog noted. The identity of the school and the police force contacted to prevent the images' distribution have not been disclosed.

Government and Expert Response

Jess Phillips, the minister for safeguarding and violence against women and girls, described the situation as a 'deeply worrying emerging threat'. She stated, 'We will not hesitate to go further if necessary and make sure our laws stay up to date with the latest threats,' in comments to The Guardian.

The Early Warning Working Group (EWWG), a UK advisory body focused on tackling online harms, has issued guidance for schools on protecting pupils from blackmailers. Their recommendations include removing images that show a student's face directly and cautioning against publishing names or faces of students. The EWWG has compiled a checklist of actions for schools, which involves regularly seeking re-signing of image consent agreements and conducting regular audits of children's images on websites, social media accounts, and promotional materials.

In the event of an incident, the group—which comprises the NSPCC charity, the IWF, the Welsh government, Education Scotland, the Safeguarding Board for Northern Ireland, and the NCA—advises schools to immediately contact the police, remove the original images, and retain any criminal images.

Sextortion: A Growing Threat

Leora Cruddas, chief executive of the Confederation of School Trusts, remarked, 'As educators we instinctively want to celebrate children's achievements and that includes sharing photos and videos of all the good things that go on in our schools – it is deeply depressing that in doing so we potentially have to contend with threats from abusers and scammers.'

This form of blackmail, known as sextortion, has become more prevalent with the increased use of AI tools. Sextortion involves manipulating a child or adult into sending intimate images of themselves and then threatening to distribute them to loved ones or release them online unless a ransom is paid.

Some schools have already taken proactive measures against this escalating threat. Last year, the Loughborough Schools Foundation removed recognisable images of pupils from the websites of the three private schools it represents.

Pickt after-article banner — collaborative shopping lists app with family illustration