Campaigners Demand Stronger Deepfake Protections as New Law Takes Effect
Deepfake Abuse Victims Call for Tougher Action as Law Begins

Victims Urge Tougher Action on Deepfake Abuse as New Law Comes Into Force

Campaigners from Stop Image-Based Abuse have delivered a petition to Downing Street with more than 73,000 signatures, calling for greater protection against deepfake image abuse as new legislation criminalising non-consensual AI-generated explicit images takes effect. While welcoming the legal changes, victims argue the law does not go far enough to provide comprehensive safeguards.

Petition Demands Civil Routes to Justice

The petition urges the government to introduce civil routes to justice, including takedown orders for abusive imagery on platforms and devices. Campaigners are also calling for improved relationships and sex education, alongside adequate funding for specialist services like the Revenge Porn Helpline that support intimate image abuse victims.

"Today's a really momentous day," said Jodie, a victim of deepfake abuse who uses a pseudonym. "We're really pleased the government has put these amendments into law that will definitely protect more women and girls. They were hard-fought victories by campaigners, particularly the consent-based element of it."

Victim Testimonies Highlight Legal Gaps

Jodie, who is in her 20s, discovered images of her being used as deepfake pornography in 2021. She and 15 other women testified against the perpetrator, 26-year-old Alex Woolf, after he posted images of women from social media to porn websites. He was convicted and sentenced to 20 weeks in prison.

"I had a really difficult route to getting justice because there simply wasn't a law that really covered what I felt had been done to me," said Jodie.

The offence against creating explicit deepfake images was introduced as an amendment to the Data (Use and Access) Act 2025. While the law received royal assent last July, the offence was not enforced until Friday.

Frustration Over Implementation Delays

Many campaigners, including Jodie, expressed frustration about delays to the law coming into effect. "We had these amendments ready to go with royal assent before Christmas," said Jodie. "They should have brought them in immediately. The delay has caused millions more women to become victims, and they won't be able to get the justice they desperately want."

In January, Leicestershire police opened an investigation into a case involving sexually explicit deepfake images that were created by Grok AI, highlighting the ongoing nature of the problem.

Concerns About Protection for Sex Workers

Madelaine Thomas, a sex worker and founder of tech forensics company Image Angel who has waived her right to anonymity, described it as "a very emotional day" for her and other victims. However, she noted the law falls short of protecting sex workers from intimate image abuse.

"When commercial sexual images are misused, they're only seen as a copyright breach. I respect that," Thomas said. "However, the proportion of available responses doesn't match the harm that occurs when you experience it. By discounting commercialised intimate image abuse, you are not giving people who are going through absolute hell the opportunity to get the help they need."

For the last seven years, intimate images of her have been shared without her consent almost every day. "When I first found out that my intimate images were shared, I felt suicidal, frankly, and it took a long time to recover from that."

Statistics Reveal Widespread Problem

According to domestic abuse organisation Refuge, one in three women in the UK have experienced online abuse, underscoring the scale of the issue that campaigners are addressing.

Stop Image-Based Abuse is a movement composed of the End Violence Against Women Coalition, the victim campaign group #NotYourPorn, Glamour UK and Clare McGlynn, a professor of law at Durham University.

Government Response and Future Measures

A Ministry of Justice spokesperson stated: "Weaponising technology to target and exploit people is completely abhorrent. It's already illegal to share intimate deepfakes – and as of yesterday, creating them is a criminal offence too."

"But we're not stopping there. We're going after the companies behind these 'nudification' apps, banning them outright so we can stop this abuse at source."

"The technology secretary has also confirmed that creating non-consensual sexual deepfakes will be made a priority offence under the Online Safety Act, placing extra duties on platforms to proactively prevent this content from appearing."