UK Government Imposes Strict 48-Hour Deadline for Revenge Porn Removal
Technology firms operating in the United Kingdom will be legally required to remove "revenge porn" and other non-consensual intimate images within 48 hours of being notified, or face severe financial penalties and potential service blocks. This mandate comes through a significant amendment to the Crime and Policing Bill, announced by government ministers as part of a broader crackdown on online violence against women and girls.
Substantial Penalties for Non-Compliance
Companies that fail to comply with the new regulations risk substantial fines of up to 10 percent of their global annual revenue. In extreme cases, persistent offenders could see their services completely blocked within the UK market. Prime Minister Sir Keir Starmer has explicitly placed technology companies "on notice" regarding these measures, which could directly impact major platforms like X (formerly Twitter).
Sir Keir emphasized the urgency of this action, stating: "The online world is the front line of the 21st century battle against violence against women and girls. That's why my government is taking urgent action against chatbots and 'nudification' tools. Today we are going further, putting companies on notice so that any non-consensual image is taken down in under 48 hours. Violence against women and girls has no place in our society, and I will not rest until it is rooted out."
Streamlined Reporting and Automated Removal Systems
The legislation introduces several key improvements to protect victims:
- Victims will only need to report an offending image once rather than submitting reports across multiple platforms
- Reported content will be automatically deleted if someone attempts to re-upload it elsewhere
- The government plans to implement hash matching technology similar to systems used for detecting terrorist content and child sexual abuse material
Victims' Minister Alex Davies-Jones explained the technical approach: "We'll be using something called hash matching, which is similar to terrorist content or child sexual abuse content so that it can be taken down on every platform, so you as a victim don't have to report it to every single platform time and again to try and get that taken down."
Regulatory Expansion and Industry Accountability
Regulator Ofcom is considering classifying non-consensual intimate images alongside child sexual abuse and terrorism content, which would enable digital marking and automatic removal across platforms. This represents a significant escalation in how such content is treated under UK law.
Technology Secretary Liz Kendall delivered a clear message to the industry: "No woman should have to chase platform after platform, waiting days for an image to come down. Under this government, you report once and you're protected everywhere. The internet must be a space where women and girls feel safe, respected, and able to thrive." She added that "the days of tech firms having a free pass are over."
Context and Recent Developments
These measures follow recent controversies involving artificial intelligence tools, particularly X's Grok AI, which raised concerns after being used to generate sexualized images of women and children. Prime Minister Starmer previously indicated that platforms facilitating such "unlawful" and "disgusting" content could face blocking in the UK.
The government's new Telecoms Consumer Charter has already seen multiple providers commit to these strengthened protections, signaling a coordinated approach between regulators, lawmakers, and industry to address online safety concerns more effectively.