AI 'Stripping' Apps Remain on UK App Stores Despite Deepfake Ban
AI Stripping Apps Still on UK Apple & Google Stores

Applications that permit users to generate artificial intelligence-powered "nude" photographs of real individuals without their consent continue to be available for download through both the Apple App Store and Google Play Store in the United Kingdom. This persistence comes despite recent legislation that explicitly criminalises the creation of sexually explicit deepfake content within the nation.

Platform Policies Versus App Availability

The ongoing availability of these applications highlights a significant gap between official platform policies and the reality of content enforcement. A recent investigation by the Tech Transparency Project (TTP) uncovered 55 such apps on the US version of the Google Play Store and 47 on the US Apple App Store that digitally remove clothing from images of women, depicting them as partially or completely naked.

A subsequent search by The Independent confirmed that a number of similar applications, including some specifically named in the TTP report, remain accessible through the UK iterations of these dominant digital marketplaces.

Contradictory Content Guidelines

This situation exists in direct contradiction to the publicly stated content policies of both technology giants. The Google Play Store explicitly prohibits apps containing or promoting sexual content, pornography, or services intended for sexual gratification. Its guidelines further forbid content associated with sexually predatory behaviour or the distribution of non-consensual sexual material.

Similarly, Apple's App Store Review Guidelines state that applications "should not include content that is offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy." The policy specifically bans "overtly sexual or pornographic material."

Specific Apps and Developer Responses

One application highlighted in the TTP investigation, which can generate a video of a woman appearing to remove her top and dance from a single uploaded photograph, was still available for download on both platforms as of Friday afternoon. This particular app has reportedly been installed more than five million times.

Another app advertised on the Google Play Store promotes the ability to digitally "try on" clothing, yet showcases images of women being placed into bikinis. In response to the TTP findings, Apple stated it had removed 28 of the 47 flagged applications and contacted other developers to address guideline violations. Google also appears to have taken down some identified apps.

Campaigners Demand Tighter Regulation

The controversy follows significant public outcry over Elon Musk's Grok AI system, which faced criticism for allowing users to create non-consensual 'nude' images of women. Leading women's rights organisations, including Refuge, Women's Aid, and Womankind Worldwide, have condemned the "disturbing" rise in AI-facilitated intimate image abuse.

Emma Pickering, head of technology-facilitated abuse at the charity Refuge, emphasised the urgent need for action: "As technology evolves, women and girls' safety depends on tighter regulation around image-based abuse, whether real or deepfake, as well as specialist training for prosecutors and police. Women have the right to use technology without fear of abuse, and when that right is violated, survivors must be able to access swift justice and robust protections."

The Independent has contacted both Google and the UK government for further comment on the enforcement of deepfake legislation and app store content moderation.