UK Charity's Facebook Page Reinstated After AI Mistook Name for Heroin
Charity's Facebook page reinstated after AI blunder

A UK photography charity is celebrating after its Facebook group was finally reinstated, following a more than month-long battle against an automated takedown. The Gloucestershire-based organisation, Hundred Heroines, had its page removed after Meta's artificial intelligence tools incorrectly identified its name as a reference to the class-A opioid, heroin.

A Costly Automated Error

Hundred Heroines, a charity celebrating female photographers, has now had its Facebook group taken down twice in 2025 for alleged breaches of community standards related to drug promotion. The most recent removal occurred in September. After a second appeal within twelve months, the 'Hundred Heroines: Women in Photography Today' page was quietly restored last week without an explanation or apology from the tech giant.

The charity's founder, Dr Del Barrett, the former president of the Royal Photographic Society, described the impact as devastating. She explained that the organisation relies heavily on Facebook to attract visitors to its physical space in Nailsworth, near Stroud. "AI technology picks up the word heroin without an 'e', so we get banned for breaching community guidelines," Dr Barrett stated. "Then no matter what you do, you can't get hold of anyone and it really affects us."

The Human Cost of AI Moderation

Founded in 2020, the charity holds approximately 8,000 items in its collection, focusing on the work of female photographers throughout history. Dr Barrett estimates that a staggering 75% of the charity's visitors find them through Facebook, highlighting the severe operational impact of the erroneous ban.

This incident sheds light on a broader issue with automated content moderation. In 2024, Meta increased its vigilance concerning drug-related groups, largely in response to the opioid crisis in the United States, where 80,000 overdose deaths occurred last year. The company states that buying and selling drugs is strictly prohibited on its platforms and claims to employ "robust measures" to detect and remove such content.

Meta's approach relies heavily on AI, which it says is "central to [its] content review process." The technology is designed to detect and remove violating content before it is even reported. In some cases, it flags content for human review teams, but Dr Barrett confirmed that during their appeals process, Hundred Heroines had no interaction with a human being at Meta.

Fighting for a Brand's Identity

The situation left the charity questioning whether it should change its established name. "We thought, 'should we change our name?' But why should we?" Dr Barrett remarked. "Why have we got to mess with our brand just because of Facebook?"

She characterised the experience as both frightening and absurd. "It sort of verges on scary and laughable," she said. "You think these bots are running the world and they can't tell the difference between a woman and an opioid. Heaven help us."

This is not an isolated incident. Earlier this year, Meta faced significant criticism for the mass banning or suspension of accounts on Facebook and Instagram. While users blamed the company's AI moderation tools, Meta acknowledged only a "technical error" affecting Facebook Groups and denied a wider increase in incorrect enforcement across its platforms. The company stated it was addressing an issue that emerged in the summer, which also affected groups sharing memes about bugs, erroneously accusing them of violating standards on "dangerous organisations or individuals."