Meta's Multimillion-Dollar Legal Defeat Over Child Exploitation on Its Platforms
In a stark courtroom moment, Mark Zuckerberg walked past a photograph of a child exploitation victim during a Senate committee hearing last year, symbolising the grave allegations against his company. This image foreshadowed a major legal reckoning for Meta, which recently lost a multimillion-dollar battle over failures to prevent children being sold on Facebook and Instagram. The case stemmed from a Guardian investigation that meticulously uncovered evidence of child sex trafficking proliferating across these social media giants.
The Tip-Off That Sparked a Groundbreaking Investigation
It all began with a confidential tip-off in 2021. While reporting on migrant worker exploitation in the Gulf, a trusted source with over a decade of contact revealed a disturbing surge in child sexual abuse trafficking within the United States. As the Covid pandemic drove predators online, Facebook and Instagram became tools for buying and selling children. This led to a collaborative investigation with human rights journalist Mei-Ling McNamara, targeting the then-unrebranded Facebook company.
Experts from anti-trafficking nonprofits and American law enforcement officials detailed the crimes emerging on these platforms. Trafficking often occurred in non-public areas like Facebook Messenger and private Instagram accounts, where predators searched for, groomed, and advertised teens to sex buyers. Under international law, children cannot consent to sex acts, making anyone profiting from or paying for such acts—including via exploitative photographs—a human trafficker.
Uncovering Shocking Evidence Through Federal Records
One key investigative tool was Pacer, the federal courts records database, though its lack of text search and sealed records for child exploitation cases posed challenges. Hours were spent trawling Department of Justice press releases and Pacer documents for trafficking cases involving social media. The findings were alarming: transcripts revealed sale negotiations for teen girls on Facebook Messenger, while exhibit documents showed trafficking victims advertised in Instagram Stories, with money and logistics discussed. Notably, none of these crimes had been detected or flagged by Meta's systems.
Insights From Moderators and Survivors Highlight Systemic Failures
Former contract moderators for Facebook and Instagram shared harrowing accounts of reviewing traumatic content daily. They reported that efforts to flag potential child trafficking often went unheeded, with harmful content rarely removed. Many felt helpless, criticising Meta's narrow criteria for escalating crimes to law enforcement. In July 2022, a visit to Courtney's House, a Washington DC safe house for teen girls of colour surviving trafficking, provided deeper insight. Run by survivor Tina Frundt, it revealed how Instagram Stories were used to advertise girls for sex.
Frundt described targeting methods for girls and LGBTQ+ youth, sometimes involving complicit family members. She recounted the tragic story of a 15-year-old girl, given the alias Maya, who died after meeting a sex buyer via Instagram who provided fentanyl-laced drugs. This case underscored the lethal consequences of online exploitation.
Law Enforcement and Prosecutors Confirm Escalating Crisis
During a reporting trip to Massachusetts, assistant district attorneys noted a 30% annual increase in child trafficking crimes on social media, exacerbated by the pandemic as children spent more time online and away from protective adults. Traffickers easily identified vulnerable targets based on online activity. Prosecutors highlighted the lucrative nature of these crimes, with appointments and payments handled seamlessly through digital platforms. Interviews with incarcerated traffickers confirmed Instagram as a preferred tool for these offences.
The Investigation's Impact and Meta's Legal Reckoning
Published in April 2023 as "How Facebook and Instagram became marketplaces for child sex trafficking," the investigation initially faced uncertainty due to Section 230 protections shielding platforms from liability. However, it was later cited in a Supreme Court amicus brief and a lawsuit by New Mexico's attorney general, which accused Meta of allowing its platforms to become predator marketplaces. In March this year, Meta lost its first jury trial, ordered to pay $375m in civil penalties for violating consumer protection laws, though it plans to appeal.
Ongoing Revelations and Controversies Over Encryption
Since the initial report, the Guardian has continued exposing exploitation on Meta's platforms, including use of Messenger and Meta Pay for exchanging child sexual abuse material. Cases like Kristen Galvan, a Texas teen groomed and sold via Instagram before being murdered, highlight persistent dangers. Child safety experts criticise Meta's December 2023 encryption of Facebook Messenger, arguing it hinders content scanning and law enforcement access. Instagram head Adam Mosseri's testimony contradicted Meta's stance, admitting self-reporting tools are less effective than detection technology.
Broader Legal Challenges and Future Trials
Meta's troubles extend beyond trafficking; it lost another trial in Los Angeles over features harming children's mental health, with plans to appeal. Upcoming cases include a coalition of 33 attorneys general alleging Meta designed addictive features targeting youth. These legal battles underscore growing scrutiny of tech giants' responsibilities in protecting vulnerable users online.



