Meta Confronts Landmark Jury Trial Over Child Safety Failures
The social media giant Meta is facing a pivotal jury trial in Santa Fe, New Mexico, with opening statements scheduled for 9 February. The proceedings, expected to last approximately seven weeks, represent a significant legal challenge for the company as it defends against allegations of enabling child exploitation across its platforms.
State Accusations of Systemic Failures
New Mexico Attorney General Raúl Torrez has launched a comprehensive legal action against Meta, accusing the company of knowingly creating dangerous environments for children on Facebook and Instagram. The lawsuit alleges that Meta's design choices and profit incentives prioritised user engagement over child safety, resulting in inadequate safeguards against sexual exploitation, solicitation, sextortion, and human trafficking.
The state's legal team contends that Meta allowed unmoderated groups dedicated to commercial sex to operate on its platforms and facilitated the distribution of child sexual abuse material. According to court filings, internal Meta documents estimate that approximately 100,000 children on Facebook and Instagram experience online sexual harassment daily.
Meta's Defence and Historical Context
In response to the allegations, a Meta spokesperson stated: "While the New Mexico attorney general makes sensationalist, irrelevant and distracting arguments by cherry-picking select documents, we're focused on demonstrating our longstanding commitment to supporting young people." The company highlighted its introduction of Teen Accounts with built-in protections and parental control tools as evidence of its safety efforts.
The lawsuit follows a two-year Guardian investigation published in 2023 that revealed Meta's struggles to prevent child trafficking on its platforms. Attorney General Torrez has described Meta as potentially "the largest marketplace for predators and paedophiles globally" in previous interviews.
Legal Precedents and Parallel Proceedings
This trial represents Meta's second major legal challenge of 2026 concerning alleged harms to children. The company's attempts to invoke Section 230 of the Communications Decency Act, which typically protects platforms from liability for user-generated content, were denied by a judge in June 2024. The ruling allowed the case to proceed because it focuses on Meta's platform design and internal decisions rather than speech-related issues.
The New Mexico trial begins just one week after another high-profile case commenced in Los Angeles, where hundreds of families and schools allege that Meta, YouTube, TikTok, and Snap have harmed children through addictive platform designs. While Snap and TikTok have reached settlements in that case, Meta and YouTube continue to face trial proceedings involving approximately 1,600 plaintiffs.
Expected Evidence and Key Witnesses
Prosecutors plan to present evidence from "Operation MetaPhile," an investigation that led to the 2024 arrest of three men charged with sexually preying on children through Meta's platforms. Undercover agents posing as children were allegedly contacted by suspects who solicited them for sex after finding minors through Facebook and Instagram features.
Key witnesses for the plaintiffs are expected to include educators, law enforcement officials, and whistleblowers who may reveal internal company discussions. Notably, teens and families who have experienced harm on the platforms are not anticipated to take the stand during the trial.
Internal Documents Reveal Concerning Practices
Recent disclosures from the attorney general's office include allegations that Meta may have profited by placing advertisements from major companies alongside content that sexualised children. Internal chat excerpts allegedly show users discussing methods to lure minors into sexual engagement.
Additional filings reveal that Mark Zuckerberg reportedly approved allowing minors to access AI chatbot companions despite safety staff warnings about potential sexual interactions. Internal communications suggest this was a "Mark-level decision" that prevented parents from disabling the chatbots for their children.
Broader Implications for Social Media Regulation
Sacha Haworth, executive director of the Tech Oversight Project, commented: "The fact that these cases are going to trial proves the Section 230 dam is breaking for social media platforms. These are the trials of a generation; just as the world watched courtrooms hold big tobacco and big pharma accountable, we will for the first time see big tech CEOs take the stand."
The outcome of this landmark trial could establish significant precedents for how social media platforms are held accountable for content moderation failures and design choices that allegedly endanger vulnerable users, particularly children and teenagers.