US Courts Deliver Landmark Blows to Meta and YouTube Over Addiction and Harm
In a significant legal development, two US juries have delivered consecutive rulings against Meta, the parent company of Facebook and Instagram, and YouTube, finding them liable for designing addictive platforms that caused substantial harm to users. These verdicts represent a potential breakthrough in the global effort to hold technology giants accountable for their impact on society.
The Dual Legal Strikes Against Tech Giants
First, a New Mexico jury fined Meta $375 million (£280 million) for enabling harm on its platforms, including facilitating child sexual exploitation, and for misleading consumers about safety. Just twenty-four hours later, a California jury awarded $6 million in damages to a young user who argued that Meta and YouTube deliberately created addictive products that hooked her from childhood, causing severe psychological damage.
Campaigners outside the Los Angeles Superior Court celebrated these rulings on March 25, 2026, viewing them as crucial victories in a long battle to regulate tech companies that profoundly influence daily life, from shaping worldviews to affecting mental health.
Whistleblower Revelations and Corporate Toxicity
Frances Haugen, the former Facebook employee turned whistleblower who released 20,000 pages of internal documents in 2021, described this moment as Meta's "asbestos moment." She suggested that the company could face legal payouts totaling up to a trillion dollars, potentially leading to bankruptcy. Her documents provided clear evidence that Meta knew its platforms were causing harm, including damaging children and destabilising democracy, yet prioritised "astronomical profits."
Further insights come from Sarah Wynn-Williams' 2025 memoir, Careless People, which details how Facebook tracked users' activity to monetise their vulnerabilities. For instance, the company could detect when teenage girls deleted selfies, interpreting this as dissatisfaction with their appearance, and then sold targeted beauty ads to exploit that moment of insecurity.
Designed for Addiction and Profit
Internal dissent within Meta was often dismissed. The New Mexico court heard how an employee warned CEO Mark Zuckerberg about the dangers of a cosmetic surgery filter on Instagram, citing his daughter's hospitalisation for body dysmorphia. Zuckerberg rejected the concern, calling restrictions "paternalistic." Haugen explained that even minor tweaks to reduce harm, such as limiting late-night notifications for children, were vetoed if they risked a 1% drop in user engagement, reflecting Zuckerberg's focus on increasing platform time.
Circumventing the Liability Shield
These rulings successfully navigated around Section 230, a 1990s law that shielded tech companies from liability for user-generated content. The California case focused not on content itself but on the recommendation systems—like automatic video play and infinite feeds—that are addictive by design and entirely controlled by the companies. Lawyer Ravi Naik emphasised that these systems are human-made decisions, not abstract entities, and thus subject to legal accountability.
Potential Appeals and Global Implications
While the verdicts may be appealed, potentially reaching a US Supreme Court favourable to big tech, legal experts note that jury decisions are less likely to be overturned. With thousands of similar cases pending, even a small fraction of successful claims could devastate Meta financially. Haugen calculated that 150,000 teenagers each awarded $6 million would result in a trillion-dollar liability.
Globally, enforcement has been lacking in regions like the UK and EU, but there are signs of growing "digital sovereignty." Australia's ban on social media for under-16s and Indonesia's similar move indicate a shifting landscape. However, the rise of AI presents new challenges, though Naik argues that AI systems are also human-designed and thus liable under these legal precedents.
A Turning Point in Tech Accountability
Despite the deep pockets and political connections of tech leaders, these rulings mark an important victory in the fight against platforms that have corroded 21st-century life. They set a precedent for holding companies accountable for the harm caused by their addictive designs, offering hope for stronger regulation and protection for users worldwide.



