Meta and YouTube Face Legal Reckoning Over Child Safety in US Courts
Mark Zuckerberg, the CEO of Meta, appeared in a Los Angeles courtroom this week for a landmark trial that scrutinises whether social media platforms intentionally addict and harm children. This high-profile case, alongside another in New Mexico, has resulted in significant legal setbacks for tech giants, highlighting a growing societal push to hold them accountable for their impact on public health.
Juries Deliver Verdicts on Addictive Design and Child Exploitation
In California, juries found both Meta and YouTube liable for deliberately engineering addictive products that caused harm to a child. This decision marks a pivotal victory for campaigners who are leveraging the US legal system to compel social media companies to alter their platforms. The evidence presented, including internal documents, revealed a shocking disregard for young people's safety. For instance, one Meta employee's email compared targeting 11-year-olds to the tactics used by tobacco companies decades ago, underscoring the cynical approach within the industry.
Simultaneously, in New Mexico, Meta was held liable for its role in child sex trafficking facilitated through Facebook and Instagram. A Guardian investigation was cited in the complaint, leading to a jury order for Meta to pay $375 million in civil liabilities. The state's attorney general is now pursuing further platform modifications and financial penalties. Although both verdicts are expected to be appealed, they signal a shift in public and judicial attitudes towards the tech sector's responsibilities.
The Broader Implications for Regulation and Society
These cases emphasise that the delivery method of content—such as infinite scroll and gaming-like rewards—is as critical as the content itself in causing harm. While debates often focus on abusive material, these design features exploit vulnerable young minds, making it difficult for users to disengage from their devices. Governments worldwide are beginning to respond; for example, Australia has mandated that social media companies cease targeting children, and the UK is considering restrictions on screen time for minors.
However, the rapid pace of digital innovation, particularly with the rise of artificial intelligence, poses a challenge for regulators struggling to keep up. As Cory Doctorow notes in a recent book, the control exerted by major platforms is unprecedented, necessitating a societal-wide effort to reduce dependence and implement safeguards for all ages. The recent court outcomes, though not a full reckoning akin to the tobacco industry's in the 1990s, represent a crucial step towards forcing tech companies to prioritise public health over profit.



