US Courts Rule Social Media Giants Liable for User Harm in Landmark Cases
In a significant legal development, courts in the United States have delivered rulings that hold social media platforms accountable for the harms inflicted on their users. This week, separate cases in New Mexico and Los Angeles concluded that companies like Meta and Google bear legal responsibility for damages caused by their services, challenging long-standing defences and potentially setting a global precedent.
New Mexico Case Exposes Meta's Failures in Child Safety
The New Mexico lawsuit, brought by the state's attorney general, accused Meta of misleading users about safety protocols and enabling child sexual exploitation on platforms such as Instagram and Facebook. Evidence presented in court revealed a lack of basic safety measures, including ineffective age verification systems. Undercover agents posing as children reported receiving sexualised communications from adults, highlighting systemic vulnerabilities.
Internal documents from Meta, which acknowledged risks of exploitation and harm, were submitted to the court, undermining the company's defence. A jury found that Meta violated consumer protection laws through deceptive and unfair practices, deeming their exploitation of users' ignorance as unconscionable. As a result, civil penalties totalling $375 million were imposed, marking a stern rebuke of the tech giant's operations.
Los Angeles Case Highlights Addictive Design and Mental Health Impacts
In Los Angeles, a lawsuit filed by a young woman against Meta and Google alleged that their platforms were deliberately engineered to be addictive, leading to severe mental health issues, including depression, anxiety, and suicidal thoughts. The plaintiff, who began using social media as a child, reported spending up to 16 hours daily on these services.
The court awarded $3 million in compensation and an additional $3 million in punitive damages, with Meta held liable for 70% of the harm and Google for the remaining 30%. Notably, the jury determined that the platforms were intentionally designed to foster addiction in children, drawing parallels to tactics used by the gambling and tobacco industries to maximise engagement and advertising revenue.
Structural Engineering and Addictive Features Under Scrutiny
Key features such as infinite scroll, algorithmic recommendations, autoplay loops, and time-sensitive content were cited as mechanisms that encourage addictive behaviours. This finding is particularly impactful because it shifts liability away from user-generated content, a traditional shield for platforms, towards the inherent design of the services themselves. As one observer noted, the killer isn't in the building; it is the building, underscoring the fundamental role of platform architecture in causing harm.
With thousands of similar cases pending globally, Meta and Google have initiated appeals, fearing that these verdicts could spark widespread class actions. Legal experts suggest this moment could mirror the Big Tobacco era, where industry practices faced intense scrutiny and regulation.
Implications for Global Policy and Public Health
These rulings come as many countries, including Australia, consider or implement social media bans for minors. Critics who dismissed such measures as boomer moralising may now face embarrassment, as the court decisions validate concerns about platform safety and addiction. The cases highlight the need for robust public health interventions in the digital age, prompting a reevaluation of how adults also interact with these addictive systems.
As the tech industry grapples with these legal setbacks, the dawn of a new era of accountability seems imminent, urging a broader societal reflection on the consequences of our digital dependencies.



