Two of the world's largest social media companies, Meta and Snapchat, are confronting significant legal challenges on both sides of the Atlantic. The lawsuits centre on allegations that their platforms have been used to facilitate child sexual abuse and the sale of illicit drugs, with claims the firms failed to implement adequate safety measures.
Dual Lawsuits Target Platform Safeguards
In a coordinated legal offensive, Meta, the parent company of Facebook and Instagram, is facing a collective action lawsuit in the High Court of England and Wales. The case, filed on behalf of potentially millions of affected children, accuses the tech giant of designing its platforms with addictive features that allegedly left young users vulnerable to sexual exploitation. The claim argues that Meta's algorithms prioritised user engagement over safety, creating an environment where predatory behaviour could flourish.
Simultaneously, Snap Inc., the firm behind Snapchat, is defending itself against a separate but equally serious lawsuit in the United States. The case, filed by families who lost children to fentanyl poisoning, alleges that drug dealers used Snapchat's disappearing messages feature to sell counterfeit pills laced with the deadly synthetic opioid. The plaintiffs contend that Snapchat was aware of this activity on its platform but did not take sufficient steps to stop it.
The Core Allegations and Company Responses
The legal action against Meta, spearheaded by the firm Hausfeld & Co LLP, draws on evidence from whistleblower Frances Haugen and internal company documents. It claims Meta's own research showed its platforms amplified risks for young people, yet it allegedly continued to deploy design elements like infinite scroll and push notifications that increased exposure to harmful content. A Meta spokesperson stated the company has developed over 30 tools to support teens and families, denying the characterisation of its services as unsafe.
For Snapchat, the lawsuit highlights the tragic consequences of the US opioid crisis intersecting with social media. The families argue the platform's ephemeral messaging is favoured by drug dealers, making detection difficult. In response, a Snap spokesperson expressed deep sympathy for the families but defended its record, stating it has "zero tolerance" for drug dealing on its platform. The company pointed to its proactive detection systems and cooperation with law enforcement.
Broader Implications for Tech Regulation
These lawsuits represent a pivotal moment in the ongoing scrutiny of Big Tech's responsibility for user safety, particularly concerning minors. They arrive as the UK's Online Safety Act begins to take effect, imposing a new duty of care on platforms. The outcomes could set powerful legal precedents, potentially forcing fundamental changes to how social media products are designed and moderated globally.
Experts suggest the cases underscore a growing impatience with self-regulation. Whether through legislation like the Online Safety Act or through the courts, there is mounting pressure for technology firms to be held accountable for foreseeable harms linked to their services. The results of these high-profile battles will be closely watched by regulators, campaigners, and the tech industry worldwide.