Meta Fined $375M as Jury Finds Platforms Harm Children in Landmark Case
Meta Fined $375M Over Child Harm in Landmark Social Media Case

Meta Faces $375 Million Penalty as Jury Finds Platforms Harm Children

A New Mexico jury has delivered a landmark verdict in the first of several social media child safety trials this year, finding that Meta's platforms are harmful to children's mental health and imposing a substantial $375 million penalty. The decision represents a significant legal setback for the tech giant and signals mounting challenges for social media companies facing similar allegations nationwide.

Jury Finds Meta Violated Consumer Protection Laws

After a nearly seven-week trial, jurors sided with state prosecutors who argued that Meta—which owns Instagram, Facebook, and WhatsApp—prioritized profits over safety. The jury determined that Meta violated New Mexico's Unfair Practices Act by concealing what it knew about dangers related to child sexual exploitation on its platforms and the impacts on children's mental wellbeing.

The jury agreed with allegations that Meta made false or misleading statements and engaged in what they described as "unconscionable" trade practices that unfairly exploited children's vulnerabilities and inexperience. The $375 million penalty represents thousands of individual violations counted separately under state law.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Building the Case Against Meta

The case was built by a team led by New Mexico Attorney General Raúl Torrez, who originally sued Meta in 2023. Investigators posed as children on social media platforms and documented sexual solicitations they received, along with Meta's responses to these incidents.

During opening statements in early February, prosecuting attorney Donald Migliori argued that Meta had misrepresented platform safety while engineering algorithms to keep young users online despite knowing children faced risks of sexual exploitation. Torrez has called for Meta to implement more effective age verification systems and improve efforts to remove bad actors from its platforms.

Meta's Response and Broader Legal Landscape

Meta has stated it disagrees with the verdict and plans to appeal. "We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online," the company said in an official statement.

While the $375 million fine represents a tiny fraction of Meta's $201 billion revenue in 2025, the verdict illustrates a growing shift in public perception regarding social media companies' responsibilities toward young users. This year, several state and federal court cases are heading to trial, all seeking to hold companies accountable for what occurs on their platforms.

Multiple Legal Fronts Opening Against Social Media Firms

In Los Angeles, jurors are currently deliberating in another landmark social media case that seeks to hold tech companies responsible for harms to children. This case focuses on whether platform design features from Meta and YouTube were intentionally engineered to be addictive, particularly for young users. TikTok and Snap settled before this trial began.

The Los Angeles case centers on a 20-year-old identified by the initials "KGM," whose outcome could determine how thousands of similar lawsuits proceed. These bellwether trials serve as test cases for both plaintiffs and defendants to gauge how their arguments resonate with juries.

School Districts Prepare for Summer Trial

A trial scheduled for this summer will pit school districts against social media companies before U.S. District Judge Yvonne Gonzalez Rogers in Oakland, California. This multidistrict litigation names six public school districts from across the country as bellwethers.

Jayne Conroy, a lawyer on the plaintiffs' trial team who previously worked on cases against pharmaceutical companies over the opioid epidemic, noted similarities between the two legal battles. "With the social media case, we're focused primarily on children and their developing brains and how addiction is such a threat to their well-being," she explained. "The medical science is not really all that different, surprisingly, from an opioid or a heroin addiction. We are all talking about the dopamine reaction."

Pickt after-article banner — collaborative shopping lists app with family illustration

Scientific Debate and Regulatory Challenges

Social media companies continue to dispute claims that their products are addictive. During questioning in the Los Angeles trial, Meta CEO Mark Zuckerberg maintained that existing scientific research hasn't proven social media causes mental health harms.

Some researchers question whether "addiction" accurately describes heavy social media use, noting that social media addiction isn't recognized as an official disorder in the Diagnostic and Statistical Manual of Mental Disorders. However, companies face increasing pushback from academics, parents, schools, and lawmakers regarding social media's effects on children's mental health.

Emarketer analyst Minda Smiley observed: "While Meta has doubled down in this area to address mounting concerns by rolling out safety features, several recent reports suggest that the company continues to aggressively prioritize teens as a user base and doesn't always adhere to its own rules."

Long Road Ahead for Resolution

With appeals and potential settlement discussions, cases against social media companies could take years to resolve completely. Unlike Europe and Australia, where tech regulation has progressed more rapidly, regulatory movement in the United States remains slow.

These legal challenges could potentially undermine companies' First Amendment protections and Section 230 of the 1996 Communications Decency Act, which traditionally shields tech companies from liability for material posted on their platforms. Beyond legal fees and settlements, companies may face operational changes that could impact user numbers and advertising revenue.

The lawsuits have emerged from diverse sources including school districts, local, state, and federal governments, along with thousands of families. These courtroom confrontations represent the culmination of years of scrutiny regarding platform safety for children and whether deliberate design choices create addictive experiences that contribute to depression, eating disorders, or suicidal ideation among young users.