Juries Deliver Landmark Verdicts Against Meta and YouTube Over Child Harms
For years, parents, teenagers, pediatricians, educators, and whistleblowers have consistently argued that social media platforms pose significant risks to young people's mental health. These concerns include addiction, eating disorders, sexual exploitation, and even suicide. This week, for the first time, juries in two separate states have sided with these advocates, delivering verdicts that could mark a turning point in the battle against Big Tech's perceived invincibility.
A Watershed Moment for Accountability
On Wednesday, a jury in Los Angeles found both Meta and YouTube liable for harms caused to children using their services. Simultaneously, in New Mexico, a jury determined that Meta knowingly harmed children's mental health and concealed its knowledge about child sexual exploitation on its platforms. Tech watchdog groups, families, and children's advocates celebrated these decisions as a long-overdue reckoning.
"The era of Big Tech invincibility is over," declared Sacha Haworth, executive director of The Tech Oversight Project. "After years of gaslighting from companies like Google and Meta, new evidence and testimony have pulled back the curtain and validated the harms young people and parents have been telling the world about for years."
Shifting Public Perception and Legal Strategies
While it remains too early to predict whether these verdicts will lead to fundamental changes in how social media platforms treat young users, they undeniably signal a changing tide of public opinion. This shift is likely to spur more lawsuits and increased regulatory scrutiny. For years, tech giants have argued that harms to children were unintended byproducts of broader societal issues or the actions of bad actors exploiting platform safeguards. They have often dismissed research linking psychological harms to social media use.
During his testimony in the Los Angeles trial, Meta CEO Mark Zuckerberg exemplified this stance. When questioned about whether addictive design leads to increased platform use, he responded, "I'm not sure what to say to that. I don't think that applies here." The jury verdicts, however, reflect a growing public willingness to hold these companies directly responsible and demand operational changes.
Company Responses and the Path Forward
Both Meta and Google have stated they disagree with the verdicts and are exploring legal options, including appeals. The final outcomes could take years to resolve through appeals and potential settlements. Arturo Béjar, a former Meta engineering director who testified about Instagram's harms, noted that jury trials "level the playing field" against trillion-dollar corporations. However, he cautioned that genuine regulation is necessary for lasting change.
"One thing that I saw working inside the company that effectively led to behavior change was when an attorney general or the FTC stepped in and required things of the company," Béjar explained. "Both New Mexico and Los Angeles and all the attorneys general that are part of this process have really an extraordinary opportunity and the ability to ask for meaningful change."
Key Differences Between the Two Cases
The lawsuits, while both focusing on harms to children, employed distinct legal strategies:
- New Mexico's Case: Filed by State Attorney General Raúl Torrez in 2023, this case involved state investigators posing as children on social media to document sexual solicitations and Meta's responses. The jury was asked to determine if Meta violated the state's consumer protection law.
- Los Angeles's Case: This case had a single plaintiff, identified by the initials KGM, against Meta, Google's YouTube, TikTok, and Snap. TikTok and Snap settled before trial. The plaintiff argued that Meta and YouTube's platform design features were deliberately addictive for young users.
These cases are part of broader litigation, with KGM and a few other plaintiffs selected for bellwether trials. These test cases, reminiscent of Big Tobacco and opioid lawsuits, aim to gauge how arguments fare before a jury, potentially leading to wider settlements.
Navigating Legal Shields and New Territories
By focusing on deliberate design choices and product liability, these lawsuits successfully sidestepped Section 230 of the Communications Decency Act. This law generally shields internet companies from liability for user-posted content. Past lawsuits focusing on content distribution often failed on these grounds.
"For the first time, courts have held social media platforms accountable for how their product design can harm users," said Nikolas Guggenberger, an assistant professor of law at the University of Houston Law Center. "This is a new legal territory that could reshape an industry long shielded by Section 230. Platforms will have to rethink their focus on engagement at any cost, which has outlived itself."
A Changing Landscape and Future Challenges
Public sentiment is already shifting. A 2025 Pew Research Center poll revealed that 48% of teens believe social media harms people their age, up from 32% in 2022. As social media faces this reckoning, new technological frontiers like artificial intelligence chatbots are emerging as the next battleground for child safety.
"You can ban today's harm, but how do you know what tomorrow is going to bring?" questioned Sarah Kreps, a professor and director of Cornell University's Tech Policy Institute. "Whether it's another social media app, AI or some other new technology, new things will crop up. And people will flock to those because where there's demand you will see a supply come to meet that demand."
The dual jury verdicts represent a significant crack in Big Tech's armor, highlighting a growing consensus that platforms must be held accountable for the real-world impacts of their design choices on young users.



