Tech Executives Face Parliamentary Scrutiny Over Social Media Addiction Claims
Senior executives from TikTok and Meta, the parent company of Facebook and Instagram, have firmly rejected assertions that their social media platforms are inherently addictive. This denial comes amidst escalating public and political outrage regarding insufficient measures to protect children online.
Intense Committee Questioning on Platform Safety
Alistair Law, TikTok's director of public policy, and Rebecca Stimson, Meta's head of public policy, were subjected to rigorous interrogation by members of the Education Select Committee. The session focused on the companies' responsibilities toward younger users and the alleged addictive nature of their services.
This parliamentary scrutiny follows a landmark US trial last month, where a jury ruled that Meta had intentionally constructed addictive social media platforms. Meta is currently appealing this decision. TikTok, initially a co-defendant in the same case, settled out of court for an undisclosed sum prior to the trial's conclusion.
Executives Dismiss Premise of Inherent Addictiveness
When directly questioned by Liberal Democrat MP Caroline Voaden about steps being taken to address addiction, both executives refuted the core allegation.
Rebecca Stimson of Meta stated: "We obviously are appealing that court case. So we don't also accept the premise that our platforms are addictive." She redirected attention to safety features Meta has implemented, including an algorithm reset she claimed reduced user time by 50 million hours, and parental controls allowing 15-minute app caps.
Alistair Law of TikTok echoed this sentiment, telling MPs: "I don't think that we accept the premise that there is an inherent addictiveness." He highlighted TikTok's own protective tools, such as screen time limits for users under 16, family pairing for parental monitoring, and a post-10pm meditation feature designed for teenagers.
Admissions on Age Verification Failures
Despite defending their platforms' design, both executives conceded significant shortcomings in age verification technology. All major platforms mandate users be at least 13 years old to create an account, with additional protections for those under 16 or 18.
Committee Chair Helen Hayes, a Labour MP, challenged the tech chiefs, arguing their safety measures "aren’t working." She cited research indicating children as young as five are being groomed into live-streaming harmful content on platforms like TikTok.
Mr. Law acknowledged discussing specific cases with police but admitted broadly: "It's absolutely a challenge... whether or not the age verification elements for those risky aspects of your site are sufficient."
Ms. Stimson separately confirmed: "There is a real problem with age assurance... There is a limit to that technology at the moment." She referenced the UK's Online Safety Act, which mandates highly effective age verification, while noting current technological constraints.
Political and Legal Backlash Grows
Following the evidence session, MP Caroline Voaden expressed strong criticism: "It is absolutely galling for social media giants to say their platforms are not addictive. Parents, experts, whistleblowers, even users, are all aware of the dangers... The platforms are the only ones still in denial."
The US court case referenced awarded $6 million (£4.5 million) in damages to a 20-year-old woman, known as Kaley, after a jury found Meta and Google's YouTube had intentionally built addictive platforms that harmed her mental health.
UK Government Considers Regulatory Action
The UK government is currently consulting on potential stricter regulations, including a social media ban for under-16s. This move follows Australia's implementation of a similar ban late last year. Ministers are cautiously evaluating evidence, concerned that an outright ban might drive teenage users toward more dangerous, unregulated corners of the internet.
Other measures under consideration by policymakers include:
- Overnight social media curfews for minors
- Raising the legal age of digital consent
- Mandatory caps on daily app usage
- Restrictions on design features believed to foster addiction, such as notification streaks and infinite scrolling interfaces
The parliamentary hearing underscores a widening chasm between the tech industry's self-assessment and mounting external pressure from legislators, courts, and child safety advocates demanding more robust action to shield young users from potential harm.



