EU Votes to Ban Under-16s from Social Media in Landmark Resolution
EU Votes to Ban Under-16s from Social Media

The European Parliament has taken a decisive stance on protecting children online, voting overwhelmingly in favour of a resolution that could see under-16s banned from social media platforms across the European Union.

A Landmark Vote for Digital Child Protection

In a significant move that could reshape the digital landscape for young people, Members of the European Parliament (MEPs) endorsed a non-binding resolution on Wednesday, 26 November 2025. The resolution calls for raising the digital age of consent to 16 years old, effectively prohibiting younger children from accessing social media services without parental approval.

The vote saw 513 MEPs support the measure, with only 79 against and 42 abstentions, demonstrating strong cross-party consensus on the need for enhanced online protections for minors. This initiative forms part of a broader effort to update the EU's landmark digital rulebook and strengthen enforcement mechanisms.

The Driving Forces Behind the Ban

The resolution highlights growing concerns about the impact of social media on children's mental health and development. Proponents argue that platforms have failed to adequately protect young users from harmful content, addictive design features, and potential exploitation.

Spanish MEP and resolution co-author Susana Solís Pérez emphasised the urgency of the situation, stating that current measures are insufficient to safeguard children in the digital environment. The resolution specifically targets what lawmakers describe as addictive design and toxic content that can negatively affect young minds.

While the resolution itself doesn't immediately change the law, it places significant political pressure on the European Commission to propose legislation that would implement an age verification mandate for social media platforms operating within the EU.

Implementation Challenges and Industry Response

The proposed ban faces substantial practical challenges, particularly regarding age verification technology. Critics question how platforms could reliably verify users' ages without compromising privacy or creating cumbersome access barriers.

Digital rights groups have expressed concerns that strict age verification requirements might lead to increased data collection and surveillance of all users, not just children. There are also questions about how such measures would align with existing data protection regulations like the GDPR.

The technology industry has responded cautiously, with many platforms pointing to their existing parental control features and content moderation efforts. However, lawmakers argue that self-regulation has proven inadequate and that more robust, legally enforceable standards are necessary.

This resolution represents a significant escalation in the EU's approach to regulating big tech companies and follows other major digital legislation passed in recent years. The outcome of this initiative could influence global standards for children's online safety and set precedents that extend beyond European borders.

As the European Commission considers its next steps, parents, educators, and technology companies alike are watching closely to see how this potential social media ban for under-16s will develop and what it might mean for the digital experiences of future generations.