The European Parliament has taken a significant step towards enhancing child protection online, formally calling for a minimum age of 16 for social media use across the bloc.
This resolution, agreed upon on 26 November 2025, is not legally binding but places considerable pressure on individual member states to harmonise their approach to digital age restrictions. The move highlights a growing global trend, with countries from Australia to Norway actively implementing or debating similar measures to shield minors from potential online harms.
Global Landscape of Social Media Age Restrictions
While the EU debates a unified approach, several nations have already forged ahead with their own legislation. The landscape is a patchwork of different age limits and enforcement mechanisms.
In Australia, a landmark law passed in November 2024 compels tech giants like Meta and TikTok to prevent minors from logging in, with non-compliance risking staggering fines of up to A$49.5 million. Following a trial that began in January, the ban is scheduled to take full effect from 10 December 2025.
Across Europe, the approach varies. France passed a law in 2023 requiring parental consent for users under 15, though reports indicate technical hurdles have delayed its enforcement. Meanwhile, Italy mandates parental consent for children under 14.
In Germany, official policy states that minors aged 13 to 16 can only use social media with parental approval, though campaigners argue these controls are often insufficient. Belgium has had a law since 2018 setting the age for independent account creation at 13.
The UK's Position and Tech's Own Rules
The British government has taken a different path. Its Online Safety Act, passed in 2023 and enforced from 2025, imposes tougher standards on platforms like Facebook and TikTok regarding age-appropriate design. However, it has notably stopped short of setting a clear, universal age limit for social media use by minors, opting instead for a broader duty of care.
Interestingly, major social media platforms are not waiting for legislation. Companies including TikTok, Facebook, and Snapchat already enforce their own policy, requiring users to be at least 13 years old to sign up. Despite this, official data from several European countries reveals a stark reality: a significant number of children under 13 already possess social media accounts, underscoring the challenges of effective age verification.
Future Directions and Stricter Proposals
The debate is far from over, with some nations considering even more stringent rules. In Norway, the government proposed in October 2024 to raise the age of consent for social media terms from 13 to 15, and is exploring an absolute legal minimum age of 15.
The urgency for action is underlined by Norwegian government statistics showing that half of the country's nine-year-olds are already using some form of social media. Similarly, a panel commissioned by French President Emmanuel Macron recommended banning smartphones for under-11s and internet-enabled phones for under-13s, signalling that future legislation could extend beyond social media platforms themselves.
As the European Parliament's resolution demonstrates, the international consensus is shifting towards greater protection for young people online, setting the stage for a continued and complex regulatory battle between governments, tech firms, and child safety advocates.