Australia's E-Safety Chief Warns Social Media Giants Over Under-16 Ban Gaps
Australia Warns Social Media Over Under-16 Ban Enforcement Gaps

Australia's E-Safety Commissioner Issues Stern Warning to Social Media Platforms Over Under-16 Ban Enforcement

Australia's e-safety commissioner, Julie Inman Grant, has issued a stark warning to major social media platforms, including Meta, YouTube, and TikTok, highlighting "major gaps" in their enforcement of the country's ban on under-16s using their services. This alert comes nearly four months after the legislation took effect, raising concerns about compliance with one of the world's toughest digital restrictions.

Investigation into Potential Non-Compliance Underway

The e-safety commission is currently investigating potential non-compliance by Facebook, Instagram, Snapchat, TikTok, and YouTube. Commissioner Grant emphasized that while initial actions have been taken, monitoring reveals insufficient efforts to adhere to Australian law. "I am concerned through our compliance monitoring that some may not be doing enough to comply," she stated in a recent announcement.

The law mandates that ten of the largest social media networks, such as TikTok, Instagram, Snapchat, YouTube, Facebook, and X, must prevent under-16s from accessing their platforms or face fines up to A$49.5 million (£26.5 million). Despite this, a report from early March noted that around 5 million accounts had been blocked due to age restrictions, yet significant enforcement shortcomings persist.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Persistent Gaps and Parental Concerns

The commission's report identified critical issues, including platforms allowing children under 16 to repeatedly attempt age verification, potentially leading to unauthorized access. A survey conducted between 19 January and 2 February, involving approximately 900 parents and carers, found that nearly half reported their child had an account on at least one platform before the ban. This figure dropped to about 31% after the law's implementation, but many platforms failed to provide effective reporting pathways for parents to flag age-restricted accounts.

Commissioner Grant clarified that proving non-compliance requires more than evidence of children still having accounts; it must show platforms have not taken reasonable steps to prevent this. "The evidence must establish that the platform has not taken reasonable steps to prevent children aged under 16 from having an account," she explained, adding that investigations will take time but could result in escalating consequences, including reputational damage globally.

Industry Response and Ongoing Challenges

In response, a Meta spokesperson acknowledged the industry-wide challenge of accurately determining user age, advocating for robust age verification and parental approval at app stores as the most effective approach. The company committed to continuing investments in enforcement to detect and remove under-16 accounts. This highlights the broader struggle social media giants face in balancing compliance with user privacy and operational feasibility.

As the e-safety commission intensifies its scrutiny, the outcome of these investigations could set a precedent for digital safety regulations worldwide, underscoring the importance of stringent enforcement in protecting minors online.

Pickt after-article banner — collaborative shopping lists app with family illustration