Snapchat Blocks Over 415,000 Under-16 Accounts in Australia's Social Media Crackdown
In a significant enforcement of Australia's pioneering social media age restrictions, Snapchat has revealed it has locked or disabled more than 415,000 accounts identified as belonging to users under the age of 16. This action represents a direct response to the country's Social Media Minimum Age (SMMA) law, which came into effect in December 2025, mandating that certain platforms prevent individuals under 16 from holding accounts.
Mass Account Disabling Under New Legislation
The figures, which apply up to the end of January 2026, cover Australian accounts where users had declared an age below 16 or were assessed as underage using Snapchat's internal age-detection systems. The company has stated that it continues to lock more accounts on a daily basis, highlighting the ongoing nature of this compliance effort. Snapchat was among ten major services initially named as required to comply with the new regulations.
This follows Prime Minister Anthony Albanese's announcement in January that approximately 4.7 million accounts across these platforms had been disabled or removed in the initial days of the ban's implementation. However, the regulator has declined to provide a platform-by-platform breakdown of these removals. It is understood that the total includes not only accounts identified as belonging to under-16s but also historical, inactive, and duplicate accounts, with most companies besides Meta and Snapchat not disclosing their specific deactivation numbers.
Significant Gaps in Age-Verification Technology
While complying with the law, Snapchat has issued a stark warning about significant weaknesses in how the policy is being applied, particularly concerning the limitations of current age-verification tools. The company pointed to a government-run trial published in 2025, which found that facial age-estimation technology was typically accurate only within two or three years of a person's actual age.
In practice, this technological shortfall means some young people under 16 may still be able to bypass the protections, potentially leaving them with reduced online safeguards. Conversely, others over 16 may incorrectly lose access to their accounts and social connections. Snapchat has expressed concern that this uneven application could undermine the policy's effectiveness and fairness.
Uneven Enforcement and Regulatory Challenges
The company also raised issues regarding the ban's uneven application across the broader digital ecosystem. Snapchat noted that teenagers could simply shift to other messaging services that fall outside the scope of the law or remain unregulated, thereby circumventing the intended protections. Australia's eSafety Commissioner, Julie Inman Grant, has acknowledged that enforcement is being phased, with regulatory attention initially focused on the original ten services where the majority of young users are concentrated.
Under the legislation, companies face severe penalties for non-compliance, including fines of up to A$49.5 million (£24.5 million) if they fail to take what the law describes as reasonable steps to keep under-16s off their platforms. This financial deterrent is intended to compel rigorous adherence from tech giants.
Snapchat's Opposition and Proposed Alternatives
Despite its actions, Snapchat has made clear its opposition to the blanket ban approach. In a detailed statement, the company said it fundamentally disagrees that Snapchat should be classified as an in-scope age-restricted social media platform, describing itself instead as primarily a messaging app used by young people to stay connected with close friends and family.
The company argued that cutting off these connections does not necessarily make teenagers safer, happier, or otherwise better off. Instead, Snapchat, along with Meta, has called for age verification to be handled at the app-store level rather than by individual platforms, advocating for a more integrated and potentially more effective system-wide solution.
International Implications and Watchful Eyes
Australia's aggressive approach to social media age restrictions is being closely monitored by other nations considering similar measures. The United Kingdom is currently evaluating comparable legislation, with the House of Lords recently backing an amendment to support a ban for under-16s. This global attention underscores the broader debate about balancing online safety with digital access and the technical challenges of enforcing such age-based barriers in the digital age.
As the situation develops, the effectiveness of Australia's policy, the reliability of age-verification technologies, and the responses of other platforms will continue to be critical issues for regulators, parents, and young users worldwide.