Australian Minister Demands Action from Roblox Over Child Safety Concerns
Australia Demands Action from Roblox Over Child Safety

Controversial gaming platform Roblox is facing intense scrutiny from Australian authorities following persistent reports that predators are targeting children with sexually explicit and suicidal material. Communications Minister Anika Wells has requested an urgent meeting with the popular platform, just two months after Australia's world-leading social media age restrictions came into effect.

Minister Expresses Deep Alarm Over Ongoing Issues

Minister Wells has voiced significant concern about claims that young Roblox users continue to be exposed to graphic and gratuitous user-generated content. 'Even more disturbing are reports and concerns about children being approached and groomed by predators, who actively seek to exploit their curiosity and innocence,' Wells stated emphatically.

Australia's Social Media Age Restrictions

Australia's social media minimum age restrictions, which took effect on December 10, require digital platforms to verify users' ages and lock accounts for those younger than 16. Ten digital platforms were initially asked to comply with this law, including Google's YouTube, Meta's Facebook, Instagram and Threads, as well as Snapchat, Reddit and TikTok.

Roblox, which is not specifically named under this legislation, has revealed that 60 per cent of its Australian daily active users have undertaken age verification checks. However, this has not alleviated concerns about the platform's safety measures.

Platform Structure and Parental Concerns

Roblox is not a single game but rather a vast ecosystem of user-created 'experiences' hosted on its platform. In the lead-up to the social media ban, parents had already expressed serious concerns about harms on Roblox, including sexually explicit and suicidal content being shared in public chats.

Wells noted that this problematic content has persisted despite Roblox engaging 'extensively' with eSafety over the past two years. 'This is untenable and these issues are of deep concern to many Australian parents and carers,' she declared.

eSafety Commissioner's Strong Warning

eSafety Commissioner Julie Inman Grant has stated that Roblox must immediately take action to block predators from accessing children following what she described as 'horrendous' reports. Roblox has informed eSafety that it delivered on its commitments under the ban, including switching off features such as direct chats and voice functions for Australian children.

However, Commissioner Inman Grant emphasized that the platform would be thoroughly assessed for its compliance. 'We remain highly concerned by ongoing reports regarding the exploitation of children on the Roblox service, and exposure to harmful material,' she said. 'They can and must do more to protect kids, and when we meet I'll be asking how they propose to do that.'

Potential Consequences and Future Legislation

Platforms that decline to comply with Australia's social media ban face substantial fines of up to $49.5 million. Minister Wells has asked the internet watchdog what additional powers can be strengthened to combat harms on Roblox as the government works toward legislating a digital duty of care.

This proposed legal obligation is separate from the social media ban and would apply to large online platforms, requiring them to take proactive, reasonable steps to prevent foreseeable harms to users. Commissioner Inman Grant noted that codes focused on age-restricted material, including pornography and self-harm content, would come into force on March 9 and apply to Roblox.

The situation highlights growing international concerns about child safety in digital environments, particularly on platforms with significant youth user bases. Australian authorities are taking a firm stance, signaling that voluntary measures may no longer be sufficient to protect vulnerable users from online exploitation.