Roblox Tightens Safety with Mandatory Age Verification for Private Chat
The popular online gaming platform Roblox is implementing a significantly stricter age verification system, specifically targeting users who wish to access private messaging features. This decisive move comes in response to mounting criticism over child safety and the increasing weight of global regulations designed to protect minors online.
The new measures will first be enforced in Australia, New Zealand, and the Netherlands in the first week of December, with a full global rollout scheduled for early January.
How the New Age Verification System Works
At the heart of the new safety protocol is an age estimation tool provided by Persona, which was initially announced in July. To use private messaging, users must now submit a video selfie for age assessment. Roblox has emphasised that this biometric data is deleted immediately after processing.
It is crucial to note that this scan is only mandatory for accessing private messaging and is not required for general use of the Roblox platform. For children under the age of 13, the rules are even more stringent. They can only engage in chat outside of specific games if they have obtained explicit parental permission.
In a notable departure from platforms like WhatsApp, Roblox does not encrypt private conversations. This policy is a deliberate safety feature, allowing the company's moderation systems to monitor chats for harmful content.
Accuracy and Age-Based Grouping
Despite some experts expressing caution over the reliability of facial age estimation technology, Roblox's chief safety officer, Matt Kaufman, has defended the system. He claims it can accurately estimate a user's age within one to two years for individuals aged five to 25.
"But of course, there’s always people who may be well outside of a traditional bell curve," Kaufman stated. "And in those cases, if you disagree with the estimate that comes back, then you can provide an ID or use parental consent in order to correct that."
Once verified, users will be placed into one of several age brackets:
- Under 9
- 9 to 12
- 13 to 15
- 16 to 17
- 18 to 20
- Over 21
Chat functionality will then be restricted, allowing interactions primarily within a user's assigned age group or with similar categories.
A Broader Push for Online Safety
This initiative by Roblox reflects a wider trend among major tech companies, including Google and Instagram, to implement more robust age verification and user protection systems. These efforts are largely driven by the need to comply with new regulations and address public concern.
The changes also arrive amidst a series of lawsuits against Roblox, which has faced accusations of becoming a "playground for predators." The company has vigorously pushed back against these allegations.
In an official statement, Roblox outlined its existing safety measures: "We have rigorous safety measures in place from advanced AI models to an expertly trained team of thousands moderating our platform 24/7 for inappropriate content." The statement continued, "No system is perfect and our work on safety is never done. We are constantly innovating our safety systems, including adding 100 new safeguards, such as facial age estimation, this year alone."
Roblox, which launched in 2006, boasts a massive user base of over 111 million daily active users. The platform has reiterated that it maintains strict safety defaults for its youngest users, preventing those under 13 from direct messaging outside of games and employing rigorous text chat filters to block inappropriate content and the sharing of personal information.