Popular gaming platform Roblox is implementing a significant new age verification system and age-based chat restrictions, responding directly to mounting criticism over child safety and tightening global regulations.
New Age Verification Measures
The platform is introducing the Persona-provided age estimation tool, initially announced in July, which requires users to submit a video selfie for age assessment. Roblox confirms these video selfies are deleted immediately after processing. This mandatory scan is specifically required for users wishing to access private messaging features, though it remains optional for general platform use.
Stricter Chat Controls and Monitoring
Under the new system, children under 13 will be restricted from chatting outside games unless they have explicit parental permission. Unlike many other social platforms, Roblox does not encrypt private conversations, enabling continuous monitoring and moderation of discussions to enhance user safety.
Despite concerns from digital safety experts regarding facial age estimation technology, Matt Kaufman, Roblox's chief safety officer, maintains the system's accuracy. "The system accurately estimates age within one to two years for those aged five to 25," Kaufman stated. "But of course, there's always people who may be well outside of a traditional bell curve. And in those cases, if you disagree with the estimate that comes back, then you can provide an ID or use parental consent in order to correct that."
Implementation Timeline and User Categories
Following verification, users will be assigned to specific age brackets: under nine, nine to 12, 13 to 15, 16 to 17, 18 to 20, and over 21. Chat functionality will be limited to interactions within their assigned age group or similar categories.
The new measures will launch in Australia, New Zealand, and the Netherlands during the first week of December, with a global rollout scheduled for early January.
This safety overhaul comes amid increasing legal pressure on the platform, including lawsuits alleging it has become a "playground for predators." Roblox has strongly defended its safety record, emphasizing its existing protective measures.
"We have rigorous safety measures in place from advanced AI models to an expertly trained team of thousands moderating our platform 24/7 for inappropriate content," the company stated. "No system is perfect and our work on safety is never done. We are constantly innovating our safety systems, including adding 100 new safeguards, such as facial age estimation, this year alone."
The platform, which launched in 2006 and now boasts more than 100 million daily users, maintains strict default settings for younger users. Those under 13 cannot directly message others on Roblox outside of games or experiences, and cannot message during games unless parental controls specifically permit it.
Roblox employs comprehensive text chat filters designed to block inappropriate content, prevent attempts to direct under-13 users off the platform, and stop the sharing of personal information. The platform prohibits user-to-user image sharing and explicitly bans sexual conversations.