Roblox, the popular global gaming platform primarily aimed at youth, has announced the implementation of mandatory age checks for users wishing to utilize the chat function, as it faces ongoing scrutiny over child safety issues. This initiative, which was first unveiled in November, uses AI technology from the third-party vendor Persona that utilizes on-device cameras to estimate users’ ages. Depending on the assessment, individuals are classified into age categories ranging from under 9 to 21+.
As of January 7, all users in locations where chat functionality is available will be required to undergo this age estimation process. This move marks Roblox as the first major online gaming platform to mandate facial age verification for all users accessing chat features.
In a blog post, Roblox’s Chief Safety Officer, Matt Kaufman, and Rajiv Bhatia, head of user and discovery product, expressed their commitment to establishing a new standard for communication safety within online gaming. Since the launch of the feature, millions have voluntarily participated in age checks, and in regions where these checks were already mandatory, such as Australia, New Zealand, and the Netherlands, over half of users opted into age-verified chat functionality.
Moving forward, Roblox plans to extend mandatory age checks to real-time collaboration features in its Roblox Studio and will update its community policies accordingly. The platform, which had 151 million daily users as of 2025, boasts a significant demographic presence among young users, including nearly half of the U.S. population under the age of 16.
In light of increasing safety concerns, Roblox has been actively enhancing its chat capabilities and parental control features, introducing tools for activity monitoring, expanded blocking options, and moderated chat environments. Despite these proactive measures, skepticism remains regarding the effectiveness of the new age checks in curtailing incidents of child exploitation on the platform. Critics argue that the technology may be prone to inaccuracies, with developers claiming an estimation error margin of only two years, which could raise concerns, especially for younger users.
Roblox’s safety measures come amid mounting legal challenges, including nearly 80 lawsuits filed by victims and their parents. These lawsuits assert that the platform has not adequately safeguarded its users from sexual exploitation or sufficiently communicated the associated risks. Many cases also implicate other platforms such as Meta, Discord, and Snapchat, where inappropriate interactions allegedly continued after initiating contact on Roblox. The claims are set to be handled by a single district judge in San Francisco, with Roblox maintaining its dispute against these allegations.
As the platform continues to navigate these challenges, the commitment to improved safety protocols and age verification could signal a positive shift in addressing the concerns surrounding child safety in online gaming environments.
