Roblox Rolls Out Facial Age Checks to Safeguard Young Players

Roblox Rolls Out Facial Age Checks to Safeguard Young Players

Roblox has announced significant changes aimed at enhancing the safety of its young users by preventing children from communicating with adult strangers on its platform. Beginning in December in Australia, New Zealand, and the Netherlands, and rolling out globally in January, the gaming platform will implement mandatory age checks for accounts utilizing chat functionalities. This move follows mounting criticism regarding the platform’s earlier failure to sufficiently protect children from inappropriate content and interactions with adults, as highlighted by ongoing lawsuits tied to child safety concerns in multiple U.S. states.

The recent policy comes at a time when Australia is preparing to introduce a social media ban for those under 16 years old, increasing the pressure on gaming platforms like Roblox to adopt stricter measures as well. Roblox CEO Dave Baszucki had previously emphasized the company’s commitment to child safety, while also advising parents to leverage their instincts when making decisions about their children’s online activities.

Rani Govender, the NSPCC’s policy manager for online child safety, recognized the necessity of the changes in light of “unacceptable risks” that young users faced on Roblox. The organization welcomed the latest initiative but urged that tangible improvements be made to effectively protect children from potential exploitation by adult users. With an impressive average of over 80 million daily players, roughly 40% of whom are under 13, the need for stringent safety measures is particularly pressing.

Following the establishment of the UK’s Online Safety Act, which imposes strict regulations on tech companies to safeguard children against online threats, Ofcom, the communications regulator, expressed approval of Roblox’s new age verification protocols. Although there remains more work to be done, Anna Lucas, Ofcom’s online safety supervision director, emphasized that the progress was encouraging.

Roblox is set to become the first significant gaming platform to require facial age verification for users aiming to access chat features. Matt Kaufman, Roblox’s chief safety officer, believes the technology, which can estimate a user’s age with remarkable accuracy—within one to two years for users aged 5 to 25—will effectively help segment users into appropriate age groups. This feature will categorize users as under nine, between 9 and 12, 13 to 15, 16 to 17, 18 to 20, and 21 or older, allowing them to interact only with peers from similar age brackets unless they add someone as a “trusted connection.” Additionally, under-13 users will be restricted from private messaging and certain chat functionalities unless approved by a parent.

These new measures are a response to the previous ability for adults to reach out to younger players, as demonstrated in a recent BBC test. The findings illustrated that a 27-year-old could communicate with a 15-year-old user without restriction, prompting a reevaluation of safety protocols.

Roblox’s proactive initiatives reflect a growing awareness and commitment to improving child safety in online gaming environments, setting a precedent that could influence similar platforms to follow suit.

Popular Categories


Search the website

Exit mobile version