Roblox, a leading online gaming platform, is implementing a new age verification system to enhance safety for its young users. Starting soon, players who wish to use the platform’s chat feature will need to verify their age by providing a government-issued ID or by using an artificial intelligence (AI) tool that estimates their age by scanning their face.
This move comes after multiple lawsuits and allegations that Roblox has been a platform for predators to target children. The company aims to prevent minors from engaging with adult strangers by enforcing stricter age verification procedures.
Roblox’s Role in the Gaming World
Roblox is a platform where users can create and play games, as well as communicate with others. Unlike many other online platforms, Roblox allows users as young as 13 to participate. With over 150 million active users, a third of whom are under the age of 13, Roblox has long marketed itself as an educational space for children to learn coding and creativity.
However, the platform has faced significant criticism due to reports of children being groomed, exploited, and, in some cases, even kidnapped by adults they met through the platform. Lawsuits filed by the attorneys general of Kentucky and Louisiana, along with concerns from parents and families, have intensified scrutiny on Roblox’s child safety measures. One tragic case even links a child’s suicide to his interactions with predators on Roblox.
New Age Verification Policy
In response to the mounting concerns, Roblox announced its new policy on Tuesday, which requires all users to verify their age before accessing chat features. If users choose not to upload a government ID, they will be asked to submit a facial scan for AI-based age estimation. The tool, developed in partnership with identity verification firm Persona, uses the user’s front camera to assess their age based on facial movements.
The AI will place users in one of several age groups: under 9, 9-12, 13-15, 16-17, 18-20, or 21+. For instance, a user estimated to be 12 years old can only interact with other users aged 15 or younger, thus limiting inappropriate adult contact. Users over the age of 13 can upload an ID to correct any inaccuracies in the AI’s estimation.
Enhanced Privacy and Safety Features
Roblox already offers parental controls, blocks photo and personal information sharing, and employs AI moderation to detect inappropriate content. This new age verification system aims to further restrict contact between minors and adults, improving safety for younger users. According to Matt Kaufman, Roblox’s Chief Safety Officer, the company prioritizes safety and aims to make the platform a positive, age-appropriate space for everyone.
Implementation and Future Rollout
The age verification feature is being introduced on a voluntary basis starting this week. It will become mandatory in Australia, New Zealand, and the Netherlands in December and will be rolled out globally in early 2026. Although Roblox has not disclosed the accuracy rate of its AI technology, Kaufman claims it is typically accurate within one or two years for users aged between 5 and 25.
While some users have found ways to bypass AI-based age estimation on other platforms, Roblox’s system includes robust fraud prevention measures. These include ensuring users follow facial scan instructions and preventing repeated use of the same facial image.
Moving Towards a Safer Digital Environment
Roblox’s commitment to improving safety comes as part of a broader trend across online platforms, such as YouTube and Meta, which are increasingly using AI for age estimation. By implementing this new verification process, Roblox hopes to protect its young audience from potential harm while still providing a fun and creative space for users of all ages.
As the platform continues to evolve, Roblox executives have stressed the importance of maintaining high standards for safety and security, while responding to the expectations of both users and the public.










