
Roblox Age Controls will be expanded to cover all users who access communication features on the platform before the end of the year, the company announced. The move follows sustained criticism that the game environment has not done enough to protect children and teens. Matt Kaufman, Roblox Head of Safety, said the company will combine age estimates with ID checks and parental consent to restrict adults from contacting minors unless the parties have a real world relationship.
Age Verification
Roblox will use machine learning to estimate user ages and will add options for verified ID checks where required. The company said it will place limits on messages and friend requests between adults and underage users. The system will also nudge parents to set controls and to review activity that may look risky.
Child Safety
Roblox said the changes will include safer defaults for chat and more visible reporting tools for unsafe content. Kaufman stated that the firm intends to roll out the controls widely and to work with child safety experts and industry groups. The company pointed to new labels and moderation updates introduced this year as part of a broader safety programme.
Global Context
The announcement comes as regulatory authorities in various countries push to put online age gates. United Kingdom and European Union Platforms now must implement more stringent age restrictions and must safeguard minors against unsuitable material. Roblox announced that specific implementations in each market will be informed by the ability to comply with local laws.
The implementation of Roblox[1] age controls will be done over time and the company will observe the results and make corrections to the systems. Parents and guardians are encouraged to consider platform settings, and to report concerns via Roblox safety channels. New guidance will be published as new tools are made available to the company.