Open Modal

Roblox expands age checks, restricts chats by age group as safety scrutiny intensifies

Roblox game application displayed on mobile device
Roblox game application displayed on mobile device

Roblox is rolling out stricter age-verification requirements and new age-segmented chat systems as it faces mounting legal and regulatory pressure over child safety on the platform. The company confirmed Tuesday that users who want to access private or expanded communication features will soon need to complete an AI-driven age check, with additional limits designed to prevent adults and minors from interacting unless they already know each other offline.

The platform will use facial age-estimation technology provided by Persona, requiring users who opt in to record a brief video selfie. Roblox says the footage is deleted once the age check is completed. Players who decline the scan can still use the platform but will not be able to message others freely. The company notes that children under 13 are already barred from chatting outside of in-game channels without verified parental permission, and private chats remain unencrypted so moderation teams can review communications for safety concerns.

Matt Kaufman, Roblox’s chief safety officer, said the age-estimation tool is generally accurate within one to two years for users between roughly five and 25. He added that those who feel their estimated age is incorrect may verify with a government ID or rely on parental consent. “But of course, there’s always people who may be well outside of a traditional bell curve. And in those cases, if you disagree with the estimate that comes back, then you can provide an ID or use parental consent in order to correct that,” he said.

Once verified, players will be sorted into six age brackets—under 9; 9–12; 13–15; 16–17; 18–20; and 21+—and allowed to chat only with their own or nearby age groups, depending on the feature being used. Roblox plans to begin enforcing these requirements in Australia, New Zealand and the Netherlands in early December, with a global rollout scheduled for January. The company is also launching a new online safety center to guide families through parental controls and upcoming policy changes.

These moves come as Roblox confronts a surge of legal challenges. Dozens of families, along with attorneys general in Kentucky and Louisiana, are suing the company—alongside platforms like Discord—accusing them of failing to prevent predators from contacting children. Florida’s attorney general is conducting a separate investigation into Roblox’s safety practices.

Tech companies across the industry have been accelerating age-verification efforts to comply with new laws and respond to criticism. Google is testing AI-based age checks for YouTube, while Instagram is experimenting with tools to identify users misrepresenting their age.

Roblox previously outlined its intention to broaden age checks for all users seeking access to communication tools, and the new measures represent the most significant expansion of those plans to date.

Editorial credit: mitagalihs / Shutterstock.com

RecomMended Posts

Loading...