Roblox’s new age checks aren’t enough to keep kids safe, cybersecurity CEO warns
Roblox is home to tens of millions of daily players under 16, making it one of the largest online spaces for young people. In 2025, the platform introduced expanded age verification as part of a push to protect minors, but a leading child-safety expert says those tools still leave major gaps.
Ron Kerbs, CEO of Kidas, a cybersecurity company focused on young gamers, called Roblox’s new system “a step in the right direction” that doesn’t solve the core problem. According to Kerbs, facial analysis and ID scans can help gate mature content, but they’re easy to work around and don’t address the real-time risks kids face in voice and text chats.
Launched in July 2025, Roblox’s updated “Trusted Connections” approach blends facial age estimation, ID verification, and verified parental consent to more accurately determine a user’s age. The goal is to better separate adults from minors and allow teens aged 13 to 17 access to chat features while keeping moderation in place. Similar pressure across the industry has prompted other social gaming platforms to add age checks as demand for stronger child safety grows.
Kerbs argues that no verification method—AI-driven or otherwise—can guarantee the person behind the screen is who they claim to be. Kidas’ work with more than 400,000 gamers has shown how determined bad actors can bypass these systems with fake IDs or AI-generated images. He pointed to recent examples elsewhere online where UK age checks were defeated using game character imagery from titles like Garry’s Mod and Death Stranding, underscoring how creative and persistent offenders can be.
The bigger issue, Kerbs says, is not simply blocking underage accounts at signup, but spotting dangerous behavior as it happens. “Protecting kids isn’t about proving you’re 13,” he said. “It’s about detecting when a conversation is going dangerously off track and stepping in fast.” He urges platforms to invest in real-time behavioral monitoring capable of identifying grooming patterns, coercion, and other red flags across voice and text chat before harm occurs.
What families can do right now
– Enable parental controls and set communication limits appropriate for your child’s age.
– Keep chat features restricted to known friends where possible, and review friend lists regularly.
– Talk openly with kids about online boundaries, reporting tools, and how to leave uncomfortable situations.
– Report suspicious behavior immediately and document usernames and chat logs.
– Consider additional safety tools that provide alerts for risky interactions while respecting privacy.
As Roblox and other online worlds scale up age verification, experts caution that layered defenses will be essential. Strong identity checks can help reduce exposure, but real-time detection and swift intervention are what ultimately keep kids safe in fast-moving social spaces.





