Australia’s push to make the internet safer for kids is intensifying, and Roblox is now firmly in the spotlight.
Last year, Australia rolled out new online safety rules that restrict how children and teens interact with major online services. Under these regulations, young people up to age 16 can face tighter limits—or outright barriers—when trying to sign up for social media platforms and similar services. The goal is simple: reduce exposure to harmful material and curb online exploitation.
Now, Australian officials are pressing Roblox to explain exactly how it’s enforcing those protections. Australian Communications Minister Anika Wells, working alongside the country’s eSafety Commissioner, has contacted Roblox over concerns tied to “graphic and gratuitous user-generated content.” While gaming platforms weren’t initially the main focus of the policy conversation, regulators are now asking services like Roblox (and other large digital platforms with major youth audiences) to show what they’re doing to prevent child exploitation, self-harm content, and other dangerous material from reaching minors. Authorities are also urging the Australian Classification Board to reconsider Roblox’s PG rating.
The eSafety office has gone a step further by signaling plans to directly test Roblox’s safety measures. Regulators say they want to evaluate the platform’s implementation and effectiveness of nine safety commitments Roblox previously made to Australia’s online safety regulator. Those commitments include restrictions such as limiting private accounts for users under 16 and limiting voice chat access for users aged 13 to 15. Some changes have already expanded beyond Australia, including stronger age verification requirements designed to control access to chat features.
Despite these moves, Australia’s eSafety Commissioner Julie Inman Grant says ongoing reports continue to raise red flags. The agency remains “highly concerned” about alleged child exploitation and exposure to harmful content on Roblox, and intends to test how well the announced safeguards work in real-world conditions. If regulators determine Roblox has fallen short of meeting its obligations, the company could face penalties of up to AUD $49.5 million (about $35 million USD).
Minister Wells is also seeking what she describes as an urgent meeting with Roblox, citing continued media reports claiming children can still access explicit content. In her communication, she said she was alarmed by allegations involving sexually explicit and suicidal material appearing on the platform. Even more troubling are recent claims that predators have approached children for grooming purposes, attempting to exploit their “curiosity and innocence.” In a recent example highlighted in coverage, a Queensland man was accused of using Roblox and Fortnite to “groom and coerce” a large number of children.
Australia isn’t alone in turning up the heat. In the United States, officials have also moved toward stronger scrutiny of online child safety on gaming platforms. A criminal investigation into Roblox has been announced in Florida, led by Attorney General James Uthmeier, and Texas Attorney General Ken Paxton has also called for Roblox to do more to protect children from predators.
With Australia, the US, and the European Union pushing stricter online safety expectations, Roblox and similar user-generated platforms are entering a new era of oversight—one where policy promises won’t be enough. Regulators want proof that safety tools work, that age gates hold up, and that harmful content and predatory behavior are being actively prevented before children are put at risk.


