Roblox Pledges Safer Gaming for Kids Under New AU Codes

Popular online gaming platform Roblox has committed to introduce a new suite of safety measures following concerns raised by Australia's eSafety Commissioner about child grooming risks on the platform and its compliance with Australia's industry codes and standards.

Roblox has committed to implementing the safety measures in Australia by the end of 2025, as a result of eSafety's engagement with the platform.

The new safety measures include making accounts for users aged under 16 private by default and introducing tools to prevent adult users from contacting under 16s without parental consent.

A number of key features will now be switched off by default for children in Australia, such as direct chat and 'experience chat' within games, until the user has gone through age estimation.

After a child aged under 16 has gone through age estimation and has chat enabled, they will be unable to chat with adults. Parental controls will also be introduced to allow parents to disable chat for 13- to 15-year-old users, on top of existing protections for under 13s.

Voice Chat will also not be allowed between adults and 13-15-year-olds, in addition to the current prohibition on the use of this feature by under 13s.

To support the delivery of these commitments, Roblox announced last week that it would be expanding age estimation technology across its communication features.

The codes and standards, which are a key requirement under Australia's Online Safety Act, are mandatory and enforceable and require all members of the online industry to take proactive steps to tackle the worst-of-the-worst online content and conduct, including child sexual exploitation and abuse.

eSafety Commissioner Julie Inman Grant said the world-leading codes and standards are designed to raise safety standards across the entire online ecosystem and the new commitments from Roblox are an example of this safety uplift.

"We know that when it comes to platforms that are popular with children, they also become popular with adult predators seeking to prey on them," Ms Inman Grant said. "Roblox is no exception and has become a popular target for paedophiles seeking to groom children.

"We've been engaging with Roblox on this issue for several months to make it clear to the platform that under Australian law they are required to take meaningful action to prioritise the protection of children.

"And we're not just concerned with Roblox's communication features, but also new features the platfomr has introduced or said it is exploring like dating, short-form video feeds and virtual shopping malls. Roblox will similarly need to show me that these features also comply with relevant obligations under the codes and standards.

"I have recently met with senior Roblox executives in person, including their Chief Legal Officer and Chief Safety Officer, to outline our compliance concerns and what we as the regulator expect of them when it comes to tackling harms as serious as grooming, sexual extortion and other forms of child sexual exploitation. I am pleased to see them come to the table with these new safety commitments.

"It also demonstrates the potential of these world-leading codes and standards to have a transformative effect on the online industry when it comes to protecting children from the most egregious harms. We want platforms to view safety as a high ceiling rather than a dirt floor with companies doing more than just the bare minimum."

eSafety will closely monitor the implementation of these commitments and may consider regulatory action in cases of future non-compliance, including where commitments are not fully delivered, are subject to delays, or where other instances of non-compliance are identified.

Last week, the eSafety Commissioner also registered a second phase of industry codes, focused on age-inappropriate content such as online pornography and content dealing with suicidal ideation, self-harm and disordered eating. These codes will apply to platforms including Roblox and a broad range of other services.

eSafety has a range of enforcement powers under the Act, including the ability to seek civil penalties of up to $49.5 million for non-compliance with the codes and standards.

While not mandated under the industry codes and standards, Roblox's announcement that it plans to expand facial age estimation across its communication features by the end of 2025 will further reduce risks associated with adult-child interactions online.

It also shows what is possible with current age assurance technologies, reflecting the conclusions of the Government-sponsored Age Assurance Tech Trial.

"While I welcome Roblox's commitments to improved safety, I would also urge parents and carers to remain vigilant and actively support children in navigating online environments safely," Ms Inman Grant said.

"The industry codes and standards will work hand-in-hand with the new social media age restrictions, ensuring that there are protections for children from harms online both on social media and, critically, on wider services, including online gaming services.

"As the digital landscape continues to evolve, eSafety will use all available powers to ensure that Roblox-and all regulated services-meet their obligations and prioritise the safety of Australian users."

The Government has also committed to introducing a duty of care for online services, reinforcing the principle that safety must be built into platforms from the ground up.

"The time has come for platforms to take real responsibility for the safety of their users. We will continue to use every tool at our disposal to hold them accountable," Ms Inman Grant said.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.