ESafety Registers Codes to Shield Kids from Harmful Content

Australia's eSafety Commissioner has registered three of the nine codes submitted to eSafety by the online industry, creating safeguards to protect children from exposure to pornography, violent content, and themes of suicide, self-harm and disordered eating.

The three include a code relating to search engine services, as well as codes covering enterprise hosting services and internet carriage services such as telcos.

"These three codes needed to create a high level of protections, especially for kids, to be registered. In particular, the fact the search engine code has achieved this is incredibly important as search engines are often the windows to the internet for all of us."

eSafety Commissioner Julie Inman Grant said she has sought additional safety commitments from industry on the remaining codes, including those dealing with app stores, device manufacturers, social media services and messaging and the broader categories of relevant electronic services and designated internet services.

"It's critical to ensure the layered safety approach which also places responsibility and accountability at critical chokepoints in the tech stack including the app stores and at the device level, the physical gateways to the internet where kids sign-up and first declare their ages," Ms Inman Grant said.

Ms Inman Grant said she asked industry to make further changes across some of the codes, including to strengthen protections around AI companions and chat bots to ensure these provide vital and robust protections.

"We are already receiving anecdotal reports from school nurses, that kids as young as 10 are spending up to five hours a day with AI chatbots, at times engaging in sexualised conversations and being directed by the chatbots to engage in harmful sexual acts or behaviours," she said.

"We need industry to be building in guardrails that prevent their chatbots engaging in this type of behaviour with children.

"Industry indicated last week they would seek to make some of these changes shortly. I will consider these changes, and I aim to make my final determination by the end of next month. If I am not satisfied these industry codes meet appropriate community safeguards I will move to developing mandatory standards.

In July 2024, eSafety tasked the online industry to begin drafting codes that would protect children from exposure to a range of age-inappropriate content across the online ecosystem.

The codes were originally due to be submitted for registration assessment on 19 December 2024. The Commissioner granted several extensions at the request of industry to enable consideration of domestic and international regulatory developments, including the social media minimum age.

In April 2025, after industry submitted the draft codes for final review in February and March, the Commissioner wrote to industry reinforcing her concerns that none of the codes, as they stood, met the required safety protections and gave them the final date of 20 May to resubmit codes that did.

Once in place, the new codes will complement an existing first phase of codes and standards already in force dealing with the highest-harm online material, such as child sexual abuse material and pro-terror content.

The codes and standards are mandatory and enforceable and failure to comply may result in civil penalties of up to $49.5 million per breach.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.