New Codes Target AI Chatbots in Suicide, Explicit Chat

The eSafety Commissioner has registered new industry-drafted codes which aim to better protect children from a range of harmful and age-inappropriate content, including the clear and present danger posed by mostly unregulated AI-driven companion chatbots.

Many of these chatbots are capable of engaging in sexually explicit conversations with minors, and have been alleged to also encourage suicidal ideation, self-harm and disordered eating.

The six codes apply to a broad range of online services and platforms - including app stores, gaming services, pornography websites, generative AI services and AI companion chatbots, equipment manufacturers and suppliers.

They also apply to social media services, with one code covering 'core' features such as posting content on the service and another covering their messaging features.

Developed by industry, these new codes represent a significant lift of the industry's responsibilities under Australia's Online Safety Act. The registration of these six codes follows the Commissioner's earlier registration of three codes applying to search engines, hosting services, and internet service providers in June.

As laid out in the Act, for the Commissioner to register industry codes, she must be satisfied that each one creates "appropriate community safeguards".

These new Codes adopt some key good practice measures already being implemented by major platforms and uplift safety protections, by introducing new obligations that will require more sectors of the online industry to promote children's online safety.

The Head Terms of the Codes also enshrine principles which will sit alongside the safety measures for every layer of the technology stack, like the importance of protecting human rights online; the right to freedom of expression; and the requirement for all services to comply with Australian privacy laws.

eSafety Commissioner Julie Inman Grant said that, together, the nine codes would protect children from exposure to a range of "lawful but awful," age-inappropriate content that has been occurring at increasingly young ages and would tackle the growing number of mostly unregulated AI chatbots.

"We've been concerned about these chatbots for a while now and have heard anecdotal reports of children - some as young as 10 years of age - spending up to 5 hours per day conversing, at times sexually, with AI companions," Ms Inman Grant said.

"We know there has been a recent proliferation of these apps online and that many of them are free, accessible to children, and advertised on mainstream services, so it's important these codes include measures to protect children from them.

"As with other forms of online pornography, there is a danger that excessive, sexualised engagement with AI companions could interfere with children's social and emotional development, setting up misguided or harmful beliefs and patterns that are damaging to individuals or relationships in real life.

"We've also seen recent reports of where AI chatbots have allegedly encouraged suicidal ideation and self-harm in conversations with kids with tragic consequences.

"I do not want Australian children and young people serving as casualties of powerful technologies thrust onto the market without guardrails and without regard for their safety and wellbeing."

Ms Inman Grant said the registering of the codes is a best practice example of how industry and the online safety regulator can work together to produce meaningful and enforceable industry codes that provide world-leading protections for children online.

"This shows how a co-regulatory approach can be successful and deliver meaningful safety protections. The industry associations have developed these codes and my office will be responsible for enforcing them to protect children from accidental exposure to content they are not cognitively ready to process and certainly cannot "unsee."

"We know this is already happening to kids from our own research, with 1 in 3 young people telling us that their first encounter with pornography was before the age of 13 and this exposure was 'frequent, accidental, unavoidable and unwelcome' with many describing this exposure as being disturbing and 'in your face'.

"We know that a high proportion of this accidental exposure happens through search engines, but other services such as app stores play an important role as 'gatekeepers' online, too, with parents often relying on the age ratings of apps to understand if they are suitable for their children.

"Under these new codes, app stores will have to make sure that apps are appropriately rated and that there are appropriate age assurance measures before permitting users to download or purchase apps rated as 18+.

"These codes enshrine good practice measures already in use by many major services, but importantly, they lift the bar for the entire online industry.

"By legislative design, these Codes are drafted by industry, for industry, to standardise and uplift their safety practices, so the public can be confident they represent what the technology industry itself considers to be proportionate and feasible measures to enhance online safety, especially for children.

Under the codes, pornography sites and other services whose purpose includes distributing pornography or other high-impact content will be required to implement appropriate age assurance technologies to prevent children from accessing harmful material.

The codes will provide complementary protections to the social media minimum age obligation, which will apply only to age-restricted social media platforms when it takes effect in December.

"While the codes will provide stronger protections and safer online spaces for children, they also require services to give all Australians information, tools, and options to limit their exposure to this sort of content," Ms Inman Grant said.

"And importantly, the codes contain privacy protections such as requiring providers not to use or disclose personal information of an Australian in a way that would be in breach of privacy law.

"And so, in determining appropriate age assurance measures to take under any code, services are also required to consider if those measures are compliant with privacy laws and whether the impact on user privacy is proportionate to the safety objectives."

The codes were developed through industry-led public consultation and extensive engagement between industry and eSafety following eSafety's July 2024 directive for the online industry to develop stronger protections for children.

These new codes will complement existing codes and standards already in force tackling the worst-of-the-worst online content including child sexual exploitation and abuse material as well as pro-terror content.

The codes and standards are legally enforceable and breach of a direction to comply may result in civil penalties of up to $49.5 million.

On 27 August, the Senate referred the implementation of the search engine code, as well as the forthcoming social media minimum age, to the Environment and Communications References Committee for inquiry and report by 31 October 2025. eSafety looks forward to making a submission to help the Committee and the public understand these measures."

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.