Australia's systemic online safety framework reaches a critical milestone on Monday with the full commencement of the Age-Restricted Material Codes, introducing common-sense measures similar to those that have protected children in the physical world for generations.
The codes cover most corners of the online ecosystem, from device manufacturers to app stores, social media and websites requiring the online industry to put in place meaningful protections preventing children's exposure to age-inappropriate content.
This includes high-impact violence, pornography, self-harm material, and dangerous content such as suicide and disordered eating.
Crucially, the codes will also apply to the AI-powered chatbots and companions which have become increasingly popular with children, preventing them from engaging in conversations with minors that are sexually explicit or encourage self-harm or suicide.
eSafety Commissioner Julie Inman Grant said that for decades society has agreed that there are certain things children are not physically, developmentally or emotionally equipped to deal with and so we have to put in age barriers to protect them.
"We don't allow children to walk into bars or bottle shops, adult stores or casinos, but when it comes to online spaces where they are spending a lot of their time, there are no such safeguards," Ms Inman Grant said. "But that changes for Australian kids with these codes, which simply bring those same, commonsense protections we all grew up with to the online world of today to ensure children are having age-appropriate experiences and not being exposed to potentially harmful content too early.
"Industry must now apply consistent standards across their services so children are not accidentally exposed when they search or scroll online."
Under the codes, search engines, social media platforms, pornography websites, app stores, gaming providers, and generative AI systems – including companion chatbots – must take meaningful steps to prevent children from being exposed to age-inappropriate content.
The six new Age-Restricted Material Codes join three that are already in force, covering search engines, internet service providers and hosting services.
"Under these codes, if a young person searches the internet for suicide or self-harm content, the first result they see will be a helpline – not a harmful online rabbit hole," Ms Inman Grant said.
"These obligations will help prevent exposure to potentially harmful content and direct at-risk children to real, lifesaving support. Children's emotional, psychological development and wellbeing is at stake and so I feel very proud of what we've been able to achieve with the industry in Australia. "These industry-developed codes shift that responsibility back where it belongs – onto the companies designing these digital platforms and profiting from their users – and will give children back a little more of their childhoods."
Under the codes, adults will continue to have full access to legal adult content, but some services will now require proof of age. The forms of age assurance used must be accurate, robust, fair and reliable.
Importantly, any age assurance measures must comply with Australian privacy laws and are managed solely by the service being used – not the Australian Government.
These are some of the main changes under Australia's Age-Restricted Material Codes:
- AI Companion chatbots – AI companion chatbots capable of generating sexually explicit, high-impact violence or self-harm material, need to confirm someone is 18 or older before allowing them access to that material. This may be required either when a person logs onto the service or at the point of access or generation for that material.
- App stores – App Stores must take appropriate steps to prevent users who are under 18 from purchasing or