Almost one-third of Australian children - some as young as 10 years of age - have seen pornographic images or videos online, according to new eSafety research.
eSafety's latest Keeping Kids Safe Online research - a nationally representative survey of 3,454 Australian children aged 10 to 17 years-old - found 32% of kids have been exposed to sexual images or videos online. 23% had seen sexually explicit content online in the past 12 months.
The research, which also found 12% of children had seen violent sexual content online, has prompted eSafety to issue an Online Safety Advisory today, including advice on how to protect children from seeing too much, too soon.
"Pornography is so pervasive and invasive that children are often stumbling across it by accident," eSafety Commissioner Julie Inman Grant said.
"We want children to understand the pillars of a healthy relationship begin with consent and respect. Pornography can be the antithesis of that. A child cannot unsee a video showing a man aggressively choking a woman during sex, for example.
"Kids' exposure to violent and extreme pornography is a major concern for many parents and carers. Our new Online Safety Advisory provides parents and carers with practical advice, including guidance around those difficult conversations we need to have with our children as well as useful information on parental controls and other safety tools. Ultimately, it's crucial kids know that they won't be in trouble if they come to you for help."
"Parents and carers play an important role - both from a protective and educative standpoint - but the responsibility cannot be entirely on them. We also need industry to play their part by putting in some effective barriers to protect children."
eSafety Commissioner Julie Inman Grant recently registered new codes, drafted and submitted by industry, which require a wide range of technology services to do more to:
• restrict children's access to porn, high-impact violent material, and material that encourages self-harm, suicide or disordered eating, and
• empower users of all ages to control the content they do not want to see.
These codes cover search engine services, hosting services, internet carriage services such as telcos, app stores, device manufacturers, social media, gaming and messaging services and other apps and websites, including some generative AI services.
"We know kids are naturally curious and they may either stumble upon or search out sexualised content as they enter adolescence and explore their sexuality, which is why it is critical tech companies, up and down the stack, have greater protections in place," Ms Inman Grant said.
"Our holistic and layered safety approach places the onus on tech companies to provide vital and robust protections for all Australians, especially children.
"We've been concerned about AI chatbots for a while now and have heard anecdotal reports of children - some as young as 10 years of age - spending up to 5 hours per day conversing, at times sexually, with AI companions," Ms Inman Grant said.
"There has been a recent proliferation of these apps online and many of them are free, accessible to children, and advertised on mainstream services. Importantly these codes include measures to protect children from chatbots which can generate highly sexualised or pornographic material."
This second phase of codes will complement the existing codes and standards that deal with the highest-harm online material, such as child sexual abuse material and pro-terror content.
The codes and standards are mandatory and enforceable and failure to comply may result in civil penalties of up to $49.5 million.