The Australian Government must deliver on its promise to introduce digital duty of care legislation, with a new eSafety Commissioner report exposing a seemingly industry wide failure to meet basic child protection standards.
According to eSafety Commissioner findings released today from transparency notices on child exploitation and abuse for August 2025 (periodic), tech companies continue to do nothing to protect children from sexual abuse that is streamed live on their services, including in video calls.
The failure continues "[d]espite the availability of technology to help detect child sexual exploitation and abuse in livestreams or video calls," according to the eSafety Commissioner's latest report.
"I think this is hugely concerning. This is illegal content. This is literally the rape, torture of children and they're enabling it and turning a blind eye," said Julie Inman-Grant, the eSafety Commissioner.
The Australian Institute of Criminology has identified online video calling services as "a central vector for livestreaming child sexual exploitation and abuse."
International Justice Mission's 2023 study, Scale of Harm, conducted with the UK's Nottingham Rights Lab, found that nearly half a million children in the Philippines are sexually abused to create new abuse material, especially in livestreams; and no wonder—tech companies continue to allow these live
crimes scenes on their platforms.
In their submission to the 2024 review of Australia's Online Safety Act, Philippine Survivor Network leaders stated "We have experienced different forms of online sexual exploitation such as livestreamed sexual abuse and production of child sexual abuse materials in exchange for money received by facilitators from online paying customers."
"We also specifically ask that digital services be held accountable for livestreaming child sexual abuse – that such criminal activity be proactively detected and disrupted on their platforms."
"Every Australian should find it deeply disturbing that the wealthiest, most powerful tech companies in the world are doing nothing to protect children from child sexual abuse streamed live in video calls," said John Tanagho, Executive Director, IJM Center to End Online Sexual Exploitation of Children.
"Companies can use artificial intelligence and machine learning tools, along with a range of signals and red flags, to detect and disrupt in real-time the rape of children through the services and platforms they provide freely to the world. They must be required to do so."
Online service providers are required to report, via transparency notices from the eSafety Commissioner, on how they are implementing the Basic Online Safety Expectations set out by the Australian Government.
eSafety's latest report also found that many of the tech companies:
- Failed to use tools to proactively detect new child sexual exploitation and abuse images and videos;
- Have safety deficiencies in user reporting to identify child sexual exploitation and abuse and the resulting illegal images and videos;
- Did not use hash-matching to proactively detect known child sexual exploitation and abuse; and
- Did not use language analysis tools to proactively detect sexual extortion.
IJM Australia CEO David Braga said the findings send a strong signal that stronger online safety laws are needed to protect children.
"These findings are shocking and are a powerful reminder of why digital duty of care legislation is urgently needed," Mr Braga said.
"Tech companies are among the wealthiest and most sophisticated organisations in the world and have the resources to prevent illegal activity and content on their platforms. They need to lift their game and do better to protect children online.
"Many of these companies aren't doing enough to combat child sexual abuse or exploitation material on their platforms, apps, and devices. Self-regulation isn't enough. It needs to be enshrined in law.
"Our laws must be fit-for-purpose in the digital age and cater for existing and emerging technologies."
The Government announced in November 2024 its intention to legislate a digital duty of care, which is a key recommendation of the independent statutory review of the Online Safety Act 2021.
Under the changes, digital platforms would be required to take reasonable steps to prevent foreseeable harms on their platforms and services, with the framework to be underpinned by risk assessment and risk mitigation and informed by safety-by-design principles.
These platforms would also be obligated to continually identify and mitigate potential risks, as technology and service offerings change and evolve.
The eSafety Commissioner Snapshot report is available here: BOSE-Snapshot-first-regular-report-on-CSEA-sexual-extortion-periodic-notices-August2025.pdf; and the full report is available here: A Baseline for online safety transparency: The first regular report on child sexual exploitation and abuse, and sexual extortion (pg. 38-40 for livestreamed child sexual exploitation and abuse.)