Global tech giants, including Apple, Google, Meta and Microsoft, continue to leave significant gaps in their efforts to combat a range of sexual crimes against children taking place on their services, three years after eSafety first used its world-leading transparency powers to expose a range of child safety concerns.
More concerningly, Apple services and Google's YouTube were not tracking the number of user reports they received of child sexual abuse occurring, and that they could not say how long it took them to respond to those reports. Google and Apple also did not provide numbers of trust and safety staff.
eSafety's latest transparency report shows minimal progress has been made by some of the most well-resourced companies in the world to tackle this urgent issue despite previous eSafety reports in 2022 and 2023 showing not enough was being done to protect children from sexual exploitation and abuse on their services.
In July last year, eSafety stepped up the pressure on tech companies to reveal whether they were taking more meaningful action, by giving legally enforceable periodic transparency notices under Australia's Online Safety Act to eight companies including Apple, Google, Meta, Microsoft, Discord, WhatsApp, Snap and Skype.
The notices require each company to report to eSafety every six months for a period of two years about how they are tackling child sexual abuse material, livestreamed child abuse, online grooming, sexual extortion and AI-generated "synthetic" child sexual abuse material.
"This latest report shows many of the same safety gaps and shortcomings we uncovered in our 2022 and 2023 reports still exist today without any meaningful or tangible action being taken to prevent the most depraved abuse of children and young people online, eSafety Commissioner Julie Inman Grant said.
"It is clear to me that during the last two to three years since we asked these companies how they are tackling online child sexual abuse, they haven't taken many steps to lift and improve their efforts here, despite the promise of AI to tackle these harms and overwhelming evidence that online child sexual exploitation is on the rise.
"In the case of Apple services and Google's YouTube, they didn't even answer our questions about how many user reports they received about child sexual abuse on their services or details of how many Trust & Safety personnel Apple and Google have on-staff.
"It shows that when left to their own devices, these companies aren't prioritising the protection of children and are seemingly turning a blind eye to crimes occurring on their services. We need to keep the pressure on the tech industry as a whole to live up to their responsibility to protect society's most vulnerable members from the most egregious forms of harm and that's what these periodic notices are designed to encourage.
"No other consumer-facing industry would be given the licence to operate by enabling such heinous crimes against children on their premises, or services.
"This group of eight companies are required to report to me every six months, and in that time, I hope and expect to see some meaningful progress in making their services safer for children."
The key findings of the report include:
- None of the providers within the scope of the periodic reporting notice used tools to detect child sexual exploitation or abuse (CSEA) livestreaming on all of their services. eSafety previously reported on the lack of tools used for detecting CSEA livestreaming on Apple's FaceTime, Discord's livestreams and voice chats, Microsoft Teams, and Skype.
- Apple, Discord, Google and Microsoft did not use hash matching on all parts of their services to detect known child sexual exploitation and abuse material. Known material is that already identified by child abuse hotlines and law enforcement which continues to circulate on the internet. Hash matching is a long-standing, readily available and privacy preserving technology. It is a form of digital 'fingerprinting' that allows for copies of previously identified child sexual exploitation and abuse material to be detected at very high levels of accuracy.
- Apple, Google and WhatsApp did not block url links to known CSEA material on any part of their services. Discord only scanned for links to known CSEA material contained in user reports. eSafety previously reported on the inaction of WhatsApp, Discord and Google to action URLs of known CSEA and yet, these services are still failing to block links to known CSEA material.
- Apple, Google, Microsoft, Snap and Skype did not proactively use tools to detect new CSEA material. eSafety's previous reports in 2022 and 2023 also showed these companies did not employ measures to detect this new material.
- Apple, Discord, Google, Microsoft, Skype, WhatsApp and Snap did not use tools to detect grooming on all parts of their service(s). eSafety previously reported in 2022 and 2023 on the lack of tools used for detecting grooming on all of these services.
- Apple, Discord, Google, Microsoft, Skype and Snap did not use tools to detect sexual extortion for both adults and children on all of their services or all parts of their services. eSafety previously reported in 2022 and 2023 on the lack of tools used for detecting sexual extortion on Discord and Google.
Positively, the following companies did take steps to improve the safety of their services in relation to CSEA since their previous reports to eSafety in 2022 and 2023:
- Discord, Microsoft and WhatsApp all generally increased their use of hash-matching tools to detect known CSEA.
- Apple, Discord, Snap and WhatsApp all increased the number of sources from which they took hash lists to detect known CSEA.
- Given the Skype consumer service was retired on 5 May 2025, eSafety has varied Skype's transparency notice. The next publication in this transparency series will be the last that includes information relating to Skype. Discord and Snap commenced using language analysis tools to detect grooming.
- Discord commenced using language analysis tools to detect sexual extortion.
- Microsoft and Snap commenced using tools to detect new CSEA on Xbox and Snapchat, respectively. In addition, Discord and Meta commenced using more tools to detect new CSEA than they did in 2023 and 2022 respectively.
"While we welcome these improvements, more can and should be done," Ms Inman Grant said. "eSafety will continue to shine a light on this issue, highlighting shortcomings and also improvements, to raise safety standards across the industry and protect the world's children including through mandatory industry Codes and Standards."
The second report in this series will be published in early 2026.