eSafety has given legally enforceable transparency notices to Roblox, Minecraft, Fortnite and Steam amid concerns online games are being used by sexual predators to groom children and by extremist groups to spread violent propaganda and radicalise young people.
The transparency reporting notices require the providers to explain how they are identifying, preventing and responding to these harms, as well as cyberbullying and online hate. The notices ask about how their systems, staffing and safety by design choices are aligned with the Australian Government's Basic Online Safety Expectations.
eSafety Commissioner Julie Inman Grant said that in cases of serious online harms such as grooming, sexual extortion and youth radicalisation, online game and gaming-adjacent platforms like encrypted message services could serve as a point of first contact between children and offenders.
"What we often see after these offenders make contact with children in online game environments, they then move children to private messaging services," Ms Inman Grant said.
"Gaming platforms are amongst the online spaces most heavily used by Australian children, functioning not only as places to play, but also as places to socialise and communicate. Our own research into children and gaming showed around 9 in 10 children aged 8 to 17 in Australia had played online games.
"Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other off-platform harms.
"We've seen numerous media reports about grooming taking place on all four of these platforms as well as terrorist and violent extremist-themed gameplay. This includes Islamic State-inspired games and recreations of mass shootings on Roblox, as well as far right groups recreating fascist imagery in Minecraft.
"Media reports have also pointed to games in Fortnite gamifying the horrific events of the WWII Jasenovac concentration camp and the January 6th US Capitol Building riots, while Steam is reportedly a hub for a number of extreme-right communities.
"These online game and gaming-adjacent platforms are used by millions of children and so it is imperative that they take every possible step to protect them and continue to improve safeguards.
"These companies must take meaningful steps to prevent their services becoming onramps to abuse, extremist violence, radicalisation or lifelong harm."
eSafety publishes reports based on its transparency notices to provide information to the public, including parents about safety risks and mitigations that currently exist, and to increase the incentive for technology companies to adopt Safety by Design, engineering out harms before they occur.
Ultimately, the aim is to ensure all users, especially children can enjoy the benefits these platforms have to offer without experiencing avoidable harms.
In addition to responding to transparency reporting notices and how they comply with the Expectations, online game platforms are also required to comply with minimum obligations under the Online Safety Codes and Standards.
A breach of a direction to comply with a code or standard can result in penalties of up to $49.5 million per breach.
Additional background:
Australia's Unlawful Material Codes and Standards require online services to implement systems and processes to safeguard Australians from illegal and restricted material, including material that depicts child sexual exploitation, pro-terror material, and crime or violence without justification.
This includes requirements on some services to address risks of grooming, which often leads to the generation of child sexual exploitation material.
Under these Codes and Standards, Roblox committed to make a number of key changes earlier this year to protect children including more stringent age assurance, making accounts belonging to under 16s private by default, and introducing tools to prevent adult users from contacting under 16s without parental consent.
eSafety will also be directly testing the implementation of these commitments to validate their effectiveness.
Additionally, Australia's Age-Restricted Material Codes create new obligations primarily focused on preventing children's access or exposure to age-inappropriate content - including high-impact violence - across the 8 sections of the online industry.
Compliance with a transparency reporting notice is mandatory. A range of enforcement options are available to eSafety if companies fail to respond, including seeking financial penalties of up to $825,000 a day.