ESafety Blocks Services Nudifying Aussie Students

A provider of three of the World's most widely used 'nudify' services has withdrawn access in Australia following eSafety enforcement action.

eSafety issued the UK-based company with an official warning in September for allowing its services to be used to create artificially generated child sexual exploitation material.

This amounted to non-compliance with Australia's mandatory Codes and Standards which require all members of the online industry to take meaningful steps to tackle the worst-of-the-worst online content, including child sexual abuse material.

The 'nudify' services provided by the company were receiving approximately 100,000 visits a month from Australians and have featured in high-profile cases related to the creation of AI generated sexual exploitation material of students in Australian schools.

eSafety Commissioner Julie Inman Grant said that the development shows Australia's world leading codes and standards are working to make the online world safer for all Australians, particularly children.

"We know 'nudify' services have been used to devastating effect in Australian schools and with this major provider blocking their use by Australians we believe it will have a tangible impact on the number of Australian school children falling victim to AI-generated child sexual exploitation," Ms Inman Grant said.

"We took enforcement action in September because this provider failed to put in safeguards to prevent its services being used to create child sexual exploitation material and were even marketing features like undressing 'any girl,' and with options for 'schoolgirl' image generation and features such as 'sex mode'.

It comes as global AI model hosting platform Hugging Face has also taken key steps to comply with Australian law after warnings from eSafety certain generative AI models it hosts are being misused by Australians to create AI-generated child sexual exploitation material.

Hosting platforms - like Hugging Face - act like the 'gatekeepers' of distribution of these powerful AI models, much the same way the more traditional app stores do, so it's equally important to ensure they also have measures in place to protect children.

"There have been instances where models were downloaded from AI model hosting platforms like Hugging Face by Australian users and used to create child sexual exploitation material, including depictions of real children and survivors of sexual abuse. We've also seen them host these so-called 'nudify' models which we've seen used with devastating impacts on Australian school students."

Following engagement from eSafety about compliance concerns, Hugging Face has now changed their terms of service so that all account holders are required take steps to minimise the risks associated with models that they upload, specifically to prevent misuse to generate child sexual exploitation material or pro-terror material.

From now on, if the company becomes aware that their new terms have not been complied with, whether from user reports or its own safety efforts, it is required to enforce the terms.

If Hugging Face fails to take appropriate action for a breach of its terms, eSafety could take enforcement action. eSafety has a range of enforcement mechanisms under the Online Safety Act where any company fails to comply with an industry code or standard, including seeking penalties of up to $49.5m.

"By targeting both the consumer tools, the underlying models that power them, and the platforms that host them, we're tackling harm at multiple levels of the technology stack," Ms Inman Grant said. "We're also working with Government on reforms to restrict access to these 'nudify' tools."

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.