Commenting on the introduction of legislation today by independent MP Kate Chaney on the use of technology to generate child abuse material, CEO of International Justice Mission Australia, David Braga, said:
"Generative AI child sexual abuse material is abhorrent and needs to be blocked. Much of it includes faces of actual children or was created using known child sexual abuse images and videos.
"Our concern is that this trains the user to see child abuse as acceptable, which it clearly is not. And because the AI generated child sexual abuse images and videos are now almost indistinguishable from abuse images and videos created through sexual abuse of children, that it is a short-step from AI-generated content to real-world content, and also adds to the already overwhelming amount of child sexual abuse materials that law enforcement are reviewing and trying to identify real children involved in generating this.
"The government has committed, but, is yet to legislate a 'digital duty of care', which would require digital platforms to take reasonable steps to prevent foreseeable harms on their platforms and services, presumably including child abuse material. It is common sense that this needs to include AI generated material.
"Now is the time for the Australian government to strengthen the Online Safety Act to require companies across the tech stack, including operating system providers and device manufacturers, to detect and disrupt child sexual abuse material in all its forms on their platforms."
David is available for