Removal Notices Target Illegal Online Material

During September 2025, eSafety received multiple complaints from Australians about extreme graphic footage appearing online.

The footage clearly showed recent attacks in the United States, including the horrific killings of Iryna Zarutska and Charlie Kirk, and the beheading of Chandra Mouli Nagamallaiah.

Australia's Classification Board reviewed the material and assessed it as Refused Classification (RC), which means it cannot be legally hosted, shared, distributed, sold or accessed in Australia, and is subject to eSafety removal notices.

eSafety recognises the importance of news reporting and public commentary on current events - especially such tragic, disturbing ones as these - and has no role regulating opinion, commentary or political speech.

eSafety does, however, have a role in enforcing Australia's Online Safety Act to keep Australian citizens safe from online harm, especially children. That includes preventing accidental, inadvertent or unnecessary exposure to harmful violent online imagery of real killings where that material has been assessed RC.

eSafety understands many Australians were deeply concerned their children had been served this material direct to their feeds without warning or any protective filters.

eSafety engaged with major platforms, informing them of the Board's assessment, sharing URLs found on their platforms and reminding them of their obligation to remove such material under the Act, including their obligations under Australia's online industry codes and standards.

While continuing to monitor the virality of the material, eSafety initially gave platforms an opportunity to remove it voluntarily, before issuing removal notices to both X and Meta. In its covering letters, eSafety confirmed geo-blocking would be sufficient to comply.

To be clear, eSafety's notices applied only to the graphic footage assessed as RC, not to any reporting, opinions or political commentary posted alongside it.

These three incidents resulted in the significant proliferation of extremely violent material over a short period of time, and which was widely accessible to Australian users, including children.

eSafety was disappointed to see how inconsistently and slowly the major platforms acted to implement their own policies and apply sensitive content labels (known as interstitials) on this violative material.

eSafety will not hesitate to discharge its responsibility under the Act to protect Australians online, especially children.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.