AFP Commissioner Kershaw Addresses National Press Club

I acknowledge the Traditional Owners of the land on which we meet today and pay my respects to Elders past and present.

I acknowledge the Honourable Attorney-General Mark Dreyfus KC, Secretary of the Attorney-General's Department Katherine Jones, Australian Criminal Intelligence Commission CEO Heather Cook, Cyber Security Cooperative Research Centre CEO Rachael Falk, my dedicated AFP members here today, and invited guests.

Many of us grew up at a time when our parents knew we were safe if the front door was locked.

But now because of technology, a locked front door is no longer a barrier for criminals.

Virtual hands can reach through almost any device, and the consequences of those actions can be as damaging as if they happened in the real world.

Technology has dramatically changed the way we police.

Federal crime is increasing.

And that is because criminals are always quick to adopt technology.

The majority of federal crime is tech enabled, such as online child exploitation; cybercrime, fraud; illicit drug trafficking; terrorism and foreign interference.

I want to be clear about why I am here today.

I want to inform the public how they can keep safe online.

I want to warn criminals that despite tech advancements, we can still identify them and bring them to justice.

And finally, I want to encourage tech companies to continue working with law enforcement and other partners to ensure privacy and protection can co-exist.

Freedom of speech does not extend to sharing online material of children being sexually abused, nor does it have anything to do with sharing a video of someone being attacked in a church.

We used to plan the future of policing through the lens of the years to come. But now, because of constant advances in technology, the years to come are almost every 24 hours.

Partly, this is because of the Fourth Industrial Revolution - think connectivity, automation, machine learning and artificial intelligence.

An Australia New Zealand Policing Advisory Agency research paper states that by 2030, almost everyone on the planet will have access to the internet.

It is projected there will be 3.5 billion new internet users between 2020 and 2030.

This brings benefits but also serious challenges, including more perpetrators and more victims.

I want to particularly thank Australian governments for providing the AFP with the legislation, tools and support needed to help keep Australians safe.

But we cannot do this by ourselves or without ourselves.

If it used to take a village to raise a child, constant advances in technology now means it takes a country, global law enforcement and the private sector to help keep them safe.

Some of our children and other vulnerable people are being bewitched online by a cauldron of extremist poison on the open and dark web.

That's one serious problem. The other is that the very nature of social media allows that extremist poison to spray across the globe almost instantaneously.

We can look at it another way. Social media companies are refusing to snuff out the social combustion on their platforms.

Instead of putting out the embers that start on their platforms, their indifference and defiance is pouring accelerant on the flames.

If we consider the disinformation and misinformation from two shocking incidents in Sydney this month, and how that social combustion was propagated throughout the world, we see the consequences of that indifference and defiance.

Our respected leaders of faith tell us how the interpretation of religion is being purposely distorted on social media.

Because of this, their communities and religious beliefs are tarnished and blamed for violent acts carried out by those who have been radicalised.

We used to warn our children about stranger danger, but now we need to teach our kids about the digital-world deceivers.

We need to constantly reinforce that people are not always who they claim to be online; and that also applies to images and information.

Another example is how criminals, pretending to be someone else, use social media to trick youth into sending intimate images of themselves, and then blackmail them for money.

Fearing their images will be sent to loved ones, young people have taken their lives.

Imagine the anxiety these victims are feeling.

Right now, I know there are silent victims feeling like their world is closing in.

Please, if this is your situation, do not be embarrassed. Tell your parents, someone you trust and law enforcement. We will protect you and make it stop.

We work regularly with overseas partners to take action against sextortion gangs.

To do this effectively, evidence such as screen grabs of text demands, profile names, URLs and accounts are required.

We also encourage reporting to the AFP-led ACCCE, which stands for the Australian Centre to Counter Child Exploitation. This can be done at www.accce.gov.au/report.

Can I now please ask you to look at the images on your tables.

There are four images of children. They look like the photographs in our family albums or stored in our phones.

Now, I want you to pick the image generated by artificial intelligence.

The answer is they have all been generated by AI.

Now, I want you to think about a diabolical scenario facing law enforcement.

The AFP identifies online child sexual abuse and we immediately triage to determine if this is a new victim, if we believe they are in Australia and if their life is at risk.

It is determined this is a priority case, but after weeks, months or maybe years, investigators determine there is no child to save because the perpetrator used AI to create an image to create the sexual abuse.

It is an offence to create, possess or share this material - and it is a serious crime - but the reality for investigators is they could have been using capability, resources and time to find real victims.

Please reflect on that.

In my experience, offenders who seek out and view child sexual abuse material, can transition to contact offending.

No one should ever have to see what our members see, and it's a damning indictment on humanity that we need so many investigators to combat this crime.

Last year, the Internet Watch Foundation released a report titled, How AI is being used to create child sexual abuse imagery.

It stated, and I quote:

"When our analysts saw the first renderings of AI-generated child sexual abuse material in spring of…2023, there were clear tells that this material was artificially generated. Half a year on, we're now in a position where the imagery is so life-like, that it's presenting real difficulties for even our highly-trained analysts to distinguish."

It said: "The testimony of perpetrators themselves in dark web forums also tells you what you want to know; there's jubilation that fantasies can be made to order. All you need is the language to tell the software what you want to see. We're seeing AI (child abuse material) images using the faces of known, real, victims. We're seeing how technology is nudifying children whose clothed images have been uploaded online for perfectly legitimate reasons."

Together, we must find solutions to these diabolical challenges but we also must be proactive about our own safety.

A simple measure is locking down settings on social media accounts to make it harder for others to access images and then use AI to create abuse material.

Think about it like this. You probably wouldn't give a stranger a photo album of your kids. That's essentially what can happen if privacy settings are not locked down.

We continue to talk to social media companies about how to help law enforcement identify a tsunami of AI-generated child abuse material we know is coming.

It could include safeguards on tools that create AI generated imagery such as digital watermarks and prompt restrictions.

While this can happen now, it is no silver bullet and offenders are always looking at how they can beat technological countermeasures.

Numerous law enforcement agencies, including the AFP, have appealed to social media companies and other electronic service providers to work with us to keep our kids safe.

That includes not transitioning to end-to-end encryption until they can ensure their technology protects against online crime rather than enabling it.

We recognise the role that technologies like end-to-end encryption play in protecting personal data, privacy and cyber security, but there is no absolute right to privacy.

People have the right to privacy just like they have the right not to be harmed.

People expect to have their privacy protected just like they expect police to do their job once a crime has been committed against them, or a loved one.

That expectation includes being able to respond and bring offenders before the justice system.

In 2018-2019 financial year, the ACCCE received more than 14,000 reports of online child sexual exploitation. In 2022-2023, we received more than 40,000 reports.

Nearing the end of this financial year, we have already exceeded the past financial year's figures.

I want to thank tech companies for the help they have provided to date.

This month, the AFP assumed the chair of the Virtual Global Taskforce, an international alliance of 15 law enforcement agencies working together to tackle child sexual abuse.

This underscores our commitment to keep kids safe, not just in Australia but around the world.

Police commissioners around the globe stand shoulder-to-shoulder with us, but there is a growing sense of dissatisfaction with social media companies.

On Sunday, a joint declaration was agreed to by 32 European police chiefs.

It stated:

"Two key capabilities are crucial to supporting online safety.

First, the ability of technology companies to provide to law enforcement investigations - in response to a lawful authority with strong safeguards and oversight - the data of suspected criminals on their service. This is known as lawful access.

Second, the ability of technology companies proactively to identify illegal and harmful activity on their platforms.

The companies currently have the ability to alert the proper authorities - with the result that many thousands of children have been safeguarded, and perpetrators arrested and brought to justice.

These are quite different capabilities, but together they help us save many lives and protect the vulnerable in all our countries on a daily basis from the most heinous of crimes, including but not limited to terrorism, child sexual abuse, human trafficking, drugs smuggling, murder and economic crime.

We are, therefore, deeply concerned that end-to-end encryption is being rolled out in a way that will undermine both of these capabilities. Companies will not be able to respond effectively to a lawful authority. Nor will they be able to identify or report illegal activity on their platforms."

The AFP echoes concerns raised in this declaration but I want to forewarn criminals that we will still identify them, although we acknowledge some of our investigations may be more resource intensive.

Helen Schneider is the AFP's Commander for Human Exploitation and oversees the ACCCE.

I asked Commander Schneider about her thoughts as we deal with this wicked challenge.

She said, and I quote: "Globally, law enforcement is in a technological race against online predators and we will only stay in this race by collaboratively and responsibly weaponising technology to counter this threat.

The tech industry must own their role in this race.

Prevention has become a non-negotiable requirement in every Australian household, and law enforcement, community and industry must be inseparable allies in the delivery of this capability."

Just like the real world, we believe the same standards should apply in the online world.

For example, if a judicial officer decides there is reasonable suspicion that a serious crime has been committed, and it is necessary for law enforcement to access information to investigate that serious crime, tech companies should respect the rule of law and the order of a court, or independent judicial authority, and provide that information.

Because this is my concern: While end-to-end encryption provides solace for law abiding individuals, it can also provide criminals with an instant invisibility cloak.

My door is open to all relevant tech CEOs and chairmen, including Elon Musk and Mark Zuckerberg.

I know we can find common ground because, put simply, tech is supposed to make our lives easier and safer, and not the opposite.

It seems counter-intuitive that in a world of advancements that our safety is at risk.

Mike and I could not be clearer today about challenges before us, however, regardless of how social media companies respond, our mission of protecting Australians and Australia's interests will not be diluted.

As Commissioner, I am focussed on ensuring there are no criminal safe havens.

And we should be able to lock criminals out of our online world, just like we can lock them out in the real world, while still protecting privacy and security.

Finally, I know some of this content has been distressing. If you or someone you know needs support, there are resources available on the ACCCE website at accce.gov.au

Thank you and we are happy to take questions.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.