Crackdown on image-based sexual abuse influenced by Durham research

Durham University

Close up picture of a man using his mobile phone

Professor Clare McGlynn from our Durham Law School, along with colleagues, has been carrying out research on image-based sexual abuse for a number of years and shown how the existing law fails victims with change needed.

Clare’s work has been shaping and influencing law reforms in this area.

And now the UK Government has announced changes in legislation which will make it easier to prosecute those sharing intimate images without consent, including deepfake porn, where a person’s face is put on someone else’s body. 

Clare talks us through the changes, what they mean and what more needs to be done.

What does the new law mean in practice?

The new law means that whenever an intimate image is deliberately shared without consent, this constitutes a criminal offence. At the moment, it’s only an offence if it can be shown the offender did so with the aim of causing distress to the victim. This is not always the case and, even when it is, it is often difficult to prove. So, this new law should make it easier to prosecute cases. 

The new law will also now cover images that are altered to make them pornographic, often called deepfake porn. The issue of deepfakes is a huge problem and something that could happen to any one of us. There are websites that will turn an ordinary image into a nude image with millions of people accessing these apps and sites.

In the future with the new law, distributing deepfake porn without consent, will also be a criminal offence. 

What is your reaction to this proposed new law?

The announcement is a welcome recognition of the harms of taking and sharing intimate images without consent and it is testament to the many victims who have been calling for reform. 

But this is only the start, for this new law to have real effect, victims must be granted anonymity when reporting to police. Also, the Online Safety Bill must mandate internet platforms to take action to tackle violence against women and girls, removing non-consensual porn and deepfakes. 

How has your work alongside others fed into this new law?

My research has involved working with victims of image based sexual abuse to understand their experiences of the abuse and of the criminal justice system. We found that the law is comprehensively failing victims. We also highlighted the serious harms of this form of abuse which often has life-shattering impacts. 

For many years, I have worked closely with victims, as well as politicians and women’s organisations, to make the case for legal reforms and change. My research has helped to shape public debates and the Law Commission proposals on which the Government announcement is based. 

What should be the next steps in terms of law reform?

This announcement is a welcome start, but so much more needs to be done. First, victims need to be granted automatic anonymity when reporting to the police. We know that the lack of anonymity – which is granted to all other victims of sexual offences – means women are reluctant to report to the police and to continue with prosecutions. 

We also need the Government to bring forward the Online Safety Bill and ensure that it mandates internet companies to take violence against women and girls seriously. 

What else should be done aside from further law reform?

The Government needs to provide sufficient and sustained resourcing for organisations supporting victims, particularly the Revenge Porn Helpline. 

/Durham University Public Release. This material from the originating organization/author(s) may be of a point-in-time nature, edited for clarity, style and length. The views and opinions expressed are those of the author(s).View in full here.