Deepfake Porn: Criminalize Creation, Not Just Sharing

Deepfake pornography - where someone's likeness is imposed into sexually explicit images with artificial intelligence - is alarmingly common. The most popular website dedicated to sexualised deepfakes, usually created and shared without consent, receives around 17 million hits a month. The content almost exclusively targets women. There has also been an exponential rise in "nudifying" apps which transform ordinary images of women and girls into nudes.

Author

  • Clare McGlynn

    Professor of Law, Durham University

When Jodie, the subject of a new BBC Radio File on 4 documentary, received an anonymous email telling her she'd been deepfaked, she was devastated. Her sense of violation intensified when she found out the man responsible was someone who'd been a close friend for years. She was left with suicidal feelings, and several of her other female friends were also victims.

The horror confronting Jodie, her friends and other victims is not caused by unknown "perverts" on the internet, but by ordinary, everyday men and boys. Perpetrators of deepfake sexual abuse can be our friends, acquaintances, colleagues or classmates. Teenage girls around the world have realised that their classmates are using apps to transform their social media posts into nudes and sharing them in groups.

Having worked closely with victims and spoken to many young women, it is clear to me that deepfake porn is now an invisible threat pervading the lives of all women and girls. Deepfake pornography or nudifying ordinary images can happen to any of us, at any time. And, at least in the UK, there is nothing we can do to prevent it.

While UK laws criminalise sharing deepfake porn without consent, they do not cover its creation. The possibility of creation alone implants fear and threat into women's lives.

Deepfake creation itself is a violation

This is why it's time to consider criminalising the creation of sexualised deepfakes without consent. In the House of Lords, Charlotte Owen described deepfake abuse as a "new frontier of violence against women" and called for creation to be criminalised.

It's also a debate taking place around the world. The US is considering federal legislation to give victims a right to sue for damages or injunctions in a civil court, following states such as Texas that have criminalised creation. Other jurisdictions such as the Netherlands and the Australian state of Victoria already criminalise the production of sexualised deepfakes without consent.

A common response to the idea of criminalising the creation of deepfakes without consent, is that deepfake pornography is a sexual fantasy, just like imagining it in your head. But it's not - it is creating a digital file that could be shared online at any moment, deliberately or through malicious means such as hacking.

It's also not clear why we should privilege men's rights to sexual fantasy over the rights of women and girls to sexual integrity, autonomy and choice. This is non-consensual conduct of a sexual nature. Neither the porn performer nor the woman whose image is imposed into the porn have consented to their images, identities and sexualities being used in this way.

Creation may be about sexual fantasy, but it is also about power and control, and the humiliation of women. Men's sense of sexual entitlement over women's bodies pervades the internet chat rooms where sexualised deepfakes and tips for their creation are shared. As with all forms of image-based sexual abuse, deepfake porn is about telling women to get back in their box and to get off the internet.

Taking the law further

A law that only criminalises the distribution of deepfake porn ignores the fact that the non-consensual creation of the material is itself a violation. Criminalising production would aim to stop this practice at its root.

While there are legitimate concerns about over-criminalisation of social problems, there is a worldwide under-criminalisation of harms experienced by women, particularly online abuse.

And while criminal justice is not the only - or even the primary - solution to sexual violence due to continuing police and judicial failures, it is one redress option. Not all women want to report to police, but some do. We also need new civil powers to enable judges to order internet platforms and perpetrators to take-down and delete imagery, and require compensation be paid where appropriate.

As well as the criminal law laying the foundation for education and cultural change, it can impose greater obligations on internet platforms. If creation of pornographic deepfakes was unlawful, it would be difficult for payment providers to continue to prop up the deepfake ecosystem, difficult for Google to continue returning deepfake porn sites at the top of searches and difficult for social media companies such as X (formerly Twitter) or the app stores to continue to advertise nudify apps.

The reality of living with the invisible threat of deepfake sexual abuse is now dawning on women and girls. My women students are aghast when they realise that the student next to them could make deepfake porn of them, tell them they've done so, that they're enjoying watching it - yet there's nothing they can do about it, it's not unlawful.

With women sharing their deep despair that their futures are in the hands of the "unpredictable behaviour" and "rash" decisions of men, it's time for the law to address this threat.

The Conversation

Clare McGlynn does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

/Courtesy of The Conversation. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).