Tech Firms Must Remove Abusive Images in 48 Hours

UK Gov

New law requires tech platforms to take down non-consensual intimate images within 48 hours or face fines.

  • Government orders tech platforms to detect and remove intimate images shared without consent
  • Firms put on notice that any non-consensual intimate image that is flagged to them must be taken down in under 48 hours
  • Government clear that tackling intimate image abuse should be treated with the same severity the same as child sexual abuse material and terrorist content

Tech companies will be ordered to take down intimate images shared without a victim's consent within 48 hours, under new laws to protect women and girls from this distressing abuse.

Through an amendment to the Crime and Policing Bill , companies will be legally required to remove this content no more than 48 hours after it is flagged to them, and platforms that fail to act could face fines of up to 10% of their qualifying worldwide revenue or having their services blocked in the UK.

The government is determined to make sure that victims will only need to report an image once. This would mean where an image is reported, they are removed across multiple platforms in one go, and from then on, they are automatically deleted at every new upload.

As part of that work, plans are currently being considered by Ofcom for these kinds of images to be treated with the same severity as child sexual abuse and terrorism content, digitally marking them so that any time someone tries to repost them, they will be automatically taken down.

In a further step to protect victims, we will publish guidance for internet providers setting out how they should block access to sites hosting this content, targeting rogue websites that may fall outside the reach of the Online Safety Act.

In recent years, there has been a worrying trend of intimate images being used to threaten, intimidate and distress, and the Prime Minister is determined to hand back control to victims and end their fear that even when an image is taken down, it will only be put up somewhere else.

Prime Minister Keir Stamer said:

As Director of Public Prosecutions, I saw firsthand the unimaginable, often lifelong pain and trauma violence against women and girls causes. As Prime Minister, I will leave no stone unturned in the fight to protect women from violence and abuse.

The online world is the frontline of the 21st century battle against violence against women and girls. That's why my government is taking urgent action: against chatbots and 'nudification' tools.

Today we are going further, putting companies on notice so that any non-consensual image is taken down in under 48 hours.

Violence against women and girls has no place in our society, and I will not rest until it is rooted out.

Technology Secretary Liz Kendall said:

The days of tech firms having a free pass are over. Because of the action we are taking platforms must now find and remove intimate images shared without consent within a maximum of 48 hours.

No woman should have to chase platform after platform, waiting days for an image to come down. Under this government, you report once and you're protected everywhere.

The internet must be a space where women and girls feel safe, respected, and able to thrive.

Minister for Violence Against Women and Girls, Alex Davies-Jones said:

Intimate image abuse devastates lives. These new measures send a clear message: tech platforms can no longer drag their feet. When harmful content is flagged, it must come down, and fast.

By requiring companies to remove non‑consensual intimate images within 48 hours, we are finally putting the onus where it belongs - on the tech firms with the power and resources to act.

It's a vital step towards making the online world safer, fairer, and more respectful for women and girls.

The government was elected on a pledge to recognise violence against women and girls (VAWG) as a national emergency, and halve this crime in the next decade. Central to this pledge is keeping women and girls safe online.

Just weeks ago, the government called out abhorrent non-consensual intimate images being shared on Grok, which led to the function being removed. Ministers are also legislating to make 'nudification' tools illegal and bringing chatbots - like Grok - within scope of the Online Safety Act.

Creating or sharing non-consensual intimate images will also become a 'priority offence' under the Online Safety Act, meaning this crime is treated with the same seriousness as child abuse or terrorism.

This builds on the government's VAWG strategy , the first step in the government's plan to transform how society response to these awful crimes, this included more than 200 pledges spanning prevention, supporting victims and pursuing offenders, and this laid out a whole of government, and a whole of society approach.

The Prime Minister has been clear this is the first step in the mission to halve violence against women and girls in the next decade, and his government is now focused on delivery.

DSIT

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.