Glitching code on a screen

Report: Real victims’ imagery used in highly realistic ‘deepfake’ AI-generated videos

We work to stop the repeated victimisation of people abused in childhood and make the internet a safer place, by identifying & removing global online child sexual abuse imagery.


Report suspected online child sexual abuse images or videos here.

Find out why we use the term 'child sexual abuse' and not 'child pornogrpahy'. 

Think Before You Share

Our Think Before You Share campaign aims to help young people understand the harm of sharing explicit images and videos of themselves, and others, and encourage parents and educators to start timely conversations with children and young people. 

We encourage you to share our campaign using #ThinkBeforeYouShare and by following, liking and sharing the campaign on our social channels

Latest news

‘Exponential increase in cruelty’ as sextortion scams hit younger victims

‘Exponential increase in cruelty’ as sextortion scams hit younger victims

Reports involving sexual extortion are on the rise as criminals become more ‘adept’ at targeting younger children

23 August 2024 News
Meta failing to stop spread of child sexual abuse imagery in wake of Huw Edwards scandal

Meta failing to stop spread of child sexual abuse imagery in wake of Huw Edwards scandal

Child protection groups warn there’s nothing to stop imagery sent to Edwards spreading further on WhatsApp.

16 August 2024 News
Susie Hargreaves OBE to leave IWF after 13 years’ 'distinguished service'

Susie Hargreaves OBE to leave IWF after 13 years’ 'distinguished service'

After 13 successful years at the helm of the Internet Watch Foundation (IWF), Susie Hargreaves OBE is leaving to take up a new opportunity.

26 July 2024 Statement
How the sending of one photo led an 11-year-old girl to become a victim of physical sex abuse

How the sending of one photo led an 11-year-old girl to become a victim of physical sex abuse

The girl sent a photo to a boy in her class before the image and her phone number were added to all-male online chat groups - she later started disappearing before being abused by "unknown men".

23 July 2024 IWF In The News
AI advances could lead to more child sexual abuse videos, watchdog warns

AI advances could lead to more child sexual abuse videos, watchdog warns

IWF warns of more AI-made child sexual abuse videos as tools behind them get more widespread and easier to use

22 July 2024 IWF In The News
AI being used to generate deepfake child sex abuse images based on real victims, report finds

AI being used to generate deepfake child sex abuse images based on real victims, report finds

The tools used to create the images remain legal in the UK, the Internet Watch Foundation says, even though AI child sexual abuse images are illegal.

22 July 2024 IWF In The News
Girl looking at phone with headphones on

Our podcast

In Conversation With

Listen now
House of commons
Our tech
Phone with IWF News Page
Latest News

IntelliGrade: Ground-breaking tech from IWF

 

IntelliGrade from IWF.

IntelliGrade, from the Internet Watch Foundation, is helping companies and law enforcement bodies to fight back against criminals who trade, store and upload images and videos showing the sexual abuse of children. 

It is is a powerful new tool that enables our analysts to accurately grade child sexual abuse images and videos, and create hashes (digital fingerprints) that are compatible with child sexual abuse laws and classifications in the UK, US, Canada, Australia, New Zealand and the Interpol Baseline standard.