“We’re working in an online environment that’s shifting at incredible speed, where new threats such as AI and nudifying apps are emerging all the time,” says Chris Hughes, Hotline Director at the IWF. “Offenders are quick to exploit the latest technology, platforms and tools to share images of the sexual abuse of children and to cover their own tracks.”
We constantly develop our technology and services to tackle these new threats. Today, we not only remove images of child victims at source, we also provide specialist products and services to more than 200 companies to ensure that criminal imagery has nowhere to hide.
We create, curate, package up and deliver services and datasets that stop images of child sexual abuse being uploaded, shared, stored and sold.
Our Members have access to a range of services that allow them to deal with images of child sexual abuse in the way that’s most appropriate to their sector. They include the following:
These services enable our Members to work proactively, scan their platforms and networks, detect criminal content and take it down.
We know that our methods work. Less than 0.34 per cent of the child sexual abuse imagery that the IWF found in 2024 was hosted in the UK, and it’s been less than 1% since 2003. We’ve helped make the UK one of the most difficult places in the world to host criminal images of children.
“We’re the largest hotline for the detection and identification of child sexual abuse imagery in Europe. Our datasets are regarded as robust, well curated and, in some instances, unique,” says Chris. “They are trusted and used globally by tech companies, law enforcement agencies and other hotlines to stop the proliferation of child sexual abuse images online.”
Our speed of response is crucial. We provide twice daily data updates, which help our Members to stop criminal images from going viral. “At midday each day, our analysts share details of new content they’ve discovered that morning,” says Chris. “At the end of the day, they do the same for content detected that afternoon.”
In our work with law enforcement agencies, the richness of our data helps investigators go to a new level. “When we send a referral over to the Victim ID Team at the National Crime Agency, we haven’t just removed the content from the internet,” says Chris.
“We’ve also gathered additional data that helps the agency to identify and safeguard the child. This is a young person who, without our involvement, may never have been discovered and could have been subjected to ongoing abuse.”
It's a strength of the IWF that we look for criminal activity in places where other people aren’t looking. “This year, our analysts were the first to notice that certain videos contained hidden messages that led to criminal content,” says Chris. “We flagged this up with the platform, which is now able to address that loophole.”
In a similar case, after seeking professional legal advice, we were able to find a way to tackle non-clickable links that referenced where to find child sexual abuse images online. These are classed as ‘inchoate offences’ and therefore criminal, as these links provide information that would enable others to commit an offence.
As for the future? “We’ll continue to develop new services for industry, law enforcement agencies, other hotlines and the safety tech sector,” says Chris. “Whatever it takes, we’ll work to end the repeated victimisation of children and make the internet a safer place for everyone.”
Discover more about the services we provide for our Members.