New tool allows police around the world to scan for known abuse imagery in 'seconds'

Published:  Mon 20 Nov 2023

New tool will help police forces across the world detect illegal child sexual abuse imagery on suspects’ devices in a matter of seconds.

In partnership with digital forensics company Cyacomb,the Internet Watch Foundation (IWF) has made the digital fingerprints, known as hashes, of 1.7 million of the worst child sexual abuse images available as a secure-by-design "Contraband Filter".

From now on, the largest global Contraband Filter of CSAM can be used by law enforcement to identify these illegal files on suspect devices faster and more reliably than ever. 

For the first time, police forces all around the world will have access to anonymised hashes of known child sexual abuse images. The images have all been assessed and graded by skilled IWF analysts. 

The IWF is the UK’s frontline against online child sexual abuse imagery. Its analysts are uniquely placed to identify videos and images of children suffering sexual abuse.

The IWF turns these images into “hashes” – digital fingerprints – which are now built into the IWF Contraband Filter.

This cutting-edge technology created by Cyacomb is secure by design. Once a Contraband Filter has been created, it can be used to match content found on suspects’ devices but cannot be used to recreate, search for, or otherwise link to the original material. This protects the security of the database and the privacy of victims.  

Dan Sexton, Chief Technical Officer at the IWF, said: “This is an important step in ensuring our world-leading expertise can be quickly drawn upon by police and law enforcement the world over. 

“The IWF’s hashes are trusted, reliable data which has come from our own human analysts. Their assessments of criminal imagery are second to none, and now police will have that information in their pockets, allowing them to make quick and decisive assessments when investigating suspect devices.”  

Ian Stevenson, Cyacomb CEO, said: “Our mission is to put the best capabilities for fighting harmful content wherever they are needed.  I’m delighted that this partnership will bring our technology together with IWF data in secure form, putting new capability into the hands of front line policing around the world.” 

Those using the tool will be able to identify CSAM in record time whilst keeping the original data collected by IWF safe and secure. 

This contributes to the speed and scale at which justice can be served, and even more importantly enables prompt action to safeguard children.

 

‘Dangerous’ AI child sexual abuse reaches record high as public backs clampdown on ‘uncensored’ tools

‘Dangerous’ AI child sexual abuse reaches record high as public backs clampdown on ‘uncensored’ tools

Analysts observed offenders discuss using hidden cameras to obtain footage of real children to transform into AI videos.

24 March 2026 News
Charity urges for ‘zero tolerance’ of ‘dangerous’ AI child sexual abuse in EU as content reaches record high

Charity urges for ‘zero tolerance’ of ‘dangerous’ AI child sexual abuse in EU as content reaches record high

New report reveals full scale of AI-generated child sexual abuse images and videos and ‘unsettling’ insight into offender views.

24 March 2026 News
British schoolgirl’s sexual abuse was spread online for years before analyst’s breakthrough - thanks to a school uniform

British schoolgirl’s sexual abuse was spread online for years before analyst’s breakthrough - thanks to a school uniform

IWF Analyst recognised the girl ‘straight away’ but warns there are still ‘thousands’ of child victims yet to be identified.

20 March 2026 News