Images and video analysis
We use our bespoke grading and hashing software, called IntelliGrade, to perform image-level assessments of images and videos. IntelliGrade uses mathematical algorithms to create unique hashes, a type of digital fingerprint, for each image or video that has been assessed as criminal.
Each hash is completely unique. Once an image has been hashed, it is shared with industry and law enforcement enabling our partners to identify known child sexual abuse imagery using the hash values, eliminating the need to view the criminal imagery.
Hashes are quickly recognised; this means thousands of criminal pictures can be blocked from ever being uploaded to the internet in the first place. By using our Hash List, which is available in multiple hash formats including both perceptual and cryptographic, tech companies can stop criminals from uploading, downloading, viewing, sharing or hosting known images and videos showing child sexual abuse.
The IWF began hashing child sexual abuse images in 2015.
At the end of 2024 we had a total of 2,883,015 unique hashes available to Members as part of our IWF Hash List.
We have 44 Members that take our hash list service.
Read more about our services.