Digital fingerprints – “hashes” – of child sexual abuse images are a revolutionary step for victims.
Internet giant Microsoft has teamed up with the Internet Watch Foundation (IWF) to create a revolutionary system that stops the upload, storage and sharing of “potentially millions” of child sexual abuse images on the internet.
Called the IWF Image Hash List, a ‘hash’ is a unique code that’s generated from the data in an image, like a digital fingerprint. The Image Hash List is a list of these individual codes (digital fingerprints) of known images of child sexual abuse.
By using the Image Hash List on their systems, internet companies across the globe will be able to stop the upload, sharing and storage of these hideous images.
Key facts:
- 125,583 images of confirmed child sexual abuse have been hashed and added to the IWF Image Hash List
- Every four minutes our analysts create a new hash
- 67% of the hashes are category A or B – the rape or sexual torture of children
- 3,040 of the hashes were assessed as babies and toddlers – two years old, or younger
IWF CEO Susie Hargreaves OBE says: “We’ve been working on the technology to make the Image Hash List a reality for some time. Microsoft provided a cloud-based solution to allow companies all over the world to use our hash list with minimal fuss and no expense to those who want to protect their customers, their brands and do the right thing for victims of sexual abuse.
“Now our Image Hash List, coupled with Microsoft’s Cloud technology is an absolute game-changer. The service is unparalleled globally.”
IWF analysts have already created huge number of hashes; to date the list stands at 125,583.
Susie continues: “Every eight minutes our analysts identify a new webpage showing a child being sexually abused. We always ensure that image is taken down. But in the past it could be uploaded again, and again. This was incredibly frustrating for us and dreadfully sad for those victims. Now our new technology allows us, and any company which uses the Image Hash List, to hunt out those abusive images, meaning internet companies can completely stamp out copies, stop the sharing, and even stop the image being uploaded in the first place.
“This is a major breakthrough. Each and every one of these images is the painful record of a child being sexually abused. Their suffering is very real. These victims have the right to know someone is fighting this important battle.”
Technical details: IWF/Microsoft collaboration
The IWF and Microsoft have teamed up to make it easy for companies to use the Image Hash List. Microsoft PhotoDNA has been the industry standard since 2009 and previously companies had to integrate hashes internally, often incurring high costs due to extensive engineering.
From today, the Microsoft and IWF collaboration has enabled a cloud-based delivery solution. The obstacles of in-house engineering have been removed, meaning that proactive monitoring of a company’s platform is no longer just for the big industry members.
Companies can compare anything from a single image to the millions of images uploaded through their platforms daily, using the cloud solution. For most companies, it’s not possible to manually scan every image which is uploaded onto their platforms as the numbers are too high. So the IWF Image Hash List, deployed through the cloud solution, can do if for them.
Companies can do all of this without affecting the users’ experience due to the speed and efficiency of the cloud solution. Any IWF member can integrate the cloud solution into their platform and be proactive in monitoring against the distribution of online child sexual abuse material.
Corporate Vice President, Microsoft EMEA, Michel Van Der Bel says: “The impact of IWF’s work has been profound. They have made great progress, but there is more to be done. This is a journey.
“Microsoft was delighted to be able to employ the technology of our cloud based service to support IWF’s Image Hash List. It’s great work and we are very happy to be involved.”
The pioneering Image Hash List, has been rolled out by Facebook, Google, Microsoft, Twitter and Yahoo! in 2015. It has been created through a unique collaboration of the IWF and the UK police CAID* database. The two lists of confirmed child sexual abuse images have been re-assessed by IWF analysts to the IWF high standard. Each image is categorised and therefore a final Image Hash List can be tailored to a company’s or country’s needs.
Today the Image Hash List service is being rolled out to industry members and can be used by any company that allows people to upload or download images, store, host, process images, or offer filtering solutions. This means that children who were sexually abused and photographed, whose images were put on the internet and shared by abusers, have some assurance that their images could be completely removed. Further more, if offenders try to upload those images repeatedly, they’ll be prevented from doing so.
IWF Deputy CEO, Fred Langford says: “The collaboration with Microsoft and our access to the police database makes this development unparalleled globally. For us, this development could revolutionise the way we work and give peace of mind to the thousands of children, whose images have been shared again and again. We currently have more than 125,000 hashes on the list but we’re looking at potentially millions.”
ends
Contact: Lisa Stacey, Communications Team +44 (0) 1223 203030 or +44 (0) 7929 553679.
Notes to editors:
- Figures from the Image Hash List relate to a data collection in October, 2016.
- To watch the Image Hash List animation: https://www.youtube.com/watch?v=5U4hQ4Bm6J0
- Interviews are available from the communications team: 44 (0) 1223 203030
- A written case study from an analyst is available on request.
* CAID. The Child Abuse Image Database (CAID) is a project lead by the Home Office, which will enable law enforcement to assess, categorise and generate unique hashes for child abuse images and videos found during their investigations.
What we do:
We make the internet a safer place. We help victims of child sexual abuse worldwide by identifying and removing online images and videos of their abuse. We search for child sexual abuse images and videos and offer a place for the public to report them anonymously. We then have them removed. We’re a not for profit organisation and are supported by the global internet industry and the European Commission.
For more information please visit www.iwf.org.uk.
The IWF is part of the UK Safer Internet Centre, working with Childnet International and the South West Grid for Learning to promote the safe and responsible use of technology.