AI imagery getting more ‘extreme’ as IWF welcomes new rules allowing thorough testing of AI tools
The IWF welcomes new measures to help make sure digital tools are safe as new data shows AI child sexual abuse is still spreading.
Published: Wed 31 Aug 2016
Voting and judging is underway for the IWF Image Hash List in the Cloud Hosting Awards.
The IWF Image Hash List stops the upload, sharing and storage of child sexual abuse images. It was soft launched last year to some of the biggest internet names – Google, Facebook, Microsoft, Twitter and Yahoo! – who also helped develop the ground-breaking safety product.
The comprehensive Image Hash List accurately targets images of child sexual abuse – even if they have been cropped or altered. It helps victims of child sexual abuse by preventing their images being uploaded to the internet and removes those that are already there in greater numbers than ever before. It protects companies’ brands and their customers and staff from seeing these images and offers incomparable standards in quality.
Vote here to show your support for this revolutionary solution. The IWF Image Hash List is named under the Product of the Year category. Your vote counts!
Want to know more? Email [email protected]
Contact: Emma Hardy – [email protected] +44 (0) 1223 203030 or +44 (0) 7929 553679.
We make the internet a safer place. We help victims of child sexual abuse worldwide by identifying and removing online images and videos of their abuse. We search for child sexual abuse images and videos and offer a place for the public to report them anonymously. We then have them removed. We’re a not for profit organisation and are supported by the global internet industry and the European Commission.
The IWF is part of the UK Safer Internet Centre, working with Childnet International and the South West Grid for Learning to promote the safe and responsible use of technology.
The IWF welcomes new measures to help make sure digital tools are safe as new data shows AI child sexual abuse is still spreading.
More than nine in ten people in the UK say they are concerned at how images and videos of children being sexually abused are shared through end-to-end encrypted (E2EE) messaging services.