![New AI child sexual abuse laws announced following IWF campaign](/media/ft4jao5p/news-blog-image_ai-announcement-01.jpg?width=380&height=257&v=1db734294dfeae0)
New AI child sexual abuse laws announced following IWF campaign
Tighter rules come as IWF warns AI-generated child sexual abuse imagery reports have quadrupled in a year.
Published: Wed 31 Aug 2016
Voting and judging is underway for the IWF Image Hash List in the Cloud Hosting Awards.
The IWF Image Hash List stops the upload, sharing and storage of child sexual abuse images. It was soft launched last year to some of the biggest internet names – Google, Facebook, Microsoft, Twitter and Yahoo! – who also helped develop the ground-breaking safety product.
The comprehensive Image Hash List accurately targets images of child sexual abuse – even if they have been cropped or altered. It helps victims of child sexual abuse by preventing their images being uploaded to the internet and removes those that are already there in greater numbers than ever before. It protects companies’ brands and their customers and staff from seeing these images and offers incomparable standards in quality.
Vote here to show your support for this revolutionary solution. The IWF Image Hash List is named under the Product of the Year category. Your vote counts!
Want to know more? Email [email protected]
Contact: Emma Hardy – [email protected] +44 (0) 1223 203030 or +44 (0) 7929 553679.
We make the internet a safer place. We help victims of child sexual abuse worldwide by identifying and removing online images and videos of their abuse. We search for child sexual abuse images and videos and offer a place for the public to report them anonymously. We then have them removed. We’re a not for profit organisation and are supported by the global internet industry and the European Commission.
The IWF is part of the UK Safer Internet Centre, working with Childnet International and the South West Grid for Learning to promote the safe and responsible use of technology.
Tighter rules come as IWF warns AI-generated child sexual abuse imagery reports have quadrupled in a year.