
AI giving offenders ‘DIY child sexual abuse’ tool, as dozens of child victims used in AI models, IWF warns MPs
The IWF has welcomed upcoming new legislation while giving evidence in Parliament this week.
Report suspected online child sexual abuse images or videos here.
Find out why we use the term 'child sexual abuse' and not 'child pornography'.
Our Think Before You Share campaign aims to help young people understand the harm of sharing explicit images and videos of themselves, and others, and encourage parents and educators to start timely conversations with children and young people.
We encourage you to share our campaign using #ThinkBeforeYouShare and by following, liking and sharing the campaign on our social channels.
The IWF has welcomed upcoming new legislation while giving evidence in Parliament this week.
Partnership grants the frontier AI lab access to safety tech tools.
As Ofcom’s Illegal Harms Codes come into force, platforms are required to implement robust measures to protect users from CSAM and illegal content.
Peter Kyle visited the IWF to see for himself the scale, and severity, of online sexual abuse against children
IntelliGrade, from the Internet Watch Foundation, is helping companies and law enforcement bodies to fight back against criminals who trade, store and upload images and videos showing the sexual abuse of children.
It is is a powerful new tool that enables our analysts to accurately grade child sexual abuse images and videos, and create hashes (digital fingerprints) that are compatible with child sexual abuse laws and classifications in the UK, US, Canada, Australia, New Zealand and the Interpol Baseline standard.
The IWF combines the technical know-how with a deep understanding of the human cost of this awful crime. The organisation's work creates scale and impact to tackle this issue, and Google is proud to work so closely with the IWF.