In conjunction with partners in the private and public sector, we regularly run campaigns aimed at raising awareness & prevention of child sexual abuse online.
The National Crime Agency estimates there to be between 550,000 and 850,000 people in the UK who pose varying forms of sexual risk to children.
Expert analysts have taken action against 200,000 websites containing child sexual abuse material
The European Parliament is taking a decisive stand against the rise of AI-generated child sexual abuse material (AI-CSAM), co-hosting a high-level briefing with the Internet Watch Foundation (IWF) to address this urgent threat. With a 380% increase in AI-CSAM reports in 2024, the Parliament is pushing for robust legal reforms through the proposed Child Sexual Abuse Directive. Key priorities include criminalising all forms of AI-generated CSAM, removing legal loopholes such as the “personal use” exemption, and enhancing cross-border enforcement. The IWF and the European Child Sexual Abuse Legislation Advocacy Group (ECLAG) urge the Council of the EU to align with Parliament’s strong stance to protect children and support survivors. This article highlights the scale of the threat, the evolving technology behind synthetic abuse imagery, and the critical need for updated EU legislation.
IWF is campaigning for an end to use of the phrase ‘child pornography’. There’s #NoSuchThing. It’s child sexual abuse imagery and videos.
New pilot shows way for smaller platforms to play big part in online safety.
The IWF Reporting Portal in Tunisia shows the importance of working with multiple partners to efficiently fight against child sexual abuse material.