Internet Watch Foundation sees the most extreme year on record in 2023 Annual Report and calls for immediate action to protect very young children online.
The National Crime Agency estimates there to be between 550,000 and 850,000 people in the UK who pose varying forms of sexual risk to children.
The UK’s Internet Watch Foundation (IWF) and the USA’s National Center for Missing & Exploited Children (NCMEC) announce a landmark agreement to better protect children whose sexual abuse images are shared and traded on the internet.
The findings will be ‘invaluable’ in turning the tide on the threat children are facing from online predators.
Expert analysts have taken action against 200,000 websites containing child sexual abuse material
The European Parliament is taking a decisive stand against the rise of AI-generated child sexual abuse material (AI-CSAM), co-hosting a high-level briefing with the Internet Watch Foundation (IWF) to address this urgent threat. With a 380% increase in AI-CSAM reports in 2024, the Parliament is pushing for robust legal reforms through the proposed Child Sexual Abuse Directive. Key priorities include criminalising all forms of AI-generated CSAM, removing legal loopholes such as the “personal use” exemption, and enhancing cross-border enforcement. The IWF and the European Child Sexual Abuse Legislation Advocacy Group (ECLAG) urge the Council of the EU to align with Parliament’s strong stance to protect children and support survivors. This article highlights the scale of the threat, the evolving technology behind synthetic abuse imagery, and the critical need for updated EU legislation.
Research report by PIER at Anglia Ruskin University, providing insight into girls and their parents' understanding of self-generated CSAM.
New report identifies honest communication as pivotal in battle to stop ‘self-generated’ child sexual abuse material.