One in five child abuse images found online last year were category A – report

Published:  Tue 25 Apr 2023

Written by:  Dan Milmo

The most extreme form of child sexual abuse material accounted for a fifth of such content found online last year, according to a report.

Category A abuse represented 20% of illegal images discovered online last year by the Internet Watch Foundation, a UK-based body that monitors distribution of child sexual abuse material (CSAM). It found more than 51,000 instances of such content, which can include the most severe imagery including rape, sadism and bestiality.

The IWF annual report said the 2022 total for category A imagery was double the figure in 2020 and the increase was partly due to criminal sites selling videos and images of such abuse.

Read more at The Guardian

AI image generators giving rise to child sex abuse material - BBC Newsnight

AI image generators giving rise to child sex abuse material - BBC Newsnight

The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.

17 July 2023 IWF In The News
Charity wants AI summit to address child sexual abuse imagery

Charity wants AI summit to address child sexual abuse imagery

A leading children's charity is calling on Prime Minister Rishi Sunak to tackle AI-generated child sexual abuse imagery, when the UK hosts the first global summit on AI safety this autumn.

17 July 2023 IWF In The News
Webpages containing the most extreme child abuse have doubled since 2020

Webpages containing the most extreme child abuse have doubled since 2020

Images of children aged as young as seven being abused online have risen by almost two thirds.

25 April 2023 IWF In The News