One in five child abuse images found online last year were category A – report

Published:  Tue 25 Apr 2023

Written by:  Dan Milmo

The most extreme form of child sexual abuse material accounted for a fifth of such content found online last year, according to a report.

Category A abuse represented 20% of illegal images discovered online last year by the Internet Watch Foundation, a UK-based body that monitors distribution of child sexual abuse material (CSAM). It found more than 51,000 instances of such content, which can include the most severe imagery including rape, sadism and bestiality.

The IWF annual report said the 2022 total for category A imagery was double the figure in 2020 and the increase was partly due to criminal sites selling videos and images of such abuse.

Read more at The Guardian

AI tools have put child sexual abuse ‘on steroids’, Home Secretary warns

AI tools have put child sexual abuse ‘on steroids’, Home Secretary warns

The Home Office said fake images are being used to blackmail children and force them to livestream further abuse.

2 February 2025 IWF In The News
UK makes use of AI tools to create child abuse material a crime

UK makes use of AI tools to create child abuse material a crime

Britain will make it illegal to use artificial intelligence tools that create child sexual abuse images.

1 February 2025 IWF In The News
Charity finds more than 500,000 child abuse victims

Charity finds more than 500,000 child abuse victims

An analyst who removes child sexual abuse content from the internet says she is always trying to stay "one step ahead" of the "bad guys".

8 December 2024 IWF In The News