Child sexual abuse: Self-generated imagery found in over 90% of removed webpages

Published:  Wed 17 Jan 2024

More than 90% of child sexual abuse webpages taken down from the internet now include self-generated images, according to the charity responsible for finding and removing such material.

The Internet Watch Foundation said that it discovered self-generated child sexual abuse material (CSAM) featuring children under 10 on more than 100,000 webpages in the last year. That figure is an increase of 66% on the year before.

In total, a record 275,655 webpages were confirmed to contain CSAM, the IWF said, an increase of 8%. The new data prompted a renewed attack on end-to-end encryption from the UK government, backed by the IWF.

Read the full article at The Guardian.

How the sending of one photo led an 11-year-old girl to become a victim of physical sex abuse

How the sending of one photo led an 11-year-old girl to become a victim of physical sex abuse

The girl sent a photo to a boy in her class before the image and her phone number were added to all-male online chat groups - she later started disappearing before being abused by "unknown men".

23 July 2024 IWF In The News
AI advances could lead to more child sexual abuse videos, watchdog warns

AI advances could lead to more child sexual abuse videos, watchdog warns

IWF warns of more AI-made child sexual abuse videos as tools behind them get more widespread and easier to use

22 July 2024 IWF In The News
AI being used to generate deepfake child sex abuse images based on real victims, report finds

AI being used to generate deepfake child sex abuse images based on real victims, report finds

The tools used to create the images remain legal in the UK, the Internet Watch Foundation says, even though AI child sexual abuse images are illegal.

22 July 2024 IWF In The News