The net is closing on child sexual abuse images

Published:  Thu 8 Jul 2021

Written by:  WIRED

Each day, a team of analysts faces a seemingly endless mountain of horrors. The team of 21, who work at the Internet Watch Foundation’s office in Cambridgeshire, spend hours trawling through images and videos containing child sexual abuse. And, each time they find a photo or piece of footage it needs to be assessed and labelled. Last year alone the team identified 153,383 webpages with links to child sexual abuse imagery. This creates a vast database of abuse which can then be shared internationally in an attempt to stem the flow of abuse. The problem? Different countries have different ways of categorising images and videos.

Read more at WIRED

Telegram plans child abuse crackdown following Pavel Durov’s arrest in Paris

Telegram plans child abuse crackdown following Pavel Durov’s arrest in Paris

Messaging app Telegram will deploy new tools to prevent the spread of images of child sexual abuse after teaming up with the Internet Watch Foundation.

5 December 2024 IWF In The News
Telegram U-turns and joins child safety scheme

Telegram U-turns and joins child safety scheme

After years of ignoring pleas to sign up to child protection schemes, the controversial messaging app Telegram has agreed to work with an internationally recognised body to stop the spread of child sexual abuse material (CSAM).

4 December 2024 IWF In The News
Paedophile Hugh Nelson who made AI child abuse images from real pictures sent to him jailed for 18 years in 'deeply horrifying' landmark case

Paedophile Hugh Nelson who made AI child abuse images from real pictures sent to him jailed for 18 years in 'deeply horrifying' landmark case

The images that Nelson made have been linked back to real children around the world. In some cases, he then went on to encourage his clients to rape and sexually assault the youngsters.

28 October 2024 IWF In The News