The US Saw a Spike in Child Sexual Abuse URLs in 2021

Published:  Mon 25 Apr 2022

CSAM hosting around the world rose 64 percent last year, and a surge in the United States put it second behind the Netherlands, a new report found.

Every year the number of photos and videos containing child sexual abuse found online increases—and 2021 was no exception. Investigators discovered record amounts of child sexual abuse material (CSAM) last year, new figures reveal.

Data from UK child safety nonprofit the Internet Watch Foundation (IWF) shows 252,194 URLs containing child sexual abuse imagery in the last 12 months. That’s up 64 percent from 2020. As well as record overall numbers, the charity found a significant uptick in the amount of CSAM hosted in the United States. Chris Hughes, the director of the IWF’s hotline, says the organization responds to reports of CSAM online and also proactively uses technology to hunt down abusive content. Most of the photos the IWF finds are on image-hosting websites, where people can upload content to share.

Since 2016, the Netherlands has hosted more abuse material than any other country the IWF has analyzed. (It is home to one of the world’s largest internet exchanges). Last year, the Netherlands had 102,676 confirmed reports of CSAM, 41 percent of everything the IWF found. This is a drop from 2020’s figures, but it coincided with a spike in US-linked reports.

Read more at wired.co.uk

Telegram plans child abuse crackdown following Pavel Durov’s arrest in Paris

Telegram plans child abuse crackdown following Pavel Durov’s arrest in Paris

Messaging app Telegram will deploy new tools to prevent the spread of images of child sexual abuse after teaming up with the Internet Watch Foundation.

5 December 2024 IWF In The News
Telegram U-turns and joins child safety scheme

Telegram U-turns and joins child safety scheme

After years of ignoring pleas to sign up to child protection schemes, the controversial messaging app Telegram has agreed to work with an internationally recognised body to stop the spread of child sexual abuse material (CSAM).

4 December 2024 IWF In The News
Paedophile Hugh Nelson who made AI child abuse images from real pictures sent to him jailed for 18 years in 'deeply horrifying' landmark case

Paedophile Hugh Nelson who made AI child abuse images from real pictures sent to him jailed for 18 years in 'deeply horrifying' landmark case

The images that Nelson made have been linked back to real children around the world. In some cases, he then went on to encourage his clients to rape and sexually assault the youngsters.

28 October 2024 IWF In The News