This Chatbot Aims to Steer People Away From Child Abuse Material

Published:  Sun 28 Aug 2022

There are huge volumes of child sexual abuse photos and videos online—millions of pieces are removed from the web every year. These illegal images are often found on social media websites, image hosting services, dark web forums, and legal pornography websites. Now a new tool on one of the biggest pornography websites is trying to interrupt people as they search for child sexual abuse material and redirect them to a service where they can get help.

Since March this year, each time someone has searched for a word or phrase that could be related to child sexual abuse material (also known as CSAM) on Pornhub’s UK website, a chatbot has appeared and interrupted their attempted search, asking them whether they want to get help with the behavior they’re showing. During the first 30 days of the system’s trial, users triggered the chatbot 173,904 times.

“The scale of the problem is so huge that we really need to try and prevent it happening in the first place,” says Susie Hargreaves, the chief executive of the Internet Watch Foundation (IWF), a UK-based nonprofit that removes child sexual abuse content from the web. The IWF is one of two organizations that developed the chatbot being used on Pornhub. “We want the results to be that people don’t look for child sexual abuse. They stop and check their own behavior,” Hargreaves says.

Read the full article at WIRED

Telegram plans child abuse crackdown following Pavel Durov’s arrest in Paris

Telegram plans child abuse crackdown following Pavel Durov’s arrest in Paris

Messaging app Telegram will deploy new tools to prevent the spread of images of child sexual abuse after teaming up with the Internet Watch Foundation.

5 December 2024 IWF In The News
Telegram U-turns and joins child safety scheme

Telegram U-turns and joins child safety scheme

After years of ignoring pleas to sign up to child protection schemes, the controversial messaging app Telegram has agreed to work with an internationally recognised body to stop the spread of child sexual abuse material (CSAM).

4 December 2024 IWF In The News
Paedophile Hugh Nelson who made AI child abuse images from real pictures sent to him jailed for 18 years in 'deeply horrifying' landmark case

Paedophile Hugh Nelson who made AI child abuse images from real pictures sent to him jailed for 18 years in 'deeply horrifying' landmark case

The images that Nelson made have been linked back to real children around the world. In some cases, he then went on to encourage his clients to rape and sexually assault the youngsters.

28 October 2024 IWF In The News