A Pornhub Chatbot Stopped Millions From Searching for Child Abuse Videos

Published:  Thu 29 Feb 2024

For the past two years, millions of people searching for child abuse videos on Pornhub’s UK website have been interrupted. Each of the 4.4 million times someone has typed in words or phrases linked to abuse, a warning message has blocked the page, saying that kind of content is illegal. And in half the cases, a chatbot has also pointed people to where they can seek help.

The warning message and chatbot were deployed by Pornhub as part of a trial program, conducted with two UK-based child protection organizations, to find out whether people could be nudged away from looking for illegal material with small interventions. A new report analyzing the test, shared exclusively with WIRED, says the pop-ups led to a decrease in the number of searches for child sexual abuse material (CSAM) and saw scores of people seek support for their behavior.

“The actual raw numbers of searches, it’s actually quite scary high,” says Joel Scanlan, a senior lecturer at the University of Tasmania, who led the evaluation of the reThink Chatbot. During the multiyear trial, there were 4,400,960 warnings in response to CSAM-linked searches on Pornhub’s UK website—99 percent of all searches during the trial did not trigger a warning. “There’s a significant reduction over the length of the intervention in numbers of searches,” Scanlan says. “So the deterrence messages do work.”

Read the full article at WIRED.

Telegram plans child abuse crackdown following Pavel Durov’s arrest in Paris

Telegram plans child abuse crackdown following Pavel Durov’s arrest in Paris

Messaging app Telegram will deploy new tools to prevent the spread of images of child sexual abuse after teaming up with the Internet Watch Foundation.

5 December 2024 IWF In The News
Telegram U-turns and joins child safety scheme

Telegram U-turns and joins child safety scheme

After years of ignoring pleas to sign up to child protection schemes, the controversial messaging app Telegram has agreed to work with an internationally recognised body to stop the spread of child sexual abuse material (CSAM).

4 December 2024 IWF In The News
Paedophile Hugh Nelson who made AI child abuse images from real pictures sent to him jailed for 18 years in 'deeply horrifying' landmark case

Paedophile Hugh Nelson who made AI child abuse images from real pictures sent to him jailed for 18 years in 'deeply horrifying' landmark case

The images that Nelson made have been linked back to real children around the world. In some cases, he then went on to encourage his clients to rape and sexually assault the youngsters.

28 October 2024 IWF In The News