Telegram plans child abuse crackdown following Pavel Durov’s arrest in Paris
Messaging app Telegram will deploy new tools to prevent the spread of images of child sexual abuse after teaming up with the Internet Watch Foundation.
Published: Wed 17 Jan 2024
More than 90% of websites found to contain child sexual abuse featured "self-generated" images extorted from victims as young as three, according to an internet watchdog.
The Internet Watch Foundation (IWF) warned of a "shocking" rise in the number of under-10s being coerced, blackmailed, tricked or groomed into performing sexually online.
Data released by the anti-abuse charity shows a record 275,655 websites were found to contain child sexual abuse in 2023 - an 8% rise from the previous year.
Of those, 254,070, or 92%, contained "self-generated" images or videos, with children under the age of 10 featuring on 107,615 of the sites, and youngsters aged between three and six found on 2,500 of them.
Read the full article at Sky News.
Messaging app Telegram will deploy new tools to prevent the spread of images of child sexual abuse after teaming up with the Internet Watch Foundation.
After years of ignoring pleas to sign up to child protection schemes, the controversial messaging app Telegram has agreed to work with an internationally recognised body to stop the spread of child sexual abuse material (CSAM).
The images that Nelson made have been linked back to real children around the world. In some cases, he then went on to encourage his clients to rape and sexually assault the youngsters.