Telegram plans child abuse crackdown following Pavel Durov’s arrest in Paris
Messaging app Telegram will deploy new tools to prevent the spread of images of child sexual abuse after teaming up with the Internet Watch Foundation.
Published: Wed 17 Jan 2024
More than 90% of child sexual abuse webpages taken down from the internet now include self-generated images, according to the charity responsible for finding and removing such material.
The Internet Watch Foundation said that it discovered self-generated child sexual abuse material (CSAM) featuring children under 10 on more than 100,000 webpages in the last year. That figure is an increase of 66% on the year before.
In total, a record 275,655 webpages were confirmed to contain CSAM, the IWF said, an increase of 8%. The new data prompted a renewed attack on end-to-end encryption from the UK government, backed by the IWF.
Read the full article at The Guardian.
Messaging app Telegram will deploy new tools to prevent the spread of images of child sexual abuse after teaming up with the Internet Watch Foundation.
After years of ignoring pleas to sign up to child protection schemes, the controversial messaging app Telegram has agreed to work with an internationally recognised body to stop the spread of child sexual abuse material (CSAM).
The images that Nelson made have been linked back to real children around the world. In some cases, he then went on to encourage his clients to rape and sexually assault the youngsters.