Telegram plans child abuse crackdown following Pavel Durov’s arrest in Paris
Messaging app Telegram will deploy new tools to prevent the spread of images of child sexual abuse after teaming up with the Internet Watch Foundation.
Published: Tue 26 Apr 2022
Written by: Rhiannon Williams, MIT Technology Review
Experts predict that without new legislation, the problem will only grow.
The US hosts more child sexual abuse content online than any other country in the world, new research has found. The US accounted for 30% of the global total of child sexual abuse material (CSAM) URLs at the end of March 2022, according to the Internet Watch Foundation, a UK-based organization that works to spot and take down abusive content.
The US hosted 21% of global CSAM URLs at the end of 2021, according to data from the foundation’s annual report. But that percentage shot up by nine percentage points during the first three months of 2022, the foundation told MIT Technology Review. The IWF found 252,194 URLs containing or advertising CSAM in 2021, a 64% increase from 2020; 89% of them were traced to image hosts, file-storing cyberlockers, and image stores. The figures are drawn from confirmed CSAM content detected and traced back to the physical server by the IWF to determine its geographical location.
That sudden spike in material can be attributed at least partly to the fact that a number of prolific CSAM sites have switched servers from the Netherlands to the US, taking a sizable amount of traffic with them, says Chris Hughes, director of the IWF’s hotline. The Netherlands had hosted more CSAM than any other country since 2016 but has now been overtaken by the US.
Read more at technologyreview.com
Messaging app Telegram will deploy new tools to prevent the spread of images of child sexual abuse after teaming up with the Internet Watch Foundation.
After years of ignoring pleas to sign up to child protection schemes, the controversial messaging app Telegram has agreed to work with an internationally recognised body to stop the spread of child sexual abuse material (CSAM).
The images that Nelson made have been linked back to real children around the world. In some cases, he then went on to encourage his clients to rape and sexually assault the youngsters.