
AI tools have put child sexual abuse ‘on steroids’, Home Secretary warns
The Home Office said fake images are being used to blackmail children and force them to livestream further abuse.
Published: Tue 11 Jan 2022
As our work and social lives went digital during the Covid-19 pandemic, a much darker trend was also unfolding; a sharp increase in the number of people accessing online child sex abuse images. The Internet Watch Foundation (IWF), a charity dedicated to finding and removing these images from the web, detected 8.8 million attempts to access illegal material during the first month of the 2020 lockdown in the UK alone. The scale of the problem is likely to be much larger.
For the IWF, the pandemic has exacerbated an already worrying increase in the volume of child abuse material being shared and viewed online. It says the amount of images and videos it is detecting each year has grown 1,420% since 2011, and in November said its analysts had detected 200,000 illegal images in 2021, the first time it had reached this grim milestone in a single calendar year.
But while technology is part of this problem, it can also be part of the solution. Tech Monitor spoke to the IWF’s chief technology officer Dan Sexton about how his team is developing bespoke software to support the charity’s work.
Read more at Tech Monitor.
The Home Office said fake images are being used to blackmail children and force them to livestream further abuse.