Ian Critchley, NPCC lead for Child Protection said: “The work of the IWF is crucial in the identification, removal and reporting of child sexual abuse material. This latest report shows how offenders are gaining access to even younger children, which is simply unimaginable for us all.
“But this isn’t just the responsibility of parents and carers - the biggest change though we must see is from the tech companies and online platforms. Companies are still failing to protect children and continue far too often to put profit before child safety.
“I welcome the Online Safety Act, but it should not have required this developing legislation to change the negligible approach to child safety by too many companies.”
According to research published last week by Ofcom, about two in five parents of five to seven year-olds (42%) say they use social media sites and apps together with their child, while a third (32%) report that their child uses social media independently. The IWF says this only highlights the need for companies to be required to take action now.
Ms Hargreaves added: “We need a full society approach to make sure children are not groomed like this in the first place, but we also need to see measures in place to make sure this imagery cannot spread on the open web. We stand ready to help Ofcom and the technology sector find solutions.”
In 2023, nearly all of the webpages the IWF discovered (92% - or 254,070 URLs) contained self-generated images or videos where the victim had been coerced, blackmailed, or groomed into performing sexual acts over a webcam for an internet predator in a remote location.
Today’s first-of-its-kind analysis gives a startling insight into three to six year old children who were abused in this way. IWF analysts discovered 2,401 individual self-generated images and videos of children in this age category that were hashed this year. Of these 91% were of girls.
Analysts witnessed abuse happening in domestic locations including bathrooms and bedrooms, kitchens and dining rooms. They saw soft toys, games, books and bedding featuring cartoon characters appearing in the background of imagery depicting some of the most extreme kinds of sexual abuse.
The most extreme (Category A) forms of sexual abuse featuring three to six year olds who had been groomed or coerced this way were featured in 15% of these images and videos (356 images and videos).
The provision of IWF datasets and technical and tools to companies in scope of regulation will be vital to the successful implementation of the Online Safety Act.
Every instance of this imagery found by IWF analysts was hashed – a process where the image is assigned a digital fingerprint – and added to the IWF’s Hash List which is distributed to technology companies to prevent the upload, sharing and storage of this imagery on their services.
More than 200 companies from across the world currently partner with IWF to disrupt and stop the spread of child sexual abuse imagery online.
The IWF is discovering more child sexual abuse imagery online than ever before in its history. Overall in 2023, the IWF found 275,652 webpages containing child sexual abuse – a record-breaking amount. Each webpage can contain thousands of images or videos.
As well as highlighting the targeting of younger victims, today’s annual report also reveals child sexual abuse online is getting more and more extreme***.
Today’s analysis shows there was a 22% increase in webpages containing Category A child sexual abuse material found in 2023, rising from 51,369 URLs in 2022, to 62,652 in 2023. This makes 2023 the most extreme year on record.
The trend for the past three years has seen increases in Category A material: between 2021 and 2023, the IWF has seen a 38% increase in Category A imagery.