Taskforce will stop millions of the most severe child sexual abuse images and videos being shared online

Published:  Wed 2 Jun 2021

A “vital” new taskforce will assess and grade millions of the most severe images and videos of child rape and sexual torture as analysts see record numbers of reports of illegal content.

The new team has been set up by the Internet Watch Foundation (IWF), the UK-based charity working internationally to identify and remove images and videos of child sexual abuse from the internet.

The analysts in this team will view, hash (create a digital fingerprint), and classify two million Category A and B images from the UK Government’s Child Abuse Image Database (CAID).

They will then distribute the hashes globally to tech companies, allowing them to be blocked or removed should anyone attempt to share them anywhere in the world.

Category A images involve penetrative sexual activity and sexual activity with an animal or sadism, while Category B images involve non-penetrative sexual activity.

The IWF is the only non-law enforcement body with access to CAID. The work will boost the UK’s contribution to global efforts to stop the distribution of child sexual abuse images on the internet and help to keep the internet a safer place for all.

Hashing an image or video is a process which produces a unique code like a “digital fingerprint” so that it can be recognised and dealt with quickly by the IWF or its partners in the future.

The work will enable tech companies to take swift action to prevent the spread of this abusive material, giving peace of mind to victims who often live with the knowledge footage of their abuse could be being shared by criminals around the world.

The IWF taskforce has been recruited thanks to a grant from international child protection organisation Thorn.

Susie Hargreaves OBE, Chief Executive of the IWF, said: “We’ve created this world-leading taskforce of highly trained analysts to help boost the global efforts to stop the distribution of child sexual abuse imagery online.

“Not only will this absolutely vital work help to create a safer internet for us all, but it will help those victims whose sexual abuse imagery is shared time and time again, preventing their continued revictimisation and exploitation.

“Thanks to the funding provided by Thorn, and the access to the UK Government’s Child Abuse Image Database, this will be a major step forward for internet safety.”

Safeguarding Minister Victoria Atkins said: “This government is determined to ensure that we are doing everything in our power to prevent child sexual abuse online and the innovative use of technology is central to this.

“I am pleased that Child Abuse Image Database (CAID) data is helping the IWF to carry out this valuable work towards reducing access to child sexual abuse material online and thereby preventing the re-victimisation of children.

“Our Tackling Child Sexual Abuse Strategy highlights that our investment in CAID will allow greater sharing of data to help safeguard more victims and bring more offenders to justice.”

Julie Cordua, CEO of Thorn, said: “IWF’s work to eliminate child sexual abuse images from the internet and end the cycle of re-victimisation is critical and tremendously difficult.

“We are grateful for their continued commitment to this work and are humbled to support their efforts.”

In 2020, the IWF dealt with a record number of reports of online child sexual abuse. Analysts processed 299,600 reports, which include tip offs from members of the public. This is up from 260,400 reports in 2019. This is an increase of 15%.

Of these reports, 153,350 were confirmed as containing images and/or videos of children being sexually abused. This compares to 132,700 in 2019 - an increase of 16%. Every report contains between one, and thousands of child sexual abuse images and videos. This equates to millions of images and videos.

Images and videos of online child sexual abuse can be reported anonymously at https://report.iwf.org.uk/en

 The public is given this advice when making a report:

  • Do report images and videos of child sexual abuse to the IWF to be removed. Reports to the IWF are anonymous.
  • Do provide the exact URL where child sexual abuse images are located.
  • Don’t report other harmful content – you can find details of other agencies to report to on the IWF’s website.
  • Do report to the police if you are concerned about a child’s welfare,
  • Do report only once for each web address – or URL. Repeat reporting of the same URL isn’t needed and wastes analysts’ time.
  • Do report non-photographic visual depictions of the sexual abuse of children, such as computer-generated images. Anything of this nature, which is also hosted in the UK, the IWF can get removed.
MP visits charity on the front line of the fight against child sexual abuse on the internet

MP visits charity on the front line of the fight against child sexual abuse on the internet

Local MP Ian Sollom learned about the herculean task faced by analysts at the Internet Watch Foundation (IWF) who find, assess and remove child sexual abuse material on the internet.

10 December 2024 News
World-leading Report Remove tool in the spotlight

World-leading Report Remove tool in the spotlight

The Internet Watch Foundation and the NSPCC have won an award that recognises the vital service that the Report Remove tool offers children in the UK.

5 December 2024 News
Telegram joins IWF in child sexual abuse imagery crackdown

Telegram joins IWF in child sexual abuse imagery crackdown

IWF data and tools will help prevent the platform’s users being exposed to child sexual abuse imagery

4 December 2024 News