Thousands of images and videos of child sexual abuse could be going undetected because internet analysts’ time is being taken up dealing with “false reports”, experts warn.
The Internet Watch Foundation, the UK charity which finds and removes online child sexual abuse material, acts on anonymous reports from the public to find and eradicate criminal content.
But they say thousands of inappropriate or false reports of non-criminal material are wasting their analysts’ time and potentially stopping them finding and eliminating real abuse imagery from the internet.
According to the IWF, one individual alone has made 8,300 false reports since June 2019 despite having been repeatedly informed what they are reporting is “off remit” for the charity.
A new reporting page launched today is intended to make it clearer to the public what is, and is not, appropriate to report to the IWF.
Peter, a senior analyst at the IWF said: “We don’t expect people to be able to make their own assessments of criminal content on the internet – that’s what we’re here for.
“But by reporting anything and everything to us, when we’re here to deal with one really serious online criminality, takes up time and resources and diverts our efforts away from the victims.
“Last year it took the equivalent of more than four years’ worth of analyst time to deal with false reports. Imagine what we could have achieved for victims of sexual abuse.
“There could have been thousands of criminal sites that we could be getting offline – thousands of illegal images of children being sexually abused we could be removing from the internet. We are instead dealing with reports of something that we know we can’t do anything about.”
Some of the reports wrongly sent to the IWF include non-criminal adult material from pornographic websites, or non-criminal images of children, such as “child modelling” or even holiday photos. Analysts are also sent other forms of content including videos of beheadings or animal cruelty – something which is distressing to be exposed to.
Last year, the public made 106,830 reports where the person said they were reporting child sexual abuse material. Of these, IWF analysts processed 77,160 reports which turned out to be false.
The IWF estimates it costs them £150,500 a year to deal with these inaccurate reports, and time equating to 4.3 years of analyst time.
Peter said: “We treat every report as though that person has legitimately stumbled upon child sexual abuse material until we’ve been able to verify otherwise.”
Susie Hargreaves OBE, CEO of the IWF said people need to report online child sexual abuse to the IWF but said they also need to be aware of what actually constitutes criminal material.
Ms Hargreaves said: “If people stumble across these images online, they need to know we are a safe place they can turn to. You can report anonymously to us and we will get material analysed and removed.
“What we can’t do is remove material that is not actually against the law. Our analysts still have to look carefully at material to make sure there is nothing criminal hidden in there and, if people are reporting inappropriate things to us, it takes up a lot of their time.
“This is time they could be spending finding and eradicating child sexual abuse material from the internet.
“People must report child sexual abuse, but please check first to make sure what you’re reporting is something we can help with.”
Peter said it is important the public continues to blow the whistle on harmful online material. But he also said some people who act as “vigilantes” – deliberately searching out online child abuse need to be reminded of the law.
He said: “If they are caught actively searching for this, they will have no more of a defence than someone that’s doing it because they want to find it for their own gratification. It is not a defence in court going looking for this.”
Ms Hargreaves added: “We know that some people might out of desperation just want to report it somewhere, but we need to look after our analysts.
“We can prepare them for seeing images of children being sexually abused, but it’s harder to prepare for the unknown and unexpected, such as beheadings, or animal cruelty. It can have a real impact on our analysts.”
A gold-standard welfare package is in place to look after the analysts’ mental health while performing their challenging role.
The IWF works specifically to find and remove child sexual abuse material online. The IWF website provides a list of different organisations, websites and resources to help the public find the right person to speak to for material which falls outside this remit.
Images and videos of online child sexual abuse can be reported anonymously on the IWF’s new reporting page.
The public is given this advice when making a report:
- Do report images and videos of child sexual abuse to the IWF to be removed. Reports to the IWF are anonymous.
- Do provide the exact URL where child sexual abuse images are located.
- Don’t report other harmful content – you can find details of other agencies to report to on the IWF’s website.
- Do report to the police if you are concerned about a child’s welfare,
- Do report only once for each web address – or URL. Repeat reporting of the same URL isn’t needed and wastes analysts’ time.
- Do report non-photographic visual depictions of the sexual abuse of children, such as computer-generated images. The images the IWF can take action on must be pornographic, be grossly offensive, and focus on a child's genitals or depict sexual activity involving or in the presence of a child. Anything of this nature, which is also hosted in the UK, the IWF can get removed.