Analyst's ‘hunch’ leads to British schoolgirl’s rescue from online child sexual abuse

Published:  Mon 9 Nov 2020

A hunch led to the “remarkable” rescue of a British schoolgirl from online sexual abuse as experts warn of entire online communities “devoted” to contacting and abusing children.

The Internet Watch Foundation (IWF) is warning parents to be vigilant to the possibility predators may be abusing children after contacting them through camera-enabled devices in their own bedrooms.

Analysts say there are online communities “devoted” to trying to find victims themselves “because they want to be the ones to have them perform these sexual acts live”.

This comes as the All-Party Parliamentary Group on Social Media announces an inquiry into the rise of “self-generated” indecent images of children online.

The UK Safer Internet Centre, a unique partnership of three world-leading charities (SWGfL, Childnet, and the Internet Watch Foundation) will be leading the inquiry as secretariat of the APPG on Social Media.

In October, an IWF analyst successfully identified and safeguarded a British schoolgirl who had been targeted and groomed into performing sexually over the internet by strangers.

His training, experience, and expertise told him that, despite the very limited evidence he could gather from a reported video of child sexual abuse, there was more he could do to help protect the victim and protect her from future harm.

The IWF is the UK charity responsible for finding and removing online child sexual abuse material.

On October 1, “Daniel”* an internet content analyst for the IWF processed an anonymous report containing a URL link to suspected online child sexual abuse.

When he investigated, he found the link led to a video of an 11-13 year-old girl in the UK who was engaged in Category B sexual abuse, the second most severe category there is.

“There are communities that are devoted to not just finding child sexual abuse content, but actually trying to find the victims themselves because they want to be the ones to have them perform these sexual acts live. It is not uncommon.”

The video was “self-generated”, meaning the victim could have been coerced into making the footage herself by an abuser who appeared to be telling her what to do over the internet.

Daniel said: “There was no one directly there who was making her do it, but certainly, from the video itself, it seemed she was being asked to do certain things.

“She wasn’t talking much, but you could see her looking towards the camera. I felt she was reading. I would say there was definitely a back and forward between her and someone telling her what to do.”

Daniel acted fast. He tracked the video down on an online forum where he found predators discussing the footage.

From this, he was able to gather enough information to help him identify the victim, which he could pass to the police.

The criminal material was quickly removed from the internet, and police and social services are now involved to make sure the girl does not fall victim to any future harm.

In October, the IWF revealed that in the first six months of 2020, 44% of all the child sexual abuse content dealt with by the Internet Watch Foundation (IWF) involved self-generated material.

This is up 15 percentage points on 2019 when, of the 132,676 webpages actioned, almost a third (38,424 or 29%) contained self-generated imagery.

Self-generated content can include child sexual abuse content, created using webcams, sometimes in the child’s own room, and then shared online.

In some cases, children are groomed, deceived or extorted into producing and sharing a sexual image or video of themselves.

Daniel said he does not believe the girl’s parents had any idea of the abuse their daughter was being subjected to online.

He warned about online communities who are seeking to contact children over the internet so they can coerce and groom them into this kind of abuse.

He said: “There are communities that are devoted to not just finding child sexual abuse content, but actually trying to find the victims themselves because they want to be the ones to have them perform these sexual acts live. It is not uncommon.”

Daniel’s role is normally simply to analyse the content to have it safely removed from the internet.

He said he had a “hunch” that the footage had been shot in the UK and that quick action could lead to the girl being safeguarded from future harm.

“It is shocking that predators are actively looking to exploit children in this way, and I would urge parents to have frank discussions with their children about the potential dangers here.”

Analysts would normally check for clues in footage, including plug sockets, titles of books on shelves, or posters on the walls, which may offer indications of when and where the abuse took place.

None of these clues were apparent in this video. But the fact the footage, which had been shot on a mobile device, was of very good quality led Daniel to believe it had been shot recently. It may, in fact, have been recorded only weeks earlier.

Susie Hargreaves, Chief Executive of the IWF, paid tribute to Daniel’s tenacity, saying analysts go the extra mile to make sure children are kept safe.

She said: “This was remarkable work which led to the rescue of a vulnerable child who was being preyed upon by people exploiting her online.

“That she is now safe and that the abuse has been exposed is as a direct result of Daniel’s tenacity. Every day, our hotline analysts watch and assess some of the worst abuse on the internet.

“They come into work, as they have throughout the entirety of the lockdown period, with the sole aim of helping children and doing their bit to make sure their abuse is rooted out from the internet.

“Children are re-victimised every single time their abuse is shared. It means they can not move on even after the physical abuse is over.

“Tragically, what we are seeing more and more of is children being groomed or coerced into this abuse themselves.

“It is shocking that predators are actively looking to exploit children in this way, and I would urge parents to have frank discussions with their children about the potential dangers here.”

Ms Hargreaves said new online harms legislation should be brought in quickly to help ensure the welfare and protection of children is protected.

She added: “The internet is an incredible thing, full of amazing potential, interesting people, and wonderful tools for learning and communicating. But we need to be aware of its dark side too.”

Daniel said engagement is at the heart of protecting children, and that parents need to explain to their children that there are people online who may try to exploit them, and who may not be who they say they are.

Daniel, who has been an internet content analyst at the IWF for more than four years, said this is the first time he has been able to ID a victim. He said knowing a victim is now safe thanks to his work is “a fantastic feeling”.

The APPG on Social Media is now launching an inquiry - “Selfie Generation”: What’s behind the rise of self-generated indecent images of children online?

The APPG, which was set up to make recommendations to assist Government and Parliament regulate the digital space, will investigate the rise of imagery produced of children when offenders groom and coerce them into sexual activities via a webcam or livestream or capture these images from live streams without any interaction with a child.

You can find out more about IWF analysts’ work by listening to the charity’s podcast - Pixels from a Crime Scene - which is available to download at www.iwf.org.uk/pixels-from-a-crime-scene.

Images and videos of online child sexual abuse can be reported anonymously at https://report.iwf.org.uk/en

 The public is given this advice when making a report:

  • Do report images and videos of child sexual abuse to the IWF to be removed. Reports to the IWF are anonymous.
  • Do provide the exact URL where child sexual abuse images are located.
  • Don’t report other harmful content – you can find details of other agencies to report to on the IWF’s website.
  • Do report to the police if you are concerned about a child’s welfare,
  • Do report only once for each web address – or URL. Repeat reporting of the same URL isn’t needed and wastes analysts’ time.
  • Do report non-photographic visual depictions of the sexual abuse of children, such as computer-generated images. Anything of this nature, which is also hosted in the UK, the IWF can get removed.

 

*The analyst’s name has been changed to protect their identity.

MP visits charity on the front line of the fight against child sexual abuse on the internet

MP visits charity on the front line of the fight against child sexual abuse on the internet

Local MP Ian Sollom learned about the herculean task faced by analysts at the Internet Watch Foundation (IWF) who find, assess and remove child sexual abuse material on the internet.

10 December 2024 News
World-leading Report Remove tool in the spotlight

World-leading Report Remove tool in the spotlight

The Internet Watch Foundation and the NSPCC have won an award that recognises the vital service that the Report Remove tool offers children in the UK.

5 December 2024 News
Telegram joins IWF in child sexual abuse imagery crackdown

Telegram joins IWF in child sexual abuse imagery crackdown

IWF data and tools will help prevent the platform’s users being exposed to child sexual abuse imagery

4 December 2024 News