UK teen’s sex abuse imagery identified thanks to IWF analysts’ pioneering work with policing database

Published:  Wed 23 Apr 2025

As teens face a ‘crisis’ of online sexual abuse and exploitation, the IWF looks at why getting imagery of older teens removed can be difficult for some moderators.

Teenagers are facing a crisis of online sexual exploitation and risk, and sexual or nude imagery of them is spreading at an alarming rate on the internet, causing immeasurable harm. But getting this imagery taken down can be more difficult than imagery of younger children. Why?

The Internet Watch Foundation (IWF) is today (April 23) warning that webpages of sexual abuse imagery online, especially of teenagers under 18, has reached record levels – a trend highlighted in the IWF’s newly published Annual Data and Insights Report.

The escalating numbers are being driven in part by new threats, like AI-generated child sexual abuse, sexually coerced extortion, or “sextortion”, and malicious sharing of intimate imagery.

The IWF’s trained analysts work every day to find and remove child sexual abuse imagery from the internet. They are among only a select few in the world with the legal powers to proactively seek this content out.

They assess content according to UK law to determine whether an image or video depicts a child (anyone under 18) suffering sexual abuse (and, if so, what category of abuse it constitutes).

The analysts must make visual assessments about the children’s ages. Sometimes, with only the clues available in the image or video in front of them, this can be challenging, especially with older children.

While an analyst can say with confidence imagery involving toddlers, or younger children is criminal, it is sometimes harder to tell a 16 or 17 year old from an 18 year old, unless there is additional information available. 

Henry, Senior Internet Content Analyst at IWF (Illustrative Stock Image)

Henry (whose name we have changed to protect his identity) is a senior content analyst at the IWF.

“It can be really frustrating with some of the older teens,” he said.

“Unless you have that extra bit of information that can help us verify their age, some of them slip through, because it can be really difficult to distinguish them from adults.

“You know, it’s obvious if you’re looking at a younger child in images, but how can you tell if someone is 17 or 18? A content moderator on a site may struggle to make that call.

“If the police, or victims themselves self-refer, it gives us what we need to be able to identify that person, and then we are empowered to help prevent imagery being uploaded or spread again online.”

Whilst in the UK it’s legal to have sex from the age of 16, imagery depicting sexual activity of 16 and 17 year olds is generally considered to be criminal. Anyone under the age of 18 is legally classified a child, and therefore is protected by the law. Therefore, making an accurate assessment of the age of the person in the image is vital.  

Henry said the IWF works closely with the police, who can help identify victims, which gives them the ability to determine with confidence whether a victim was under 18 at the time the abuse took place.

The ability to take reports directly from children and young people themselves via the dedicated Report Remove service also helps analysts get a clearer picture of whether imagery includes under 18s.

Henry said a recent example of a victim who has been seen multiple times in different imagery would have slipped through the net and possibly been misidentified as an adult had it not been for the diligence of the IWF checking her identity with the Child Abuse Image Database.

He said: “One particular girl, her imagery appears so often. To look at her, you would probably say she was in her 20s. The imagery of her was appearing on all sorts of sites, including adult entertainment sites.

“But in this case, the victim is actually a child from the UK. We were able to identify her ourselves by using CAID.

“We had seen her multiple times on revenge porn sites and extortion sites over the last few months but we were never sure of her age until recently.

“We immediately began the process of hashing her images and reporting her images with the confidence knowing she was a police-confirmed UK victim.”

IWF have successfully age verified many other victims this way.

In a separate case, Henry said the IWF was able to positively identify a young person after she self-referred imagery to the IWF and confirmed she had been 16 and 17 when the imagery was made. This was corroborated by the local Police force she reported the incident to.

This allowed Henry and the rest of his team to take action to remove this imagery. He warned that smaller teams, or moderators employed by platforms, may not have the resources to make the same distinction, leading to imagery like this not being identified as imagery of children.

Henry said it can take more resource to accurately age older teens than young children, but said it is important to do as much as possible to protect young people whose imagery is being abused and spread illegally online.

Henry said IWF analysts work diligently to age verify underage victims wherever possible.  

He said: “In another UK incident, we received an anonymous report but the analyst was unable to verify age of the female in the images.

“Despite this, they worked hard to narrow down the possible location of the individual in the images based on evidence from the images.

“The IWF sent a referral directly to the local police force, and they were able to confirm that the individual in the images had reported the incident to them and confirmed her age at the time was 15 or 16 when the images were taken.”

Henry said once the IWF has successfully identified and hashed an image of a teen (assigned it a digital fingerprint which can be recognised wherever in the world it appears), it can take steps to make sure that imagery can not be uploaded again in the future. This means those young people do not need to grow up in the fear nude or sexual imagery of them is being spread around by strangers.

The harm this can cause, and the torment of worrying who has seen this material can be extremely damaging.

Henry said the IWF’s new Image Intercept tool, which will make millions of the IWF’s hashes available for free to thousands of smaller and medium-sized platforms, will help extend protection to more parts of the internet, meaning there are fewer places criminals are able to spread imagery of children and young people. 

The work the IWF does with Report Remove, the Police, and others, as well as the diligence of analysts like Henry will ensure IWF hashes will make the most of as much additional information and intelligence as possible, meaning we can do all we can to protect all children and young people, even when it is more difficult to do so.

Find out more about the IWF’s Image Intercept service here.

Global leaders and AI developers can act now to prioritise child safety

Global leaders and AI developers can act now to prioritise child safety

By Hannah Swirsky, Head of Policy and Public Affairs at IWF

21 February 2025 Blog
New pilot shows way for smaller platforms to play big part in online safety

New pilot shows way for smaller platforms to play big part in online safety

Even the smallest platforms can help prevent child abuse imagery online.

19 December 2024 Blog
The IWF relies on strong partnerships with the tech industry

The IWF relies on strong partnerships with the tech industry

Internet Watch Foundation Interim CEO Derek Ray-Hill writes on why we are working with Telegram to tackle child sexual abuse material online.

18 December 2024 Blog