With the exception of our tools and datasets which work in a preventative way, most of our core hotline work happens after a sexual abuse event has ended. In some instances, we are taking action to have images and videos of child sexual abuse removed many years after the abuse originally took place.
In 2023, however, we were asked to assist on a case where child sexual abuse imagery featuring two UK-based children was ‘going viral’ online. It was a race against time to remove all the instances from various platforms and websites, and we relied on the prompt cooperation of our industry contacts and child protection partners to help us.
We received a public report telling us that videos and images of a boy and girl were on a popular social media platform. Our partners at SWGfL had received a call through their Professionals Online Safety Helpline confirming the people featured were both under 18 – something which wasn’t immediately clear when we assessed the content. Additionally, the imagery was going viral, and being used to create memes. The children’s school and the police were also involved.
Thanks to our ability to proactively search for child sexual abuse imagery, we immediately mobilised all our analysts to hunt for the material and seek its removal in an attempt to stop it spreading online.
Within a couple of hours our analysts had found videos or images containing sexual activity between these two children 72 times online. As well as proactively searching, we received a number of reports from the public.
To date, we have assessed 121 reports and sought the removal of 116 URLs related to this case as these contained sexual activity which breached UK law. Overall, we have seen 9 videos of different scenarios involving these two children, plus still images that are derived from them shared repeatedly.
Report origin |
Number of reports processed |
Public |
5 |
Proactive |
116 |
Total |
121 |
The table below shows the timeframe on when we took action to get this content removed.
This demonstrates how reporting these instances is critical; the quicker we can take action and find this content, the greater the chances of preventing the widespread circulation of this material online.
Timeframe of actioned reports:
Days from 1st reported date |
Number of reports |
1 day |
71 |
2 days |
24 |
3 days |
14 |
Within 1 month |
2 |
Within 4 months |
4 |
Within 5 months |
1 |
Total |
116 |
It is well known that when a video goes ‘viral’ social media is often the method used to distribute the material and it was no different in this case.
We can immediately take action to remove the URL and give each video or image found a unique hash – a digital fingerprint – that means it can be identified, blocked and removed on the internet.
Site types where this child sexual abuse imagery was seen:
Category |
Number of reports |
Social Network |
85 |
Website |
13 |
Blog |
11 |
Chat |
3 |
Video Channel |
2 |
Search |
1 |
Redirector |
1 |
Total |
116 |
We found most of the ‘actionable’ content on one well-known social media site and it was removed by them, most within one day.
Though some non-criminal but related content remains, it is now much more difficult to find the criminal content in publicly accessible areas of the internet.
Sadly, the spread of this content on the internet can be likened to a wildfire; it can spread so fast and can be hard to control. By the time the IWF was alerted to this content the situation was already well developed and much of the online commentary had already happened. It is also difficult to estimate how much of this content still exists in other inaccessible areas, such as on encrypted platforms.
The repeated sharing of videos and images of these two children is obviously very distressing and the associated commentary from people online, that may not contain child sexual abuse but still inflames the situation, only serves to spread the ‘story’ and expose and abuse the children further. Various content including tags, commentary or associated videos and images from this case produced viewing figures totalling in excess of a million. Many of these contained images of the children; not all of this content breached UK law, but it was still harmful to the children involved.
Our partners at the social media sites acted quickly to ensure the content we found was removed from their platforms. One social media company told us they performed a further ‘sweep’ of their platforms for related content. Cooperation and good communication with social media companies was critical to fast and effective content removal.
This is a great example of how the IWF can be reactive to a ‘live’ situation and where speed of action is imperative. The sooner we find and take action to remove instances of child sexual abuse imagery, the sooner we can protect the abused children involved. Seeing this content disappearing from the internet is what fuels our analysts to continue the often very difficult work that they do.