The horror and the heartbreak - how one child sexual abuse survivor’s torment will never end thanks to AI

Published:  Mon 22 Jul 2024

“Anyone trained a LoRA for ‘Olivia’ yet? Would be really cool to see”.

This seemingly innocuous question is infused with a cheerful curiosity that makes its true meaning at once both horrific and heartbreaking.  

The query was posted, without any apparent sense of shame or guilt, by a visitor to a dark web forum where images and videos of children being sexually abused are routinely shared and discussed.  

The ‘LoRA’ is a type of AI model that has been fine tuned to generate bespoke computer-generated images of child sexual abuse.  

And ‘Olivia’? Olivia is a pseudonym provided by the Internet Watch Foundation (IWF) for a known survivor of child rape and torture.  

The chilling excerpt is from a new IWF report update that delves into what analysts at the child protection charity currently see regarding synthetic or AI-generated imagery of child sexual abuse.  

IWF AI CSAM Report Update
Our NEW AI CSAM report update

AI-generated child sexual abuse material (AI CSAM) is a growing trend that is highly concerning because of the speed at which hundreds of images can be spewed out at the click of a button and, in some cases, the near flawless, photo-realistic quality of the pictures being generated.  

But offenders haven’t stopped at generating images of fake children, they are using AI tools to make new images of existing victims, such as Olivia. 

Olivia is well known to analysts at the IWF – whose mission it is to find and remove imagery of child sexual abuse online. She suffered terrible sexual abuse at the hands of someone she trusted from the ages of about three to eight years old and images of her abuse are widely circulated on the internet. 

Olivia was rescued by police in 2013, five years after her abuse first began. In 2018 the IWF highlighted Olivia’s story in its annual report. IWF analysts were seeing Olivia every day through their work. They had watched her grow up from preschooler to 8-year-old through cruel images and videos. She was repeatedly raped and sexually tortured.  

 

Although now free of her abuser, Olivia, like many other survivors, is repeatedly victimised every time imagery of her abuse continues to be shared, sold and viewed online.  

This torment has now reached a new level because of the advent of generative text-to-image AI which is being exploited by offenders. 

Offenders online are known to collect and share images of favourite child sexual abuse victims, Olivia being one of them. Some offenders profit from selling the imagery to those who want to acquire a particular set of images to complete their ‘collections’. 

These collections of named victims are now being harvested to train the LoRA models mentioned above: fine-tuned models that plug into foundation AI models, allowing the user to generate images of the victim in any setting, scenario or sexual activities they can imagine. 

According to the IWF report, analysts have now found an AI model for generating novel images of Olivia. It is available to download for free. So the answer to the sickening question at the beginning of this article is sadly, yes. 

Fine-tuned models like Olivia’s have been trained on the imagery that IWF analysts were seeing daily but despite best efforts were unable to eradicate. This means that the suffering of survivors is potentially without end, since perpetrators can generate as many images of the children as they want. 

One dark web forum user shared an anonymous webpage containing links to fine-tuned models for 128 different named victims of child sexual abuse.  

These models are not only limited to using images of victims. The IWF has also found models for generating AI CSAM of celebrity children. While many examples of deepfake images featuring well-known individuals have been seen before, now the IWF is identifying entirely AI-generated images produced using fine-tuned models. 

Unfortunately, UK legislation is falling behind advances in AI tech. While AI CSAM is illegal, and the IWF can take steps to have it removed, the same is not true for AI models fine-tuned on images of child sexual abuse. The tool for creating AI images of Olivia remains legal in the UK.  

Olivia is now entering her 20s. The IWF knows, from talking to adults who have suffered repeated victimisation, that it’s a mental torture to know that their imagery continues to be circulated online. It can blight lives and have an impact on their ability to leave the abuse in the past. For many survivors, the knowledge that they could be identified, or even recognised from images of their abuse is terrifying. 

The IWF wants to change UK law so that it is fit for purpose to tackle the threat of AI CSAM. The act of building a LoRA of victims needs to be made illegal.  

We must give Olivia, and others like her, a chance to escape their past. Perhaps knowing that legislative measures are in place could go some small way to granting them a sense of respite. 

Notes to editors:  

The IWF is the largest hotline in Europe dedicated to finding and removing child sexual abuse material from the internet.  

Contact: Cat McShane, IWF Press Officer [email protected] +44 (0) 7572 783227  

Parents and carers are encouraged to T.A.L.K to their children about the dangers. 

  • Talk to your child about online sexual abuse. Start the conversation – and listen to their concerns. 
  • Agree ground rules about the way you use technology as a family. 
  • Learn about the platforms and apps your child loves. Take an interest in their online life. 
  • Know how to use tools, apps and settings that can help to keep your child safe online. 

What we do:  

We make the internet a safer place. We help victims of child sexual abuse worldwide by identifying and removing online images and videos of their abuse. We search for child sexual abuse images and videos and offer a place for the public to report them anonymously. We then have them removed. We’re a not for profit organisation and are supported by the global internet industry.  

For more information please visit www.iwf.org.uk  

The IWF is part of the UK Safer Internet Centre, working with Childnet International and the South West Grid for Learning to promote the safe and responsible use of technology. 

The IWF works globally to stop child sexual abuse imagery on the internet. If you ever stumble across a sexual image or video of someone you think is under 18, please report to the IWF. Reporting can be done anonymously and confidentially – we don’t need your details, just your help. 

AI-generated videos of child sexual abuse a ‘stark vision of the future’

AI-generated videos of child sexual abuse a ‘stark vision of the future’

Real victims’ imagery used in highly realistic ‘deepfake’ AI-generated films

22 July 2024 News
‘Worst nightmares’ come true as predators are able to make thousands of new AI images of real child victims

‘Worst nightmares’ come true as predators are able to make thousands of new AI images of real child victims

International collaboration vital as 'real world’ abuses of AI escalate

25 October 2023 News