These models are not only limited to using images of victims. The IWF has also found models for generating AI CSAM of celebrity children. While many examples of deepfake images featuring well-known individuals have been seen before, now the IWF is identifying entirely AI-generated images produced using fine-tuned models.
Unfortunately, UK legislation is falling behind advances in AI tech. While AI CSAM is illegal, and the IWF can take steps to have it removed, the same is not true for AI models fine-tuned on images of child sexual abuse. The tool for creating AI images of Olivia remains legal in the UK.
Olivia is now entering her 20s. The IWF knows, from talking to adults who have suffered repeated victimisation, that it’s a mental torture to know that their imagery continues to be circulated online. It can blight lives and have an impact on their ability to leave the abuse in the past. For many survivors, the knowledge that they could be identified, or even recognised from images of their abuse is terrifying.
The IWF wants to change UK law so that it is fit for purpose to tackle the threat of AI CSAM. The act of building a LoRA of victims needs to be made illegal.
We must give Olivia, and others like her, a chance to escape their past. Perhaps knowing that legislative measures are in place could go some small way to granting them a sense of respite.