How we build awareness, protect and influence

An IWF Campaign advert showing our Think Before You Share artwork

We’re often the first to see new forms of online child sexual abuse, giving us a critical role in protecting vulnerable groups. Angela Muñoz Aroca, IWF Senior Campaigns and Communications Officer explains: “That’s why it’s so important that we work with partners to create resources and campaigns that support children, parents and teachers, while raising awareness of child sexual abuse online.”

We’ve continued this work in 2024 through four major campaigns: Think Before You Share, Report Remove, and resources focused on AI and financially motivated sexual extortion, often called ‘sextortion’. 

‘Self-generated’ child sexual abuse material, including explicit images shared without consent or created under coercion, was found on 92% (254,071) of the webpages removed by IWF in 2023. Preliminary research with the International Policing and Public Protection Research Institute (IPPPRI) highlighted the normalisation of young people being exposed to or sharing sexual content, sparking concern among parents and carers.

In response, we launched our 2024 campaign Think Before You Share, phase five of our long-term effort to address this crisis. Funded by the Oak Foundation, with support from Qualcomm and advertising credits provided by TikTok and Snapchat, we aimed to help young people understand the risks of sharing nude images online and equip parents and educators to have honest, judgement-free conversations.

By meeting audiences where they are - on TikTok, Snapchat, Twitch, YouTube, Facebook, Instagram and digital radio – we reached millions. Over six weeks, the campaign achieved 122 million impressions, with 94% of YouTube viewers watching our videos to the end - 30% above industry standards.
 
The impact extended far beyond clicks and views. Reports from 13 to 15-year-olds using Report Remove rose by 90%, platform visits increased by 37%, and sessions started grew by 60%. “These numbers represent real change,” says Angela, “with more young people knowing where to go to report explicit images.”

Developed with NSPCC’s Childline, Report Remove continues to be a game-changing resource for young people whose intimate images have been shared. Fast, private and empowering, it allows victims to report explicit content, see if it can be removed, and regain control.

Each report is reviewed urgently by the IWF. If the content is criminal, a unique digital fingerprint, or “hash” is created, which ensures the image can be identified and removed from online spaces.

Meanwhile, Childline provides emotional and practical support, helping victims feel less isolated and overwhelmed. As well as keeping a child informed of the status of their report, they provide both immediate and ongoing assistance around the clock - whether that’s via live chat, telephone, peer support or resources available on their website.

In 2024, 1,142 reports have been made via the Report Remove service, and as a result 2,009 images and videos have been assessed as criminal – this marks a 44% increase on the number of reports received since last year.

Report Remove is also a vital tool for victims of sexual extortion, helping them regain control of their images. This is often referred to as ‘sextortion', which is a type of blackmail where someone tries to use intimate, naked or sexual images or videos of a person to blackmail or extort them, often for either more images or for money.

Offenders, including organised criminal gangs, use social media, gaming platforms and dating apps to build trust before coercing victims into sharing intimate images or videos. Teenage boys are often extorted for money, while girls are predominantly targeted for sexually explicit content.

The psychological toll is severe, leaving victims feeling trapped and ashamed. To combat this, we launched a series of resources, offering guidance to victims, families and educators. These resources emphasise:

  • Cutting off communication with offenders immediately.
  • Preserving evidence for law enforcement.
  • Seeking support from trusted adults.

As ‘sextortion’ rises, we are also addressing another growing challenge. AI-generated child sexual abuse images, once easily identifiable as computer-generated, became increasingly realistic by late 2023, with 90% meeting the same legal classification as genuine abuse content.

Despite its illegality, the tools used to create AI-generated child sexual abuse content remain largely unregulated, allowing offenders to produce and distribute imagery at scale.

In response to the rising severity of content, in July we launched our 2024 AI report which calls for:

  • Legislation criminalising the use of AI to create explicit images of children.
  • Platforms to adopt Safety-by-Design principles.
  • Greater public awareness of AI’s misuse.

“IWF is always on the lookout for developing trends and we will continue our effort of raising awareness and resilience throughout 2025,” says Angela.

Find out more information about our awareness campaigns or more details regarding online safety and advice resources