Search Results

129 results
  1. How AI is being abused to create child sexual abuse imagery

    IWF research into how artificial intelligence (AI) is increasingly being used to create child sexual abuse imagery online

  2. Under 10s groomed online ‘like never before’ as hotline discovers record amount of child sexual abuse

    Alarming increase in online grooming and child sexual abuse imagery, particularly among under 10s, in 2023 as reported by the IWF.

  3. Annual Report 2023

    Discover the latest trends & data in the fight against online child sexual abuse imagery in the 2023 Annual Report from the Internet Watch Foundation (IWF).

  4. Our live-streaming report: The case studies

  5. Talk Trust Empower

    Research report by PIER at Anglia Ruskin University, providing insight into girls and their parents' understanding of self-generated CSAM.

  6. Where to report non child sexual abuse

    A list of where to report some of the other types of harmful content you may see online.

  7. UK Safer Internet Centre public report released

  8. Everton FC team up with UK Safer Internet Centre in a first for British football

  9. Teenage boys targeted as hotline sees ‘heartbreaking’ increase in child ‘sextortion’ reports

    The IWF and NSPCC say tech platforms must do more to protect children online as confirmed sextortion cases soar.

  10. Tunisia takes ‘major step’ in global fight against online child sexual abuse material

    A new IWF portal will, for the first time, give people in Tunisia a safe and anonymous place to report illegal videos and images.

  11. Portal to call out child sexual abuse material in Kenya ‘could lead to the rescue of a young victim’

    The Kenyan public will now have a safe and anonymous place to report suspected images and videos of children suffering sexual abuse.

  12. Taskforce will stop millions of the most severe child sexual abuse images and videos being shared online

    A specialised new team will take ‘digital fingerprints’ of millions of images so companies and organisations around the world can spot them and have them removed.