Protecting children is at the heart of everything we do. For over 28 years, since the early days of the internet, our job has been to help child victims of sexual abuse by hunting down and removing any online record of the abuse.
It’s a tough job. Our image analysts are amongst the best in the world. The children in the pictures are real. Their abuse and suffering is very real. Our experts never forget that.
The criminals who sexually abuse children, then record their suffering and share the horror online are ruthless. Sometimes they create images with audiences in mind. Victims range from babies to young teens. Abusers are often experienced at online grooming, skilled at manipulating young minds. Sometimes victims don’t even realise they are being abused until it’s too late.
Sadly, the internet makes it easier to share these images. We use advanced technology and human expertise to help young victims. If we can remove the record of suffering online and stop those images circulating, then we can stop the abuse being perpetuated. This makes the internet a safer place for all children and adults.
We are an independent not-for-profit organisation. Tech companies, our ‘Members’, companies and the public fund our work. Our donors are extraordinary people who care about keeping children safe online. They’re our IWF heroes.
Our experience and data is unique. We use this expertise to help governments shape new laws, laws that will benefit victims of child sexual abuse and give the best possible protection to children online. Our Policy Team is supported by IWF Champions, UK Members of Parliament who have signed-up to help spread the word about protecting children online.
We also run public campaigns, to help share our learning. Encouraging teenage boys who might have accidently stumbled on child abuse images to report to our Hotline, or helping parents have difficult conversations with their children about staying safe online are important messages for people.
At IWF we recognise online child sexual abuse imagery is a global problem, which demands a global solution. So, we’ve taken our fight to countries without anywhere to report online child sexual abuse. Working in partnership with local people, we provide scaled down Reporting Portals which feed directly to our expert analysts in the UK. Today we have nearly 50 of these portals.
It’s all part of our mission to help victims of child sexual abuse worldwide, by identifying and removing the online record of their abuse. If you share our vision, why not consider making a donation, or if you represent a company, take a look at memberships and partnerships.
We use the term child sexual abuse to reflect the gravity of the images and videos we deal with. 'Child pornography', 'child porn' and 'kiddie porn' are not acceptable descriptions. A child cannot consent to their own abuse. Read more.
Why the Internet Watch Foundation exists, what it what set up to do, and how it does it.
The processes IWF use to assess child sexual abuse imagery online and have it removed from the internet.
The IWF is made up of a team of over 50 diverse team members working in a variety of disciplines including our team of front-line analysts.
New online safety guidelines need to be more ambitious if the “hopes of a safer internet” are to be realised, the IWF warns.
Local MP Ian Sollom learned about the herculean task faced by analysts at the Internet Watch Foundation (IWF) who find, assess and remove child sexual abuse material on the internet.
Messaging app Telegram will deploy new tools to prevent the spread of images of child sexual abuse after teaming up with the Internet Watch Foundation.
The Internet Watch Foundation and the NSPCC have won an award that recognises the vital service that the Report Remove tool offers children in the UK.
IWF data and tools will help prevent the platform’s users being exposed to child sexual abuse imagery
After years of ignoring pleas to sign up to child protection schemes, the controversial messaging app Telegram has agreed to work with an internationally recognised body to stop the spread of child sexual abuse material (CSAM).