“IWF is a unique place to work as a technologist,” says Dan Sexton, Chief Technology Officer at the IWF, “because we know that every time we make a system faster or gather some important bit of data, it has a direct positive effect on the internet. We’re helping to protect children and enabling policy makers to improve child safety. You feel your work matters. It’s helping to make the world a slightly better place.”
“The problem is vast and the scale is massive,” says Dan. “It’s not something we can fix on our own. It’s essential that we create and leverage effective technology to address the scale of the issue.”
The IWF constantly innovates so we can provide our analysts and our Hotline with tools that help them work better, faster, more efficiently and more accurately. We also work closely with our industry partners so we can understand the harms they’re seeing and the data they need to help protect their platforms.
“By cooperating with our Members, including providing datasets and technical innovations, we can have images removed not just once, but removed and blocked millions of times,” says Dan.
In recent years, our relationship with Nominet, the guardian of .UK domain names, has enabled a variety of technical innovations at the IWF. Nominet provides us with funding specifically for transformative projects to help counter online harm.
In 2024, technology funded by Nominet allowed us to count thousands of additional child victims of sexual abuse for the first time. Thanks to the new ‘Multichild’ feature in our Intelligrade system, we have been able to record 70,898 children who would otherwise have been ‘invisible’.
Previously, if an image featured more than one child, due to technical limitations we could only record information about the youngest child. Now, our analysts can easily track information about all the children shown in the image, including data about their age, sex and skin tone.
“We’re the only organisation in the world that records information at that level of detail,” says Dan. “It means we have incredibly comprehensive data about imagery of child sexual abuse that simply doesn’t exist anywhere else.
“Our hope is that we can now use that data to tell powerful stories that will help policy makers, regulators and industry to make good decisions and create effective technical and educational interventions. If you know precisely which children are being targeted, you can begin to understand how to end that abuse.”
Looking to the future, we have plans to build on our innovative tech tools to make it even easier and quicker for our analysts to identify and remove images of child sexual abuse from the internet, wherever we find them. We’re also looking forward to forging new alliances with Members, funders and regulators to protect vulnerable children and make the internet a safer place for everyone.