Celebrating the people that make IWF great - Joe: It’s not something people like to talk about in a social setting

Published:  Tue 23 Jul 2019

It’s one of the most common questions people use to start a conversation. “So, what is it that do you do for work?” 

When I first started, I try to evade the question, think of something close enough so that it’s not entirely a lie but not too close to give it all away. It inevitably leads to uneasiness, strange questions, and to be fair, it’s not something people like to talk about in a social setting. I would say, “I do IT for a local charity.” Maybe it’s the awkward way I say it, or the way I make it sound like the most mundane thing ever, but that’ll usually be the end of it and the conversation would move onto something else. 

I work for the Internet Watch Foundation. The IWF is the UK hotline for removing images of child sexual abuse from the internet. We make the internet a safe place for everyone but also, we help victims by removing images and videos of the abuse they have suffered.

My professional background is in IT. I’ve worked for the commercial sector doing all sorts of IT work including technical support and systems administration. I’ve even worked for the US military. Working for the IWF has given me so much satisfaction which I know I couldn’t get working anywhere else. At the core of what we do is not making money or building a huge company. What we do is deeper than any of those things. We prevent internet users from seeing the most heinous crimes being committed. Every image of child sexual abuse we assess and take down is a crime scene where the most horrific abuse is inflicted on the most vulnerable in our society.

We are currently working on some exciting technology at the IWF. Artificial Intelligence could help us find criminal content more efficiently including previously unseen images of child sexual abuse, thus allowing us to take them down as quickly as possible before they get disseminated and shared across the internet. We are also working to use automation to help our highly-trained analysts do some of the more repetitive work, allowing them to focus on proactively seeking out more criminal images for removal but also, at the same time, looking after their welfare as that they don’t have to see the same criminal content over and over. We need to stay on the cutting edge of technology because criminals are trying to do the same. It’s a constant and difficult battle, but we never lose sight of who we do this for: the children.

I’ve always thought it’s a privilege to be able to say that what you do for work makes people’s lives better. And here at IWF, we have helped safeguard children from further abuse, making our work truly life-changing for the victims. I am proud of what I do at the IWF. So when someone asks me what I do for a living, I proudly say, “I’m a Technical Projects Officer for the Internet Watch Foundation.”

Tik Tok’s bold step puts children’s safety before the rush for extreme privacy - more should follow their example

Tik Tok’s bold step puts children’s safety before the rush for extreme privacy - more should follow their example

IWF CEO Kerry Smith welcomes TikTok’s decision to prioritise child protection over end‑to‑end encryption.

9 March 2026 Blog
Why the EU’s temporary law allowing companies to detect child sexual abuse online must be extended

Why the EU’s temporary law allowing companies to detect child sexual abuse online must be extended

Child safety is on the line - the EU must extend its temporary law before vital protections are turned off.

9 March 2026 Blog
CSA partners from around the world join forces to say No to Nudify Apps

CSA partners from around the world join forces to say No to Nudify Apps

On Safer Internet Day 2026, the IWF and child protection partners worldwide unite to call for a global ban on AI nudify apps and tools.

10 February 2026 Blog