“Our work protecting victims and survivors of child sexual abuse is a vital piece of the puzzle”

Published:  Wed 10 Jul 2019

Written by:  Mike Tunks

Like many organisations with an interest in creating a safe online environment, we’ve submitted our response to the Government’s Online Harms White Paper. It’s one of the first attempts globally to introduce a new regulatory framework for the online environment. 

We welcome its strong emphasis on dealing with child sexual exploitation and abuse and its ambitions to deal with emerging issues such as grooming and live streaming, problems for which technical solutions have so far proven elusive.

However, despite the paper’s ambition to tackle new harms and develop a new regulatory framework that’s more transparent and accountable, we believe that more clarity is needed for the future regulator, companies and others working in this space to ensure that this proposed legislation will not have unintended consequences.

We‘ve been clear in our response to the Government, that any future regulatory requirements must take account of the good work that’s already occurring in this space and build on those arrangements, improve them and make them better. 

At the IWF, we ensure that there’s an anonymous place for people to report suspected child sexual abuse images and videos online. Our analysts also utilise the latest technology to proactively search the internet for this content. Over the past 23 years we‘ve removed millions of images from the internet, and in the UK, we now host relatively little of this content. Last year, we, and our public reporters, identified just 41 URLs from the UK – a mere 0.04% of the world’s known child sexual abuse imagery. In 1996, the year we were founded, the UK was responsible for 18% of the world’s known child sexual abuse imagery. 

One of our significant concerns with the introduction of a regulatory framework, however, is that if Government seeks to levy money from the internet industry to pay for the regulator, that without careful consideration as to how this is done, it may impact on existing initiatives, such as the funding we receive directly from the industry. In turn, this could have a catastrophic impact on the amount of child sexual abuse images we’re able to remove.

Our ability to receive public reports, to proactively search the internet and seek the swift removal of this imagery, must continue in the new regulatory framework. Our work protecting victims and survivors of child sexual abuse is a vital piece of the puzzle. 

Secondly, we believe that the regulator must be equipped with the right technical knowledge, skills and expertise in order to be effective. The regulator must know the technology sector well in order to adapt to the fast-moving nature of the industry. It must ensure that start-up companies have the same access as the very largest companies to tools and services that can assist them in controlling illegal and harmful content online. We’ve facilitated this method for years by having a sliding fee structure where the largest companies pay the most (currently £78,000) and the smaller companies pay the least (as little as £1,040). This means that all companies gain access to every service we offer at a fee based on their size and sector. We also provide a forum in which they can discuss the issues that they’re facing on their platforms with other companies and share best practice as well as assisting us in designing technical solutions.

Thirdly, we believe that the Government must consider the global nature of the internet in the design of its new regulatory framework. One of the challenges for us at the IWF is that many of the companies that are responsible for hosting child sexual abuse imagery are based outside of the UK and the Government needs to consider how they would get these companies to comply with UK regulation. We have a great deal of expertise in this area. We provide a list of webpages for blocking whilst we seek removal at source of criminal content based outside of the UK. We do this through the INHOPE network of hotlines or Law Enforcement contacts if a country doesn’t have a hotline. We also operate reporting portals in the most under-developed countries in the world, with an expected 50 portals by 2020. We might be based in the UK, but we’re global in working practices and reach. 

We also have concerns about how the new regulatory framework will operate in relation to the European Union’s E-Commerce Directive and Child Sexual Abuse Directive, both of which are likely to be retained after Brexit. Our further concern is that unless there is a consensus internationally around the right approach to regulation, current models for international collaboration could be impacted. We’re therefore urging the Government to have conversations through the forthcoming five-eyes summit and with the European Commission. 

Above all, we need the right thing to happen for victims and survivors who’ve suffered horrendous sexual abuse and had their suffering compounded by having their imagery shared online. 

Tags

New pilot shows way for smaller platforms to play big part in online safety

New pilot shows way for smaller platforms to play big part in online safety

Even the smallest platforms can help prevent child abuse imagery online.

19 December 2024 Blog
The IWF relies on strong partnerships with the tech industry

The IWF relies on strong partnerships with the tech industry

Internet Watch Foundation Interim CEO Derek Ray-Hill writes on why we are working with Telegram to tackle child sexual abuse material online.

18 December 2024 Blog
IWF and RM celebrate  20th anniversary of partnership

IWF and RM celebrate 20th anniversary of partnership

Heidi Kempster, Deputy CEO of the Internet Watch Foundation (IWF), and Jason Tomlinson, Managing Director of RM Technology, reflect on the organisations’ two-decade partnership tackling child sexual abuse material.

24 October 2024 Blog