IWF welcomes Ofcom duties, but warns more needed to realise ‘hopes of a safer internet’

Published:  Mon 16 Dec 2024

Written by:  Derek Ray-Hill, Interim CEO

New online safety guidelines need to be more ambitious if the “hopes of a safer internet” are to be realised, the IWF warns.

Ofcom has today (December 16) published its first-edition codes of practice and guidance on tackling illegal harms, including child sexual abuse, under the UK’s Online Safety Act.

As the UK’s front line against the spread of child sexual abuse material on the internet, the Internet Watch Foundation has welcomed positive steps laid out by Ofcom, including mandating platforms to scan for known child sexual abuse imagery.

The IWF is continuing to examine the detail of Ofcom’s published documents, but says more needs to be done to prevent child sexual abuse material being shared in private communications.

Derek Ray-Hill, Interim CEO at the Internet Watch Foundation, said: “A safe internet should mean there is nowhere for criminals to get away with sharing imagery of children being raped, tortured, and sexually abused. While laying some strong foundations, these regulations fall short of that ambition.

“The Huw Edwards scandal highlighted how criminals are sharing images and videos of children being abused in private communications. But these regulations would do nothing to detect and prevent crimes like this in these spaces.

“Mandating the use of hash matching to detect child sexual abuse is a positive step, and something the IWF stands ready to help platforms with. But unless we can see similar measures brought to private communications, these guidelines create a risk of platforms designing themselves out of scope of the regulations.

“There are already effective ways to prevent child sexual abuse material being uploaded and shared in these spaces. We'd like to see a more ambitious approach with these tools being widely deployed.

“Last year, the IWF discovered 275,652 webpages containing child sexual abuse, more than ever before in its history. This year is set to surpass even this grim milestone. We have also seen younger and younger children being targeted and groomed online in their own homes. Children as young as three. There is still a lot of work left to be done if we are to realise these hopes of a safer internet.”

Platforms in scope of the new laws have from today until March 16, 2025 to complete an assessment to understand the risks illegal content poses to children and adults on their platform. Sites and apps will then need to start implementing safety measures to mitigate those risks, using measures such as hash-matching to prevent the spread of child sexual abuse material.

Dame Melanie Dawes, Ofcom’s Chief Executive, said: “For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today. “The safety spotlight is now firmly on tech firms and it’s time for them to act. We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year. “Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.”

Huw Edwards’ offences highlight how WhatsApp can be abused by predators sharing criminal imagery of children, IWF warns

Huw Edwards’ offences highlight how WhatsApp can be abused by predators sharing criminal imagery of children, IWF warns

There is still nothing to stop criminals sharing child sexual abuse imagery via WhatsApp, even in the wake of the Huw Edwards scandal, the IWF warned.

20 September 2024 News
'Pivotal moment' as Online Safety Act gains Royal Assent

'Pivotal moment' as Online Safety Act gains Royal Assent

The IWF welcomes new laws, but says the hard work is 'only just beginning'.

26 October 2023 News