Meta failing to stop spread of child sexual abuse imagery in wake of Huw Edwards scandal

Published:  Fri 16 Aug 2024

Child protection groups warn there’s nothing to stop imagery sent to Edwards spreading further on WhatsApp.

  • Frontline child sexual abuse charities warn abuse thrives in secrecy’ and call on Meta to outline how it will proactively detect and prevent child sexual abuse imagery in the future.
  • Safeguarding Minister Jess Phillips says social media companies must bring in ‘robust’ measures to prevent platforms being ‘safe spaces’ for criminals.
  • National Crime Agency Chief blasts technology companies’ ‘refusal to take responsibility for serious criminality enabled by their platforms’.

Sexual images of children sent to Huw Edwards could still spread on WhatsApp “today, tomorrow, and the next day” amid warnings Meta is failing to stop the spread of child sexual abuse material.  

Edwards admitted having indecent imagery of children as young as seven, including Category A imagery, the most extreme category of child sexual abuse imagery in the UK which can include penetration, rape, sadism, or even bestiality.

The material was sent to Edwards via WhatsApp, an end-to-end encrypted messaging service, where even the company itself cannot see, let alone block, the criminal files being shared.

Now Safeguarding Minister Jess Phillips has joining child protection charities and the National Crime Agency in calling on Meta to do more to stop images and videos of child sexual abuse from being shared on its platforms.

Meta, which owns WhatsApp, says current methods of detecting and blocking child sexual abuse imagery are incompatible with end-to-end encryption, which the company says it uses to help keep people, including children, safe.

Dan Sexton, Chief Technology Officer at the Internet Watch Foundation (IWF) said: “I’d like to ask this question. How is Meta going to prevent this from happening again? What is stopping those images being shared again on that service today, tomorrow, and the next day?

“Right now, there is nothing stopping those exact images and videos of those children being shared on that platform, even though we know about it, and they know about it, and the police know about it. The mechanisms are not there. That’s what I’d like to see changed. 

“We should not be seeing this in the news time and time again. It is a solvable problem.

“There are tried, trusted, and effective methods to detect images and videos of child sexual abuse and prevent them from being shared in the first place. But in WhatsApp, these safeguards are effectively switched off, with no alternative measures in place.

“This was a technology-enabled crime against children and I think this is where we need to see change.

“We must not forget they are at the heart of this scandal, and everyone, including big internet companies and platforms, owe it to those victims to make sure their imagery cannot spread even further. At the moment, Meta is choosing not to.”

Dan Sexton, Chief Technology Officer at the Internet Watch Foundation (IWF)
Dan Sexton, Chief Technology Officer IWF

Safeguarding Minister Jess Phillips said: “Child sexual abuse is a vile crime that inflicts long lasting trauma on victims.

“UK law is crystal clear – the creation, possession and distribution of child sexual abuse images is illegal and we continue to invest in law enforcement agencies to support their efforts in identifying offenders and safeguarding children.

“Technology exists to detect and prevent the abhorrent abuse of thousands of children and ensure victims are given privacy by stopping the repeated sharing and viewing of images and videos of their abuse.

“Social media companies must act and implement robust detection measures to prevent their platforms being safe spaces for criminals.”

Safeguarding Minister Jess Phillips
Jess Phillips, Safeguarding Minister

Rick Jones, Acting Director of Intelligence at the National Crime Agency said: “It is fundamentally not acceptable for technology companies, with the resources they have at their disposal, to consciously step away from preventing the distribution of indecent images of children across their platforms.

“Technology is available to identify these images, but most companies are choosing to design their platforms in a way that does not allow it to be used either at all, or to its full effectiveness.

“When end-to-end encryption (E2EE) is used, technology companies cannot protect their customers, millions of whom are children, as they simply cannot see illegal behaviour on their own systems.

“This is not a UK only issue.  In a statement issued in April this year, the NCA and 32 Police Chiefs from across Europe called on technology companies, such as Meta, to do more to ensure public safety measures are in place across their platforms.

“It is not morally defensible for platforms to put the onus on victims, especially children, to identify and report abuse that they are being subjected to.

“Criminals are highly manipulative and often exert severe pressure on victims to not report what is happening to them - to anyone at all.  By taking this stance, the technology companies are refusing to take responsibility for some of the most serious criminality that is enabled by their platforms. 

“Criminals actively track which platforms are used by the greatest number of potential victims, and which offer them the greatest protections from being caught.  They are watching the choices that big technology companies are making.  

“Privacy and public safety need not be mutually exclusive.  Solutions need to, and can, be found that deliver both.  We all have a responsibility to ensure that those who seek to abuse these platforms are identified and caught, and that platforms become more safe over time, not less.  We cannot let ourselves be blinded to crime.”

Deborah Denis, CEO of Lucy Faithfull Foundation
Deborah Denis, CEO of Lucy Faithfull Foundation

Deborah Denis, CEO of the Lucy Faithfull Foundation, a charity which helps offenders or people worried they may offend, address and change their behaviour, said: “What happened with this scandal is really shocking news for many people, but we see this all too often in our work.

“It shows that people who offend online come from all backgrounds and walks of life. It also shows the huge consequences of this illegal behaviour.

“Edwards’ offending has destroyed his reputation, his career, and caused deep hurt to the people that love him most. It could all have been avoided.

“What might not be as obvious is that, as long as this imagery is being shared, children are suffering. When they are abused, the harm does not end there, it continues. Edwards has added to the suffering of the children in those images.

“There is a responsibility on everyone to not offend, but it’s all too easy to share and find these illegal images, and tech companies need to do more to stop that. We know from our decades of experience that abuse thrives in secrecy and that end-to-end encrypted platforms are easy places for people to offend.

“We urge Meta to consider the many thousands of young victims of online sexual abuse every year and to ensure the necessary child safety measures are put in place to keep their platforms safe.

“Preventing harm from happening in the first place must be something we all strive for. Anyone worried about their own behavior needs to know that there is anonymous support available to change, including from our Stop It Now helpline. If Edwards had had the courage to reach out when he first suspected he had a problem, it is possible he could have addressed his behaviour.

“We must encourage the kind of environment where people can, confidently and confidentially, take steps to change their own behaviour at the earliest opportunity.”

AI-generated videos of child sexual abuse a ‘stark vision of the future’

AI-generated videos of child sexual abuse a ‘stark vision of the future’

Real victims’ imagery used in highly realistic ‘deepfake’ AI-generated films

22 July 2024 News
The horror and the heartbreak - how one child sexual abuse survivor’s torment will never end thanks to AI

The horror and the heartbreak - how one child sexual abuse survivor’s torment will never end thanks to AI

The right legislative steps could help to lessen the harm

22 July 2024 News
Stability AI joins IWF’s mission to make internet a safer space for children

Stability AI joins IWF’s mission to make internet a safer space for children

Leading AI company partners with Internet Watch Foundation to tackle creation of AI generated child sexual abuse material

8 July 2024 News