AI-created child sexual abuse images ‘threaten to overwhelm internet’
Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law.
Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law.
Sextortion is a form of blackmail in which a child is tricked into sending sexual images of themselves to abusers, who then threaten to share the pictures with friends, family or more widely on the internet if they are not paid money.
As new data shows EU servers are being targeted by criminals to host this imagery, EU legislators must pass vital new legislation to get a grip on the worsening situation and prevent the abuse of EU servers by criminals profiting off child sexual abuse imagery, writes Susie Hargreaves.
A leading children's charity is calling on Prime Minister Rishi Sunak to tackle AI-generated child sexual abuse imagery, when the UK hosts the first global summit on AI safety this autumn.
The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.
Images of children aged as young as seven being abused online have risen by almost two thirds.
Internet Watch Foundation says amount of material showing most extreme form of sexual abuse has doubled since 2020
Jordan King, reporter for Metro, looks at IWF transcripts showing actual conversations between groomers and child victims
Senior writer at WIRED, Matt Burgess, looks into Pornhub trialling a new automated tool that pushes CSAM-searchers to seek help for their online behaviour
Susie speaks to Aasmah Mir about the increase in self-generated child sexual abuse online amongst 7-to-10 year olds