![AI being used to generate deepfake child sex abuse images based on real victims, report finds](/media/vwcfrued/sky-news.png?anchor=center&mode=crop&width=380&height=257&rnd=133662029454400000)
AI being used to generate deepfake child sex abuse images based on real victims, report finds
The tools used to create the images remain legal in the UK, the Internet Watch Foundation says, even though AI child sexual abuse images are illegal.
Published: Mon 22 Jul 2024
Advances in artificial intelligence are being used by paedophiles to produce AI-generated videos of child sexual abuse that could increase in volume as the technology improves, according to a safety watchdog.
The majority of such cases seen by the Internet Watch Foundation involve manipulation of existing child sexual abuse material (CSAM) or adult pornography, with a child’s face transplanted on to the footage. A handful of examples involve entirely AI-made videos lasting about 20 seconds, the IWF said.
The organisation, which monitors CSAM around the world, said it was concerned that more AI-made CSAM videos could emerge as the tools behind them become more widespread and easier to use.
Read the full article at The Guardian.
The tools used to create the images remain legal in the UK, the Internet Watch Foundation says, even though AI child sexual abuse images are illegal.