Last year was the “most extreme year on record” for child sexual abuse online, UK based charity Internet Watch Foundation warned.
AI used to generate deepfake images of child sexual abuse uses photos of real victims as reference material, a report has found.
Sextortion is a form of blackmail in which a child is tricked into sending sexual images of themselves to abusers, who then threaten to share the pictures with friends, family or more widely on the internet if they are not paid money.
As new data shows EU servers are being targeted by criminals to host this imagery, EU legislators must pass vital new legislation to get a grip on the worsening situation and prevent the abuse of EU servers by criminals profiting off child sexual abuse imagery, writes Susie Hargreaves.
Paedophiles are using artificial intelligence (AI) to ‘de-age’ celebrities and create images of them as children.
A leading child protection organisation has warned that abuse of AI technology threatens to "overwhelm" the internet.
Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law.
Volume of material children are coerced or groomed into creating prompts renewed attack on end-to-end encryption.
The Internet Watch Foundation (IWF) warns of a "shocking" rise of primary school children being coerced into performing sexually online.
Every time someone in the UK searched for child abuse material on Pornhub, a chatbot appeared and told them how to get help.
A new campaign warning children of the dangers of sharing sexually explicit images and videos has been launched, with an appeal for parents and young people to openly discuss these issues.
Criminals are tricking young people into sending intimate images, then demanding money. Experts explain how to deal with these attacks.