Tech companies and protection experts call for EU to act now to plug gap in online safety laws
Act now or see ‘fewer children safeguarded, fewer perpetrators held accountable, and offenders re-established on mainstream platforms’, lawmakers warned.
Standard encryption is widely used across the internet for all kinds of services, including banking apps, health records and messaging services. It is used to keep private information private. End-to-end encryption (E2EE) goes a step further, meaning only the sender and intended recipient hold the keys needed to ‘unlock’ or decrypt the message.
Over recent years, an increasing number of digital platforms have implemented E2EE and are no longer deploying proactive detection. This makes it harder to detect criminal content, like CSAM (child sexual abuse material), by securely scanning imagery against lists of known illegal content.
IWF’s explainer delves into how platforms can prevent the upload of CSAM in E2EE environments in a privacy preserving way. Upload prevention is a method that works and is already being used by companies to check for other types of content. It is time these safety checks also extended to CSAM. Governments must require companies to implement the upload prevention method on their E2EE services and reduce the risk of known CSAM being sent and shared on their services. Implementing upload prevention will be a crucial step in preventing offenders from sharing known CSAM on E2EE messaging.
Services that adopt end-to-end encryption must also adopt upload prevention, ensuring that known CSAM is detected and blocked before it can be shared. In doing so, platforms can uphold both the security of private communications and the fundamental rights of victims and survivors.
Act now or see ‘fewer children safeguarded, fewer perpetrators held accountable, and offenders re-established on mainstream platforms’, lawmakers warned.
Analysts observed offenders discuss using hidden cameras to obtain footage of real children to transform into AI videos.
New report reveals full scale of AI-generated child sexual abuse images and videos and ‘unsettling’ insight into offender views.
IWF Analyst recognised the girl ‘straight away’ but warns there are still ‘thousands’ of child victims yet to be identified.
CipherOwl has joined the Internet Watch Foundation (IWF) as a new Member, bringing blockchain compliance intelligence directly into the global effort to eliminate child sexual abuse material from the internet.