New pilot shows way for smaller platforms to play big part in online safety
Even the smallest platforms can help prevent child abuse imagery online.
The phenomenon of self-generated child sexual abuse material (SG-CSAM) has escalated in recent years, driven by the proliferation of smartphone camera technology and increased internet accessibility. The COVID-19 pandemic, which led to prolonged periods of lockdown, further compounded this issue. Self-generated child sexual abuse material includes intimate or sexually explicit content created by and featuring minors, which can be shared either voluntarily or through coercion, grooming or blackmail. This report, funded by the Oak Foundation and conducted in collaboration with the Policing Institute for the Eastern Region (PIER) and the Internet Watch Foundation (IWF), aims to build an evidence base to inform targeted prevention campaigns. The primary objectives were to investigate effective public awareness campaigns, design and deliver targeted public campaigns and evaluate their effectiveness in educating children, parents, carers and educators about self-generated child sexual abuse material.
Social media's intrinsic role in the lives of young people necessitates a thorough understanding of the challenges they face online. This project emphasises the importance of incorporating the perspectives of children, young people, parents and educators in developing sensitive and effective responses to self-generated child sexual abuse material. By exploring how children and young people perceive, understand and navigate these issues, the report seeks to highlight the complexity and gravity of self-generated child sexual abuse material. It underscores the need for campaigns that do not merely focus on abstinence but also address safe sharing practices and the realistic contexts in which children and young people operate online. The research findings presented in this report mark the culmination of the project's research phase, aiming to contribute to a more informed and responsive approach to safeguarding young people in the digital age.
Even the smallest platforms can help prevent child abuse imagery online.
Internet Watch Foundation Interim CEO Derek Ray-Hill writes on why we are working with Telegram to tackle child sexual abuse material online.
New online safety guidelines need to be more ambitious if the “hopes of a safer internet” are to be realised, the IWF warns.
Local MP Ian Sollom learned about the herculean task faced by analysts at the Internet Watch Foundation (IWF) who find, assess and remove child sexual abuse material on the internet.