For many years, the IWF has worked with a range of stakeholders including politicians and campaigners to advocate for strong regulations aimed at combating child sexual abuse material (CSAM) and improving safety online. We welcomed the Online Safety Act as a pivotal opportunity to protect children from the harms the internet can expose them to.
Regulation can help ensure that children are consistently provided with age-appropriate experiences that prioritise their safety online. By holding tech companies accountable, it shifts the responsibility to these platforms to minimise harm and deliver more positive outcomes for children and young people.
To help understand this landmark legislation and its impact on the IWF’s mission, we’ve created the following guide:
The Online Safety Act (2023) is a new law that introduces measures to protect children and adults online. It requires tech companies that provide user-to-user services or search engines to improve online safety by removing illegal content (which includes CSAM), address harmful material for children, and enforce age limits for adult content.
Under this law, these companies are required to safeguard UK users by evaluating potential risks of harm and taking steps to address them. The new regulation applies to all relevant services with a significant number of UK users or those targeting the UK market, irrespective of the companies’ locations.
The Act officially became law in October 2023.
Ofcom, the independent communications regulator, has since been working to implement the new legislation. Ofcom has drafted specific steps that providers can take to meet their safety obligations through codes of practice. Once the codes are finalised, Ofcom will possess the power to assess and enforce compliance among service providers with the new framework. The first new duties will take effect towards the end of 2024.
For more details of the timeline of the Act’s implementation, see Ofcom’s updated roadmap.
The Online Safety Act represents a crucial advancement in safeguarding children from online sexual abuse.
In 2023, IWF’s team of analysts acted on over 275,000 web pages displaying this illegal content, with each page containing hundreds, if not thousands, of indecent images of children. 2023 was the most extreme year on record, and we are discovering more child sexual abuse imagery online than ever before in our history.
To tackle the threat of child sexual abuse material (CSAM), the Act establishes safety duties for online services, requiring them to conduct risk assessments to evaluate the likelihood and impact of child sexual exploitation and abuse on their platforms. These services must take appropriate measures to mitigate identified risks, as well as actively identify and remove illegal content.
The greater the risk on a service, the more measures and safeguards that service will need to take to keep users safe from harm and prevent grooming and exploitation of children. With the implementation of the Act, the sharing of CSAM should become significantly more difficult.
The Act will raise the bar of current industry standards by incorporating robust age assurance measures, safer algorithms, and comprehensive tools designed to support children's safety online.
Companies that fail to comply with the new regulations face fines of up to £18 million or 10% of their qualifying worldwide revenue, whichever is greater. Criminal action may also be pursued against senior managers, if their company does not ensure compliance with information requests from Ofcom. The regulator will also have the authority to hold companies and senior managers criminally liable if they fail to adhere to enforcement notices regarding specific child safety duties related to child sexual abuse and exploitation on their platforms.
As it stands, the draft Codes would significantly expand the number of high-risk services that use hash matching and bring about a step change in the detection and removal of child sexual abuse material (CSAM).
Section 121 of the Act sets out Ofcom’s powers to require services to use accredited technology, including in private messaging, to tackle child sexual exploitation and abuse (as well as terrorism). The section states that if Ofcom considers that it is necessary and proportionate to do so, the regulator may give a notice to a service provider. The notice requires the provider to use its “best endeavours to develop or source technology for use on or in relation to the service or part of the service”.
This proactive approach is critical for the detection of CSAM, ensuring that safeguards are in place wherever there is a risk to children or potential exposure to this material. By enforcing the requirement for companies to use their best endeavours in detecting CSAM, Ofcom can help ensure that technology is effectively deployed to combat these threats. It is crucial that Ofcom fully leverages its authority under Section 121 of the Act, compelling tech companies to take all necessary measures to detect, block, and prevent the upload, sharing, and storage of images and videos depicting child sexual abuse.
In November 2023, two weeks after the Act became law, Ofcom launched a major consultation on illegal harms online. The regulator laid out proposals requiring tech firms to use a range of measures to protect their users from illegal content online – including CSAM and grooming.
The illegal content duties are not only about removing existing illegal content; they are also about stopping such content from appearing at all. As the March 2025 deadline for the Illegal Harms Codes of Practice approaches, platforms need to think about how they design their sites to reduce the likelihood of them being used for criminal activity in the first place.
In May 2024, Ofcom published its the second major consultation which focused on its proposals for user-to-user services and search services should approach their new duties relating to content that is harmful to children.
You can read our response to Ofcom’s Illegal Harms Codes of Practice here and Protection of Children Codes here.
We are calling on Ofcom to:
Ofcom's work is ongoing, with the final Codes of Practice and guidance on illegal harms expected to be published in December 2024. The Codes and guidance focused on the protection of children are set for release in April 2025. You can find more information about this here.
At the IWF, we believe the Online Safety Act has the potential to transform child safety online.
Companies looking to make sure their platforms are compliant with the provisions set out in the new legislation can join the Internet Watch Foundation as Members to make sure they are doing their part to help make sure the internet becomes a safer place for everyone.
Find out more here or contact our team directly at [email protected].