Tech companies have been warned not to introduce encryption unless they can guarantee they can keep platforms free of illegal content, as MPs call for more to be done to protect children from online grooming and sexual abuse.
A new report published today (September 13) by the All-Party Parliamentary Group (APPG) on Social Media highlights the increasing dangers of children being bullied or coerced into producing images or videos of their own sexual abuse by adult predators on the internet.
The APPG calls on the Home Office to review all relevant legislation to ensure it is as easy as possible for children to have their images removed from the internet.
The report, “Selfie Generation: What’s behind the rise of self-generated indecent images of children online?”, sets out 10 recommendations the UK Government and the tech industry must adopt to safeguard children online.
The report is the result of an inquiry which drew on oral evidence from academics, children’s charities, law enforcement and industry.
The APPG for Social Media was established in 2018, to mitigate the negatives and promote the benefits of social media. The UK Safer Internet Centre (of which the IWF is part) provides the Secretariat for the APPG and promotes the safe responsible use of technology.
The APPG’s chairman, Labour MP Chris Elmore (Ogmore), said social media companies are “fundamentally failing” to keep children safe, and that companies need to “wake up and get a grip” in finding and removing images and videos of children suffering sexual abuse.
Mr Elmore said: “It is all too often the case that laws and lawmakers find themselves playing catch-up when it comes to effective regulation of the online media landscape.
“The pace of technological change has meant that policy, legal reform, and standards of best-practice in this area are simply not fit for purpose. And this virtual game of cat and mouse has appalling real-world consequences.
“It’s high time that we take meaningful action to fix this unacceptable mess. Children are daily at real risk of unimaginable cruelty, abuse and - in some instances - death.
“Social Media companies are fundamentally failing to discharge their duties, and simply ignoring what should be an obvious moral obligation to keep young users safe.
“They need to get a grip, with institutional re-design, including the introduction of a duty-of-care on the part of companies toward their young users. Firms must be more pro-active and forthcoming when it comes to rooting out abuse images.
“There is an urgent need for social media platforms to be transparent with young users about the mechanisms available to them to remove and complain about these harmful images.”
Susie Hargreaves OBE, Director at the UK SIC, said: “This report serves as a stark reminder that we can all be doing more to make sure children are kept safe online.
“We see the fall-out of abuse and, when children are targeted and made to abuse themselves on camera by criminal adult predators, it has a heart-breaking effect on children and their families.
“There is hope, and there are ways for children and young people to fight back. The Report Remove tool we launched this year with Childline empowers young people to have illegal images and videos of themselves removed.
“We have also campaigned to help children and young people, as well as their parents, understand the potential dangers which lurk online.”
“New legislation will also help make a difference, and the forthcoming Online Safety Bill is a unique opportunity to make the UK a safer place to be online, particularly for children.”
The report comes as MPs and Peers begin their scrutiny of the Government's Draft Online Safety Bill.
Mr Elmore said the Bill will offer an opportunity for meaningful reforms, but said there must be robust age-verification requirements on websites hosting adult content.
He added: “Social media companies should not encrypt their service, unless they can guarantee that they can still remove illegal content and cooperate with law enforcement in the same way they do now.
“They need to stop putting profits before the safety of kids online, and accept that warm words and algorithms just won’t do the job.”
The report makes 10 key recommendations:
- Tech companies should not introduce encryption unless they can guarantee that they can still remove illegal content and cooperate with law enforcement in the same way they do now.
- The RSE (Relationships and Sex Education) curriculum should facilitate constructive conversations about healthy relationships in a digital age, that avoid blaming children. The Department for Education and relevant devolved Education Departments must ensure that schools are well-resourced, and teachers receive appropriate training to facilitate these messages. The APPG recommends that interventions are targeted at primary aged children, as well as older teenagers.
- The Home Office should review all relevant legislation to ensure it is as easy as possible for children to have their images removed from the internet and ensure that they can have confidence in the removal process.
- Tech companies should be proactive in taking responsibility for ensuring they act with a duty of care towards their users. They should cooperate constructively with Government and other stakeholders. Platforms should ensure there are clear ways for users to raise complaints and request images are taken down.
- “Self-generated” indecent imagery should be referred to as “first person produced imagery”.
- There should be clearer guidelines established for policing throughout England, Wales, Scotland and Northern Ireland relating to Outcome 21 to ensure a more consistent outcome that does not blame or criminalise children unnecessarily.
- The Online Safety Bill and other relevant legislation such as the Audio-Visual Media Services Directive should encourage age verification of adult websites to prevent children from accessing them.
- The Government should publish more information about the requirements in the Online Safety Bill as soon as possible, including how Ofcom will designate expert co-regulators in priority areas such as child sexual abuse.
- The Government should ensure that organisations working to remove illegal content or preventing offending are well-funded and resourced, particularly areas that were previously EU-funded.
- Platforms should take all possible measures to tackle harmful fake accounts, particularly those held by sex offenders.