Prof Farid said the report from Levy and Robinson made it clear that privacy does not have to come and the expense of child protection.
He said: “We have been made to believe there is a false choice here. Either you have privacy or you have security for kids, and I think that is a false choice.
“We routinely scan on our devices, on our email, on our cloud services for everything including spam and malware and viruses and ransomware and we do that willingly because it protects us. It protects our devices and, without that, without the ability even within end-to-end encryption, to scan for harmful content to our devices, we would be dead in the water.
“I don’t think it is hyperbolic to say that, if we are willing to protect ourselves, then we should be willing to protect the most vulnerable among us.
“It is the same basic core technology, and I reject those that say this is somehow giving something up. I would argue this is, in fact, exactly the balance that we should have in order to protect children online and protect our privacy and our rights.”
Prof Farid said, in the light of the report’s findings, he would now urge Apple to continue with an initiative it floated in 2021 when they said they would perform “on-device matching”, scanning against a database of known child sexual abuse image hashes.
Hashes are a unique digital fingerprint of known child sexual abuse material. Images on devices in the US will be flagged if their digital fingerprint strongly matches that of known abuse imagery in the database.
The feature called NeuralHash was shelved after a backlash from campaigners concerned the move would be a breach of users’ privacy.
Prof Farid said: “The push back was from a relatively small number of privacy groups. I contend that the vast majority of people would have said sure, this seems perfectly reasonable, but yet a relatively small but vocal group put a huge amount of pressure on Apple and I think Apple, somewhat cowardly, succumbed to that pressure.
“I think they should have stuck their ground and said this is the right thing to do and we are going to do it. And I am a strong advocate of not just Apple doing this, but Snap doing this, and Google doing this – all the online services doing this.”
Prof Farid said the main reason steps like this have not yet been introduced are that there is no financial incentive for companies to take action.
“We can’t turn a blind eye to the real harms we are seeing on these services which are affecting tens of thousands, hundreds of thousands of kids around the world. I think we have to make a decision as to who do we want to be as a society.
Mike Tunks, Head of Policy and Public Affairs at the IWF, said: “For the last few years, the Government has been saying we want tech companies to do more about tackling child sexual abuse in end-to-end encrypted environments.
“As we know, at the minute, there is no technology that can do that, but this paper sets out some ways in which that can be achieved.”