“Let me be clear: there is no intention by the Government to weaken the encryption technology used by platforms, and we have built strong safeguards into the [Online Safety] Bill to ensure that users’ privacy is protected,” Stephen Parkinson, Minister for Culture, Communications and Creative Industries of United Kingdom said in House of Lords, the upper house of the UK Parliament, explaining the government’s stance on end to end encryption.
The debate around encryption was stirred up when the Online Saftey Bill, which was first made public back in 2019, was tabled for its third reading in the House of Lords on September 6. Parkinson’s comment focused on clause 59 of the Bill that holds digital platforms responsible for ensuring that child sexual abuse material, or CSAM is not being transmitted through their service.
What is the Online Safety Bill?
This bill is the UK’s attempt to make the internet safer for children. As such, it introduces a wide range of regulations on digital platforms including— age verification, removal of all illegal content, and removal of content that doesn’t meet a platform’s terms and conditions to name a few.
Also Read: Open Letter: WhatsApp & Other Messenger Apps Demand Revision Of UK’s Online Safety Bill
What does Clause 59 say about encryption?
Chapter 2- Clause 59 of the bill says that digital platforms offering user-to-user services in the UK must — “operate the service using systems and processes which secure (so far as possible) that the provider reports all detected and unreported CSEA [child sexual exploitation and abuse] content present on the service to the NCA [National Crime Agency] ”
While this section does not explicitly talk about breaking encryption, it makes digital platforms responsible for checking whether CSEA was present in their service. Since encrypted platforms don’t have access to the messages being transmitted between their users, to comply with the bill, service providers would have to forgo encryption.
Why it matters:
Breaking encryption is a contentious issue, if the UK bill went ahead with it, it could inspire copycat regulation in other countries like India as well. In a recent consultation, India’s telecom regulator (TRAI) discussed the idea of creating a licensing framework for communication platforms, just like the one telcos in the country currently follow.
However, such a licensing framework would allow for the creation of a back door entry for the government into end-to-end encrypted platforms which organizations like the Global Encryption Coalition vehemently oppose. The main argument posed by experts is that encryption technology keeps people safe on the internet (including children, marginalized populations, women, and the elderly), and doing away with it would do more harm than good.
Also Read: Why Are Apple’s Plans To Scan iCloud Photos For Child Sexual Abuse Material Concerning?
Clarifications on clause 59:
Parkinson mentioned that the Office of Communications (OfCom, the UK’s communications regulator) cannot proactively ask digital platforms to monitor private communication in order to comply with the bill. He specified that such monitoring can only be required by “issuing a notice to tackle child sexual exploitation and abuse content under Clause 122” (Clause 122 discussed requirements that OfCom can enforce onto digital platforms).
He further stated that a notice to monitor user communication can only be issued where technically feasible and when the technology being used to monitor communication has, “been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content.” If appropriate technology does not exist, Ofcom cannot require its use.
Parkinson said that under the provisions of the bill OfCom can require platforms to develop or source such technology. “It is right that Ofcom should be able to require technology companies to use their considerable resources and expertise to develop the best possible protections for children in encrypted environments. That has been our long-standing policy position.”
Why surveillance isn’t the answer to combating CSAM:
In August 2021, Apple announced that it would use advanced cryptographic methods to detect if a user’s iCloud Photos library contains high levels of CSAM content and pass on this information to law enforcement agencies. This received heavy criticism with people pointing out that governments can ask Apple to use this same technology to censor other kinds of content. Apple assured that it wouldn’t let that happen but by September of that year, Apple abandoned the project.
Apple has since clarified that it abandoned the project because scanning every user’s iCloud data “would create new threat vectors for data thieves to find and exploit,” according to a report by Wired magazine. “Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types,” Erik Neuenschwander, Apple’s director of user privacy and child safety said expressing the company’s position on the matter.
A similar argument was posed by WhatsApp and other end-to-end (E2E) encryption messenger apps in June this year when they wrote an open letter expressing privacy concerns about the United Kingdom’s Online Safety Bill. The apps said that the bill could break end-to-end encryption which may open doors for indiscriminate surveillance.
Note: The story was updated on September 22, 2023, at 11:59 to remove wrongful attribution of clause 59 as spy clause.
STAY ON TOP OF TECH POLICY: Our daily newsletter with the top story of the day from MediaNama, delivered to your inbox before 9 AM. Click here to sign up today!
Also read:
