The European Commission on May 11 proposed new legislation to prevent and combat child sexual abuse online that allows EU countries to order companies like Facebook and Apple to implement systems that can scan, detect, and remove child sexual abuse content on their platforms.
“With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive. […] The current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children,” the Commission reasoned.
Despite the good intentions behind the legislation, privacy experts have criticised the rules because the provisions require messaging apps like WhatsApp and Facebook Messenger to scan private messages. Last year, Apple announced plans to enable child sexual abuse material (CSAM) detection on iCloud Photos and iMessage, but implementation was postponed indefinitely after backlash. EU’s new rules are even more invasive to privacy than Apple’s, critics opined.
Dear reader, we urgently need to build capacity to cover the fast-moving tech policy space. For that, our independent newsroom is counting on you. Subscribe to MediaNama today, and help us report on the policies that govern the internet.
Who will the new rules apply to?
The rules will apply to online service providers offering services in the EU, namely:
- Hosting services
- Interpersonal communication services (messaging apps)
- App stores
- Internet access providers
- Applies to encrypted services as well: The Commission explained that the obligations are technologically neutral, meaning that even encrypted services will be required to carry out the proposed obligations. “A large portion of reports of child sexual abuse, which are instrumental to starting investigations and rescuing children, come from services that are already encrypted or may become encrypted in the future. If such services were to be exempt from requirements to protect children and to take action against the circulation of child sexual abuse images and videos via their services, the consequences would be severe for children,” the Commission said.
- Legal representative in the EU: Providers of hosting or interpersonal communication services not established in any EU Member State, but offering their services in the EU, will be required to designate a legal representative in the EU.
What content is covered under the legislation?
- Known CSAM material: Re-uploaded photos and videos that have been previously identified as child sexual abuse material and reported to organisations like the US National Centre for Missing and Exploited Children (NCMEC).
- New CSAM material: Abuse photos and videos not previously identified
- Grooming-related content: Grooming is the practice where offenders build a relationship of trust and emotional connection with children in order to manipulate and sexually exploit and abuse them.
What are the key features of the new legislation?
- New EU Centre on Child Sexual Abuse: As part of the legislation, an independent EU Centre on Child Sexual Abuse will be established. The EU Centre will maintain a database of digital “indicators” of child sexual abuse material and provide the same to service providers, receive and analyse reports from providers and submit them to law enforcement after checking that they are not erroneous, and provide support to victims.
- Mandatory risk assessment for hosting and messaging services: Providers of hosting or messaging services will have to assess the risk that their services are misused for child sexual abuse material or for grooming and will have to propose risk mitigation measures accordingly.
- Detection obligations when significant risk remains: The EU Member States will need to designate national authorities in charge of reviewing the risk assessment, and if such authorities determine that a significant risk remains despite the proposed risk mitigation measures, they can ask a court or an independent administrative authority to issue a detection order for known or new child sexual abuse material or grooming. When such orders are issued, service providers will use the indicators provided by the EU Centre (hashes/AI classifiers) in their detection efforts.
- How will detection work? “Technologies for detection of known child abuse material are typically based on hashing, which creates a unique digital fingerprint of a specific image. Technologies currently used for the detection of new child abuse material include classifiers and artificial intelligence (AI). A classifier is any algorithm that sorts data into labelled classes, or categories of information, through pattern recognition. Technologies for the detection of grooming in text-based communications make use of analysis of text technologies and/or analysis of metadata. Human review is already typically in place even for the most accurate technologies such as hashing,” the Commission explained. The proposed technologies to detect new CSAM and grooming content are the most controversial aspects of these new rules because they would require algorithms to analyse the context of pictures and text messages.
- Safeguards on detection to prevent invasion of privacy: Companies that receive a detection order will only be able to detect content using indicators of verified CSAM provided by the EU Centre, which means the providers themselves cannot determine what is illegal in the EU. Detection technologies must only be used for the purpose of detecting child sexual abuse and will be limited in time. “Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible,” the Commission said. Critics have pointed out that such technologies do not exist (more below).
- Reporting detected CSAM content: Providers that have detected online child sexual abuse will have to report it to the EU Centre.
- Removal orders in case the content is not taken down swiftly: EU national authorities can issue removal orders if the child sexual abuse material is not swiftly taken down by providers themselves. The authorities can direct internet access providers to disable access to images and videos that cannot be taken down because they are hosted outside the EU in non-cooperative jurisdictions.
- Reducing exposure to apps used for grooming: App stores are required to ensure that children cannot download apps that may expose them to a high risk of solicitation of children.
- Fines of up to 6% of turnover: If a service provider ends up being deemed to be in violation of these rules, the Commission has proposed fines of up to 6% of global annual turnover, but it would be up to the Member States to determine the exact level of any penalties.
Why are privacy advocates not happy with the proposed legislation?
A new mass surveillance system
“This document is the most terrifying thing I’ve ever seen. It is proposing a new mass surveillance system that will read private text messages, not to detect CSAM, but to detect “grooming”,” cryptography professor Matthew Green tweeted after a leaked version of the proposed legislation (which is largely similar to the final version) released last week.
Green took aim at the proposed systems to detect “grooming” because this will require using algorithms that read actual text messages to figure out what people are saying, at scale, he said. He also criticised the EU “for demanding the existence of technology we don’t really have yet.”
“What’s terrifying is that once you open up “machines reading your text messages” for any purpose, there are no limits,” Green said. Concurring with Green, Will Cathcart, Head of WhatsApp, opined: “If the EU mandates a scanning system like this be built for one purpose in the EU, it will be used to undermine human rights in many different ways globally.”
“It describes the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR. Not an exaggeration. By legally mandating the construction of these surveillance systems in Europe, the European government will ultimately make these capabilities available to every government.” – Matthew Green
Expressing similar concern, Jan Penfrat of digital advocacy group European Digital Rights (EDRi) tweeted: “This looks like a shameful general surveillance law entirely unfitting for any free democracy.”
In a separate Q&A page, the Commission explained that detection obligations cannot be used for mass surveillance because “what service providers will be able to do under this legislation will be very tightly ringfenced both before and after a detection order is issued.” The Commission outlined a three-pronged approach that will prevent abuse of detection obligations and will encourage only the least privacy-intrusive options, but critics are not buying this argument.
Damages end-to-end encryption
“Incredibly disappointing to see a proposed EU regulation on the internet fail to protect end-to-end encryption,” Will Cathcart tweeted. “As is, this proposal would force companies to scan every person’s messages and put EU citizens’ privacy and security at serious risk.”
“It goes without saying child sexual abuse material and the abusers who traffic in it are repugnant. There is much governments and technology companies can do to combat abuse. But far too often governments approach the challenge by trying to weaken privacy and security, instead of strengthening it,” Cathcart said.
Cathcart also pointed out that the legislation will actually take away privacy and safety from billions of people, including children. He suggested that legislators work with “experts who understand internet security so they don’t harm everyone, and focus on ways we can protect children while encouraging privacy on the internet.”
Echoing similar thoughts, Joe Mullin of the Electronic Frontier Foundation (EFF) said: “The new proposal is overbroad, not proportionate, and hurts everyone’s privacy and safety. By damaging encryption, it could actually make the problem of child safety worse, not better, for some minors. Abused minors, as much as anyone, need private channels to report what is happening to them.”
Takes away fundamental rights
As a consequence of weak encryption, the new legislation will take away the fundamental rights of users, critics warned.
“The EU Commission is laying the axe to fundamental rights with its chat control proposal. […] There are no backdoors that can only be used for the noble goal of protecting children. Journalists, lawyers, whistleblowers and all others who rely on confidential communication will be exposed to considerable risks through chat control, which cannot be justified in light of fundamental rights,” Germany’s Society for Civil Rights said.
Technically Unfeasible
“There was uproar when Apple was suggesting something similar for finding known [CSAM] content. But if you introduce ambiguity and these context-dependent scenarios, in which AI-based tools which are notoriously unreliable, the challenges are much greater. You only have to look at how dodgy spam filters are. They’ve been around in our email for 20 years, but how many of us still get spam in our inboxes and miss legitimate emails? That really shows the limitation of these technologies,” Ella Jakubowska, a policy advisor at EDRi, told The Verge.
“This whole proposal is based around mandating technically infeasible — if not impossible — things,” Jakubowska said.
“The Commission omits, quite cleverly, depending on where you’re standing, just how they should do so. Effectively her message for companies is: Do the impossible, you get to decide how,” the Netherlands’ Bits of Freedom said.
Ineffective
Some privacy activists believe measures to erode encrypted communications would be ineffective. “Criminals are already using distribution channels that would not be affected by these scans and will easily escape scans in the future,” Linus Neumann of the German hacker collective Chaos Computer Club, told CNBC.
What is the current mechanism for combatting CSAM?
Currently, some online service providers detect online child sexual abuse on a voluntary basis. US service providers supply most reports that reach law enforcement, with the US NCMEC forwarding EU-related reports to Europol and national law enforcement, the European Commission said.
“Up to 95% of all reports of child sexual abuse received in 2020 came from one company, despite clear evidence that the problem does not only exist on one platform alone,” the European Commission noted. “Voluntary action is therefore insufficient to effectively address the misuse of online services for the purposes of child sexual abuse. A clear and binding legal framework is needed,” the European Commission said.
This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.
Also Read:
- Apple Says It Will Not Allow Governments To Use Its CSAM Detection System For Other Images, But Assurance Doesn’t Go Far Enough
- How WhatsApp Deals With Child Sexual Abuse Material Without Breaking End To End Encryption
- India Leads In Generation Of Online Child Sexual Abuse Material
- How End-To-End Encryption Impacts Human Rights? The Good And The Bad
Have something to add? Subscribe to MediaNama here and post your comment.
