A controversial European Union legislative proposal to scan citizens' private messages for child sexual abuse material (CSAM) is a risk to the future of web security, warned Meredith Whittaker, head of the non-profit foundation that runs the end-to-end encrypted (E2EE) messaging app Signal, in a public blog post on Monday.
“Implementing such proposals in the context of end-to-end encrypted communications would be impossible without fundamentally undermining encryption and creating dangerous vulnerabilities in core infrastructure with global implications far beyond Europe,” she wrote.
The European Commission tabled its initial proposal for mass scanning of private messaging apps to combat the spread of CSAM online in May 2022. MEPs have since unitedly rejected the approach, and last fall proposed an alternative that would exempt E2EE apps from scanning. But the European Council, the legislative body made up of representatives of member states' governments, continues to push for strongly encrypted platforms to remain subject to the scanning law.
The latest council proposal, submitted in May during the Belgian presidency, includes a requirement that “providers of interpersonal communication services, also known as messaging apps,” install and operate what the draft text describes as “technology for upload management,” according to documents published by Netzpolitik.
Article 10a, which sets out the upload control plan, says the technologies are expected to “detect the dissemination of known and new child sexual abuse material before it is transmitted.”
Last month, Euraactive reported that the proposed changes would require users of E2EE messaging apps to consent to being scanned for CSAM. Users who don't consent would reportedly be prevented from using any features that involve sending visual content or URLs, effectively downgrading their messaging experience to basic text and voice.
Whittaker's statement criticized the council's plan as an attempt to use “rhetorical tactics” to rebrand client-side scanning, a controversial technology that security and privacy experts say is at odds with the strong encryption that underpins sensitive communications.
“[M]”Bulk scanning of private communications fundamentally weakens encryption. It's a no-no,” she stressed. “This can happen, for example, whether that's done by tampering with the random number generation in the encryption algorithm, by implementing key escrow systems, or by running communications through a monitoring system before encrypting them.”
“You can call it a backdoor, a frontdoor, or even 'upload moderation,' but whatever you call it, all of these approaches create vulnerabilities that hackers and hostile nation states can exploit, taking away the unbreakable protections of mathematics and creating high-value vulnerabilities instead.”
Patrick Breyer, a member of the MEP from the Pirate Party, who has been an early opponent of the European Commission's controversial message scanning plans, also slammed the revised Commission proposal in a statement last month, warning: “The Belgian proposal means that the essence of the European Commission's extreme and unprecedented original chat moderation proposals will remain in place. In the 21st century, using messenger services purely for sending text messages is not an option.”
The EU's own data regulator has also expressed concerns, warning last year that the plans pose a direct threat to democratic values in free and open societies.
Meanwhile, it is likely that it will be law enforcement agencies that pressure governments to force E2EE apps to scan private messages.
In April, Europol chiefs issued a joint statement calling for platforms to design security systems that could identify illegal activity and send reports on message content to law enforcement. The statement, which called for “technical solutions” to ensure “lawful access” to encrypted data, did not specify how platforms should achieve this subtlety. But as we reported at the time, the lobbying had called for some form of client-side scanning. So it seems no coincidence that the Council published its “upload moderation” proposals just a few weeks later.
The draft bill contains several statements that attempt to put a fig leaf over the huge security and privacy black hole that “upload moderation” represents, such as the statement that “without prejudice to Article 10a, this Regulation does not prohibit or make impossible end-to-end encryption,” the assertion that service providers are not required to decrypt or provide access to E2EE data, a clause that says service providers should not introduce cybersecurity risks “for which they are unable to take effective measures to mitigate them,” and a statement that service providers should not be able to “infer the true nature of the content of communications.”
“These are all good points, but this proposal creates a self-negating paradox,” Whitaker told TechCrunch when asked for her reaction to these conditions. “Because what's being proposed – incorporating mandatory scanning into end-to-end encrypted communications – weakens the encryption and creates significant vulnerabilities.”
She contacted the European Commission and the Belgian Presidency of the Council for a response to her concerns, but had not received a response from either at the time of writing.
It remains to be seen what conclusion the EU will ultimately reach on CSAM scanning, as EU legislation is typically done in tripartite consultations. If the Council agrees to a position, so-called tripartite consultations between the Parliament and the European Commission will begin to explore a final compromise. However, it is also worth noting that the composition of Parliament has changed since MEPs agreed to a negotiating mandate last year following the recent EU elections.