The European Union has long had a reputation for strong privacy laws. But a legislative plan to combat child abuse, formally introduced by the bloc in May 2022, threatens to reduce the privacy and security of hundreds of millions of local messaging app users.
The European Commission, the EU's legislative body that drafted the proposal, has framed it as a plan to combat child abusers' misuse of mainstream technology tools and protect children's rights online. Child abusers claim they are increasingly using messaging apps to distribute child sexual abuse material. (CSAM) and even new victims.
The approach adopted by the EU, perhaps as a result of lobbying from the child safety technology sector, is a technological solutionsist one. The commission's efforts are focused on regulating digital services (mainly messaging apps), which use technology tools to scan users' communications to detect and report illegal activity. It imposes obligations on digital services.
In recent years, mainstream messaging apps have been temporarily exempted from the bloc's ePrivacy rules, which deal with the confidentiality of digital communications. This exclusion lasts until May 2025 following the last extension. Therefore, it is now possible to voluntarily scan people's communications in certain areas. scenario.
However, the child abuse regulation would create permanent rules that essentially mandate AI-based content scanning across the EU.
Critics of the proposal argue that it would force messaging platforms to use imperfect technology by default to scan users' private communications, with dire consequences for people's privacy. are. They also warned that the law would put the EU on a collision course with strong encryption as it would force end-to-end encryption (E2EE) apps to reduce their security to comply with content review requirements. I am doing it.
Concerns about the proposal are so serious that the bloc's own data protection supervisor warned last year that it was a tipping point for democratic rights. The Council of Europe's legal advice service also considers the service to be incompatible with EU law, according to a leaked assessment. If passed, the law would almost certainly face legal challenges, as EU law prohibits the imposition of general surveillance obligations.
So far, the EU's co-legislators have not been able to agree on the way forward regarding this file. But the bill still stands, and so do all the risks it poses.
Extensive CSAM detection order
The European Commission's original proposal stated that once a detection order has been issued, platforms will be able to collect known CSAM (i.e. child abuse images that have been previously identified and hashed for detection) as well as unknown CSAM (i.e., new images of abuse). This further increases the technical challenge of detecting illegal content with high accuracy and low false positives.
A further element of the Commission's proposal would require platforms to identify grooming activity in real time. This means that in addition to scanning images uploaded for CSAM, apps need to be able to analyze user communications to understand when an adult user is trying to induce a minor to engage in sexual activity. That's what it means.
Using automated tools to detect signs of behavior that may portend future fraud in common interactions between app users has shown that the potential for misinterpreting an innocent conversation is enormous. Suggests. Taken together, the European Commission's extensive CSAM detection requirements would turn mainstream messaging platforms into mass surveillance tools, opponents of the proposal suggest.
“Chat Control” is the main name they came up with to encompass concerns about the EU passing a law requiring full scanning of civilians' digital messages (including screening of text exchanges they send). .
What about end-to-end encryption?
The European Commission's original proposal for regulations to combat child sexual abuse also does not exempt E2EE platforms from CSAM detection requirements.
And the use of E2EE means that such platforms do not hold the encryption keys and therefore do not have the ability to access readable versions of users' communications, making secure messaging services legally It is clear that if regulated, we would face certain compliance issues. Required to understand invisible content.
As such, critics of the EU's plan say the law would force E2EE messaging platforms to downgrade the flagship security protections they provide by implementing risky technologies such as client-side scanning as a compliance measure. I'm warning you.
The Commission's proposal does not mention specific technologies that platforms should deploy for CSAM detection. The decision will be left to the EU Center for Combating Child Sexual Abuse, as required by law. However, experts predict that this will most likely be used to force the introduction of client-side scanning.
Another possibility is that platforms implementing strong encryption may choose to withdraw their services from the region altogether. Signal Messenger, for example, previously threatened to withdraw from the market rather than be forced by law to compromise user security. This prospect could prevent people in the EU from accessing mainstream apps that use gold-standard E2EE security protocols to secure digital communications, such as Signal, Meta-owned WhatsApp, and Apple's iMessage, to name a few. There is.
Opponents of the proposal argue that none of the measures drafted by the EU have the intended effect of preventing child abuse. Instead, the impact they predict will be dire ramifications for app users, as the private communications of millions of Europeans will be exposed to faulty scanning algorithms.
As a result, they argue, there is a risk of a large number of false positives. Millions of innocent people can be mistakenly implicated in suspicious activity, and false reports can flow down the pipeline and burden law enforcement.
The system envisaged by the EU proposal would require citizens' private messages to be regularly released to third parties involved in checking suspicious content reports sent by platforms' detection systems. Therefore, even if certain flagged content is not ultimately forwarded to law enforcement for investigation because it was identified as non-suspicious earlier in the reporting chain, it inevitably It would have been investigated by someone else. than the sender and its intended recipient. Rest in Peace, Privacy.
Securing private communications leaked from other platforms also poses an ongoing security challenge, and poor security practices by third parties involved in processing content reports may result in reported content becoming more There is a risk that you may be exposed.
There's a reason people use E2EE, and it's important that your data isn't accessed by mass intermediaries.
Where is this terrible plan now?
Legislation in the EU is usually tripartite, with the European Commission proposing legislation and co-legislators in the European Parliament and Council working with the EU executive to reach a compromise that everyone can agree on. I will try to do so.
However, in the case of child abuse regulation, EU institutions have so far taken very different views on this proposal.
A year ago, members of the European Parliament agreed on a negotiating position by proposing significant amendments to the Commission's proposals. Lawmakers across the political spectrum supported significant amendments aimed at mitigating rights risks, including support for completely separating E2EE platforms from scanning requirements.
They also suggested restricting more targeted scanning. That means adding a requirement that only messages from individuals or groups suspected of child sexual abuse be screened. This means that users will be notified when a detection instruction is provided to the platform, rather than the law imposing a bulk scan on all messages.
Further changes supported by the MEP limit detection to known and unknown CSAM and remove the requirement that platforms also detect grooming activity by screening text-based exchanges.
The version proposed by Congress would make profiles private by default to reduce the risk of minors being discovered by predatory adults, and would include other types of measures, including requirements for platforms to increase user privacy protections. asked to be included.
Overall, the MEP's approach appears to be much more balanced than the Commission's original proposal. However, the composition of parliament has since changed due to EU elections. The views of new members of parliament are less clear.
Questions also remain as to what the European Council, made up of representatives from member states' governments, will do. No agreement has yet been reached on negotiating powers for this file, so discussions with Congress have not begun.
Anyone who chooses privacy will be downgraded to a basic dumb phone-style feature set with text and voice only. Yes, that's actually what local legislators are considering.
The council ignored pleas from councilors last year to go along with the compromise. Rather, Member States appear to support a position that is much closer to the European Commission's original 'scan everything' position. However, opinions are divided among member countries on how to proceed going forward. And so far, enough countries have opposed compromising the document presented by the Council Presidency to agree to the mandate.
Proposals leaked during Council discussions suggest that member governments still seek to maintain the ability to bulk scan content. However, the May 2024 Compromise sought to fine-tune this language by euphemizing the legal requirements for messaging platforms as “upload moderation.”
This prompted Signal chairman Meredith Whittaker to call for public intervention, accusing EU lawmakers of indulging in “rhetorical games” in a bid to win support for a mass scan of national communications. It “fundamentally undermines encryption,” she warned, nonchalantly.
A document leaked to the press at the time also reportedly proposed the possibility of asking users of the messaging app to consent to having their content scanned. However, users who do not consent to the review will have key features of the app disabled and will not be able to submit images or URLs.
In that scenario, messaging app users in the EU would essentially be forced to choose between protecting their privacy and enjoying a modern messaging app experience. Those who choose privacy will be downgraded to a basic Danphone-style feature set of text and audio only. Yes, that's actually what local legislators are considering.
There have been recent signs that support within the council for pushing for mass surveillance of public messages may be waning. Earlier this month, Netzpolitik covered the Dutch government's announcement to abstain from further fine-tuned breaches, citing concerns about the impact on E2EE and the security risks posed by client-side scanning.
Earlier this month, discussion of the regulation was also removed from another board item on the board's agenda because it clearly did not qualify for a majority.
However, a number of EU countries continue to support the European Commission's push to bulk scan messages. And the current president of the Hungarian Council seems committed to continuing efforts to find a compromise. So the risk hasn't gone away yet.
Member states could still arrive at a version of the proposal that satisfies governments enough to open the door to talks with MEPs, and if that happens, the EU's private trilateral Everything will be contested in the consultation process. The stakes therefore remain high for the rights of European citizens and the European Union's reputation as a champion of privacy.