A controversial push by European Union lawmakers to legally require messaging platforms to scan citizens' private communications for child sexual abuse material (CSAM) could result in millions of emails being sent a day. could lead to false positives, hundreds of security and privacy experts warned in an open letter Thursday.
Concerns about the EU's proposals have grown since the European Commission proposed its CSAM scanning plan two years ago, with independent experts, members of the European Parliament and even the European Union's data protection supervisor sounding the alarm. is ringing.
The EU proposal goes beyond simply requiring messaging platforms that receive a CSAM detection order to scan for known CSAM. Additionally, non-specific detection scanning techniques should be used to detect unknown CSAM and identify the grooming activity taking place. This leads to accusations of legislators indulging in technological solutionism at the level of magical thinking.
Critics say the proposal requires something technically impossible and would fail to achieve its stated goal of protecting children from abuse. Rather, it would force platforms to fully monitor all users when introducing risky and unproven techniques, such as client-side scanning, wreaking havoc on internet security and web user privacy. Deaf, they claim.
Experts say no technology exists that can accomplish the law's requirements without doing far more harm than good. But the EU continues to move forward nonetheless.
The latest open letter refers to the European Council's recently proposed amendments to the draft CSAM scanning regulation, which signatories claim fails to address fundamental flaws in the plan.
The letter's signatories (270 people at the time of writing) include hundreds of prominent security experts such as Professor Bruce Schneier of the Harvard Kennedy School and Dr. Matthew D. Green of Johns Hopkins University. Includes academics. Researchers working at technology companies such as IBM, Intel, and Microsoft.
An earlier open letter signed by 465 academics (last July) said the detection technology that the bill would force platforms to deploy was “seriously flawed and vulnerable to attack.” and warned that this would lead to a significant weakening of the critical protections provided by the system. End-to-end encrypted (E2EE) communications.
There is little support for counterproposals.
Last fall, members of the European Parliament banded together to oppose a much revised approach that would limit scans to individuals and organizations already suspected of child sexual abuse. Limits her CSAM to known and unknown and removes the requirement to scan for grooming. It also eliminates risks to E2EE by limiting it to platforms that are not end-to-end encrypted. However, the Council of Europe, the other co-legislative body responsible for EU legislation, has not yet taken a position on the issue, and where the issue lands will influence the final shape of the law. become.
The latest amendments on the agenda were submitted in March by the Belgian Council Presidency, which is leading the discussion on behalf of EU member state governments. However, experts warned in an open letter that the proposals still fail to address the fundamental flaws built into the Commission's approach, saying the amendments still “require the surveillance of internet users”. It claims that it creates “unprecedented capabilities for control” and “undermines our secure digital future.” It has an impact on society and could have a significant impact on democratic processes in Europe and elsewhere. ”
Adjustments for discussion in the revised Board proposal include a proposal that detection orders could be more targeted by applying risk classification and risk mitigation measures. Cybersecurity and encryption can be protected by ensuring that platforms are not obligated to create access to decrypted data and by vetting detection technologies. But 270 experts suggest this amounts to tinkering around the edges of a security and privacy crisis.
“From a technical point of view, for this new proposal to be effective, the security of communications and systems would also be completely compromised,” they warn. In order to send more targeted detection orders, the law does not allow for “massive surveillance” of her web users' messages, even if it relies on “defective detection technology” to identify targeted cases. The risk of ushering in a dystopian era will not be reduced. analysis.
The letter also addresses the Council's proposal to limit the risk of false positives by defining “persons of interest” as users who have already shared CSAM or are attempting to care for children. It is envisaged that this will be done through automatic evaluation. Wait 1 hit for known she-CSAM, 2 hits for unknown she-CSAM/grooming, etc. before the user is officially detected as a suspect and reported to the EU center that processes CSAM reports is.
Billions of users, millions of false positives
Experts warn that this approach could still generate a huge number of false alarms.
“It is unlikely that the number of false positives due to detection errors will be significantly reduced unless the number of repetitions is so large that the detection stops being effective. These platforms send large numbers of messages (on the order of billions). ), we expect a very large amount of misinformation (on the order of millions),” they wrote, noting that it is likely that the platform will eventually be slammed. If there is a detection order, there could be millions or even billions of users of Meta-owned his WhatsApp etc.
“Given that there is no publicly available information on the performance of the detectors available in the wild, imagine having a CSAM and grooming detector with a false positive rate of only 0.1%, as described in the proposal. (i.e., misclassifies a non-CSAM as a CSAM 1 in 1000 times), which is much lower than any currently known detector.
“Given that WhatsApp users send 140 billion messages a day, even if only 1 in 100 messages were tested by such a detector, there would be 1.4 million false positives every day. To get the false positives down to a few hundred, we would need to identify at least five repetitions using different statistically independent images or detectors. This is only true for WhatsApp; considering other messaging platforms, including email, the number of iterations required will increase significantly to the point where CSAM sharing functionality cannot be effectively reduced.
Another council proposal to limit detection orders to messaging apps deemed “high risk” is still likely to “indiscriminately affect large numbers of people,” so the signatories say In its opinion, it is a useless amendment. Here, they point out that CSAM exchanges only require standard features such as image sharing and text chat. These features are widely supported by many service providers, and a high-risk classification means they “definitely impact many services.”
It also notes that the adoption of E2EE is increasing, which suggests that services that deploy E2EE are more likely to be classified as high risk. “This number is likely to increase further due to interoperability requirements introduced by the Digital Markets Act, resulting in messages flowing between low-risk and high-risk services. , almost any service could be classified as high risk,” they claim. (Note: Message interoperability is a core plan of the EU's DMA.)
backdoor for backdoors
Regarding encryption protections, the letter echoes what security and privacy experts have repeatedly shouted to lawmakers for years: “Detections in end-to-end encryption services, by definition, undermine encryption protections.” repeating the message.
“The new proposal includes as one of its goals “protecting cybersecurity and encrypted data while keeping services using end-to-end encryption within the reach of detection orders.” . As we have explained before, this is contradictory,” they stress. “End-to-end encryption protection means that no one other than the intended recipient of a communication has any information about the content of that communication. Enabling detection capabilities, even before data is encrypted, violates the very definition of confidentiality provided by end-to-end encryption.”
In recent weeks, police chiefs from across Europe have issued their own joint statement, expressing concern about the spread of E2EE, and recommending that security systems be used in a way that allows them to identify illegal activity and send reports on message content to law enforcement agencies. We are asking platforms to design .
The intervention is widely seen as an attempt to pressure lawmakers to pass legislation such as CSAM scan regulations.
Police chiefs have denied that they are seeking backdoor encryption, but they are not sure what technical solutions their platforms will adopt to enable the “lawful access” that is being sought. I haven't explained exactly what I want them to do. Square that circle and a very oddly shaped ball returns to the lawmaker's courtroom.
If the EU continues on its current path – assuming the Council fails to change course as MEPs have demanded – the results will be “catastrophic”, the signatories of the letter said. Continue to warn. “This sets a precedent for internet filtering and prevents people from using some of the tools available to them to protect their right to live a private life in digital spaces, especially for teenagers who rely heavily on online services. , it will have a chilling effect. It will change the way digital services are used around the world, and it could have a negative impact on democracies around the world.”
EU officials close to the Council were unable to provide any insight into current discussions between member states, but said there was a working group meeting on May 8 and regulations to combat child sexual abuse. He noted that he had confirmed that the proposal would be discussed.