Recently, Wired reported on documents leaked in May from the Council of the European Union’s ongoing deliberations over the draft Child Sex Abuse Regulation. The proposal is controversial because, as presently written, it would require online service providers to proactively scan their services – including users’ private interpersonal communications – for CSA material (CSAM) and child grooming. This would be a significant escalation from current law, which merely permits services to detect those activities voluntarily (as many services do). The leaked documents reveal disagreement among EU member states over the proper scope of the regulation and how best to protect children’s safety while still upholding the rights of the 447 million people in the EU, children and adults alike.
The proactive detection requirement likely renders the CSA Regulation (called “Chat Control” by its opponents) illegal in its current draft form. That’s according to no less an authority than the Council’s own legal service (whose April analysis reaching that conclusion was also leaked) as well as the EU’s own data protection supervisor, which condemned the current draft last year. For one, existing EU law forbids the imposition of a general content monitoring obligation on online platforms; for another, the EU Charter of Fundamental Rights restricts how individuals’ rights may be limited by the state. Mandating the suspicionless monitoring for CSAM of all online communications, and particularly private interpersonal communications (as distinguished from publicly-available content), would impermissibly violate the prohibition against general monitoring as well as Europeans’ fundamental privacy rights. The latter includes the rights of European children, as I pointed out in comments I submitted last May. A year later, the debate over the CSA Regulation continues. One sticking point: what to do about end-to-end encryption.
A little background on end-to-end encryption
End-to-end encryption (E2EE) is a technology that renders a message unreadable by anyone except the sender and intended recipient(s) who have the necessary information to decrypt the message. Outsiders – whether that’s the platform transmitting the message, the police, nosy busybodies, or malicious snoops – can’t intercept and read E2EE messages; to them, the message is scrambled gobbledygook. Rebuffing prying eyes and ears makes E2EE a vital protection for digital privacy and security, not to mention national security, the economy, human rights, and other key interests.
However, E2EE also affects criminal law enforcement: Scanning digital communications and files to detect CSAM (or terrorist content, or politically verboten material, or evidence of crimes that used to be basic rights) doesn’t work when they’re end-to-end encrypted. The impact of E2EE on detection varies by the type of disfavored or illegal content in question. For most types, my research shows that online service providers have other methods of effective detection that don’t rely on scanning and thus work even in E2EE settings. The exception is CSAM, for which scanning is considered the most useful detection technique.
The prevalence of E2EE has complicated policymakers’ task of addressing the eternal tension in free societies between fighting crime and respecting individual freedom. Already, messaging apps that are E2EE by default are used by billions of people around the world, and that number will only continue to increase. Most of them are not criminals. Nevertheless, a tiny fraction of users do employ E2EE to shield criminal activity. That’s prompted regulators and law enforcement agencies around the world to take a hostile stance toward end-to-end encryption. Over the years, they have repeatedly sought to regulate or outlaw it, either directly or indirectly – typically while paying lip service to how important encryption is for cybersecurity, privacy, yada yada.
A little background on the CSA Regulation debate
The draft CSA Regulation is Europe’s entry in the long line of global regulatory efforts that, whether intentionally or incidentally, would jeopardize end-to-end encryption. The rationale for such proposals changes depending on time and place, but protecting children’s safety (especially from sexual harms) has emerged as a particularly potent justification in recent years. The gut-punch horror of child sex abuse lends gravitas to policy proposals such as the CSA Regulation that would have the effect of increasing digital surveillance while burdening the free availability of strong encryption technologies.
The regulation’s public face is EU commissioner Ylva Johansson, who has remained stubbornly resistant to criticism of “her” bill’s potentially devastating impacts. Take the EU regulatory process (which is slow and cumbersome in any circumstance), apply it to a hot-button topic like child sex abuse, and put an inflexible personality at the helm, and it’s no wonder the CSA Regulation has been such a protracted and fractious affair so far.
Given all of the above considerations, how should the CSA Regulation deal with E2EE? Imposing an affirmative obligation to detect CSAM, including in private interpersonal communications, is already legally dubious for at least two reasons, as said. But it is technically as well as legally sketchy if the obligation encompasses E2EE services: Providers can’t “read” the contents of E2EE communications – that’s the whole point of E2EE.
So how are E2EE service providers supposed to comply with the draft regulation’s detection obligation? Should E2EE services be excluded due to technical infeasibility? Or, as Johansson believes, would being subject to the mandate spur them to invent some way of complying (the magical-thinking “nerd harder” myth)?
Should the CSAM detection obligation even cover interpersonal communications services at all, whether E2EE or not, given the (likely illegal) magnitude of the intrusion on fundamental privacy rights?
Can regulators thread the needle of respecting both technical realities and existing EU law while also creating an effective legislative tool for achieving their child-safety goal?
Those are the questions discussed in the May document leak.
The May 2023 document leak
The leak consists of 20 EU member states’ responses to a survey from the Council to the Law Enforcement Working Party (LEWP), specifically its police subgroup. (The LEWP, composed of representatives from EU member states, assists with EU regulatory and policy matters involving crime and customs issues.)
That means these responses represent the views of law enforcement. It’s important to bear that in mind when reading the responses. In the eternal debate over encryption, law enforcement authorities reliably (and understandably) promote their own equities: preventing, detecting, investigating, and prosecuting crime. They tend to leave other interests to be championed by other stakeholders (e.g., human rights, the economy, intelligence and national security, international relations). It is the job of policymakers – those actually crafting the CSA Regulation’s language – to seek out and incorporate all those other equities into the final product. While policymakers tend to give law enforcement’s views a lot of weight, we should not expect that the LEWP responses alone will dictate how the CSA Regulation turns out in the end.
I say all that by way of preview to a general observation: whew do some of these cops really hate it that Europeans have the legal right and technical ability to talk to each other privately whereas they, the police, lack the legal or technical capabilities to surveil them indiscriminately en masse.
What the survey asked
These are the questions asked in the survey:
1. To what extent can encrypted CSA material be affected by a detection order? Are you in favour of including some wording in the Regulation excluding the weakening of E2EE (see, for example, recital 25 of Regulation (EU) 2021/1232)?*
2. Are you in favour of exploring if voluntary detection should be continued? If so, would you rather prolong the Temporary Regulation (EU) 2021/1232, or include its content in the CSA proposal?
3. Are you in favour of including audio communications in the scope of the CSA proposal, or would you rather exclude it as in Regulation (EU) 2021/1232?
4. With a view to detecting CSA, do you wish that detection be performed on interpersonal communications and publicly accessible content, or be limited to publicly accessible content?
* N.B.: Regulation (EU) 2021/1232 carved out a temporary exception from the EU’s stringent data privacy requirements so that online service providers could voluntarily scan for CSAM without breaking the law. This law (the “Temporary Derogation”) sunsets on August 3, 2024, which puts time pressure on the CSA Regulation deliberations and helps explain the leaked survey’s choice of questions. Recital 25 reads, “End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any weakening of encryption could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or weakening end-to-end encryption.”
Survey respondents could (and some did) also make additional comments and provide section-by-section feedback and suggestions on the draft regulation.
The respondents’ answers to questions 2 through 4 (as well as their additional comments) are of interest, but best left for a different write-up; this one is a helluva longread as-is. For now, let’s focus on the member states’ answers to question 1 about encryption.
Member states’ varying attitudes to encryption and communications privacy
The responses of the 20 member states (or rather, their LEWP representatives) cover a spectrum, from extremely anti-encryption and pro-police powers, to robustly pro-encryption, with quite a few countries occupying a middle ground best described as “encryption is important, but…”. I’ve sorted the 20 countries into categories according to my highly scientific classification of how I personally interpreted each country’s response. (Please excuse the giant gap before the table starts that I am too lazy to figure out how to delete in CIS's benighted Drupal implementaton.)
A. Anti-Encryption, Pro-Police Powers Responses
Spain: Ban E2EE Altogether
Spain’s response has received press attention for stating outright that end-to-end encryption should be banned by law entirely. “Ideally, in our view,” they say, “it would be desirable to legislatively prevent EU-based service providers from implementing end-to-end encryption.” Spain is also against including any language in the CSA Regulation “excluding E2EE weakening,” saying (in essence) it should be up to each member state to decide how much data protection is too much to allow their citizens to have.
Spain’s stance is frankly shocking. It is vanishingly rare in 2023 to hear any democratic government – even any law enforcement agency – take such an extreme stance. For some years now, the debate about encryption has moved to a place where (as Will Cathcart recently commented during a discussion at RightsCon) “most people are trying to say they’re pro-encryption while they’re trying to take it away.” That is – as many other member states’ questionnaire responses exemplify – law enforcement authorities and other government representatives typically pay lip service to end-to-end encryption’s importance before going on to propose how it ought to be weakened or undermined.
To be sure, Spain also repeats many typical law enforcement talking points against encryption: that law enforcement has to be able to keep doing its job, that it’s “imperative” that they have access to data (for which they also say there should be data retention obligations) and “the capacity to analyze” it, “no matter how large the volume.” That is: they want the ability to surveil everything, and the more the better.
Nothing new about that. But for Spain to say they need to do their job the way they’ve always been able to, and that therefore E2EE should be illegal in the EU, is beyond the pale.
This is the most extreme example of a law enforcement position that disregards all other equities but their own. And really, banning encryption wouldn’t even be in law enforcement’s own interest, since encryption helps prevent crimes and can also be used by investigators to protect their own communications. Spanish law enforcement authorities are well aware of government officials’ need for good cybersecurity: remember, this is the country that planted spyware on the phones of Catalan politicians. That scandal alone ought to destroy the credibility of the Spanish position that E2EE should be banned.
“This is highly controversial,” Spain acknowledges. Instead, their proposed “solution” after banning E2EE is “that encryption with automatic decryption be carried out at some intermediate server of the communication,” with notice to the user that this is happening. That’s not a solution. That’s what we used to have, and it wasn’t good enough to protect the billions of people who use the Internet from the many criminal, governmental, and corporate threats to their privacy and security online.
It’s 2023, not 2013. It’s long overdue for law enforcement authorities to accept that strong encryption is here to stay and that they just need to adapt. Stop wishing for a time machine that could roll everything back to a time when our devices, web traffic, and communications mostly weren’t strongly encrypted. Those days are over. Deal with it.
Cyprus, Slovenia, Lithuania, Croatia, and Hungary all adopt a similar stance: Law enforcement access to E2EE content should be written into the CSA Regulation (and thus detection orders to E2EE services should be in-scope), because E2EE is used to shield child abuse offenses. Cyprus and Slovenia at least gesture in the direction of caring about privacy rights, whereas Lithuania thinks everyone should just trust the police. Croatia is skeptical that there are effective alternatives for CSAM detection in E2EE environments, and Hungary wants to mandate law enforcement access to data.
“Of course, such regulation should be balanced with the need to ensure the right to privacy, taking into account the jurisprudence of the European Court of Justice,” Cyprus says. Slovenia says something similar, in a little more detail: “For detection in an encrypted environment, we must use or develop technology that will interfere as little as possible with the right to privacy of those who do not commit sexual abuse.” However, Slovenia is concerned that such development could be impeded if the CSA Regulation includes language prohibiting the weakening of E2EE, so they don’t endorse that.
Lithuania begs to differ when it comes to respecting people’s privacy. They complain that European data protection regulators are too absolutist about people’s privacy rights and don’t strike the right balance between privacy and law enforcement. Mistrust in law enforcement is unwarranted, they say, because its activities “are subject to strict requirements.” Ask those spied-upon Catalan politicians (and the many other European politicians who’ve been subjected to spyware) how well they think guardrails on surveillance are working out.
Croatia says “it is of utmost importance to provide clear wording in the CSA Regulation that end-to-end encryption is not a reason not to report CSA material.” They mention the technical workshop evidently given to Member States during a prior round of CSA Regulation meetings. Evidently, the workshop showcased technological tools that supposedly would enable service providers to detect CSAM even in E2EE environments. Croatia is not convinced: they say the workshop didn’t answer the question of “are there effective ways and strategies to bypass end-to-end encryption in order to identify CSAM materials and offenders distributing CSAM.” It’s nice that they’re skeptical of those hypothetical technical options, but unfortunate that they drew the exact wrong conclusion from that skepticism.
Hungary believes that law enforcement’s problems are “the result of the full end-to-end encryption used by online platforms, which makes classic data interception activities via electronic communication service providers impossible.” While they say the solution should be “proportionate to the fundamental principles of privacy and data protection,” this is pure lip service. Hungary wants “new methods of data interception and access … to maintain law enforcement capabilities, based on cooperation with major international online platforms and smart device manufacturers.” To that end, “[e]stablishing national jurisdiction would be essential to ensure data interception and access for online platform providers and smart device manufacturers.” That is: online platforms and smart devices should “cooperate” with the police by building in surveillance capabilities for law enforcement, and the CSA Regulation shouldn’t get in Hungarian law enforcement’s way by including language that prevents the weakening of E2EE. This is little better than Spain’s turn-back-time stance; they just aren’t as blunt about it.
B. Equivocal Responses: “Encryption Is Important, But…”
Belgium and Poland say they understand that strong encryption is important, but their responses indicate that they don’t actually understand how end-to-end encryption works.
“We believe in the motto, ‘security through encryption and despite encryption,’” Belgium’s response begins. If that phrase sounds familiar, it’s because it was an official resolution of the Council in 2020. “We are therefore in favour of excluding E2EE,” Belgium continues. Great! Right? “But…” Oh dear; nothing good ever comes after a government official says “encryption is important, but…”. So too here:
“...but [we] would, however, propose that service providers are responsible for the management of their own networks and encryption. Meaning that a service provider should be able to ‘deactivate’ their own encryption when a request from a judicial authority is submitted. … [I]n regard to E2EE, we would emphasi[ze] to place the responsibility on the providers.”
Poland’s response is similar. It says it favors regulatory language “aimed at avoiding the weakening of” E2EE and nods at E2EE’s importance to communications security. “However,” – ah, yes, there’s the “but…”:
“However, protecting E2EE should not be absolute and exposing children to threats. There are two important instances where E2EE can be lifted:
1. It should be made possible for the parent or the legal guardian to make an informed choice to decrypt the communication of the child being their own or under legal care.
2. By court order
In PL’s view no other concessions should be made in order to weaken encryption. Going further would probably [amount] to creating backdoors to undermine E2EE.”
That’s not how end-to-end encryption works. Providers can’t “lift” E2EE on demand for specific users or for specific messages, as both countries seem to think. If they did, the encryption wouldn’t be end-to-end anymore. It’s fundamental to E2EE that neither the user’s service provider nor the user’s mom can decrypt their messages. (That’s why children’s iCloud accounts don’t have an E2EE option, because they’re managed by a parental iCloud account.)
Both countries are saying they want one thing, then asking for its opposite. Wanting E2EE services to be able to remove E2EE upon court order is flatly incompatible with saying you favor regulatory language excluding the weakening of E2EE. And it means nothing to say you’re against “backdoors to undermine E2EE” when that’s what you're actually asking for!
The technology Belgium and Poland want simply does not exist. And that’s not for lack of trying: Computer security experts have tried for decades to invent it, because governments for decades have kept demanding it, and for decades those attempts have failed. After a quarter-century of failure to invent the magical, mythical “secure encryption that allows access for just the good guys and not the bad guys” technology, maybe it’s time for governments to accept that it can’t be done.
Several countries replied that they support regulatory language excluding the weakening of E2EE in view of its importance to privacy and cybersecurity, but that in order to be effective, the CSA Regulation should not exclude E2EE material from being subject to detection orders since E2EE services are known to be used for CSAM and grooming.
Denmark “finds it crucial that the proposal strikes the right balance between … respect for private and family life and the protection of personal data … [and] the legitimate intent to prevent and combat child sex abuse.” Denmark favors regulatory language clarifying that the Regulation doesn’t prevent providers from employing E2EE on their services. However, Denmark does not favor excluding E2EE services from the Regulation’s scope, as that “would compromise the proposal’s capacity to achieve its objective,” given that “CSAM often spreads through platforms that use E2EE” in the national police’s experience.
Romania’s nuanced response makes a different case for why E2EE services should be in-scope: Sometimes a detection order will work, sometimes not, depending on the circumstances and the particulars of the encryption technology. If the encryption is strong, then a detection order won’t work; but sometimes encryption is weak and can be broken, and in those cases, the authorities should have the legal tools to try to access the decrypted material. This explanation grounds Romania’s policy stance on technical realities rather than (as with Spain, for example) a blunt desire to maximize police powers.
In an implicit rebuttal of Belgium’s and Poland’s magical thinking, Romania goes on to say that laws mandating “backdoors” or “exceptional access” weaken overall cybersecurity, because they require adding vulnerabilities that could be exploited by malicious actors, not just by government agencies. (I don’t speak Romanian, but they’re speaking my language!) Thus, says Romania, “We agree that nothing in the proposed CSA Regulation should be interpreted as prohibiting or weakening end-to-end encryption, but…” – welp, sigh, there it is – “...also we don’t want [for] E2EE encryption [sic] to become a ‘safe haven’ for malicious actors. Therefore, we tip the scales towards protecting children.”
I was with them up until that last bit. As said, E2EE protects 447 million Europeans, the vast, vast majority of whom aren’t criminals; if anything, E2EE is a “safe haven” from malicious actors. Plus, they don’t explain what they mean by “tipping the scales”: are they saying they’re in favor of “exceptional access” despite its downsides? They don’t clarify. I’m choosing to charitably interpret their response as like Denmark’s: the Regulation shouldn’t weaken or ban E2EE, but that doesn’t mean E2EE services should be off-limits from detection orders.
Slovakia’s response evokes Romania’s “safe haven” language. Slovakia claims to agree that "end-to-end encryption is the main tool for guaranteeing information security and an essential means of enabling the digital economy and the protection of fundamental rights, including the right to privacy and freedom of expression.” They don’t really seem to mean that, though: they don’t think it’s “urgent” to include language about E2EE in the CSA Regulation. At most, they “could accept” wording “that does not go beyond that of recital 25” (bold emphasis theirs) – so long as it’s “non-operative.”
Slovakia believes “the use of end-to-end encryption (or any other forms of encryption) by a service provider cannot in itself justify non-compliance” with the CSA Regulation’s obligations, and they really want any regulatory language about E2EE to make that clear, in order to avoid “creating a legal loophole that might create a safe harbour for CSAM or grooming." In short, they’re OK with, at most, weak, meaningless language in the CSA Regulation about E2EE – similar to their own weak, meaningless statement that E2EE is important.
Slovakia, like Croatia, is skeptical of hypothetical technical solutions for detecting CSAM in encrypted settings. Slovakia notes the “trade-off between their effectiveness in detecting illegal material and users’ privacy.” Rather than adopt those proposed solutions, Slovakia would prefer that the EU pass the CSA Regulation as a way to “stimulate” the “further technological development” of such tools, either by an EU-level center for child sex abuse prevention (which would be established by the regulation) and/or by online service providers themselves.
This is the “nerd harder” approach that Ylva Johansson believes in. To me, it’s reflective of a broader attitude by European regulators that they can simply regulate their desired reality into existence. In actuality, a “nerd harder” mandate would just create a “damned if you do, damned if you don’t” situation for E2EE service providers: If, due to encryption, they can’t comply with the CSA Regulation’s requirements to detect CSAM, they’ll be punished for that; and if they adopt a technology for CSAM detection that doesn’t adequately protect user privacy under European privacy law, they’ll be punished for that. “Protect privacy, except when we want you not to,” basically.
EU privacy law is why Ireland’s comments are pure equivocation. Ireland says E2EE services should be in-scope in the Regulation because they’re used for CSAM and “major service providers” plan to expand their use of E2EE. Nevertheless, Ireland “agree[s] with the principle that E2EE should not be prohibited or weakened.” And yet, Ireland “would be opposed, however” (aha, there’s the “but”) “to including any wording that might have the effect of restricting the effectiveness of the Regulation,” including by hindering the development of detection technology.
Ireland is in a uniquely weird spot among the 20 member states that took part in this survey. As the GDPR watchdog for U.S.-based Big Tech firms in Europe, Ireland knows it needs to step its game up on privacy enforcement. The Irish Data Protection Authority (DPA), which employs a whopping 42 people, just tried to prove it’s not toothless by fining Meta 400 million euro. Well, guess who Ireland meant when they referred to “plans by major service providers to expand the use of E2EE”? Yep, also Meta. Get better at privacy, Meta – but wait, not by using more E2EE!
So: Ireland is expected to robustly enforce EU privacy law, especially against American Big Tech companies like Meta. E2EE is a really damn good way to protect user privacy. Since Ireland’s DPA is ostensibly pro-privacy, that should mean being in favor of E2EE (and more of it). But Ireland’s law enforcement agency plainly isn’t a fan of E2EE ‘cuz of crime. But they can’t very well say tech companies shouldn’t be allowed to use or expand E2EE, because E2EE is good for privacy and Ireland is all about getting tech companies to do better at privacy.
And so Ireland’s head explodes. Because the European Union perennially wants to have both All The Privacy and Catching The Bad Guys, but they cannot for the life of them figure out how to do both at once. Witness the fact that, as noted above, the EU passed a privacy law that, oops, accidentally made it unlawful for platforms to scan voluntarily for CSAM, so the EU had to enact a special exception at the last minute – the Temporary Derogation – so that their privacy law wouldn’t literally make it illegal for platforms to help catch the bad guys.
The EU’s split-brain attitude toward privacy and law enforcement is one of the reasons the CSA Regulation is taking so long. It haunts this survey and the member states’ responses. And no part of the EU is getting pulled in two directions more strongly than Ireland. And that’s why Ireland gave this perfectly meaningless response to a survey question about encryption policy.
C. Pro-Encryption Responses
The Netherlands and Bulgaria both favor protecting E2EE, and both cited new technologies that have been discussed during the CSA Regulation deliberations that would allegedly enable the detection of CSAM even in E2EE environments. Bulgaria touches on those presentations only briefly, whereas the Netherlands’ lengthy response addresses them in greater detail.
Bulgaria says it does not support weakening E2EE, “as it is essential to ensure secure communications,” and believes “the inclusion of E2EE safeguards could be provided in the Regulation.” However, Bulgaria’s response seems to be informed by the technical presentations Member States apparently received at some point, in which “technologies were presented which are said to have the ability to detect illegal material in encrypted communication.” (I think this is the same “technical workshop” referenced by Croatia and Slovakia.) Apparently unskeptical of those technologies’ supposed abilities, Bulgaria seems to be OK with protecting E2EE in the Regulation only because the tools supposedly exist to get around E2EE anyway. Their response suggests they have not thought through the implications of those tools (assuming they actually work) – like whether, by effectively rendering E2EE irrelevant, they might undermine E2EE’s “essential” ability “to ensure secure communications” that Bulgaria seems to think is important.
The Netherlands’ essay-length response repeatedly underscores their national policy: “no making end-to-end encryption impossible.” The Dutch Parliament, they note, adopted a resolution in July 2022 “specifically instructing the Dutch government not to accept proposals which make end-to-end encryption impossible.” Thus, the Netherlands will not accept any policy outcome that (intentionally or not) forces companies to disable their E2EE in order to detect CSAM, and it is “of key importance” to them that the regulation include language addressing their concern.
To that end, the Dutch don’t think the Recital 25 language on E2EE goes far enough. Instead, they favor language clarifying that online service providers can’t be required to install or operate any CSAM detection technologies “that make end-to-end encryption impossible.” Alternatively, they propose to strengthen the Recital 25 language (bold emphasis theirs):
“End-to-end encryption is an important tool to guarantee the security, integrity and confidentiality of the communications of users, including those of children. Any weakening of encryption could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or weakening end-to-end encryption. Any technology developed to detect CSAM as a result of this Regulation shall be fully compatible with the use of end-to-end encryption.”
How, then, “to tackle CSAM effectively” while ensuring “that end-to-end encryption is not made impossible”? The Netherlands thinks it “is neither desirable nor necessary” (emphasis theirs) for the Regulation to impose a decryption obligation on providers. That’s because, “subject to further research regarding their successful deployment on a large scale,” two technologies “may allow for automatic detection of CSAM while at the same time leaving [E2EE] intact,” both of which are “on-device solutions” that work by detecting CSAM “before the material is encrypted and sent.” Specifically, the Netherlands cites options 4(a) and 4(d) on page 309 of the EU Commission’s May 2022 impact assessment report on the draft CSA Regulation (which I suspect were also included in the “technical workshop” presentations referened by Croatia, Slovakia, and Bulgaria).
These “on-device solutions” are varieties of what’s known as “client-side scanning,” or CSS. The Europeans have been pushing CSS as a solution to CSAM for years, as shown in a leaked 2020 draft document. This is a very dangerous trendline in their regulatory discussions. Client-side scanning technologies are unproven, as the Dutch acknowledge (indeed, both “solutions” are classed as “needs research” in the impact assessment). What’s more, even though they technically don’t break encryption, CSS techniques nevertheless undermine the privacy and security properties that people expect from using E2EE services, as the Internet Society wrote in a response to the draft CSA Regulation proposal. The tech has serious ramifications for freedom of expression: there’s no technical limitation confining its use to CSAM detection and not, say, dissident political expression too. As computer security experts (and others) have already warned, client-side scanning “neither guarantees efficacious crime prevention nor prevents surveillance. Indeed, the effect is the opposite.”
It’s unfortunate to see the Netherlands advocate for client-side scanning technologies. “No making end-to-end encryption impossible” comes across as a hollow vow, not a principled state policy, if your plan is to simply circumvent E2EE anyway by embedding surveillance technology on everyone’s phones.
It’s enough to make me wonder what I’m missing: The Dutch government is very cybersecurity-savvy. Their law enforcement’s investigatory and offensive security capabilities are highly sophisticated. So why the largely uncritical embrace of untested, theoretical technologies with major downsides? (For example, the Dutch compare client-side scanning to our phones’ spam filtering and auto-correct functionalities, without noting the obvious distinction that those work for the phone’s user, not against the user.) Is this simply an instance of law enforcement equities predominating over other considerations? I don’t know.
The brief, straightforward Czech Republic response advocates that the Regulation stay in its lane: the Regulation, they say, should be technologically neutral and “set out general boundaries.” They do, however, take the opportunity to add, “We consider encryption to be very important as it ensures secure communication in the online environment. … [W]e do not consider [it] appropriate to explicitly prohibit the use of encryption technologies.” Sure, they could’ve gone further and advocated for express wording excluding the weakening of E2EE, as per the survey question. But I’ll count it as a W.
Italy’s response concisely identifies several concerns: one, the assumption that the tools even exist to identify CSAM in encrypted communications (apparently they were skeptical of those technical presentations, unlike Bulgaria); two, that a general scanning obligation would be disproportionate (in violation of Europeans’ fundamental rights) because “it would represent a generalized control on all the encrypted correspondence sent through the web”; and three, that scanning everything would, in addition to privacy, also impact the police’s effectiveness by overwhelming them with a huge volume of reports (including a “considerable amount of false positives”). Italy’s practical concerns complement their legal concerns: why mandate a scanning regime that would violate everyone’s privacy rights when it wouldn’t even work well?
Malta gives a long, thoughtful, cautious response. It can be summarized as “let’s slow our roll”: respect rights, evaluate the options, look at the evidence, think about unintended consequences, and proceed conservatively. Since the current draft is likely illegal, Malta says, the Council should explore alternative approaches or additional safeguards that could cure the proposal’s legal problems and render it compliant with fundamental human rights. Some alternatives: re-using the Recital 25 language about E2EE, and replacing CSAM detection orders with “the next best measure,” namely “obligations [on providers] emanating from risk assessments including mitigation measures.” Would that be effective, they ask? And how about the current voluntary CSAM scanning legal regime – how well is that working?
Malta’s caution extends to the proposal’s potential impacts on encryption. Their response favors re-using the Recital 25 language safeguarding E2EE, and “calls for alternative solutions which will not indiscriminately interfere with encryption of telecommunication means.” Further, Malta raises the possibility of “a dedicated legislative proposal” to address law enforcement access to encrypted data pertaining to illicit content. It cautions the Council about “avoid[ing] unforeseen precedents in other areas.”
Malta is right: the CSA Regulation is not the appropriate vehicle to regulate for the entire EU such a huge, complex, controversial issue as law enforcement access to encrypted data. As demonstrated by the necessity of the Temporary Derogation permitting continued voluntary CSAM scanning, EU policymakers aren’t always great at seeing the collateral consequences of the laws they write. A law that’s supposed to be about children’s safety should not be allowed to become a stalking horse for expanding surveillance capabilities while eroding privacy and cybersecurity. (It would have been nicer, however, if Malta had gone a step further and opposed any legal mandate that providers give law enforcement access to encrypted data, whether it’s standalone or shoehorned into some other regulation.)
Funnily enough, the most strongly pro-encryption, pro-privacy, pro-cybersecurity responses to the survey came from (a) the two respondents that are directly next door to Russia – you know, the country with advanced offensive cyber capabilities? The one that started a land war in Europe last year after years of cyberattacks? – and (b) the country that’s deeply suspicious of police surveillance thanks to its Stasi past.
You’d think other countries with a ghastly recent past under dictatorships or Communism might be just as averse as Germany to mandating suspicionless mass surveillance of everything everybody says and does online, but you’d be wrong. Heck, you’d think the NATO countries would be just as vehement as Finland and Estonia about not enacting any regulation that would intentionally or inadvertently weaken their cybersecurity posture in an age of Russian impunity, but you’d be wrong. Yeah, I don’t get it either.
Estonia, uniquely, gave two sets of answers to the four survey questions: one from its LEWP representative, and another from its Ministry of Economic Affairs and Communications. (This illustrates the point I made above, that the survey responses for a given country come from law enforcement and may not represent other stakeholders’ equities.) The law enforcement response explains that detection orders served on E2EE services would be unenforceable, so if the EU mandates data decryption capabilities, then E2EE service providers in the EU will either reengineer their systems or simply shut down.
In its separate response, the economic ministry supports adding language akin to Recital 25 to prohibit the weakening of E2EE. “Estonia does not support the possibility of creating backdoors for end-to-end encryption solutions,” they say (bold emphasis theirs). “End-to-end encryption is an important tool to guarantee the security and confidentiality of the internet infrastructure and the communications of users. Any weakening of encryption could potentially be abused by malicious third parties. Therefore, end-to-end encryption should not be weakened.” Yes! Exactly! This response underscores why it’s crucial that policymakers seek input from multiple stakeholders and not just law enforcement.
Intriguingly, the economic ministry adds: “At the same time, we can support the use of privacy enhancing technologies (PETs) that allow the analysis of encrypted content without decryption, so that the reliability, security and integrity of digital services relying on encryption is preserved.” They don’t specify what technologies they mean, exactly (or whether they’re anything more than academic proof-of-concept right now). But they’re not embracing client-side scanning, which is a relief.
Finland’s response is even more emphatic. Online service providers have responsibilities to create a safer online environment, they say, but nevertheless, the CSA Regulation must not restrict strong encryption, endanger cybersecurity or communications security, or lead in practice to encryption backdoors.
The Finns frown on the murky draft language’s potential negative impacts on E2EE and communications confidentiality overall: “Considering the importance of encryption to confidentiality of communications (respect for private or family life), freedom of speech, high level of data protection as well as cybersecurity, this Regulation’s impact on end-to-end encryption should not remain unsatisfactorily ambiguous.” They note that the current draft might not conform to the EU Charter of Fundamental Rights (or, potentially, the Finnish Constitution). They’re equally skeptical about the technological feasibility of the CSA Regulation: they want more information “about the technical and organizational means behind the detection order[s],” and want the EU Presidency & Commission to provide more information “about measures and technologies that would not undermine use of encryption and would not jeopardize security of information services and systems, but that would help fight CSAM online.”
The Finnish response is a concise distillation of what I’ve been saying for seven years: Strong encryption is “an essential tool to guarantee trust in the online environment” that “secures digital systems” and “protects privacy and personal data of the users.” Therefore, the law should not impose “restrictions on strong encryption of electronic communications,” and “must not endanger cyber security or the security of communication and information systems” or “lead to undermining the security of communication systems and services”; after all, “any backdoors for justified purposes could potentially be abused by malicious third parties.”
Finally, Germany’s response is a reminder that encryption doesn’t just protect individuals’ privacy; it also protects the government’s own affairs. The Germans are dubious of the current draft’s detection order provisions; the CSA Regulation, they say, “must uphold fundamental rights, in particular when it comes to protecting the confidentiality and privacy of communication.” What’s more, “[f]or the Federal Government, a high level of data protection and cyber security, including complete and secure end-to-end encryption in electronic communications, is essential. With this in mind, Germany believes it is necessary among other things to state in the draft text that no technologies will be used which disrupt, weaken, circumvent or modify encryption.” That phrasing (which goes beyond “the weakening of E2EE” as per the survey question) suggests the German government isn’t on board with client-side scanning.
These three countries stand above the rest for treating privacy as a fundamental right and cybersecurity as a vital state interest, and insisting that the language of the Regulation reflect that. Of all the responses, they’re the most well-rounded, taking into account other important equities besides just those of law enforcement.
These countries’ voices will be absolutely critical as the CSA Regulation continues to trundle shakily along. To borrow from Greece (which apparently didn’t respond to this survey), Estonia, Finland, and Germany are like a chorus cautioning the Council about the perils of doggedly proceeding on its current path.
It is unacceptable in a democratic society to make digital intermediaries monitor everyone’s communications without any suspicion of wrongdoing. Child sex crimes are abhorrent, but that doesn’t justify discarding the fundamental rights of half a billion people (and everyone they talk to). Suspicionless mass surveillance is a wildly disproportionate invasion of individual privacy. And when democracies do it, they give repressive regimes an excuse to do the same – and for “crimes” far afield from child sex abuse. Europe should not set a precedent in this regard. (Especially since it’s debatable which of those two columns, democratic or repressive, certain EU members and aspirants – Hungary, Poland, and Turkey – belong in.)
Plus, the proposal’s deleterious effects on cybersecurity should be completely unacceptable to a Council that’s had front-row seats to a land war in Europe for over a year now. Ukraine was invaded without provocation by an aggressor that has some of the best hacking capabilities of any country in the world and isn’t afraid to use them. European nations – their governments, their businesses, their people – need all the digital security they can get, not a new law that undermines their ability to protect themselves.
The Council should go back to the drawing board on the CSA Regulation. The stakes are too high to get this wrong.
UPDATE, June 12, 2023: Since posting this, I've learned that Spain will assume the EU Presidency on July 1 (which, tbh, I should've known already), which is unsettling given their LEWP stance described above. More notably, there's a news report today that, on its way out the door, the Swedish Presidency just issued a revamped version (a "compromise text") of the draft CSA Regulation that allegedly includes language to protect encryption -- and, in grand tradition, the draft leaked.
The leaked draft can be found here. It anticipates a LEWP (Police) meeting scheduled for tomorrow (June 13). Its revisions to the CSA Regulation's draft language include the following:
"This Regulation shall not prohibit, make impossible, weaken, circumvent or otherwise undermine cybersecurity measures, in particular encryption, including end-to-end encryption, implemented by the relevant information society services or by the users. This Regulation shall not create any obligation to decrypt data."
The language is accompanied by the following footnote comment by the Presidency:
"[O]n the basis of the discussions in the LEWP meeting on 2 June, the PCY wishes to initiate an exchange of views on the way forward on a provision regarding encryption and the need to ensure that cybersecurity is not undermined. ...
In another footnote, the Presidency suggests including the following recital:
“Cybersecurity measures, in particular encryption technologies, including end-to-end encryption, are critical tools to safeguard the security of information within the Union as well as trust, accountability and transparency in the online environment. Therefore, this Regulation should not adversely affect the use of such measures, notably encryption technologies. Any weakening or circumventing of encryption could potentially be abused by malicious third parties. In particular, any mitigation or detection measures should not prohibit, make impossible, weaken, circumvent or otherwise undermine cybersecurity measures irrespective of whether the data is processed at the device of the user before the encryption is applied or while the data is processed in transit or stored by the service provider.”
That last line is a pretty overt rejection of client-side scanning. I'm looking forward to hearing what goes down at the June 13 LEWP meeting. This is a very positive development (and I'm super glad I got this giant blog post out the door right before it happened!) - huge thanks to the member states that stood up for encryption, and to the many European civil society organizations and academics who've been sounding the alarm about the CSA Regulation draft for a year now. But the work isn't done; as the Regulation continues moving forward, we have to keep up the pressure to protect Europeans' fundamental rights and safeguard end-to-end encryption in the EU.