On Tuesday, June 23, Senators Graham (R-SC), Cotton (R-AR), and Blackburn (R-TN) introduced a bill that is a full-frontal nuclear assault on encryption in the United States. You can find the bill text here. It's been formally introduced as Senate bill 4051, which you can track here. (Other reactions to the bill so far: EFF, Techdirt.)
Dubbed the “Lawful Access to Encrypted Data Act of 2020” (acronym: LAED, which my fingers definitely do not mis-type as LEAD every single time), the bill is an actual, overt, make-no-mistake, crystal-clear ban on providers from offering end-to-end encryption in online services, from offering encrypted devices that cannot be unlocked for law enforcement, and indeed from offering any encryption that does not build in a means of decrypting data for law enforcement.
The new bill applies to operating systems and apps and messaging and chat and social media platforms and email and cloud storage and videoconferencing and smartphones and laptops and desktops and your Xbox, and probably voting machines and IoT devices – basically any electronic device with just 1 GB of storage capacity. It isn’t just aimed at Apple, Google, Facebook, Signal, and the like, though it certainly applies to them; it goes well beyond, to include everyone from Box and Dropbox, to the full range of Microsoft’s products, to OEM handset manufacturers.
This bill is the encryption backdoor mandate we’ve been dreading was coming, but that nobody, during the past six years of the renewed Crypto Wars, had previously dared to introduce. Well, these three senators finally went there.
Yes, It’s Really Bad
I’m still trying to digest the bill, which is over 50 pages long, so this is my initial, quick-and-dirty impression of it. Take it with a grain of salt.
The bill’s wording is unambiguous: providers, across the spectrum of devices and information services, must design in the ability to decrypt data and provide it in intelligible form. That applies to providers that have more than a million U.S. users. This applies to both stored data (whether locally on a device, or remotely) and data in motion (i.e., communications in transit). For providers of the former, they're covered if they've had a million or more users, or sold a million or more devices, in the U.S., annually, in 2016 or any year since. For providers of the latter, the threshold is one million monthly active users (MAUs) in the U.S., in January 2016 or any month thereafter. Providers will bear the costs of the proactive redesign; they will not be compensated by the government.
The point of compelling providers to design for decryptability is so that, when/if they are served with a search warrant or other court order for someone’s (i.e. your) data or device, they will be able to decrypt and hand over the relevant data in legible form. The way that process will work is that, for stored data (either remotely or on a local device) law enforcement will go before a judge and apply for a court order requiring technical assistance from the provider (which they can do either concurrently with the search warrant application, or after getting the warrant). If law enforcement can show “reasonable grounds to believe” that the assistance “will aid in the execution of the warrant,” then the judge must issue the technical-assistance order; she does not have the discretion to reject the application so long as she finds the applicant has made that “reasonable grounds” showing.
For data in motion, the bill contemplates technical-assistance orders that will be issued to effectuate a different kind of court order from a search warrant. Search warrants are for stored data; for data in motion, we’re talking about orders for wiretapping communications (such as phone calls, text conversations, or email conversations) under the federal Wiretap Act, and orders for monitoring communications metadata pursuant to the federal Pen Register Act. Both of those statutes already contain provisions for the issuance of technical-assistance orders to providers, to help law enforcement implement the wiretap or pen register/trap-and-trace device. (Wiretaps intercept contents of communications; PR/TT devices capture metadata. For the former, think of the body of an email or text message; for the latter, think of the email header, or the phone numbers to/from which you send and receive texts.) This bill amends those statutes to expressly state that technical-assistance orders shall include decrypting the data that’s captured. Per the existing statutory language, these technical-assistance orders under the Pen Register and Wiretap Acts were already “shall issue” (no discretion by the judge so long as the requisite showing is made).
For all of these technical-assistance orders (stored data and data in motion), there’s a limitation: the provider that receives the order must decrypt “unless the independent actions of an unaffiliated entity make it technically impossible to do so” – i.e., unless the data was encrypted by somebody else, not the provider.
So what would this mean if the bill were enacted? No longer will Facebook be allowed to respond that they lack the ability to decrypt WhatsApp messages; no longer will Apple be allowed to say they don’t have the ability to unlock an iPhone. If this bill passes, they will have to redesign those products so that they are able to decrypt. What’s more, Mozilla and Cloudflare better look out, too: Senator Graham didn’t forget about HTTPS and DNS over HTTPS. The encryption debate has been pretty much entirely about locked devices and E2EE messaging apps. Encrypted metadata has rarely been raised as a problem by U.S. law enforcement. And yet, the bill’s decryptability requirement also applies to metadata, in the context of technical-assistance orders under the Pen Register statute.
What if a provider hasn’t already designed a decryption capability? In that case, the Attorney General can simply command it to build one, using what’s called an “assistance capability directive.” (If it does already have that capability, the AG can use the directive to command it to maintain it.) That isn’t limited to the million-plus club; any provider can be served with such a directive. That is, the “big” providers have to proactively design for decryptability, and the “little guys” with less than a million U.S. users better gird their loins. (But unlike proactive redesigns, providers do get compensated for the cost of developing a capability in response to a notice, so at least they’ve got that going for them.) The provider can challenge the directive in court, but if the court upholds the directive (in whole or in part) and orders the provider to comply, then any refusal by the provider to comply with the order will constitute contempt of court.
To create a decryption capability, providers are allowed to farm the work out to contractors, but the contractor must be U.S.-based. (Sorry, Cellebrite.)
By the way, all this stuff about search warrants and pen registers and wiretaps is the part of the bill that pertains to domestic law enforcement surveillance. This whole regime of mandating decryptability also applies to electronic surveillance under the Foreign Intelligence Surveillance Act. This is a national security bill, not just a domestic surveillance backdoor bill. I’m going to admit, though, that I am an ignorant dunce about FISA, so I’m going to wait for someone who knows FISA better than I do to explain the bill’s ramifications for FISA surveillance. But at first glance, it looks like it goes even further than the domestic provisions do. Which is even scarier.
Oh, and because we have to gamify everything including your privacy, the bill also includes a prize competition “to incentivize and encourage research and innovation into solutions providing law enforcement access to encrypted data pursuant to legal process.” The prize can be (but need not be) awarded to “technological solutions that provide law enforcement access to encrypted data pursuant to legal process.” Why? Why bother? That prize only makes sense in the current environment, where there is no backdoor mandate and the DOJ has spent years complaining that not enough academics are doing the DOJ’s work for them by trying to come up with a secure golden backdoor key. If you pass a law forcing providers to figure out how to come up with a lawful-access mechanism, and at the same time prohibit the government from making providers adopt any particular solution that somebody might invent, then the need to incentivize this research goes away! It’s like passing a law mandating that everybody’s house has to be made out of super-fragile, highly transparent glass, so that the police can see what everybody is doing inside their homes and easily break down somebody’s wall if they see something they don’t like, and then creating a prize for the glazier whose glass doesn’t give the cops an owie when it shatters. The prize just doesn’t matter anymore once the mandate is in place.
The Domino Effect
God, what a mess. Essentially, this bill is CALEA for electronic devices and The Internet. Weirdly, though, the bill is so poorly-drafted that it closes only the encryption carve-out in CALEA, Section 1002(b)(3), not the carve-out for “information services,” Section 1002(b)(2). Equally weirdly, much of this bill could have been achieved by amending the CALEA statute, but instead it spreads out its provisions among various parts of the U.S. Code. (Including in places where they don’t belong: the bill adds a technical-assistance provision to the Stored Communications Act, bringing it in line with its ECPA brethren the Wiretap Act and the Pen Register Act; but the bill doesn’t amend the part of the SCA where you’d expect that provision to go, instead adding it into the section of the U.S. Code about searches and seizures.)
The bill reads like an unholy combination of CALEA, Senators Burr and Feinstein’s 2016 “Compliance with Court Orders Act” bill (which went nowhere), and Australia’s 2018 Assistance and Access Act (which did pass). The “assistance capability directive” provisions seem to be modeled on the Australian law’s “technical capability notices” and “technical assistance notices,” which in turn were modeled on the UK’s 2016 Investigatory Powers Act. As I feared, Australia and the UK are the dominos that have tipped over onto their Five Eyes ally the United States. The U.S. Senate can point to Australia and the UK as evidence that it’s OK for a democracy to severely restrict people’s ability to communicate privately and secure their data.
This bill also has more in common with the Australian law than with the 2016 Burr/Feinstein bill in terms of how broad in scope it is. The CCOA bill applied only to devices. But that was 2016, a more innocent time; here we are in 2020. If this bill had come out 18 months ago, it’s questionable whether it would have targeted messaging apps as well as smartphones. Starting last year, though, the encryption debate in the U.S. expanded beyond devices to encompass E2EE messaging as well. Both were topics of the December Senate Judicary Committee hearing at which the new bill’s sponsors threatened to introduce the legislation they have now put forth. (Nevertheless, it’s still curious to me that neither of the CCOA’s sponsors is sponsoring the new bill, so far at least.)
While they mostly remind me of the Australian and UK laws, these directives do have something in common with CALEA, which mandates that telcos make their networks wiretappable, but doesn’t let the government tell them exactly how to do so. So too here, with the assistance capability directives, the AG can issue the directive but can’t tell the provider exactly how to build the capability. Interestingly, that means this new bill would not permit the 2016 San Bernardino “Apple vs. FBI” order to Apple that, if it had not been vacated by the court, would have spelled out exactly what Apple was supposed to do in order to help the FBI get into that phone. That order would still be out-of-bounds under this new bill, just as it was out-of-bounds under the All Writs Act, which the new bill is evidently intended to replace in the context of compelling providers to decrypt data and devices.
I Am Petty Enough To Say I Told You So
About that December SJC hearing: I commented at the time that it sounded like law enforcement was offering to refrain from seeking regulation for E2EE, and would settle for a backdoor mandate for device encryption. This split is also what that Carnegie Endowment report from last September had suggested, a couple months prior to the hearing. At the hearing itself, regulating devices while leaving data in transit alone was what witness Matt Tait suggested, doing his best to warn the committee off from the latter. I also seem to recall that the witness for Facebook (i.e., encrypted messaging) threw the witness for Apple (i.e., encrypted devices) under the bus on exactly that point. (Though it was so quick that it was easy to miss it. Maybe I was imagining things.) That’s between them and their God.
I commented at the time, in that December post about the hearing, that I did not believe for a single moment that law enforcement or Congress would settle for only regulating encryption as to devices and not data in transit. And here we are: just as I predicted, the Senators who grilled the witnesses at that hearing have come up with a bill that mandates backdoors for devices and messaging. They didn’t even wait to roll the mandates out in two separate bills, as I thought they might do; they just put both backdoor mandates in at once. ¿Porque no los dos?
It should come as no surprise that the Department of Justice refused to settle for going halfsies. The Attorney General is delighted with this new bill. As expected, they pushed for everything. As I explained in that December post, CALEA was a compromise too, and federal law enforcement agencies have been reneging on that compromise ever since. They were never going to settle for anything less than total access to everyone’s communications and devices. I’ve only been doing this job for less than five years, out of the quarter-century this debate has been dragging on, and even I wasn’t so damn naïve and delusional as to think the feds and Congress would settle for only screwing up encryption as to devices when they could screw it up for data in transit too.
“Exceptional Access Only With A Warrant” Is Not A Modest Proposal
And yet “Lawful Access to Encrypted Data” doesn’t sound so extreme, does it? Providers only have to decrypt data if they get a court order issued pursuant to a warrant, right? They can even appeal the order in court. Seems reasonable. “This bill allows exceptional access, and only with a warrant” is the kind of tagline we’re sure to hear about this bill in the coming days and weeks. But make no mistake: this is a sweeping bill. “Exceptional access” is a phrase that suggests some narrow, limited concept. In truth, what this bill would require is a mandatory built-in mass backdoor for practically every device or service you use that has a computer in it or touches the Internet at any point. If it passes, this bill marks the end of strong encryption for stored data on devices; those would now be illegal to sell in America. And it is an outright ban on offering E2EE in the U.S. Say goodbye to WhatsApp and Signal: they’ll be wiped from the Google and Apple app stores. iMessage will no longer be E2EE, either. And as for Zoom’s big plans to end-to-end encrypt video calls? If this passes, Zoom can put their pencils down on that one.
What’s more, “only with a warrant” doesn’t really mean that much given how easy it is to get a warrant. And when providers have to design up-front for decryptability across the board, rather than on a case-by-case one-off basis, so that just in case they receive a warrant or wiretap/PRTT order for encrypted data, they’ll definitely be able to decrypt it – well, that design thus has to be built into all devices in the U.S., everyone’s version of an app or software, the OS update pushed out to all phones. That means the backdoor – the ability to decrypt – or, to put it another way, the security vulnerability in the product – will be present at all times. It doesn’t materialize only once there’s a warrant.
Decryption will be technically possible even without a warrant. And that will attract abuse. We will see faked court orders and faked warrants. It will be up to the provider to verify that the order is legitimate. And that’s just within the U.S. If providers have to build in a decryption capability to satisfy U.S. law, that capability will immediately be demanded by every other government on earth. That includes the not-so-nice ones who don’t even pretend to abide by the rule of law, the way the U.S. still somehow manages to say it does with a straight face. Plus, any country that gets a CLOUD Act agreement with the U.S. will be able to take advantage of the provider’s compliance with this bill. If Australia or the UK hadn’t already gotten providers to build in a decryption capability under their laws that I mentioned above, now the U.S. will do that for them. And in addition to those countries — the democracies that might have an independent judiciary and court orders kinda-sorta like ours — the other governments knocking on providers’ doors will be China, Russia, Bahrain, pick your poison.
If U.S. providers are forced by law to backdoor their encryption, it won’t just put an end to Americans’ electronic privacy and security. Innocent users in other countries who use the same products and services will be harmed as well. WhatsApp has 1.5 billion users in 180 countries. If the U.S. forces Facebook to backdoor WhatsApp, that affects roughly 20% of the entire population of Earth.
Finally, even if this bill passes, and even if it has an impact on the privacy and security of literally billions of people, that still won’t catch every criminal. The crypto cat is out of the bag. Encryption technology is out there.
Passing a law in the U.S. won’t stop criminals and terrorists from finding other ways to encrypt their data and communications. Al Qaeda rolls its own encrypted messaging software. That developer team isn’t going to respond to a U.S. court order. What’s more, most entities that offer encrypted products are located outside the U.S., outside Congress’s jurisdiction. Those companies, too, could be expected to thumb their nose at legal process emanating from U.S. law enforcement. (Unless, of course, their country has a CLOUD Act agreement with the U.S. The incentive, then, is to switch HQ to a country that does not.)
And finally, what about open-source projects? Whom do you even serve with a court order in that situation? The bill is so broadly worded that I think it might apply to individual contributors to open-source projects, such as, say, the Linux kernel, but I’m not sure and I don’t want to scare everybody. Even so, even if the bill is really that broad, and the FBI did scare some individual contributor into writing backdoor code, how would the code ever make it past all the internal reviews and checks and balances into actual deployment?
Bills like this seem like utter lunacy in their total refusal to acknowledge the reality that it is impossible to put the crypto genie back in the bottle. But despite all the Sturm und Drang about catching criminals and terrorists, the bill’s sponsors and the law enforcement agencies backing the bill know perfectly well that the bill won’t catch all of them. It will just take away the strong privacy and security guarantees that encryption provides to the vast majority of normal, average users of iPhones, iMessage, WhatsApp, etc. If you backdoor the encryption used by most normal people, that means you also backdoor the encryption used by most normal criminals (the ones who are easiest to catch anyway, even without resorting to a backdoor mandate). Senator Graham is perfectly fine with selling out the privacy and security of huge numbers of normal, innocent, law-abiding people in order to catch the low-hanging fruit of the criminal element. But I won’t hold my breath for him to acknowledge that that is what he’s doing, or that his bill will have no effect on the sophisticated and savvy bad guys who will always be able to get their hands on strong encryption. Those individuals will just stop using the major U.S. tech providers’ products and services, and move to others – whether illicit apps and platforms (like Al Qaeda’s home-rolled app), or legit ones based outside the U.S. – that make them harder for U.S. authorities to track down.
The EARN IT Act Is Still Also Bad
None of this lets the horrible EARN IT Act bill off the hook.
You’ll recall that Senator Graham, the sponsor of this new bill, is also the sponsor of the EARN IT Act, which I have covered extensively in this blog and elsewhere. (You’ll also recall that Senator Cotton is the guy who believes that peaceful protesters exercising their First Amendment rights should be crushed by the full might of the U.S. military.) That bill is scheduled for mark-up tomorrow, June 25, in the Senate Judiciary Committee, and a number of amendments are expected, potentially even including something to address its encryption problem. However, it’s also anticipated that the mark-up will be delayed a week – to July 2, when many people (myself included) will be away for the July Fourth holiday and won’t be paying attention. Marking up a bill when everyone’s on holiday is a definite indicator that the bill’s sponsors know just how unpopular it is and are hoping to push it through at a moment when they’re likely to encounter the least resistance.
In any event, the introduction of another Graham bill immediately before the scheduled EARN IT mark-up is obviously intended to make EARN IT look more reasonable by comparison. Graham’s new bill is an extreme and poorly-drafted piece of legislation that might as well just say “Fuck you, Apple, Google, and Facebook” in 100-point font instead of dragging on for 52 pages. It overtly and aggressively outlaws strong encryption (both for stored data and data in transit), rather than taking the roundabout approach that EARN IT does—an approach calculated to be more politically palatable, given how controversial the topic of regulating encryption is. Tellingly, the new bill has only three Republican sponsors, compared to the ten bipartisan co-sponsors that EARN IT had garnered by the time it was introduced. In short, it’s a dud, designed to be DOA. And indeed, nobody currently seems to think that the new bill is actually intended to make any progress at all, much less actually pass.
But other members of Congress need not buy into Graham’s transparent fallacy, that his new bill is so bad that they should vote for EARN IT instead. The LAED Act – which, because it’s so hard to type that acronym correctly, I guess I’ll just call the Big Bad Backdoor Bill – doesn’t make EARN IT acceptable. EARN IT is unacceptable, full stop. It will still be unacceptable even if the encryption problem gets fixed in amendments during the mark-up, given its numerous other problems.
It is possible, indeed super easy, to reject both that bill and this bill, rather than buying into Graham’s ploy of introducing a greater evil to make his lesser of two evils seem more acceptable. I know this Congress has really caused us to lower our standards for what we expect out of our government, but this isn't a Hobson's choice that lawmakers are forced to make. They don’t have to choose between the two. They can choose to vote against both. Much worthier bills die on the vine in committee all the time, because congressmembers recognize that doing nothing is also a choice. The fact that Senator Graham is capable of having two bad ideas at the same time does not mean the rest of his colleagues must pick one.
Look, if you go out on a dinner date with a guy who's rude to the waitress and mansplains your job to you, and then you go on another date with someone who casually reveals that he is an actual serial killer, that doesn't mean you have to agree to a second date with the mansplainer. You could just delete the dating apps from your phone and spend your Friday night solo on the couch watching a reality TV show about competitive flower-arranging instead.
I urge the Senate to vote against EARN IT and the Big Bad Backdoor Bill. The lesser of two evils is still evil. With so many people now working, studying, and doing many other life activities online thanks to pandemic-induced shutdowns, now is really the absolute worst possible time to attack encryption and undermine cybersecurity.
Moreover, the timing of the new bill, and the insistence on pushing forward with EARN IT, feels particularly ill-suited to a moment when America is fed up with the current state of policing. With thousands of protesters nationwide risking their health to take to the streets and protest police violence and systemic racism, pushing to give law enforcement more power is tone-deaf at best and, at worst, suggests that these particular Senators could not care less about the racism in policing that already manifests as over-surveillance of Black and brown communities. The rest of the Senate should listen to the Senators who are pushing these two bills, Lindsey Graham in particular. They're telling us who they are, and we should believe them.