Ray Ozzie’s Key-Escrow Proposal Does Not Solve the Encryption Debate -- It Makes It Worse

Steven Levy, who’s been covering the crypto wars for a quarter-century, has a long article out in Wired about Ray Ozzie’s technical proposal for a key-escrow-based exceptional-access scheme to give law enforcement access to encrypted smartphones. (Levy interviewed me while he was writing the piece, though my comments didn’t make it into the end product.)

Ozzie’s proposal is riddled with flaws. Yet the article, for all its length, gives very, very little airtime to critiques. Those are confined to three paragraphs at the end of the nearly 5,000-word article, and consist of mostly high-level comments from computer science professors and noted cryptography experts Susan Landau, Matthew Green, and my colleague Dan Boneh. Green has put his specific critiques into a more detailed blog post, as have Errata Security’s Rob Graham and Mike Masnick at Techdirt(Edit: So has computer security professor Steve Bellovin, and Landau has some indirect observations about the proposal on Lawfare; I hope we'll hear more from her.) I imagine there will soon be more responses to this article as well.

While not naming Ozzie’s proposal expressly, a whitepaper I published in February made a lot of the same points those guys are making now. I won’t rehash the whitepaper’s arguments here. But I want to add a couple of points.

First, Ozzie intends his proposal to be a way to move the encryption debate forward from its entrenched state. He sees it as the plank that gets a stuck vehicle out of the mud so it doesn’t keep spinning its wheels, without any guarantee that it’ll reach its destination. In response to Green’s tweetstorm of critiques, Ozzie said on Twitter, “This isn't The Answer, nor is there one. It’s all risks/tradeoffs.” I wish he’d holler that from a mountaintop rather than put it in a tweet reply, because as Techdirt notes, “The Answer” is exactly how law enforcement officials will tout Ozzie’s plan. Worse, not only will they misrepresent the proposal, they’ll use it to undermine and discredit critics of the plan (and of exceptional-access proposals more generally).

Second, Ozzie paints the “Keys Under Doormats crowd” as hysterically over-stating the risks of exceptional-access schemes and as recalcitrant, for politically-motivated reasons, to discuss “a way forward” on encryption. (Levy largely adopts that tone in his article, and honestly, he should know better, after so long covering these issues.) This view is the same as that of the law enforcement officials Ozzie is helping, who, as I’ve discussed before, like to paint opponents of exceptional access as “absolutists” who are “refusing to have a mature conversation.”

But this debate isn’t stuck because tinfoil hat-wearing extremists refuse to budge. As Green points out in the article, computer security is really hard and we are very bad at it. Thus, unsurprisingly, every time a “new” technical proposal for exceptional access comes along, it turns out to have flaws. Maybe they’re ones that can be spotted early, as the Wired article describes cryptographer and CS professor Eran Tromer doing on the fly during one of Ozzie’s presentations. But sometimes those flaws only come to light later, long after the flawed scheme has been implemented in the real world, as yet another cryptographer and CS professor (notice a trend?), Nadia Heninger, explained at an ACM event last year.

To continue the car metaphor, we may be stalled out and spinning our wheels on the encryption debate, but every way forward that we can see, Ozzie’s included, has perilous pitfalls. And even if we can’t see the pitfalls from where we are now, we know they’re out there, and we’d rather not discover them later by falling into them. So we don’t want to proceed down any of those paths. That’s why computer security experts are skeptical of exceptional-access schemes. And it’s why, as Ozzie concedes, there simply is no answer that will make everybody happy here.

Third, if Ozzie’s proposal is implemented, it will permanently brick every single phone law enforcement opens. The access mechanism Ozzie proposes would act as a kind of tamper-evident seal that, once “broken” by law enforcement, renders the phone unusable. Ozzie touts this as a security feature for the user, not a bug. True, it does have an upside in that the phone’s owner would unavoidably be notified that the phone had been unlocked and accessed; the police couldn’t keep that fact a secret. But from the consumer perspective, breaking every phone law enforcement examines is outlandish. It’s like saying that every time the cops search a car, the car would fall apart once they’re done collecting evidence. Sure, it makes certain that cops couldn’t secretly rummage through a car unbeknownst to the driver. But that’s cold comfort to a driver facing a useless pile of metal and plastic.

This “feature” alone should consign Ozzie’s idea to the rubbish heap of history (which overfloweth with other people’s clever ideas for exceptional-access schemes). The whole reason smartphone encryption is ostensibly a problem is because it’s supposedly large-scale. If it were a minor issue that affected only a few cases, there wouldn’t be such a tizzy over finding a solution. But if it’s such a large-scale problem, then Ozzie’s solution necessarily means destroying a large-scale number of people’s phones—devices whose importance in our daily lives I don’t even need to explain. The FBI said it amassed about 7,800 smartphones in a single year that it couldn’t access due to encryption. Add in the numbers from other federal, state, and local law enforcement agencies—and still more in other countries—and the absurdity of this idea becomes evident. Ozzie’s scheme would basically require a self-destruct function in every smartphone sold, anywhere his proposal became law, that would be invoked thousands and thousands of times per year, with no compensation to the owners. That proposal does not deserve to be taken seriously—not in a democracy, anyway.

And, to my final point, it would open up a new way for law enforcement to pressure people to open their phones for the police. Imagine an officer asking you to do so. Maybe you’re a suspect in a crime, or the victim, or a family member or friend of the victim. The police believe your iPhone contains evidence relevant to their investigation, and they want you to consent to the search of your device. (That’s easier for them than getting a warrant from a judge and then going through the Ozzie unlock procedure with Apple.) But you’re reluctant. If you’re a suspect, you may wish to preserve your Fourth and Fifth Amendment rights. Even if you’re the victim or someone not directly affected by the crime, you may not want to hand over your phone, with all the personal, intimate information it contains about you, to the police. So you might balk at the request to unlock it for them.

If Ozzie’s proposal were implemented, it would give the police a way to lean on you to open the phone for them. “You can make it easy on yourself by unlocking the phone and giving it to us,” they might say. “But hey, we don’t need you. We can go to a judge and get a warrant, and then we can just have Apple unlock it for us. …Of course,” they’d continue, “that would brick it forever, so you couldn’t use your phone anymore, even after we gave it back to you eventually. You’d have to go out and buy a new one. …Say, how much does a new iPhone cost these days, anyway?”

It’s long been noted that exceptional-access schemes would threaten civil liberties and human rights by guaranteeing that any country’s government, no matter how totalitarian, could get access to anyone’s smartphone. But by including this self-destruct technical “feature,” Ozzie’s scheme would also open up another way of imperiling those rights. It turns what should be a security-minded feature—a tamper-evident seal—into a cudgel the police can use to pressure people into giving up their rights and “consenting” to let police search their phones.

Ray Ozzie clearly meant well. I sympathize with that. None of us wants to keep having this same debate endlessly. But not only is his proposal not The Answer to the encryption debate, it just made it a lot worse.

Add new comment