The Manhattan District Attorney’s Office has released its fourth annual report on the impact of device encryption (i.e., smartphones and tablets) on the office’s investigations and prosecutions. The prosecutors in that office have made it their job to put this report out every year, and I guess it’s my job to critique it, no matter how tired we all are at the tail-end of the Year of Our Lord 2018 of making the same points and counterpoints over and over and over again in the endless encryption debate. It’s like when your kid reaches the age where they find out about knock-knock jokes and you have to dutifully say “Who’s there?” dozens of times a day, even though you know the punchline already and it wasn’t even funny the first time and also you’re screaming inside. So here goes!
Telling Stories, Tallying Numbers
Way back in January 2016, I wrote a guest post at Just Security wherein I discussed the Manhattan DA’s first smartphone encryption report, which came out in November 2015. That report attached the DA’s proposed bill for mandating “exceptional access” to encrypted devices (which was introduced in the New York legislature in 2016 but failed to pass).
In my post, I pointed out that in reports like these, law enforcement officials tend to highlight “worst of the worst” offenses—terrorism, murder, sex crimes involving children—to justify their demands for crypto backdoor legislation. However, according to arrest and prosecution statistics, I noted that less-horrifying offenses such as larceny were much more common. That is, the Manhattan DA was playing to people’s repugnance at egregious, but uncommon, crimes in order to persuade lawmakers to give him the power to access encrypted phones, a power his office would then actually use principally to investigate other, more common crimes.
What I wrote back then of the first report is still largely true of the fourth. It cites horrific examples of crimes where the Office was unable to get into an encrypted device (even after trying the workarounds the Office says sometimes work): child rape, an infant’s suspected murder, a woman being stalked and murdered. These are abhorrent tragedies. Yet the report also shows that they are not the most common types of crimes for which investigators wish to access encrypted smartphones.
The report contains a chart of the number of locked phones and tablets the Office received during a four-month period in 2018, broken down by category of crime under investigation. The chart shows that far and away the #1 category was “larceny/forgery/fraud/cybercrime ID theft”: 35.7% of locked devices. “Sex crime” came second (20.8%); “homicide/attempted murder,” fourth (10.3%), slightly ahead of “assault/robbery/burglary” at 10.0% and behind “drug charge” at 12.4%.
This chart in the fourth report reinforces (albeit imperfectly) the prediction I made after seeing the first report. The instances where the DA’s Office wanted to get into a locked smartphone involved something like theft or a forged check far more often than they involved the “worst of the worst” violent crimes the report specifically highlighted.
This is not to say that the Manhattan DA’s Office cannot still use “worst of the worst” crimes to advocate for legislation that would mandate law enforcement access to encrypted smartphones. It’s only to say that they should be more forthcoming about the fact that the power they’re asking for is much likelier to be used against, say, check-kiters and ID thieves than murderers. (That is, unless that power is restricted to serious, violent crimes—a limitation the DA’s 2016 bill proposal did not contain.)
The Manhattan DA wants the public and their elected representatives to decide they’re OK with undermining everyone’s data security and privacy in the name of more-efficient law enforcement investigations. That requires a frank discussion about whether the relative rarity of violent crime and terrorism is worth the trade-off. With violent crime rates in New York City near historic lows as of the end of last year, New Yorkers might just decide it’s not.
Exculpatory Evidence (That's Mostly Not From Smartphones)
After the section on cases where encrypted data served as inculpatory evidence, the report turns to cases involving exculpatory evidence. The Office identified 17 cases where “evidence we recovered from a smartphone” led to reduced or dismissed charges. The report describes four specific cases to “demonstrate that electronic evidence is critical to the truth-seeking mission of law enforcement, not only to prosecute the guilty, but also to exonerate the innocent.”
It is certainly true that electronic evidence can help exonerate the wrongly accused. But the examples provided do not demonstrate the need for mandatory legislation requiring smartphone makers to backdoor their encryption in order to guarantee law enforcement access. That “solution” is the conclusion the report reaches, but it simply does not follow from these examples.
One case involved “evidence extracted from one of the defendant’s phones,” meaning that a digital forensics tool (such as Cellebrite or GrayKey) was successful in pulling data off the phone. The report cautions that these devices do not always work—but in this instance, there was no need for backdoored encryption.
Another example involved using cell phone data and social-media messages to corroborate the accused’s alibi that at the time of the crime, he was in police custody (on an unrelated matter). Using that data, “investigators were able to locate the precinct where he had been detained, and the date and time of his detention.” Why should this instance be read as supporting exceptional access to smartphones, rather than as evidence of shortcomings in the NYPD’s own recordkeeping system?
Finally, the four cases involve the following: (1) “evidence extracted from one of the defendant’s phones”; (2) “data from [a] social media app, as well as cell site data for the phone”; (3) “cell phone data and messages from the accused’s social media applications”; and (4) “evidence recovered from a cellphone” including photographs and “cell site data.”
Cell site data and data from social-media apps come up repeatedly in these cases. But those are types of information that are held by third-party service providers. Historical cell-site data would be acquired from the provider (with a warrant, thanks to Carpenter), not off the accused’s phone—the phone doesn’t contain cell-site data at all. Likewise, while social-media info can be retrieved from the phone used to access that platform, it typically can also be obtained with the proper legal process from the social-media company. (That could vary depending on the app, but the report doesn’t specify which apps were at issue.)
When exonerating information can be obtained from third-party service providers, that’s not an argument for needing to backdoor the encryption on smartphones. Rather, these instances provide support for a report from the Center for Strategic and International Studies (CSIS) that I blogged about in July. CSIS surveyed and interviewed members of federal, state, and local law enforcement agencies and asked them what their biggest digital-evidence challenge was. The #1 answer? Not encryption, but rather, identifying which provider would hold the relevant evidence in the first place. Here, that meant the cell-service providers and the social-media companies. They are already subject to legal process, and while the report is short on details, it sounds like they responded to legal process in the examples cited here.
Even though these aren’t very good examples, it’s smart of the DA’s Office to include instances of exculpatory evidence in the report. That will appeal to the criminal defense bar. But talking up how weakened smartphone encryption could exonerate the accused also reveals how, when law enforcement says, “Criminals hide behind encryption, and that’s why we should break encryption,” that’s only half the story. Law enforcement wants to break everyone’s encryption, not just criminals’. It is impossible to give law enforcement exceptional access only to criminals’ smartphones without also weakening the encryption used by innocent people. And even if it were possible, law enforcement would still demand more, in order to investigate crime.
Why? Because not everyone is a criminal, but anyone can be wrongly accused of a crime. In the American criminal justice system, defendants are presumed innocent until proven guilty. Inevitably, some individuals are erroneously accused. The Manhattan DA wants to break encryption for them too. What is more, anyone can become a victim of a crime. Again, the DA wants to break victims’ smartphone encryption, not just that of the offender who victimized them.
If not everyone is a criminal, but anyone can be wrongly accused and anyone can become a victim, then the only solution, from law enforcement’s viewpoint, is to break encryption for everyone.
When law enforcement officials try to persuade you that you should be OK with undermining encryption because the police should be able to get into bad guys’ phones, remember: what they really want is to be able to get into your phone, even if you’re not the bad guy.
The Manhattan DA Doesn't Care About Your Privacy, Only His Own Power
A year ago, I noted in another blog post that with the tide of public opinion turning against tech companies, anti-crypto law enforcement officials smelled blood in the water and seized upon the opportunity to vilify tech companies for strong encryption. They’re at it again here. The Manhattan DA’s latest report condemns Facebook for the Cambridge Analytica scandal and Google for, among other things, considering a re-entry into the Chinese market with a censored version of its search engine. The report says that these incidents show that tech platforms care about enriching their shareholders, they don’t really care about users’ privacy, and it shouldn’t be up to them to decide how they handle customer data. It then points to the GDPR as an example of how to regulate tech companies’ handling of user data.
On the surface, these look like absolutely bizarre analogies to deploy in the context of asking for more police power to access people’s personal data. But underneath, they reveal a lot about the Manhattan DA’s thinking.
For one thing, the report highlights scandals involving Facebook and Google that have nothing to do with encryption, and tries to tar providers of strong encryption with the same brush (oh, those evil Silicon Valley tech companies, they’re all alike!). This “guilt by association” argument seems like a non sequitur—until you realize it’s the continuation of a longstanding pattern by New York law enforcement. It’s akin to putting innocent people in a gang database because they live in the same community and are of the same ethnicity as actual gang members. Or placing everyone who belongs to a particular religion under suspicionless surveillance because a few people who professed that religion committed terrorist acts. “Guilt by association” is just par for the course in Manhattan, where it’s an M.O. that’s usually used against populations that are far more vulnerable and marginalized than billion- and trillion-dollar companies.
Another oddity: of all the Silicon Valley scandals the DA’s Office could have chosen, Cambridge Analytica and Google/China seem like particularly odd choices to shore up a pro-law enforcement, anti-crypto argument. Encryption protects the security of your data, and it also protects your privacy. Americans have come to expect both of those needs to be protected by the tech companies to which we now entrust so much of our lives. Mandating backdoors in encryption, as the Manhattan DA wants to do, would compromise both security and privacy.
The outrage over Cambridge Analytica’s access to Facebook user data underscores how much people (and regulators) care about their privacy. The scandal prompted calls for U.S. law to impose stronger requirements for privacy protection on tech companies. So it makes little sense for the Manhattan DA to use that outrage to argue that tech companies should be forced to undermine user privacy by backdooring their crypto.
It seems just as weird for the DA to point to Google’s dipping its toe back into Chinese waters. (In a footnote, the report also reminds us that Apple complied with China’s iCloud data localization requirements.) The reason people are mad at Google (and Apple) about this is because China is an oppressive police state. Indeed, the report wrings its hands over China’s human rights abuses and censorship. Understandably, a lot of Americans want Silicon Valley to push back. So why would a bunch of prosecutors think it makes any sense to use the Google/China outrage to demand more police power?
After discussing Cambridge Analytica and Google/China, the DA’s report then turns to discussion of the European Union’s General Data Protection Regulation (GDPR). This also seems like a weird choice of example, because, as the report says, the GDPR “imposes stringent privacy regulations” on tech companies. That is, it’s about getting companies to better protect user privacy, not undermine it in the interests of law enforcement.
To the contrary, commentators have observed that the GDPR may pose significant hurdles to European law enforcement investigations—something the DA’s report vaguely acknowledges in a footnote. The footnote thus calls for additional legislation about forcing tech companies’ compliance with government commands. Yet the report evinces great envy for the GDPR as it stands, observing that it was passed in response to “calls for greater regulation” of tech companies, and that those companies rushed to try to comply, lest they face crushing fines. These passages are telling, because they show what the DA’s Office thinks the real meaning of the GDPR is. To the DA, it’s not about privacy; it’s about bending tech companies to the government’s will.
This section of the report concludes by saying tech companies need to be brought to heel: they should not get to decide “how and when customer data should be made available for criminal justice or public safety purposes.” But “criminal justice and public safety” aren’t what any of those examples — Cambridge Analytica, Google/China, GDPR — were about. The DA is just performing some sleight-of-hand here, twisting these examples around to try to fit them into his preferred talking point, and hoping you won’t notice.
The only way to make sense of the DA’s use of these examples, which are ostensibly about protecting privacy, is if you recognize the underlying message: “Silicon Valley tech companies don’t really care about your privacy, they’re just pretending to, so we in the DA’s office want to force them by law to abandon that pretense and actually give you less privacy than you want and have come to expect from them.”
What the report is really saying is that the Manhattan DA wants more power to get into your personal data and devices. He thinks he can use your anger at some companies like Facebook, over issues that are totally unrelated to encryption, to turn you against other companies like Apple because they do protect you through encryption. You don’t have to fall for it. You can demand that tech companies protect your data security and respect your privacy.
The DA’s fourth annual report concludes by saying that federal (not state) legislation mandating exceptional access to encrypted smartphones is “the only answer.” So far, legislators haven’t agreed. The proposed bill attached to the DA’s first report was one of three such bills that failed in three different states in a single year (2016)—along with a federal bill by Senators Burr and Feinstein, which the conclusion of this report suggests as a possible alternative to the DA’s own proposed language. The 2016 Burr-Feinstein bill was so roundly trashed by everyone (including me) that it was never even formally introduced in the Senate. Congress has shown no appetite since then for passing anti-crypto legislation, even though—or perhaps because—privacy and cybersecurity have been hot congressional topics this year.
Instead of trotting out this report year after year only for it to fall on deaf legislative ears, the DA’s resources would be better spent on learning to live in a world of ubiquitous encryption rather than railing against it: by working with tech companies to train investigators in what data is available from which sources, acquiring additional digital forensics equipment, strengthening partnerships with federal agencies, or learning from their counterparts in countries that have acknowledged the importance of strong encryption, to name a few examples. Those options all come with their own concerns, of course (privacy and civil liberties, data security, transparency and oversight…). But at least I wouldn’t have to read any more of these reports.
 The drug-crime number is lower than I would have expected back in January 2016. This pleasant surprise might be partially due to the DA’s Office’s “decline-to-prosecute” policy, about which the Office released statistics the same day it issued the new encryption report. The encryption debate has barely budged in the past three years, but nationwide, marijuana policy is light-years from where it was back then.