Banning Strong Encryption Does Not Mean Catching Criminals. It Only Makes You Less Safe from Them.

On November 17, 2019, Reuters reported that the Federal Bureau of Investigation (FBI) had drafted a resolution for the international police organization, Interpol, that condemned strong encryption and called for encryption backdoors. (Interpol later denied that “the General Secretariat” had any plans to issue any such statement, though that does not rule out some other working group or other subdivision of Interpol.)

The reason for this condemnation? Encryption’s use by child sexual predators to share child sex abuse material (CSAM) with each other and to contact potential victims. Earlier this fall, after years of terrorism being the favored rationale for their endless war against strong encryption, law enforcement agencies in the U.S. (and other countries) suddenly changed their public-relations messaging to focus almost exclusively on CSAM. While it is not wholly clear what prompted the shift in public messaging, one main reason seems to be Facebook’s announcement earlier this year that it would to upgrade the encryption in its Messenger app and Instagram direct messages to match that of WhatsApp, which is end-to-end encrypted by default. The Attorney General, joined by his counterparts in the UK and Australia, sent a letter to Mark Zuckerberg in early October imploring Facebook not to proceed with this plan. So far, Facebook has refused to back down.

In any event, CSAM is only the latest in a string of reasons given by law enforcement to justify their calls to outlaw any encryption that cannot provide “exceptional access” to law enforcement. That reason varies, depending on whatever law enforcement in a particular place thinks their particular constituency might find convincing at a particular point in time. In my few years working on encryption policy, the reasons I’ve heard have included terrorism, CSAM, corruption (Brazil), mob violence (India), hate crimes against refugees (Germany), and disinformation leading up to elections (multiple countries). CSAM is a uniquely sympathetic reason, because sex crimes against children are uniquely repugnant. But whatever the justification, the ultimate goal is always the same. Stated simply, that goal is: To make it illegal for you to have, and for entities like Facebook to provide, the strongest cybersecurity possible.

Many people would be all too happy to give up that right – not just for themselves, but for everyone else – if they were assured that banning strong encryption will mean catching child sex predators. But that’s the problem. It won’t. A ban on strong encryption won’t even work for its stated purpose.

Why not? Because the cat’s out of the bag. The genie is out of the bottle. The horse has fled the barn. Whatever metaphor you pick, here’s the truth: Strong encryption is here to stay, and making it illegal won’t make it go away. Even if the U.S. and our allies all ban strong encryption—and companies like Facebook knuckle under and comply—it will remain available. The software, and the underlying mathematical know-how, are out there. That would still be the case even if Facebook yanked WhatsApp tomorrow. And even if law-abiding companies and individuals stopped using strong encryption, the bad guys wouldn’t. Terrorists already write their own encryption software, and the purported creator of the TrueCrypt file-encryption program (which was favored by ISIS) turned into an international crime kingpin.

It sounds cliché, but it’s true: If strong encryption is outlawed, only outlaws will have strong encryption. Were the U.S. to enact a mandate requiring Facebook to backdoor its messaging encryption for law enforcement, then many criminals’ immediate move would be to switch to a different program. Savvy criminals, particularly traders in CSAM, have long been technologically sophisticated in cloaking their activities; they teach one another how to hide, and they respond to reports of compromised programs by shifting tactics. If law enforcement actually gets its way and strong encryption is banned, then only the dimmest criminals would still use known-harmful programs. The FBI would find it easier to catch this low-hanging fruit, yes. But savvy criminals would continue to elude them.

Meanwhile, the law-abiding users of law-abiding companies would have weaker tools available to protect ourselves. What the police are asking for is a world where, by law, criminals have better security than law-abiding people, giving criminals more leverage over innocent people. That is what banning strong encryption means. It means legally requiring you to be weaker than the criminals. Banning strong encryption won’t stop the bad guys. It just hurts the rest of us. And yet, the police (who, remember, are supposed to protect us) are hell-bent on harming everyone’s security in the name of a measure that won’t actually stop savvy criminals.

When law enforcement officials say “We want lawful access,” what they mean is: “We want to make it illegal for you to have the best security possible. We are fine with the collateral damage to you, so long as we can catch a few dumb bad guys while the savvy ones use the tools we won’t let you have.”

But of course, a ban on strong encryption won’t rid you of the desire to protect yourself and your data. You’ll still want those tools too: end-to-end encrypted chat apps, software for encrypting files on your devices, etc. And as said, those programs will still be out there for you to download, if you can find them. If you do so, congratulations: now you’re a criminal too.

But how will you get your hands on those programs, if Apple and Google are legally prohibited from letting you get them in their app stores? Well, you’ll have to sideload them. But without the vetting process that Apple’s and Google’s official app stores perform (often inadequately as-is), how can you know that the software isn’t malicious? You might not find out until too late, if ever. That will be the irony of taking self-help measures in a world where strong encryption is illegal: in trying to enhance your protection, you might end up doing just the opposite.

This situation might sound familiar to people who, like me, were in college around the late ‘90s and early aughts. I recently read this oral history of the peer-to-peer file-sharing service LimeWire. As the article recounts, when users of LimeWire and other P2P services downloaded files from the anonymous strangers sharing them, they took the risk that they were getting the content they thought they were getting, and not a virus that would infect their computer.

The entertainment industry’s take-no-prisoners war against copyright infringement holds lessons for the current encryption debate. In a world where the U.S. or other countries ban strong encryption, the software will still be out there, available to be found and installed. But if Google and Apple are not allowed to let you download that software in their app stores, that world will look a lot like it did during the file-sharing wars: If people can’t get what they want through legitimate channels, they’ll get it elsewhere, and put their security at risk.

All the entertainment industry’s war on file-sharing did was bring ruinous litigation onto college students and grandmas and kill off innovative companies such as LimeWire, until the world moved on, people changed how they consumed media, and it was all for naught. All because certain powerful interests just could not stand the loss of total, complete control.

Did the entertainment industry’s war against peer-to-peer file-sharing services and their users stop a few bad actors? Sure. Would banning strong encryption make it easier to catch some criminals? Sure. But the collateral damage was huge then, and would be huger now. In an age of hacking crimes, ransomware, data breaches, and cyber espionage and conflict between nations, we literally cannot afford not to have strong encryption.

So, to recap: “lawful access” means that in the name of catching some criminals, it will be illegal for you to have stronger security than the cops want you to have. If you do so anyway, you’re a criminal too; if you don’t, you might get victimized. And the law won’t even work.

Sex crimes against children are abhorrent. I should not even need to say that, but the genius of this latest public-relations campaign by government officials is that it cannily equates “pro-encryption” with “anti-child safety,” forcing advocates of strong security and privacy into the position of having to affirmatively denounce these crimes. That is: we have to justify why preserving strong encryption is imperative for security, privacy, and personal safety; they do not have to justify why they are demanding a measure that would hurt everyone while not actually serving its stated purpose.

We don’t have to go along with this “pro-encryption equals anti-children” framing. It’s time we turn the tables and start asking questions.

I’ve begun by sending a Freedom of Information Act request to the Departments of Justice and Homeland Security, asking for all records relating to that open letter to Mark Zuckerberg. So far, all they’ve sent me in response is… a copy of the letter (which was already available on the DOJ website). It’s funny how federal law enforcement wants guaranteed access to my data, yet balks at giving me access to any of its files. I don’t know why they’re so reluctant to comply with FOIA. After all, the only reason to object to legally-mandated access to information is if you’ve got something to hide, right?

Comments

Should we extend the argument to financial transactions?

AML provisions are often justified by similar lines of reasoning, and the cost is that we sacrifice our entire financial privacy to the banks and ultimately to law enforcement (if they want it). Cash has long offered a safe haven for the anonymous -- although it seems now that anonymous cryptocurrencies will be banned for the ease by which they permit ML and other nefarious financial behavior.

Should we have the right to transact in private from banks and government?

Add new comment