There’s a story in today’s Washington Post “Cybersecurity 202” newsletter that confirms that the Department of Justice is capitalizing on the techlash in order to build up congressional support for the DOJ’s long-desired goal of legislation that will restrict your freedom to encrypt your data and communications.
The Post reports that, according to assistant attorney general for national security John Demers, the DOJ has given up hope that tech companies will “voluntarily” backdoor their own encryption, as the agency had been pressing them to do since around 2016. Instead, the DOJ is now “focusing on getting legislation that forces companies to cooperate – and is hoping encryption-limiting laws in Australia and the United Kingdom will ease the path for a similar law in the United States.”
Why now? What’s changed since 2016, when we had the great Apple vs. FBI showdown? According to Demers, two things: (1) the “techlash” by Congress and the public “in the wake of myriad privacy scandals” and the 2016 election; and (2) Australia’s 2018 passage of the Assistance and Access Act, which followed on the heels of similar legislation in the United Kingdom in 2016. Demers “hopes these laws will create a model for how lawmakers in the United States might limit encryption.”
These two factors lay out, straight from the horse’s mouth, what I’ve been saying for a while. It comes as something of a relief for a high-ranking DOJ official to finally acknowledge publicly the playbook I could see they were running to try to get Congress to finally ban strong encryption. That doesn’t mean I’m happy about it.
I explained last month that the techlash has now gained enough momentum that law enforcement may have a fighting chance of getting its anti-encryption wish, under the guise of protecting children, in the form of a terrible bill called the EARN IT Act. That bill doesn’t look much like Australia’s Assistance and Access Act or the UK’s IP Act -- in fact it doesn’t mention the word “encryption” at all -- but right now it’s the lead contender for the DOJ to get an “encryption-limiting law” passed in the U.S.
Exploiting the techlash is a strategy I’ve been calling law enforcement out for since October 2017. It’s incredibly frustrating for me to see that this obvious ploy is working so well. AAG Demers admitted that the DOJ thinks it can persuade congressmembers to be angry at tech companies over encryption because they’re already mad at those companies for violating users’ privacy. But this, let’s call it, transitive rage contradicts itself. Why? Because encryption protects user privacy.
It doesn’t just do that; indeed, information security experts have had to push back for years against the overly simplistic “security versus privacy” framing to emphasize that the encryption debate is primarily a question of “security versus security.” Nevertheless, privacy certainly is one of the main interests that encryption protects. And it doesn’t just shield your data and conversations from criminals and snoops: it can even shield them from the eyes of the entity that provided the encryption. For example, when you use a chat app such as WhatsApp that end-to-end encrypts your conversations by default, not even the app provider (Facebook, in the case of WhatsApp) can read your messages or listen in on your calls. So, if you’re mad at Facebook for invading your privacy, you should be glad that they use encryption that prevents them from snooping on your WhatsApp conversations, and that they’re planning the same for their other messaging services too.
Thus, the DOJ’s strategy is obviously just trying to sow confusion among the public and Congress by mixing up the issues: conflating tech companies’ privacy violations with tech companies’ privacy-protective encryption, as I pointed out in a recent press article. Even Senator Graham, the author of the EARN IT Act bill, admitted in that very same article that this doesn’t make any sense: “When asked whether he saw any tension between Capitol Hill’s ongoing effort to pass privacy legislation and its burgeoning push to mandate encryption backdoors,” Graham admitted he saw “‘a lot.’”
So, if even Senator Graham can see through the DOJ’s ploy to elicit what I’m calling transitive rage, why is it working? The answer might be: children. Per the Post today (and me last fall), “Justice officials have also shifted their messaging on encryption, talking less about the danger of terrorists recruiting and planning operations outside law enforcement's view and more about the threat of a surge in child predators sharing illicit images or luring children on social media.” Congress seems receptive to this child-safety messaging. Legislators expect Big Tech to protect the privacy of users, including children. Encryption shields users’ privacy. Simultaneously, they also expect Big Tech to be able to detect the bad guys on their services, including those who are hurting children. But encryption shields the bad guys too.
How to resolve this dilemma? Previously, the answer from Congress was “do nothing,” both on passing an anti-encryption law -- something for which Congress has heretofore shown no appetite -- and on passing comprehensive federal privacy legislation. But the tide has shifted, the Hill is awash in the techlash, and the DOJ has succeeded in equating being pro-encryption with being anti-child safety. If pedophiles benefit from strong encryption built in by default to popular software and devices, then, according to Senator Graham, nobody should get that benefit anymore. (Never mind that it won’t work out the way he thinks.) In a Congress already dithering over passing a federal privacy law, the child safety rationale may prevail, at the expense of the many interests that encryption protects -- privacy not least among them.
Maybe Graham, in acknowledging the dilemma of demanding both privacy and encryption backdoors simultaneously, was really just tacitly admitting that when 327 million Americans’ privacy is pitted against the rhetorical power of “think of the children,” privacy loses. Overall, the attitude from Congress in 2020 seems to be, to paraphrase Michael Pollan: “Protect users. Not too much. Mostly kids.”
It is likewise unsurprising yet disappointing that DOJ views Australia’s stupid law as clearing the way to make anti-encryption legislation palatable to the U.S. Congress. In October 2018, I warned that the passage of the Australian law (then a pending bill) would likely have a domino effect on other Five Eyes countries, including the U.S. By passing the bill in December 2018, “Australia set an example of a Western democracy passing legislation that undermined encryption, making it look like that’s normal and OK,” I said last summer. It’s not OK, even if it becomes normal. Of the DOJ officials currently rejoicing over the opening Australia and the UK have given them to finally shove anti-encryption legislation through Congress, how many have ever said to their children, “And if all your friends jumped off a bridge, would you jump too?”
The DOJ wants the U.S. to take a blinkered view of how governments should handle the topic of encryption. In July 2018, I had predicted that the DOJ would place itself in an echo chamber where it would listen to “only other countries whose governments have adopted anti-encryption stances,” specifically Australia and the UK, while ignoring countries that have come out more strongly in favor of encryption, such as Germany. That seems to be what’s happening now: the DOJ wants America to imitate Australia, when Germany’s federal Office of Information Security just today issued a set of proposed requirements for smartphones that require full-disk encryption. This shows that another way is possible than the path chosen by the UK and Australia. The German approach may have much to teach the U.S. It is dangerous for DOJ to urge Congress to stick its head in the sand and refuse to listen.
Yet here we are. With the disastrous EARN IT Act bill about to drop, the DOJ is openly and pointedly taking the gloves off in the encryption fight. But make no mistake: once the DOJ throws its knock-out punch, it’ll be your privacy and security that hit the floor.