The EARN IT Act Is Here. Surprise, It’s Still Bad News.

Well, the dreaded day has come: the EARN IT Act was formally introduced today in the Senate Judiciary Committee. I wrote at length in January about the bill, which aims to kneecap encryption under the guise of protecting children online, while capitalizing on the techlash and the current unpopularity of Section 230 of the Communications Decency Act. As introduced, the bill is up to nine co-sponsors total, from the original two (Sens. Graham and Blumenthal). 

This version of the bill is different from the version that I blogged about in January, and also from the intermediate version that Eric Goldman blogged about last month. I've attached a PDF to this blog post of the version as introduced. (Scroll down, it's there at the bottom!) Here’s the Senate Judiciary Committee’s press release (which, maddeningly, does not link to bill text, at least as of the time I wrote this). Here’s coverage from the New York Times, whose reporting last fall is credited for helping inspire the bill. Here’s a Wired story that quotes me.

I have some observations on how the bill as introduced has changed from the version I blogged about in January. You’ll want to read that blog post before this one, as this post assumes the reader’s familiarity with the initial January version of the bill. And, as in that blog post, I can’t hope to touch on even half of what’s going on in this bill. It’s still a sprawling mess that would take a roomful of lawyers and policy wonks, with many different kinds of expertise, to issue-spot everything that’s weird or problematic with it.

First, a round-up of responses from civil society:

Now, here goes:

  • This version of the bill is twice the length of the previous version. Most of that is due to a new Section 7 at the end of the bill, which will replace the outmoded and disfavored term “child pornography” everywhere it occurs in federal law and replace it with “child sexual abuse material,” to have the same legal meaning. (The bill spells out all of those instances in federal law, hence the length.)
    • While I am on board with getting rid of the term “child pornography,” it bears remembering that the “best practices” the bill contemplates go beyond images. CSAM is only a subset of the problem at hand. The bill now calls for best practices for fighting “the online sexual exploitation of children, including the enticement, grooming, sex trafficking, and sexual abuse of children and the proliferation of online child sexual abuse material.”
      • (That’s a long list, so throughout this blog post, I’m going to refer to all of those things as “CSAM” for the sake of convenience.) 
    • It’s an improvement that the bill is now being a bit more specific; the original bill referred to “online child exploitation conduct,” without saying what that meant, though it seemed to differentiate between “child exploitation” and “child sexual abuse.”
    • CSAM, enticement, grooming, sex trafficking, and sexual abuse are all different problems with different mitigation strategies that providers could deploy. Because these categories don’t just involve images, any best practices risk overbreadth and could result in the censorship of legal speech (like FOSTA has -- which, by the way, already covers sex trafficking; it’s unclear how this bill interacts with it). 
    • Even my saying “just images” oversimplifies the problem facing both the Commission (in drafting best practices) and providers (in implementing them). Think child sexual abuse imagery is a bright-line, clear-cut thing, both for First Amendment purposes and for detection/reporting purposes? Nope. It’s not.
  • The Commission is up to 19 members from the original 15.
    • Still includes the AG and the heads of DHS and the FTC. 
    • It now has 2 members with “current experience in matters related to constitutional law, consumer protection, or privacy.” This is an increase from 0 in the original version. So we’ve got that going for us, which is nice. But that “or” means the Commission could end up with 2 consumer-protection people who are experts in, I don’t know, car seat safety, who know nothing about privacy or the Constitution. What’s to stop that from happening?
    • The Commission still has 2 members who are experts in computer science, but the exact wording has become more specific: previously, it called for current experience in “computer science or software engineering”; now, it’s “computer science or software engineering related to matters of cryptography, data security, or artificial intelligence in a non-governmental capacity.”
      • This is the only place in the bill where cryptography comes up. By requiring a cryptography expert, this version of the bill basically confirms what we all knew to be true from the start: The “best practices” are going to target encryption. And again, that “or” in “cryptography, data security, or AI” means that the Commission could end up including zero cryptography experts.
    • The original bill called for 2 members to have “experience in providing victims services for victims of child exploitation”; the bill as introduced allots 4 members who shall either have that experience (in a non-governmental capacity”) or “be survivors of online child sexual exploitation.” This is fascinating. I am strongly against so much about the EARN IT Act, but expressly reserving seats at the table for survivors (well, modulo that “or” again) is remarkable. 
    • That said, all of the foregoing is basically window dressing: The best practices require only 14 of the 19 members’ approval. That means the Commission can completely ignore the 4 experts in privacy, constitutional law, and computer science, or the 4 survivors and child-abuse experts, and the best practices will still go onward into the hands of the Attorney General.
  • The bill no longer gives unilateral power to the AG to write the “best practices” himself. Instead, after the Commission submits the recommended best practices to him, the AG, “upon agreement with” the heads of DHS and the FTC, shall either approve or deny them.
    • If approved, the best practices are published on the DOJ website and the Federal Register and submitted to Congress. If denied, the AG has to write up why he denied them. Then, the Commission gets a do-over: it “may resubmit recommended best practices.”
    • In other words: even though he no longer gets to rewrite the best practices at will, the AG still has thumbs-up/thumbs-down power over them (so long as the FTC and DHS heads agree). He -- along with the heads of FTC and DHS -- is also on the Commission. Why would the remaining members of the Commission ever bother writing any best practices that won’t please AG Barr? He still basically gets to dictate what the best practices say, because if they write some he doesn’t like, he can just deny them and make them resubmit revised versions that are more to his liking. And since AG Barr despises encryption (and because, as said, the 14-of-19 requirement means all the members who are experts in privacy, cryptography, and data security can be totally ignored), of course the best practices will go after encryption.
  • The bill has been rewritten in what appears to be an attempt to avoid the serious procedural defects present in the original leaked version.
    • The original version unconstitutionally delegated ultimate power to the Attorney General to decide what the “best practices” would be (and allowed him to rewrite whatever the unelected Commission recommended), sidestepping the usual regulatory rulemaking procedures or congressional review. It wasn’t a law, it wasn’t an agency rulemaking, it was a set of “best practices” that just happened to also have the force of law because not following them would open providers up to liability and oh also maybe land one of their executives in prison.
    • Now, the bill contains a process for the recommended best practices, after approval by the Attorney General, to be put into a “covered bill” in Congress (which must contain all of the best practices, not only some of them) and fast-tracked to a vote in each house of Congress.
      • Do not pass Go, do not collect $200, do not do all the usual things that would otherwise pose inconvenient democratic speed-bumps to the swift passage of a bad bill. A lot of the bill is devoted to spelling out exactly how the best practices will get to sidestep the usual Schoolhouse Rock stuff. The determination to get around normal congressional procedure is itself a huge red flag.
    • I’m not a legislative analyst, and the language is confusing to me, but I think the upshot of all this is that the “best practices” would get passed into an actual law enacted by Congress
    • This has echoes of the CLOUD Act, which gives Congress the opportunity to disapprove of an executive agreement under the Act that the U.S. enters into with another country. But as with CLOUD, I expect that this process is just a rubber-stamp by Congress, particularly given the fast-tracking provisions that do away with the usual legislative processes. That is: the “best practices” are still pretty much up to the AG to determine.
    • Despite the attempt to fix it, the bill still seems to have nondelegation-doctrine and Administrative Procedure Act problems, as TechFreedom explains here.
    • If these "best practices" are now going to be codified into federal legislation, that might (might) address the bizarre nondelegation/administrative procedure issues with the original -- but it opens up a host of other problems.
      • The rewrite leaves the bill even more vulnerable to challenges on First and Fourth Amendment grounds.
        • Codifying “best practices” for online content means congressionally-required rules governing speech on the Internet. That just opens up a whole can of First Amendment worms, as Project DisCo describes.
        • The bill contains the following “rule of construction”: “Nothing in this Act or the amendments made by this Act shall be construed to require a provider of an interactive computer service to search, screen, or scan for instances of online child sexual exploitation.” That rule of construction is clearly aimed at trying to fix EARN IT’s “state actor” problem under the Fourth Amendment, but it might not be enough. TechFreedom did a great write-up of the Fourth Amendment issue so I don’t have to.
      • And if those “best practices” are indeed codified into federal law, then, as I’ve said before, the best practices would conflict directly with CALEA to the extent that they involve encryption or law enforcement access to communications. But the bill still doesn’t even acknowledge that CALEA has any bearing on this bill.
  • There are two particular newly-added provisions that I believe the AG will use to go after encryption:
    • Section 4(a)(1) charges the Commission with developing a set of “recommended best practices” that providers “may choose to engage in” in order to fight CSAM. Those best practices shall include “alternatives that take into consideration” a variety of factors. One of the newly-added factors is “whether a type of product, business model, product design, or other factors related to the provision of an interactive computer service could make a product or service susceptible to the use and facilitation of online child exploitation.”
      • Because the AG continually lambastes end-to-end encrypted messaging for cloaking pedophiles’ exchanges of CSAM and grooming of child victims, this is code for “encryption is not a viable alternative best practice.” This will be used to discourage any “product design” that includes encryption that isn’t backdoored for law enforcement.
      • This factor is also utterly nonsensical, because -- leaving aside the vagueness of what “facilitate” means -- it applies to every single online service ever that lets a user (1) communicate with another person or (2) transmit or store any kind of file. In case you’ve forgotten, that is the entire point of the Internet. All email services! All cloud storage! All chat services! Wikipedia! Github! JIRA tickets! If you take out the word “online,” it’s everything. The telephone! The U.S. mail, FedEx, and UPS! Safety deposit boxes at the bank! Hotel rooms -- heck, private homes -- that have walls, and doors that close and lock! The past one hundred years of Boy Scout camps, for god’s sake! 
      • Fundamentally, privacy and the ability to communicate with each other are “susceptible” to being “used” for, or “facilitating,” child exploitation. Undermining our online privacy and hampering our ability to communicate with each other are goals the DOJ would love to achieve with this bill.
    • Section 4(a)(4), later in that same section, sets out the “relevant considerations” that the Commission “shall consider” in developing best practices. The as-introduced version adds a troubling new one: “the impact [of implementing the best practice] on the ability of law enforcement agencies to investigate and prosecute child sexual exploitation and rescue victims.” 
    • Because law enforcement alleges that encryption hampers their investigative abilities, this, too, is just code for “the Commission shall decide that end-to-end encryption is a worst practice.” 
  • This is definitely an anti-encryption bill. The bill is 65 pages long, and yet it speaks even more volumes in what it doesn't contain. If this is not a bill that is intended, as Berin Szoka put it over at Techdirt, as "a backdoor to a backdoor" to encryption, it could just say so. But it doesn't. And that's telling.
    • Compare this bill to the CLOUD Act. The “AG refers the best practices to Congress for them to rubberstamp” structure feels a little like the CLOUD Act’s aforementioned process for executive agreements between the U.S. and other countries. By contrast, however, the CLOUD Act expressly states that such agreements must be encryption-neutral: “the terms of the agreement shall not create any obligation that providers be capable of decrypting data or limitation that prevents providers from decrypting data.”
    • The drafters of EARN IT were equally capable of inserting similar language here: just replace “the terms of the agreement” with “the best practices”. But they never did so, even after this bill has gone through at least three iterations.
    • The omission is even more glaring in light of the bill’s aforementioned “rule of construction”: “Nothing in this Act or the amendments made by this Act shall be construed to require a provider of an interactive computer service to search, screen, or scan for instances of online child sexual exploitation.” 
    • If the drafters of this bill could write a rule of construction about this not being a monitoring and filtering mandate (because that’s a Fourth Amendment no-no), they could certainly have written a rule of construction clarifying that the bill is also not an encryption-backdoor mandate. (And also that it doesn’t impair any rights under CALEA, too.) But they didn’t.
  • The bill makes providers immune from liability for "compliance with a search warrant, court order, or other legal process." This is in tension with the Stored Communications Act, which (with some relevant exceptions) allows providers to be sued civilly for improperly disclosing users' communications. If a provider illegally discloses a user's information in response to an illegal subpoena (or a fake court order that was Photoshopped to look real; this is not mere speculation), why should they be immune? This provision could have an innocent explanation -- for example, to ensure that providers do not face liability for merely dropping a CD in the mail to the DOJ that contains CSAM -- but as written, it's overbroad.
  • No change to the original version’s language that lowered the mens rea for civil liability by providers for having CSAM on their service from actual knowledge to recklessness. CDT explained why that lowered standard is problematic.
  • The as-introduced version of the EARN IT Act, weighing in at 65 pages, doesn’t fix the fundamental problem with the bill: It’s a big, unnecessary mess. As I wrote in January, if Congress thinks providers aren’t doing enough to fight CSAM, why not just amend the federal law that tells providers what they have to do about CSAM? This arcane, twisted, confusing, tortu[r]ous bill structure could be avoided in favor of a much simpler bill that just straightforwardly amends that statute to tell providers that they have to start doing x, y, and z additional things on top of their existing reporting and preservation duties.
    • That’s not to say that x, y, and z are totally fine and unproblematic. The bill calls for the best practices to address a list of matters that include, for example, “employing age rating and age gating systems to reduce child sexual exploitation.” That, as Project DisCo’s blog post on EARN IT notes, runs afoul of the First Amendment when mandated by law, as the Supreme Court already ruled over a decade ago. This weird “oh, it’s not a legal mandate, it’s just best practices that a Commission comes up with and tailors to please the AG who then sends them to Congress to rubberstamp so it looks an awful lot like a real law but we’re still going to pretend it’s totally voluntary” charade does not fix that. Can we please avoid relitigating the same damn First Amendment violations that were already struck down by the Supreme Court over ten, fifteen, even twenty years ago? 

Meanwhile: 11 Voluntary Principles

Meanwhile, on the same day the bill finally came out, the DOJ held a press conference about another new anti-CSAM initiative. As CNET and the NYT report, Attorney General Barr, appearing together with representatives from America’s Five Eyes partners, announced a set of 11 voluntary principles for providers to adopt to fight CSAM. Six companies -- Facebook, Google, Microsoft, Twitter, Snap, and Roblox -- have signed on, and Attorney General Barr said during the press conference that other companies have indicated an interest in joining as well. 

These principles are, naturally, pretty vague; I don’t see anything especially objectionable in them. Overall they seem to be about the kinds of things that the industry signatories, such as Google, are probably already doing in some form to combat CSAM anyway (though whether they’re doing enough is, of course, the crux of the dispute that got us to this point). The principles don’t overtly mention encryption at all. No surprise there. Any version that said anything about “limits” or “balance” or a “middle ground” on encryption would not have gained any tech company signatories. Something vague, anodyne, and non-binding is really the only possible outcome when governments and businesses work together to draft something everyone can agree to, especially about a hotly contentious topic. 

But that didn’t stop the Five Eyes speakers at the press conference, such as AG Barr, from using their airtime at the podium to talk mostly about encryption (and, in Barr’s case, Section 230). As the UK representative said, “Encryption remains the elephant in the room.” Their attitude today was in keeping with the Five Eyes’ joint statement from September 2018 that made their stance on encryption very clear. Nothing’s changed since then; as today’s presser made plain, the Five Eyes have confined themselves to the echo chamber I warned about, continually working to undermine strong encryption while ignoring any countervailing approaches from other governments. So while the principles may sound mostly harmless, the Five Eyes’ underlying motives are anything but.

These voluntary principles are the “speak softly” part of the DOJ’s encryption agenda, and the EARN IT Act bill is the “big stick.” The two-pronged approach, deployed simultaneously today, will further pinch tech companies, which have been under pressure for years from law enforcement agencies worldwide to backdoor the encryption in their devices and services. 

I am sure there are sound strategic reasons for the six signatories to have signed on to the voluntary principles. It’s good optics; it depicts the companies as trying in good faith to work constructively with law enforcement. And it doesn’t actually require them to do anything. 

But that’s exactly the problem. If those six companies think that agreeing to these voluntary principles will help relieve the pressure and stave off the EARN IT Act bill from going anywhere, I’m not so sure. Congress and the public are fed up with industry self-regulation as a solution to various ills. True, self-regulation may be the only option when it comes to certain measures for fighting CSAM, because companies can voluntarily take measures (such as removing online speech that is abusive but not illegal, or warrantlessly scanning every file in everybody’s email or cloud storage account) that would violate the First and Fourth Amendments if they were mandated by law. That’s a big problem for the as-introduced version of EARN IT Act, and the drafters of the bill know it.

But self-regulation is weak sauce in 2020. Self-regulation got us Cambridge Analytica, and the current online environment of ad-driven surveillance capitalism more generally. Heck, Section 230 is, in a sense, a pass for providers to engage in self-regulation, and now people hate Section 230 for that. Even if providers are in fact diligently working on the problem of abuse internally, Congress and the public don’t have great visibility into those internal efforts. What they see, from the outside, is providers endorsing yet another set of non-binding, vague principles, which will be talked up in a press conference and maybe a blog post… and then rarely heard of again.

That lack of transparency is understandable inasmuch as providers don’t want to hand bad actors on their services a roadmap to evading the providers’ anti-abuse measures. But I fear that signing on to these 11 principles won’t dissuade Congressmembers from voting for the EARN IT Act and approving “best practices” that ban strong encryption and, like FOSTA before it, impose overbroad speech regulations on the Internet. If they want to fend off a bad law telling them to do more on CSAM, tech companies need to show Congress how much they’re already doing. As said, what got us here is public perception that platforms are sitting on their hands when it comes to CSAM. If that’s not the case, Congress needs to hear it (behind closed doors as necessary), and soon. 

Even that might be too little, too late -- many Congressmembers, while they surely do sincerely care about the wellbeing of children, likely want to be seen to be doing something about CSAM (rather than to be deferring to the big bad tech companies), even if that something is a terrible bill like the EARN IT Act. 

And while it’s certainly a necessary, urgent, and desirable goal to combat the scourge of online child exploitation, there are still limits on what tech companies should do. Stepping up to fight CSAM should not mean wholesale converting their services into even more powerful surveillance tools for law enforcement than they already are. Tech companies should take care that they don’t let accusations of inaction against child abuse, coupled with the specter of legislation, push them into falling all over themselves to strip away their users’ privacy and security and hand increased snooping capabilities to governments that, frankly, can’t be trusted with them, as the Snowden revelations showed.

That’s not a popular thing to say, because rhetorically, “we should do everything at all possible to fight the sexual abuse of children” sounds a lot better than “well, yes, protect children, up to a point.” The rhetorical power of this particular kind of harm is exactly what makes everything about the government’s current focus on CSAM so dangerous. Child sex abuse is a radioactive topic, and invoking it has the unfortunate tendency of shutting down nuanced discussion. It is practically taboo to suggest there should be any limits on what we should do to fight it. But that taboo doesn’t help anyone but power-hungry governments -- as our own government is well aware. 

Undermining everyone’s privacy and security, strengthening governments’ already-excessive surveillance powers, nibbling away ever more at civil liberties and human rights -- if it’s in the name of protecting children from sexual abuse, then the DOJ and its Five Eyes partners expect we’ll not only accept it, but demand it from the online service providers we use. (The UK representative at today’s press conference commented that “putting our children at risk for what I believe are marginal privacy gains” -- i.e., from encryption -- “is something I really struggle to believe any of us want.”) As Cato’s Julian Sanchez said on Twitter, it’s a “gross, cynical” ploy by law enforcement to get tech companies to roll over on their users’ rights.

Hearteningly, there are reports that some Congressmembers, both Democrats and Republicans, are pushing back against EARN IT. Senator Ron Wyden (D-OR), an evergreen champion of Americans’ civil liberties, issued a statement against it and promised to offer legislation soon that would provide additional funding and resources for fighting CSAM. That’s a more sensible proposal, and it may scratch Congressmembers’ “I need to be seen to be doing something” itch. But they need to know that if they stand up for Americans’ privacy, security, and speech rights online, they have their constituents’ support. If you want to make sure that the EARN IT Act bill goes nowhere, contact your Congressmember and tell them you oppose it. 

Add new comment