The EARN IT Act Threatens Our Online Freedoms. New Amendments Don’t Fix It.

On July 2, the Senate Judiciary Committee held a full-committee hearing at which it made significant changes to the pending EARN IT Act bill, S.3398, about which I’ve written extensively on the CIS blog. While many Americans (myself included) were out on vacation for the Independence Day holiday, Senator Lindsey Graham (R-SC), who chairs the SJC, introduced a so-called “manager’s amendment” that largely overhauls the version Graham introduced in March. The Committee approved the manager’s amendment unanimously, and also approved an amendment by Sen. Patrick Leahy (D-VT) that modifies the manager’s amendment to provide some protections for encryption. The bill can now proceed to consideration by the full Senate. You can watch video of the hearing and read all the amendments here.

EARN IT is among a bumper crop of bills introduced in this Congress that would amend Section 230 of the Communications Decency Act of 1996, which largely immunizes the providers of online services (think: email, social media, websites, apps including messaging apps, you name it) from liability for the actions of their users on their services. That immunity, as relevant here, bars most civil lawsuits as well as prosecutions under state criminal law; it does not bar enforcement of federal criminal law. (Techdirt has a terrific Section 230 explainer for the uninitiated.)

The current rash of bills varies in how they would affect providers’ Section 230 immunity. EARN IT specifically targets providers’ immunity (against state criminal charges and civil lawsuits, that is) for a particular class of user content: child sexual abuse material (CSAM). In January, when analyzing an early version of this bill, I wrote a lengthy post about providers’ existing duties under federal statutes to report such material when they find it on their service.[1]

The July 2 manager’s and Leahy amendments attempt to respond to some of the concerns that I and others have raised about EARN IT. But they perpetuate the basic underlying problem: if passed, even in this amended form, the bill would still pose a serious threat to our freedoms online, especially freedom of speech. That threat is inherent to this legislation; no amount of amendments can fix it. And here’s the kicker: it still won’t guarantee children’s safety online.

What’s Changed in the Bill?

The manager’s amendment makes a number of changes to the previous version of EARN IT, some big, some small.[2] The primary thing that the manager’s amendment does is that it cuts off providers’ Section 230 immunity for CSAM, full stop. This change gets rid of the earlier version’s incentive structure, which dangled providers the carrot of continued 230 immunity if they jumped through certain hoops and hit them with the stick of stripping their immunity against exposure to legal liability (from civil plaintiffs and states) for CSAM if they didn’t. The manager’s amendment takes away the carrot and leaves only the stick. That’s a simpler structure, but straightforward curtailment of Section 230 immunity doesn’t make the bill better.

The CSAM carve-out from 230 immunity is also different in scope than it was in the March version. The March version abrogated 230 immunity to allow civil claims and state criminal charges against providers if the underlying conduct (i.e., some user’s actions on the service) violated the core federal CSAM statutes, 18 U.S.C. §§ 2252 and 2252A, or their cognates under state law. The manager’s amendment still allows civil lawsuits against providers for users’ 2252 or 2252A violations, but it changes the scope of allowable state-law criminal charges and civil lawsuits against providers, to cover state laws “regarding the advertisement, promotion, presentation, distribution, or solicitation of child sexual abuse material” (as defined in Section 2256(8)). That list of acts tracks language in Section 2252A(a)(3)(B). That is, the manager’s amendment allows state criminal and civil actions against providers for state-law analogs of that particular subsection of 2252A, rather than of any part of 2252 or 2252A. While that sounds narrower, it’s simultaneously very broad (especially multiplied across 50-plus jurisdictions). That list of acts “is a vague and broad list of potential offenses that will encompass a wide variety of state laws that apply different legal standards to the same conduct,” as CDT told Gizmodo.

Let’s delve into the removal of the carrot/stick structure, because it touches on several of the objections I and others had raised to the bill’s version as introduced in March. In that version, retaining continued immunity for CSAM was conditional upon the provider’s certified compliance with the “best practices” recommended by a 19-member commission the bill would create. The commission and the best practices are still part of the manager’s amendment. The commission’s makeup, its duties, and the considerations it is to take into account, are unchanged from the March version. However, the commission’s recommended best practices no longer have the legal force they would have had under the March version, which I discussed here.

Taking away the commission’s teeth is apparently responsive to multiple concerns raised about the earlier bill version. For one, it was extremely unusual to give an unelected commission’s recommendations the force of law. That raised due process problems, as I wrote; it also, per TechFreedom, seemed to violate core requirements of administrative law. The manager’s amendment makes the commission, and its work product, more like any other.

Importantly, the manager’s amendment also removes some of the Attorney General’s formerly broad power over the best practices. The AG still heads the commission (which has a number of other representatives from law enforcement), meaning he will still exert influence over what gets recommended. However, he no longer has the power to approve or disapprove the commission’s recommended best practices. In the manager’s amendment, the AG instead just has to publish the best practices on the Department of Justice’s website and in the Federal Register. Given how notoriously anti-encryption and pro-surveillance the current AG is, this change reduces AG Bill Barr’s ability to use the best-practices process to strong-arm providers into undermining users’ security and privacy and turning their services into streamlined DOJ spying machines, as I warned about in January.

By making the best practices more explicitly voluntary, the manager’s amendment appears to be intended to address the earlier bill version’s Fourth Amendment “state actor” problem. I wrote about that issue here. (By implication, the amendment also admits that the original bill’s best-practices scheme was never really voluntary, after all.) The March version of the bill recognized that it had this problem, and tried to cure it with disclaimer language saying that nothing in the bill was to be construed to require providers “to search, screen, or scan” for CSAM. That disclaimer language has been removed from the manager’s amendment. To me, that indicates that Graham et al. believe that they’ve fixed the “state actor” issue by decoupling compliance with the best practices (or failure to do so) from either positive or negative consequences. Irrespective of compliance, nobody can qualify for continued 230 immunity for CSAM anymore. Providers are damned if they do (follow the best practices) and damned if they don’t.

Taking away the carrot and leaving only the stick is a strange approach. For one thing, Section 230 has never barred federal criminal law enforcement. That is, it never immunized any noncompliance by a provider with federal CSAM law in the first place, as I said in January and as Techdirt notes in a post today. If providers are flouting federal CSAM law, 230 already doesn’t prevent the federal government from going after them. So, as I said in January, removing Section 230 immunity is not the most on-point way to get providers to change their approach to CSAM; the most direct way would be to amend the federal laws that tell providers what they have to do about CSAM.

For another thing, how does “damned if you do, damned if you don’t” in any way “incentivize” providers to follow the best practices? If they’re liable either way, why invest a bunch of resources retooling their services in response to the commission’s nonbinding recommendations (and then do so again every five years, when the commission is supposed to submit updated recs)? With its teeth removed, as Techdirt observed, “The Commission just gets to shout into the wind.” It’s not clear why the bill is even still called “EARN IT” – with the carrot of continued immunity gone, there’s nothing left for providers to “earn”!

What the July 2 changes do incentivize is more encryption, because Leahy’s amendment tries to create an encryption safe harbor. His amendment to the manager’s amendment is intended to address widespread concerns that EARN IT is a sneaky way to ban encryption. In an effort to keep “cybersecurity protections” from giving rise to liability, the Leahy amendment spells out that a provider may not be held liable because it “utilizes full end-to-end encrypted messaging services, device encryption, or other encryption services”; “does not possess the information necessary to decrypt a communication”; or “fails to take an action that would otherwise undermine the ability of the provider to offer full end-to-end encrypted messaging services, device encryption, or other encryption services.” In short: if, having lost 230 immunity for CSAM, a provider faces a civil lawsuit or state criminal prosecution over CSAM, the claim or charge has to be based on something other than encryption. Offering encryption would not automatically give rise to provider liability for CSAM on the service.

This amendment was reportedly intended to, and did, garner additional bipartisan support for the bill. It’s not as strong as it purports to be, as further explained below. But ironically, as Techdirt points out, the encryption safe harbor would have the effect of incentivizing providers to encrypt even more information in a form they can’t “see” (if such a design change is feasible, which isn’t universally true for all providers and all services). Doing so would reduce the effectiveness of providers’ current anti-CSAM efforts, such as the automated scanning tools many providers currently voluntarily use on files shared or stored on their services. I, for one, would be happy to see encryption become even more ubiquitous, but EARN IT’s backers might not recognize how this amendment creates an incentive at odds with the bill’s goals. Techdirt’s post delves further into the new flavor of “moderator’s dilemma” that the Leahy amendment creates.

The manager’s amendment’s changes to the March version of EARN IT are a positive move to address the concerns of EARN IT critics like me. But I still can’t rest easy. For one thing, encryption is still under threat despite the Leahy amendment. Graham made it clear at the July 2 meeting that he will not stop attacking encryption. He isn’t even bothering with sneak attacks anymore: a couple days earlier, he’d introduced another bill, the Lawful Access to Encrypted Data (LAED) Act, that is a full-frontal assault on encryption which would swallow the Leahy amendment by forcing providers to proactively backdoor their encryption. And although the Leahy amendment (combined with the other changes noted above) and the LAED Act are clearly intended to make this new version of EARN IT seem more palatable and “reasonable,” I still don’t and won’t support EARN IT. Here’s why.

What’s Still Wrong with the Bill?

The July 2 amendments make some positive changes but don’t fix some problems with this bill. The only way to fix a bill like this is by throwing it out.

Leahy’s encryption amendment isn’t all it’s cracked up to be. There’s still skepticism among tech policy wonks and cryptographers alike over Leahy’s encryption amendment. It essentially gives providers a defense against liability, which is less strong than the a priori immunity from liability in the first place that Section 230 currently provides.

CDT predicts that it will invite prolonged litigation over whether potential liability is “because of” the provider’s use of encryption (if so, the case is barred) or because of some other reason (if so, no bar).[3] CDT told CyberScoop that the “consistent threat of litigation … will be a strong disincentive against providing [end-to-end encryption] and continuing to have to defend that decision in court.” With potentially wide variation in state CSAM laws, “the worry,” as Techdirt says, “is that we won't know whether or not offering end-to-end encryption would be seen as violating state laws until long and costly cases go through their lengthy process.” The Internet Society’s Joe Hall agreed, telling CyberScoop that the amendment is “a fig leaf of protection for strong encryption” that leaves providers “to fight it out in court, which is far from cementing protection and clarity for encryption, the bedrock of our lives on the internet and in the real world.” I couldn’t have said it better.

It’s not clear how many cases against providers would actually be precluded by Leahy’s amendment. Plaintiffs and state AGs could readily come up with other grounds besides encryption on which to premise liability for an encrypted service (at least as a pretext, even if encryption is really the ultimate reason they’re mad). CDT also points out that the Leahy amendment doesn’t stop the AG-headed commission from recommending anti-encryption best practices (as any commission with Bill Barr at the helm will likely do). That would’ve been a freebie for Leahy to throw in, especially with the commission’s fangs removed anyway.

I also think the Leahy amendment doesn’t go far enough. The carve-out’s section header is about “cybersecurity protections” not giving rise to liability. But the text is only about encryption. What about other kinds of cybersecurity protection? Those should not give rise to liability either; federal policy should incentivize providers to protect their users’ security, not dissuade them from doing so – especially now that COVID-19 has moved so many Americans’ work, school, and other life activities online. What about mechanisms for accessing otherwise-encrypted information that technically “don’t touch the encryption,” such as the ghost user proposal, client-side scanning, or the custom version of iOS that the FBI tried to force Apple to create in the 2016 “Apple vs. FBI” showdown? The way I read the Leahy amendment, I’m not sure it would preclude liability if a provider fails to take measures such as those.

The Leahy amendment is certainly better than nothing, and the language is more thorough than it could be. But it’s not the silver bullet that some are holding it out as in terms of answering critics’ concerns about how EARN IT could potentially discourage encryption and harm cybersecurity.

The bill still violates the First Amendment. Removing immunity for CSAM, full stop – with no chance to retain it by jumping through hoops – renders this version of EARN IT much more akin to the controversial FOSTA law of 2018, which removed providers’ 230 immunity for sex trafficking (but which, as I noted in January, merely hurt sex workers). But FOSTA violates the First Amendment, and so does EARN IT, as I’ve already discussed here.

That’s still true despite the manager’s amendment. This version of EARN IT would still result in massive over-removal by providers of completely legal user speech for fear of liability. As the ACLU said in a June 29 letter to the SJC, “Even if the speech covered by the law could be restricted without raising constitutional concern, the content moderation practices the companies will deploy to avoid liability risk will sweep far more broadly than the illegal content.”

Recall that provider liability can be premised on either users’ violations of the federal statutes, Sections 2252 and 2252A, or of state laws “regarding the advertisement, promotion, presentation, distribution, or solicitation of child sexual abuse material.” Providers could reasonably fear that the liability threat would encompass a wide range of their services. Any service that gives users the ability to talk to each other (think social media, chat rooms, messaging apps, videoconferencing services, voice calling apps); the ability to post online ads seeking to buy or sell stuff (think craigslist); the ability to post images (think Tumblr); the ability to share files (think Dropbox) – all of these could be misused by users, rendering the provider liable under one or more laws.

You’ll note that those functions are basically everything we use the Internet for. That’s because the ability to talk to each other and share information online is inherently susceptible to abuse, both CSAM and other kinds of abusive online conduct (some of which are legal and protected by the First Amendment in the U.S., whether you like it or not). That is the downside of the Internet, and it can only be mitigated, not wholly avoided. But unconstitutionally inducing providers to suppress tons of legal user speech by allowing broad liability for the illegal bits is not the way to mitigate it.

What’s more, the manager’s amendment puts no limit on providers’ potential liability under state laws. The earlier March version would have created new civil penalties for providers’ reckless violations of Sections 2252 or 2252A (under which the current standard for criminal violations is actual knowledge). That language is gone from the July 2 manager’s amendment, but it’s undermined by the absence of any restriction on what the provider’s mens rea must be to hold a provider liable for the actions of its users under the relevant state laws. As a coalition letter about the manager’s amendment, spearheaded by the R Street Institute, commented, “If a state law makes it illegal to negligently or recklessly transport CSAM, interactive computer services will likely be unable to host user-generated content at all.” For providers of online services, liability for abuse is tantamount to liability for existing.

The result of the manager’s amendment, which curtails 230 immunity for CSAM full stop, will be the same as it would have been under the March version. Providers will be scared into shutting down large portions of their services, cutting off the perfectly legal conduct that users were engaging in on the service, for fear of litigation and state AG prosecutions under any of potentially 50 or more state laws. (This is precisely why Section 230 immunity was designed to apply to state criminal laws: to prevent dozens of jurisdictions, which vary widely in what they criminalize, from dictating the liability of an online provider for the actions of users located all around the country and the world.) And that’s why EARN IT is still unconstitutional.

The manager’s amendment of EARN IT is now very straightforwardly FOSTA redux, for CSAM instead of sex trafficking. EARN IT, like FOSTA before it, is just another variation on the same tired theme of Internet free speech-killing bills that Congress has passed (and the courts have kept striking down) since the ‘90s. You’d think that Congress would know better by now, particularly given how many members of the Senate Judiciary Committee have been there since the ‘90s.

The manager’s amendment may or may not fix the Fourth Amendment “state actor” problem. As noted, the manager’s amendment removes the carrot/stick incentives that raised Fourth Amendment issues in the March version. Does this change actually fix that problem? I’m not sure. On the one hand, everyone gets the stick, whether they follow the best practices or not. On the other hand, maybe the stick is smaller for those who do follow them.

The SJC must believe that the best practices to be recommended by the EARN IT committee would be more effective at fighting CSAM than whatever providers do currently. If so, following the best practices would translate into less liability exposure for the provider (because there’s less CSAM to give rise to liability) than if the provider ignores the best practices and keeps doing its own thing. In that sense, you could say that the government is still inducing providers to follow the best practices. The stick isn’t “exposure to liability, period” (as under the March version), it’s “exposure to less liability than if you don’t follow them.”

Let’s say that the best practices encourage proactive monitoring and filtering of user files to detect and block CSAM. (Remember, the March version’s disclaimer about that is gone now!) If the government tells a provider, “You’ll be hit with a big stick if you don’t follow our best practices, but you’ll be hit with a smaller stick if you do,” is that enough for a court to find that compliance with the best practices is “not primarily the result of [the provider’s] private initiative”? If so, then we might still have a Fourth Amendment “state actor” problem on our hands.

This question is a closer call under the manager’s amendment than it was under the March version of the bill. Even so, as I observed at the time, the government walks a very fine line whenever it passes laws governing providers’ CSAM duties. Cross that line, and the whole plan backfires: the result is suppression of evidence in a bunch of CSAM prosecutions, making it harder to convict those accused of the heinous crimes that motivated the introduction of the EARN IT bill in the first place. The Senate Judiciary Committee is taking a gamble that with the manager’s amendment’s revisions, it will win on the state-actor issue in court. Maybe it will, but those are the high stakes it’s willing to bet. Is sticking it to Big Tech really worth that risk?

Of course, this analysis is all based on the EARN IT bill itself and on the best practices the commission might create. Since the manager’s amendment to EARN IT removes Section 230 immunity from liability under every state’s CSAM laws, it will, if passed, incentivize providers to conform to the most restrictive, aggressive state law in order to minimize their newfound liability. I don’t know anything about state-level CSAM laws, but if there are any that would require monitoring and filtering in order for the provider to avoid liability, then a provider might choose to do so. Not only would that be a huge invasion of all users’ privacy, it also would convert the provider into a state actor. That is, by opening up liability under state laws, EARN IT may be creating a whole new Fourth Amendment state actor problem, separate and apart from the bill’s own language or the best practices.

EARN IT still won’t achieve its goal of online child safety. I am skeptical of the Committee’s assumption that this bill would have a net positive impact on providers’ fight against CSAM online. Even if there’s no more Fourth Amendment problem, the July 2 version of the bill could still harm rather than aid those efforts.

I explained before that the early version of EARN IT would not stop the flow of CSAM on online services. I observed that EARN IT would merely induce those offenders to move from mainstream platforms – the U.S.-based Big Tech companies that comply with federal law – onto the dark web, where such imagery is the entire raison d’être for rogue services that do not comply with federal CSAM reporting obligations, do not qualify for Section 230 immunity in the first place, and do not care. No version of the EARN IT bill is going to make those dark web services bat an eye. As Gizmodo succinctly put it, entities already “are not legally protected under Section 230 if it’s found they’re participating in, or even wittingly allowing, the exchange of CSAM online,” so “[i]t’s unclear, frankly, why any of this [is] necessary.”

The manager’s amendment is no help. If the best practices really are voluntary and providers are free to ignore them, and if they’re no longer immune from liability in any event, then some providers (the deep-pocketed ones) might not change anything about how they fight CSAM, preferring to maintain their current strategy. If they change nothing, then they won’t find any more CSAM than they do already (and remember, they already find a lot), so there’s no additional child-safety benefit attributable to the bill.

On the other hand, as said, a provider might make changes to follow the best practices – or at least remove huge swaths of user speech, as predicted above – because it thinks it’ll face less liability than it would under its current practices. If so, then as said, those changes risk pushing offenders off those mainstream services and onto the dark web, where they’re harder to track down. And meanwhile, as said, the provider’s changes would likely stop a lot of totally legal user conduct. The provider’s service would be less viable for the legitimate, innocuous purposes that most of its users use it for, and offenders would just go trade horrible content with each other somewhere else.

Either way, the manager’s amendment (like FOSTA before it) won’t get rid of the problem it aims to solve. At best, it’s superfluous; at worst, it would just make the problem of online child sex abuse harder to fight while harming everyone’s online speech.

Conclusion

Fixing some of the original EARN IT’s problems can’t save a fundamentally bad bill. EARN IT is a danger to free speech on the Internet, which was already weakened by EARN IT’s older sibling, FOSTA. As Fight for the Future says, “The EARN IT Act is and always will be a threat to free speech and free expression online.”

The EARN IT bill now passes out of committee and will head to a floor vote of the full Senate. With bipartisan backing, it stands higher odds of passing in the full Senate. Please continue to contact your Senators to tell them to oppose EARN IT in any form, and start contacting your House reps as well. You can take action here.



[1] In that post, I speculated that if providers were flagrantly violating their federal statutory CSAM obligations, it would’ve been front-page news. For what it’s worth, I recently searched the federal court docket for the Northern District of California, where many major providers are located. If there had been any criminal charges filed against providers under those statutes for violating those duties, that court is where you’d expect to find them. I found zero.

[2] The small things include stuff like (1) changing some language that I thought was overbroad and in tension with the Stored Communications Act, clarifying that the language was indeed about what I suspected it was about (“to ensure that providers do not face liability for merely dropping a CD in the mail to the DOJ that contains CSAM”), (2) borrowing some language from a recent bill from Sen. Ron Wyden (D-OR) that doubles the length of time that providers must preserve the CSAM they report (from 90 to 180 days) and allows preservation beyond that (though Wyden, one of Section 230’s authors, was not impressed), and (3) adding a provision for “IT solutions relating to combating online child exploitation” that seems to be aimed at improving the DOJ’s slipshod handling of information about Internet crimes against children, but which looks like small potatoes compared to the Wyden bill.

[3] This happens all the time in garden-variety Section 230 litigation. In seeking to circumvent Section 230’s clear bar against treating a service provider as the “publisher or speaker” of information provided by a third party, plaintiffs frequently tie up defendant providers in litigation over whether that is in fact what their theory of liability does. The answer is often yes, so the case is often dismissed as barred by 230. But the defendant still had to spend time, money, and effort just to get to that point.

 

Add new comment