Stanford CIS

Ignoring EARN IT’s Fourth Amendment Problem Won’t Make It Go Away

By Riana Pfefferkorn on

This is the latest entry in my lengthy archive of writing, talks, and interviews about the EARN IT Act:

A month ago, the controversial EARN IT Act sailed through a markup hearing in the Senate Judiciary Committee. If enacted, the bill would strip the providers of online services of Section 230 immunity for their users’ child sexual exploitation offenses, meaning they could be subject to civil suit by private plaintiffs and criminal charges under state law. The idea is that providers aren’t presently doing enough to combat child sex abuse material (CSAM) on their services, and that exposing them to more liability would goad them into better behavior.

A handful of committee members—Senators Lee, Coons, Ossoff, Booker, and Padilla (plus Leahy, kind of)—voiced concerns that, as written, the bill would have negative consequences for encryption, privacy, security, free speech, and human rights, and would further harm already at-risk populations at home and abroad, such as domestic abuse survivors, LGBTQ individuals, and journalists. (Two weeks after the hearing, Russia’s invasion of Ukraine underscored these high stakes.)

Despite their reservations, all of those senators joined their colleagues to vote the bill unanimously out of committee without any changes. They expressed confidence, however, that these issues could be addressed before the bill reaches the Senate floor (if it ever does).

I don’t share their professed optimism, whether it’s sincere or not. Either they truly believe the bill will receive amendments that assuage their concerns… or they don’t, but they’ll keep voting “yes” anyway for fear of being branded a sympathizer to pedophiles (or worse, Big Tech). If they’re sincere, they’re setting themselves up for disappointment when EARN IT’s sponsors refuse to fix its problems. If not, their failure to stand up against the bill may come back to haunt them later if, perversely, its passage helps set pedophiles free.

EARN IT’s Problems Aren’t an Accident, They’re the Point

The bill’s issues regarding encryption and privacy (to say nothing of all the other stuff) are unlikely to be fixed, because to its sponsors, they’re not bugs, they’re features.

As I explained when EARN IT was reintroduced, the current bill’s encryption-related language (which originated in the House) is less protective of encryption than the version (drafted by Leahy) that was in the bill the last time it passed out of this committee in mid-2020. And even that was skim milk, not full-fat.

Will this language be strengthened? Doubtful. On the eve of markup, and again during the hearing, bill sponsor Senator Richard Blumenthal finally dispensed with his years-long pretense that his bill is not about punishing providers that offer strong encryption to their users. He made it clear that he will not agree to an amendment that truly protects encryption, because, in his own words, he doesn’t want encryption to be a “get out of jail free card” for providers. (This is the same senator who, at the same time he was pushing EARN IT, got mad at Zoom for not using the strong encryption he wants companies to get punished for using.) Blumenthal’s recalcitrance bodes poorly for the other senators’ encryption concerns.

Their worries about privacy will fare no better: Blumenthal wants more surveillance by online service providers of their users’ files and communications. That’s the whole point of his bill, after all: hey, that CSAM isn’t gonna find itself. In the hearing, Blumenthal didn’t shy away from admitting that his goal is to get providers to start scanning all user content for CSAM if they aren’t doing so already. (Side note: “Scan all the things” is the constant drumbeat of surveillance-happy policymakers, but as my newly-published article describes, it’s not the only way to detect abuse online, or even the most effective in most contexts.) This objective had already been emphasized in a “myths vs. facts” document that was circulated with the bill language.

(Perplexingly, Blumenthal, along with several other EARN IT sponsors, endorsed a bill in 2015 that would have strengthened privacy protection for Americans’ online communications instead of undermining it as EARN IT would. Between that and his anger at Zoom, it seems the gentleman from Connecticut is happily unencumbered by the proverbial hobgoblin of little minds.)

In apparent reference to that “myths vs. facts” document, committee chair Dick Durbin (who’s also a bill co-sponsor) asked whether the bill requires providers to proactively inspect content for CSAM. Blumenthal replied that while there’s no “express” duty, there’s a “moral responsibility.” (Note the failure to deny that there’s an implied duty.) When the bill’s other lead author, Senator Lindsey Graham, suggested there should be an affirmative duty to inspect, Blumenthal proposed to work with Graham and Durbin on a future bill that would expressly contain such a mandate.

Those remarks were a mistake.

The Problem with Admitting on the Record to Wanting a CSAM Scanning Mandate

As I’ve explained before (and as Professor Jeff Kosseff explained far better in a brisk 2021 paper), forcing tech companies to scan for CSAM would upset the delicate arrangement that presently enables online service providers to find and report CSAM without running afoul of Americans’ legal privacy rights. Blumenthal and Graham are dissatisfied with the current setup’s shortcomings, and they claim Section 230 is to blame. But it’s not really Section 230 they’re mad at. It’s the Fourth Amendment.

Fourth Amendment government agency doctrine

The Fourth Amendment prohibits unreasonable searches and seizures by the government. Like the rest of the Bill of Rights, the Fourth Amendment doesn’t apply to private entities—except where the private entity gets treated like a government actor in certain circumstances. Here’s how that happens: The government may not make a private actor do a search the government could not lawfully do itself. (Otherwise, the Fourth Amendment wouldn’t mean much, because the government could just do an end-run around it by dragooning private citizens.) When a private entity conducts a search because the government wants it to, not primarily on its own initiative, then the otherwise-private entity becomes an agent of the government with respect to the search. (This is a simplistic summary of “government agent” jurisprudence; for details, see the Kosseff paper.) And government searches typically require a warrant to be reasonable. Without one, whatever evidence the search turns up can be suppressed in court under the so-called exclusionary rule because it was obtained unconstitutionally. If that evidence led to additional evidence, that’ll be excluded too, because it’s “the fruit of the poisonous tree.”

Fourth Amendment government agency doctrine is why lawmakers and law enforcement must tread very carefully when it comes to CSAM scanning online. Many online service providers already choose voluntarily to scan all (unencrypted) content uploaded to their services, using tools such as PhotoDNA. But it must be a voluntary choice, not one induced by government pressure. (Hence the disclaimer in the federal law requiring providers to report CSAM on their services that they know about, which makes clear that they do not have to go looking for it.) If the provider counts as a government agent, then its CSAM scans constitute warrantless mass surveillance. Whatever CSAM they find could get thrown out in court should a user thus ensnared raise a Fourth Amendment challenge during a resulting prosecution. But that’s often a key piece of evidence in CSAM prosecutions; without it, it’s harder to convict the accused. In short, government pressure to scan for CSAM risks letting offenders off the hook.

EARN IT has a “government agent” problem – and the Senate Judiciary Committee knows it

This brittle state of affairs is what’s at stake when lawmakers try to pressure private tech companies into looking harder for CSAM—the very thing Blumenthal and Graham are openly doing. As NetChoice’s Kir Nuthi explained in Slate after the markup hearing, Blumenthal’s phrasing is apropos: EARN IT is indeed a “get out of jail free card”… for CSAM offenders.

Lawmakers have been warned for years that EARN IT could end up backfiring in this way. Yet the senators on the Judiciary Committee unanimously decided to entrust Senators Blumenthal and Graham with fixing the problem. Fat chance: Blumenthal and Graham had just said during the hearing that they favor an affirmative monitoring requirement.

I doubt that’s the answer Chairman Durbin was fishing for when he asked them if their bill requires proactive content monitoring. As a fellow sponsor of the bill, he was presumably angling to get remarks on the record that EARN IT isn’t pushing providers to scan. But they said what they said – remarks that (along with their “myths vs. facts” document) will be attached as exhibits to countless motions to suppress if EARN IT passes. Durbin’s question showed that the committee is well aware of the bill’s Fourth Amendment problem. And Graham and Blumenthal’s responses showed they have no intention of fixing it.

And yet, despite that clear signal not to expect any meaningful amendments to the bill, the entire Judiciary Committee voted unanimously for the bill. In doing so, the committee members indicated that they’re willing to roll the dice in actual CSAM cases. Should the bill end up passing as-is, every CSAM defendant who walks free will have Congress to thank for getting them off the hook.

Senators Are Putting the Fate of Their Bill in the Hands of the Very People They’re Trying to Compel

By consciously leaving the bill vulnerable to the Fourth Amendment government agency argument, lawmakers will be putting the power to decide their bill’s fate into the hands of the very people whose behavior the bill is targeting. I’m not sure Congress has thought about who those people are, or how they might react if pulled into litigation over the “government agent” question.

As said, the legality of CSAM scanning comes down to voluntariness. So far, major online service providers (many of whose legal departments are chock-full of former federal prosecutors) all choose to scan. Companies such as Microsoft, Yahoo, Google, Facebook, Dropbox, and AOL routinely have their personnel file declarations that support the government, not the defendant, when defendants move to suppress the evidence found via CSAM scan. Courts tend to give a lot of weight to those declarations that the providers choose to scan for their own business reasons. These providers have an interest in not making the argument that they’re being forced to scan, because they don’t want to be declared government agents: If they were, they would have to shut off their scanning programs, which help them find unwelcome content. The providers, not just the government, have an interest in upholding the narrative that’s sustained the legality of their CSAM scanning regimes for many years: that all of this is totally, 100% voluntary. (Kosseff’s explanation is especially barbed on this point.)

The EARN IT bill explodes that polite fiction, even in its current version with supposedly nonbinding “best practices" that tell providers what they really ought to do to avoid liability. (In actuality, those best practices are hardly voluntary, as cogently explained here.) The bill would destroy that assertion—"we want to scan, we choose to scan, we are not being forced to scan"—that has been key in getting courts to shoot down so many CSAM defendants’ suppression motions.

Should EARN IT pass, we may start seeing provider affidavits that tell a different story: that the provider never chose to scan before, and that EARN IT is the only reason it scans now. Take, for example, Signal and Telegram, which provide end-to-end encrypted messaging functionality; Telegram also offers non-E2EE “channels” for broadcasting public messages. Both apps’ founders are ideologically opposed to censorship and surveillance (and both have historically taken a light touch to trust and safety issues). Or consider right-wing “alt” social networks Parler and Gettr: as this run-down of the EARN IT markup hearing explains, both have refused to use PhotoDNA because they just don’t want to know about CSAM on their apps.

Put simply, the providers of online services don’t all fall into the “lawful good” box on the D&D alignment chart. That’s exactly why Blumenthal and Graham feel the need to regulate them with EARN IT: to get them to do more than they do now. And yet they’re implicitly assuming that if the bill passes, not only would everyone knuckle under and start scanning for CSAM (as opposed to, say, leaving the U.S. market, as Signal has threatened to do), they’d also behave themselves if called into court during any resultant motions to suppress.

But if EARN IT is the thing that finally pushes Telegram or Parler to start scanning for fear of potential criminal and civil liability, why would their representatives say otherwise in court? Why play along and pretend the government isn’t forcing them to do searches they don’t want to do? What incentive would they have to pretend to a court that they freely chose to scan (which would, after all, border on perjury), especially knowing that if they tell the truth, they won’t have to keep scanning anymore? The fate of those CSAM prosecutions, and therefore of EARN IT itself, could very well hinge on what the representatives of such apps—apps that are known for not playing nicely with governments—decide they want to say in a sworn declaration. Congress will have placed all the power in the hands of the very providers it ostensibly deplores.

If, as its sponsors have candidly admitted, the goal of EARN IT is to goad more providers to start scanning, then the success of that pressure campaign (if any) will, paradoxically, also be the law’s downfall.

True, providers that were already scanning can probably keep doing so and have a solid argument that EARN IT didn’t suddenly convert them into government agents overnight. The EARN IT sponsors’ “myths vs. facts” document claims that “little will change under this bill” for providers like that. Assuming that’s true (and I’m dubious), EARN IT doesn’t move the needle as to those providers. At best, if the one and only goal of EARN IT is to bring about universal CSAM scanning, EARN IT is superfluous.

But providers that weren’t already scanning are the ones most likely to be deemed government agents in Fourth Amendment challenges. So the CSAM cases where defendants will walk free will be the ones where the scans were conducted by the only providers whose behavior was affected by EARN IT. The net effect of EARN IT would be that fewer CSAM offenders would be brought to justice, because Congress will have shattered the “voluntariness” premise on which so many CSAM convictions currently rest. The law’s net effect would be to make the problem it aims to solve worse, not better.

Mess? What Mess?

If Congress punts on EARN IT’s Fourth Amendment problem and the law ends up backfiring, will they have the intestinal fortitude to admit it, confront the harm they’ve done, and repeal EARN IT? Or will they deny there even was any harm, like they’ve done with the utterly disastrous FOSTA law that amended Section 230 in 2018—a law that multiple senators actually referred to as a victory during the markup hearing?

When it’s federal judges publishing opinions saying “EARN IT is to blame for this undesirable outcome,” not just some marginalized voices (i.e., sex workers harmed by FOSTA) that those in power can ignore and dismiss out of hand, will Congress listen then? If EARN IT prompts court rulings highlighting Congress’s role in helping people who victimize children to evade accountability, will Congress own up to its mistake?

With EARN IT’s risks just as foreseeable now as FOSTA’s were in 2018, just how badly does Congress want to pass the EARN IT Act? What’s it willing to sacrifice—and whom?

* * *

Postscript: the “binary search” and “no Fourth Amendment rights online anyway” theories

In the past, the government has made some arguments in CSAM cases that we might see it make again if EARN IT passes and defendants move to suppress based on the “government agency” theory. These aren’t slam-dunk winners, so it would be very risky for Congress to bet that one of these arguments will prevail in court every single time. (Remember, children are the bargaining chips.)

One argument contends that a scanning program that discloses nothing more than the presence or absence of CSAM might not count as a “search” at all for Fourth Amendment purposes, because CSAM is contraband and the courts have ruled that there’s no protectable privacy interest in contraband. In that case, providers’ CSAM scans would be permissible even if they were compelled rather than voluntary.

But opinions differ as to the viability of this “binary search” theory. A 2021 student note on EARN IT gives it more credence than do two student notes from 2018. Kosseff’s paper consigns it to a skeptical footnote. Even top Fourth Amendment scholar Orin Kerr has admitted that it’s not certain that a court would endorse this theory to uphold a provider’s warrantless CSAM scanning at government behest. (If it were certain, then why does that aforementioned disclaimer in the CSAM reporting law keep coming up over and over again when courts rule that online service providers aren’t government agents?) While the idea has been raised occasionally in CSAM cases, it hasn’t caught on: the courts in those cases didn’t rely on it and neither did the government.

In an even more extreme argument, discussed by Elizabeth Banker in Medium, the government has occasionally asserted in court that Americans don’t even have Fourth Amendment privacy rights in our online files and communications: because we hand them over to third-party providers, we can’t reasonably expect them to stay private. (A related argument claims there’s no reasonable expectation of privacy because, in agreeing to a provider’s terms of service, users consent to the provider’s search of their account contents—even when the provider is acting as a government agent. But this argument only works if the language of the TOS backs it up. Apropos of nothing, here’s a link to Telegram’s TOS.)

“You have no Fourth Amendment rights online anyway” was Blumenthal’s previous retort to concerns about an earlier iteration of EARN IT. As I said then, this frightening position goes against the grain of contemporary Supreme Court jurisprudence in the digital era. It has not been universally adopted among the lower courts; while Banker cites some cases adopting it, others have vigorously rejected it. Even the Wolfenbarger opinion Banker cites, which agreed with the government’s argument, was later vacated and replaced with an opinion that avoided addressing the issue. In short, like the “binary search” theory, this is hardly a slam-dunk argument—but per Banker, we might see the government press it more often if EARN IT passes.

It’s still shocking to me that EARN IT’s authors endorsed this view. I wonder how many other members of Congress are willing to stand up in front of their constituents and tell them they can’t reasonably expect their emails, DMs, cloud storage, etc. to be private. If they really believe that, the proper response isn’t to use it to justify EARN IT. Rather (as Banker notes), it’s for them to strengthen our federal electronic communications privacy statutes.

That’s just what the bill I said Blumenthal supported back in 2015 would have done. It’s passed again and again in the House (once unanimously, 419-0), only to grind to a halt in the Senate. If the Senate has repeatedly refused to pass an online privacy bill that’s been incredibly popular on the House side, why should we expect it will have much appetite for fixing EARN IT’s constitutional pitfalls before it reaches the Senate floor?

Maybe it will never get there, though: floor time is precious, this bill is controversial, and there are other things (COVID, inflation, budget, Russia/Ukraine...) on Congress’s plate. For senators who dislike the bill but don’t want to look soft on crimes against children, not having to make a decision about how to vote might be the best outcome they can hope for. Stay tuned.