The EARN IT Act Is Unconstitutional: Fourth Amendment

So far, I’ve covered what I believe to be some First Amendment and due process problems with the EARN IT Act bill. Last but not least, let’s talk about the Fourth Amendment. The EARN IT Act risks turning private companies – the providers of online services such as social media platforms, chat apps, email services, and cloud storage – into agents of the state for purposes of the Fourth Amendment. That would result, ironically, in courts’ suppression of evidence of the child sexual exploitation crimes targeted by the bill. That means the EARN IT Act would backfire for its core purpose, while violating the constitutional rights of online service providers and users alike.

Right now, in order to cut down on child sexual abuse materials (CSAM) on their services, many providers of online services (such as social media, webmail, and cloud storage) voluntarily scan files in users’ accounts for CSAM. They do this by comparing the file the user is trying to send/upload/store, against a database of known child sex abuse images. (More specifically, by comparing the images’ hash values. You can learn more about hash searches in this 2018 Stanford Law Review note, which discusses prior legal scholarship on the same topic.)

Some examples of providers that voluntarily and proactively search for CSAM: Microsoft (which helped pioneer the PhotoDNA technology that many companies use) scans files uploaded to OneDrive, Apple scans files uploaded to iCloud accounts, Gmail scans images attached to email messages, Twitter checks images attached to tweets, and Facebook checks photos that are uploaded or attached to (non-end-to-end encrypted) messages. (CSAM transmitted via end-to-end encrypted messaging is much more difficult to detect. I’ve written before about how this bill is a sneak attack on encryption dressed up in the guise of protecting children. But encryption isn’t my topic today.)

Federal law requires that providers report any CSAM they find to NCMEC, the National Center for Missing and Exploited Children, which reviews the reports and passes them onward to law enforcement. As of last year, the current volume of reports to NCMEC was nearly one million reports per month. Most of that high volume is due to the automated CSAM-detection systems many providers have implemented.

The key word in the foregoing description is voluntarily. Why? Because of the Fourth Amendment. The Fourth Amendment protects Americans against unreasonable searches and seizures by the government. Normally, for a search to be reasonable, it requires a warrant supported by probable cause. That’s why the police go get a warrant if they want to search and seize, say, the contents of your email account. Law enforcement does not get to constantly warrantlessly search everybody’s email accounts, cloud storage accounts, etc. The reason that private companies like Facebook can, and do, do exactly that is that they are not the government, they’re private actors, so the Fourth Amendment doesn’t apply to them.

Unless… the government makes a private company act on behalf of the government. The Fourth Amendment “does not constrain private parties unless they act as agents or instruments of the government.” United States v. Stevenson, 727 F.3d 826, 829 (8th Cir. 2013) (citation omitted). However, “[w]hen a statute or regulation compels a private party to conduct a search, the private party acts as an agent of the government. Even when a search is not required by law, however, if a statute or regulation so strongly encourages a private party to conduct a search that the search is not primarily the result of private initiative, then the Fourth Amendment applies.” Id. (citations and quotation marks omitted).

That is, a private company can become a “state actor” for Fourth Amendment purposes if the search occurred at the government’s behest, not because the private company chose to do it. In that case, the Fourth Amendment’s warrant requirement kicks in; if there was no warrant for the search, that makes the search “unreasonable.” It’s a Fourth Amendment violation.

The usual remedy for an unconstitutional search is exclusion. In the criminal proceeding against the person whose rights were violated, the court will exclude any evidence that was obtained as a result of the unconstitutional search from being used against the person at trial. The idea behind the exclusionary rule is to deter government misconduct. There’d be little to deter Fourth Amendment violations if the government faced no consequences for the violation.

The key piece of evidence in a prosecution for possession of CSAM is, well, the CSAM itself. If that evidence gets excluded as a remedy for a Fourth Amendment violation, there isn’t much left to prove the government’s case, and the court will likely dismiss it. That means that someone who may have committed one of society’s most deplored crimes will go free, because the government violated his constitutional rights. That is a hell of a deterrent to unlawful searches.

This is why that federal law I mentioned about CSAM reporting, 18 U.S.C. § 2258A, doesn’t require providers to search for CSAM, only to report whatever CSAM they just happen to “obtain actual knowledge” of to NCMEC. Congress is well aware that it is walking a very fine line with regard to the Fourth Amendment when it comes to passing laws governing what providers do about CSAM. If providers such as Google or Facebook were required by law to do the mass warrantless scans of all content on their systems that they presently do voluntarily, that would violate the Fourth Amendment. And the CSAM found and reported to NCMEC as a result of those compulsory warrantless scans would likely be excluded from any criminal proceedings against the users who were caught with it. Those defendants would likely go free, undermining the whole point of passing laws to fight online child sexual exploitation.

The “state actor” doctrine has come up frequently in CSAM cases that arose from the defendant’s use of an online service like Gmail or Dropbox to send or store illegal images. Defendants have often attempted to get that evidence excluded in court by accusing either the service provider, or NCMEC, or both of being “state actors” that conducted warrantless searches in violation of the Fourth Amendment.

As to the providers, this argument has historically failed. Courts have routinely held that email providers, such as Yahoo and Google, acted on their own initiative when their automated systems scanned defendants’ email accounts for CSAM. In the Stevenson case I quoted above, the defendant had allegedly sent CSAM from his AOL email account to his Google email account, which was caught by AOL’s automated filtering system and reported to NCMEC. He argued that the reporting requirements of Section 2258A (in combination with the subsequent section of law, Section 2258B) turned AOL into a state actor. The Eighth Circuit rejected this argument: “A reporting requirement, standing alone, does not transform an Internet service provider into a government agent whenever it chooses to scan files sent on its network for child pornography.” 727 F.3d at 830. Neither 2258A nor 2258B, the court said, “authorizes AOL to scan its user’s emails”; rather, the law “is silent regarding whether or how AOL should scan its users’ e-mail,” and “[t]he only subsection that bears on scanning makes clear that [a provider such as AOL] is not required to monitor any user or communication, and need not affirmatively seek facts or circumstances demonstrating a violation that would trigger the reporting obligation.” Id. That is, Congress’s careful drafting of those two statutes has so far successfully sidestepped the potential Fourth Amendment “state actor” pitfall as to providers.

Arguments accusing NCMEC of state-actor status have fared better. The 2016 case United States v. Ackerman involved another defendant who, like the defendant in Stevenson, allegedly sent CSAM using his AOL email account, triggering AOL’s filters and thus a report to NCMEC. 831 F.3d 1292, 1294 (10th Cir. 2016). AOL reported the email, with four images attached, to NCMEC, where an analyst opened the email and viewed the attached images, confirming that all four were CSAM. Id. The Tenth Circuit agreed with the defendant that NCMEC was a “governmental entity or agent.” Id. at 1294-95. It held that NCMEC’s authorizing statutes (which impose “over a dozen separate functions” on NCMEC to perform) make it a governmental entity. Id. at 1297-98. And even if it weren’t, the court went on, it’s still an agent of the government—which implicates the Fourth Amendment because the government cannot authorize its agents to conduct searches that would be illegal if done by the government itself. Id. at 1300-03. The court concluded that NCMEC, a government agent, had conducted a search for Fourth Amendment purposes by opening and examining the defendant’s email, and thus the four images should be suppressed. Id. at 1308.

Ackerman finally snapped the thin reed on which NCMEC and the federal government had for years been hanging the legal fiction that NCMEC didn’t effectively stand in the same shoes as the government for Fourth Amendment purposes. After Ackerman, at least one district court has “assume[d], without deciding, that NCMEC is a state actor,” partially “[i]n light of Ackerman.” Thanks to Ackerman, the government, in these CSAM prosecutions, has faced a less easy (but not insurmountable) task in persuading courts that the NCMEC search at issue in any particular case holds up to constitutional scrutiny. It can no longer rely on courts to dependably find that NCMEC is a private actor, rather than a government cut-out doing law enforcement’s business (in all the various ways detailed in Ackerman).

These cases about providers and NCMEC show that Congress is walking on thin ice whenever it tries to regulate what providers ought to do about CSAM. One false step, and the careful legal construct that places NCMEC as a buffer between providers’ CSAM scanning and the government’s Fourth Amendment obligations could all come crashing down.

That brings us back to the EARN IT Act. The bill removes providers’ Section 230 immunity against civil lawsuits and state criminal prosecutions for CSAM unless providers either comply with a set of supposedly voluntary “best practices” aimed at fighting online child sexual exploitation or implement unspecified “reasonable measures” aimed at that same goal. (While certifying compliance with the best practices guarantees that a provider keeps 230 immunity, going the other route does not; the provider would have to litigate in court the question of whether its measures are “reasonable” and entitle it to continued 230 immunity.) The bill also reduces the mens rea for providers’ liability for CSAM on their services, from actual knowledge to recklessness. As CDT explains, “providers would become liable for a new set of federal civil penalties if they just recklessly (rather than knowingly) provide a service that people use to distribute CSAM.”

That complicated set-up boils down to the following: the bill exposes providers to potential civil and state criminal liability for CSAM on their services unless the provider complies with the “best practices” – while simultaneously lowering the bar for being held liable.

The byzantine structure is doubtless intentional. The drafters of the bill know it would definitely violate the Fourth Amendment for the bill to explicitly require providers to scan for CSAM on their services. That would unquestionably convert the providers into “state actors,” and those mass warrantless searches would violate the Fourth Amendment, triggering the exclusionary rule and letting suspected child predators go free. “In short, direct mandates could bring the entire system of cooperation between ... providers and law enforcement crashing down,” as TechFreedom observed in a letter to lawmakers.

Hence the bill’s weird, convoluted structure, as well as the following disclaimer it contains: “Nothing in this Act or the amendments made by this Act shall be construed to require a provider … to search, screen, or scan for instances of online child sexual exploitation.” There’s similar language in Section 2258A, to which courts including Stevenson have pointed to back up their conclusion that providers are not state actors for Fourth Amendment purposes.

Maybe that disclaimer would win the day in a Fourth Amendment “state actor” challenge to EARN IT, too. But maybe not. Remember, as the Stevenson court said, the “state actor” doctrine doesn’t apply only when a statute compels a private entity to conduct a search, but also “when a search is not required by law,” if the statute “so strongly encourages a private party to conduct a search that the search is not primarily the result of private initiative.” 737 F.3d at 829 (citations and quotation marks omitted).

Stevenson said Section 2258A (with the similar disclaimer language) didn’t do that. But a court might conclude that the EARN IT Act does. EARN IT says that compliance with the “best practices” to be promulgated under the Act is voluntary, but the scheme doesn’t look very voluntary at all; it is clearly designed to railroad providers into complying with whatever the “best practices” are, on pain of significant legal liability. It sets up a tortuous and byzantine bill structure, plus the disclaimer, to try to put a fig leaf over that fact. But that won’t necessarily stop a court from seeing through it.

So far, the government has been able to argue successfully that providers’ automated CSAM systems are voluntary and providers aren’t state actors under existing law. And until Ackerman, it was able to maintain the polite fiction that NCMEC (which is underfunded and overwhelmed, due to neglect by the DOJ) was not effectively an arm of the government. EARN IT wouldn’t necessarily stand up to Fourth Amendment scrutiny as existing law has. A court could find that, notwithstanding the bill’s convoluted set-up and that disclaimer, the one-two punch of liability exposure and lowered mens rea effectively requires providers to follow the best practices as a practical matter, converting what the law pretends is voluntary conduct into state action subject to the Fourth Amendment. The charade might come to an end, as it finally did for NCMEC’s supposed private status in Ackerman.

I’m far from the first to discuss the bill’s Fourth Amendment pitfalls. The bill had engendered vociferous opposition before it was even formally introduced last week. In response, to try to drum up support, EARN IT co-sponsor Senator Richard Blumenthal (D-CT) circulated a document on the Hill that purported to debunk myths about the bill. TechFreedom posted it online. One of the critiques to which the document responds is the foregoing Fourth Amendment “state actor” issue.

How did Senator Blumenthal respond to the contention that his bill violates the Fourth Amendment? By telling his audience – other members of Congress and their staff – that Americans have no right of privacy in their online information anyway. Yes, really. Specifically, Blumenthal quoted a statement in the Ackerman opinion, written by then-Judge Gorsuch, that the Supreme Court has “suggested that individuals lack any reasonable expectation of privacy and so forfeit any Fourth Amendment protections in materials they choose to share with third parties.”

This is not only frightening, it’s ahistorical, and it misstates Ackerman. We still have Fourth Amendment rights in our “papers and effects,” even though paper letters and filing cabinets have been replaced by email and cloud storage accounts held on third-party servers. Not even the Department of Justice takes Blumenthal’s position anymore – as it told the Supreme Court in 2017, it’s been DOJ policy since 2013 “always to use warrants” to get the contents of emails from providers.

What’s more, Blumenthal conveniently ignores that the Supreme Court in recent years has been expanding Fourth Amendment privacy protections for people’s digital information, including in its 2018 decision in United States v. Carpenter (in which Gorsuch dissented). That case ruled that the Fourth Amendment requires a warrant for the historical location data your cell service provider has on you, even though you ostensibly choose to share your location with the third-party cell phone company.

In fact, Blumenthal’s citation to Ackerman even leaves out what Gorsuch said later in the same paragraph Blumenthal quoted that mentioned the third-party doctrine: “But the district court didn’t rely upon third-party doctrine in ruling against Mr. Ackerman. Exactly to the contrary, throughout its decision the court assumed that Mr. Ackerman had a reasonable expectation of privacy in his email.” 831 F.3d at 1305. Funny how Blumenthal forgot to quote that part.

Senator Blumenthal’s contention that Americans have no online right to privacy from their government is utterly scandalous. And it also comes off as insincere: as TechFreedom’s leader Berin Szoka innocently asked on Twitter, if there’s no Fourth Amendment issue with his bill, why didn’t Senator Blumenthal “just write ‘best practices’ into legislation as direct mandates?” Now that you’ve read this post, you know the answer.

In sum: The EARN IT Act bill’s authors apparently care about the Fourth Amendment only insofar as it might get in the way of their terrible bill. Meanwhile, their pretense that the bill doesn’t pose a significant Fourth Amendment problem is a transparent charade. The EARN IT Act, if passed as-is, is at serious risk of being deemed unconstitutional under the Fourth Amendment by effectively mandating online service providers to conduct mass warrantless searches of everyone’s digital documents and communications.

If passed, the bill threatens to knock the government’s ability to fight CSAM off of the narrow legal tightrope that it is currently walking. That delicate balance depends entirely on providers’ continued ability to freely choose to voluntarily, proactively scan content on their services for CSAM. That volition is the only thing keeping CSAM evidence from being excluded in court cases around the country. If the EARN IT Act upsets that balance, it will critically undermine prosecutors’ ability to get convictions in CSAM cases, meaning more suspected child sex abuse offenders will go free.

What is more, as cryptography professor Matt Green points out, “the real outcome of this bill” will be to replace providers’ intrinsic motivation to fight CSAM with the extrinsic motivation of a legal mandate. Right now, providers fight CSAM because they want to; no legitimate provider wants that horrible, radioactive stuff on its service. But once they are no longer free to choose how to look for CSAM, but instead, Congress “decide[s] to tell Silicon Valley how to do their job — at the point of a liability gun — you can bet the industry will revert to doing the minimum possible.” That is, they’d have the incentive to do the bare minimum that would check off the “best practices” boxes, and no more. The EARN IT Act would disincentivize investment in additional efforts to fight CSAM. Right now, if providers try something new, and it doesn’t work, they are free to abandon that method. But if, as Green says, “they could be mandated to deploy any new technology they invent, regardless of the cost” or effectiveness, why try?

The EARN IT Act would not only risk letting suspected child sex offenders go free, it would discourage providers from trying to get better at detecting them. That’s the exact opposite of what this bill is supposed to do: get providers to step up their efforts to combat child sexual exploitation online. The bill would backfire for its stated purposes, while also violating everyone’s constitutional rights, and probably banning strong encryption in the process. Heckuva job, Senate.

* * *

I’ve now written a series of three blog posts about how bad this bill is, and I still haven’t covered all the reasons why. The EARN IT Act bill is like a law school exam, where you have to spot all the potential issues in a given fact pattern. But because EARN IT is so incredibly, complexly bad, it’s less like a final exam from a single course, where you’re only responsible for the material from that particular class (say, Constitutional Law), and more like the bar exam, where one question may require you to spot issues across a half-dozen different areas of law. I can see, however dimly, that there are many more problems with this bill above and beyond those I’ve flagged so far. In short, to borrow a phrase from Cory Doctorow, the EARN IT Act is fractally terrible. Want to tell your congressperson you oppose it? Digital rights orgs Fight for the Future and EFF both have ways for you to take action.

Add new comment