Recently, I wrote for Lawfare about Sen. Dick Durbin (D-IL)’s new STOP CSAM Act bill, S.1199. The bill text is available here. There are a lot of moving parts in this bill, which is 133 pages long. (Techdirt valiantly tries to cover them here.) I am far from done with reading and analyzing the bill language, but already I can spot a couple of places where the bill would threaten encryption, so those are what I’ll discuss today.
According to Durbin, online service providers covered by the bill would have “to produce annual reports detailing their efforts to keep children safe from online sex predators, and any company that promotes or facilitates online child exploitation could face new criminal and civil penalties.” Child safety online is a worthy goal, as is improving public understanding of how influential tech companies operate. But portions of the STOP CSAM bill pose risks to online service providers’ ability to use end-to-end encryption (E2EE) in their service offerings.
E2EE is a widely-used technology that protects everyone’s privacy and security by encoding the contents of digital communications and files so that they’re decipherable only by the sender and intended recipients. Not even the provider of the E2EE service can read or hear its users’ conversations. E2EE is built in by default to popular apps such as WhatsApp, iMessage, FaceTime, and Signal, thereby securing billions of people’s messages and calls for free. Default E2EE is also set to expand to Meta’s Messenger app and Instagram direct messages later this year.
E2EE’s growing ubiquity seems like a clear win for personal privacy, security, and safety, as well as national security and the economy. And yet E2EE’s popularity has its critics – including, unfortunately, Sen. Durbin. Because it’s harder for providers and law enforcement to detect malicious activity in encrypted environments than unencrypted ones (albeit not impossible, as I’ll discuss), law enforcement officials and lawmakers often demonize E2EE. But E2EE is a vital protection against crime and abuse, because it helps to protect people (children included) from the harms that happen when their personal information and private conversations fall into the wrong hands: data breaches, hacking, cybercrime, snooping by hostile foreign governments, stalkers and domestic abusers, and so on.
That’s why it’s so important that national policy promote rather than dissuade the use of E2EE – and why it’s so disappointing that STOP CSAM has turned out to be just the opposite: yet another misguided effort by lawmakers in the name of online safety that would only make us all less safe.
First, STOP CSAM’s new criminal and civil liability provisions could be used to hold E2EE services liable for CSAM and other child sex offenses that happen in encrypted environments. Second, the reporting requirements look like a sneaky attempt to tee up future legislation to ban E2EE outright.
STOP CSAM’s New Civil and Criminal Liability for Online Service Providers
Among the many, many things it does in 133 pages, STOP CSAM creates a new federal crime, “liability for certain child exploitation offenses.” It also creates new civil liability by making a carve-out from Section 230 immunity to allow child exploitation victims to bring lawsuits against the providers of online services, as well as the app stores that make those services available. Both of these new forms of liability, criminal and civil, could be used to punish encrypted services in court.
The new federal crime is for a provider of an interactive computer service (an ICS provider, as defined in Section 230) “to knowingly (1) host or store child pornography or make child pornography available to any person; or (2) otherwise knowingly promote or facilitate a violation of” certain federal criminal statutes that prohibit CSAM and child sexual exploitation (18 U.S.C. §§ 2251, 2251A, 2252, 2252A, or 2422(b)).
This is rather duplicative: It’s already illegal under those laws to knowingly possess CSAM or knowingly transmit it over the Internet. That goes for online service providers, too. So if there’s an online service that “knowingly hosts or stores” or transmits or “makes available” CSAM (whether on its own or by knowingly letting its users do so), that’s already a federal crime under existing law, and the service can be fined.
So why propose a new law that says “this means you, online services”? It’s the huge size of the fines that could be imposed on providers: up to $1 million, or $5 million if the provider’s conduct either causes someone to be harmed or “involves a conscious or reckless risk of serious personal injury.” Punishing online service providers specifically with enormous fines, for their users’ child sex offenses, is the point of re-criminalizing something that’s already a crime.
The new civil liability for providers comes from removing Section 230’s applicability to civil lawsuits by the victims of CSAM and other child sexual exploitation crimes. There’s a federal statute, 18 U.S.C. § 2255, that lets those victims sue the perpetrator(s). Section 230 currently bars those lawsuits from being brought against providers. That is, Congress has heretofore decided that if online services commit the aforementioned child sex offenses, the sole enforcer should be the Department of Justice, not civil plaintiffs. STOP CSAM would change that. (More about that issue here.)
Providers would now be fair game for 2255 lawsuits by child exploitation victims. Victims could sue for “child exploitation violations” under an enumerated list of federal statutes. They could also sue for “conduct relating to child exploitation.” That phrase is defined with respect to two groups: ICS providers (as defined by Section 230), and “software distribution services” (think: app stores, although the definition is way broader than that).
Both ICS providers and software distribution services could be sued for one type of “conduct relating to child exploitation”: “the intentional, knowing, reckless, or negligent promotion or facilitation of conduct that violates” an enumerated list of federal child exploitation statutes. And, ICS providers alone (but not software distribution services) could be sued for a different type of conduct: “the intentional, knowing, reckless, or negligent hosting or storing of child pornography or making child pornography available to any person.”
So, to sum up: STOP CSAM
(1) creates a new crime when ICS providers knowingly promote or facilitate CSAM and child exploitation crimes, and
(2) exposes ICS providers to civil lawsuits by child exploitation victims if they intentionally, knowingly, recklessly, or negligently (a) host/store/make CSAM available, or (b) promote or facilitate child exploitation conduct (for which app stores can be liable too).
Does E2EE “Promote or Facilitate” Child Exploitation Offenses?
Here, then, is the literally million-dollar question: Do E2EE service providers “promote or facilitate” CSAM and other child exploitation crimes, by making their users’ communications unreadable by the provider and law enforcement?
It’s not clear what “promote or facilitate” even means! That same phrase is also found in a 2018 law, SESTA/FOSTA, that carved out sex trafficking offenses from providers’ general immunity against civil lawsuits and state criminal charges under Section 230. And that same phrase is currently being challenged in court as unconstitutionally vague and overbroad. Earlier this year, a panel of federal appeals judges appeared skeptical of its constitutionality at oral argument, but they haven’t issued their written opinion yet. Why Senator Durbin thinks it’s a great idea to copy language that’s on the verge of being held unconstitutional, I have no clue.
If a court were to hold that E2EE services “promote or facilitate” child sex offenses (whatever that means), then the E2EE service provider’s liability would turn on whether the case was criminal or civil. If it’s criminal, then federal prosecutors would have to prove the service knowingly promoted or facilitated the crime by being E2EE. “Knowing” is a pretty high bar to prove, which is appropriate for a crime.
In a civil lawsuit, however, there are four different mental states the plaintiff could choose from. Two of them – recklessness or negligence – are easier to prove than the other two (knowledge or intent). They impose a lower bar to establishing the defendant’s liability in a civil case than the DOJ would have to meet in a federal criminal prosecution. (See here for a discussion of these varying mental-state standards, with helpful charts.)
Is WhatsApp negligently facilitating child exploitation because it’s E2EE by default? Is Zoom negligently facilitating child exploitation because users can choose to make a Zoom meeting E2EE? Are Apple and Google negligently facilitating child exploitation by including WhatsApp, Zoom, and other encrypted apps in their app stores? If STOP CSAM passes, we could expect plaintiffs to immediately sue all of those companies and argue exactly that in court.
That’s why STOP CSAM creates a huge disincentive against offering E2EE. It would open up E2EE services to a tidal wave of litigation by child exploitation victims for giving all their users a technology that is indispensable to modern cybersecurity and data privacy. The clear incentive would be for E2EE services to remove or weaken their end-to-end encryption, so as to make it easier to detect child exploitation conduct by their users, in the hopes that they could then avoid being deemed “negligent” on child safety because, ironically, they used a bog-standard cybersecurity technology to protect their users.
It is no accident that STOP CSAM would open the door to punishing E2EE service providers. Durbin’s February press release announcing his STOP CSAM bill paints E2EE as antithetical to child safety. The very first paragraph predicts that providers’ continued adoption of E2EE will cause a steep reduction in the volume of (already mandated) reports of CSAM they find on their services. It goes on to suggest that deploying E2EE treats children as “collateral damage,” framing personal privacy and child safety as flatly incompatible.
The kicker is that STOP CSAM never even mentions the word “encryption.” Even the EARN IT Act – a terrible bill that I’ve decried at great length, which was reintroduced in the Senate on the same day as STOP CSAM – has a weak-sauce provision that at least kinda tries halfheartedly to protect encryption from being the basis for provider liability. STOP CSAM doesn’t even have that!
Teeing Up a Future E2EE Ban
Even leaving aside the “promote or facilitate” provisions that would open the door to an onslaught of litigation against the providers of E2EE services, there’s another way in which STOP CSAM is sneakily anti-encryption: by trying to get encrypted services to rat themselves out to the government.
The STOP CSAM bill contains mandatory transparency reporting provisions, which, as my Lawfare piece noted, have become commonplace in the recent bumper crop of online safety bills. The transparency reporting requirements apply to a subset of the online service providers that are required to report CSAM they find under an existing federal law, 18 U.S.C. § 2258A. (That law’s definition of covered providers has a lot of overlap, in practice, with Section 230’s “ICS provider” definition. Both of these definitions plainly cover apps for messaging, voice, and video calls, whether they’re E2EE or not.) In addition to reporting the CSAM they find, those covered providers would also separately have to file annual reports about their efforts to protect children.
Not every provider that has to report CSAM would have to file these annual reports, just the larger ones: specifically, those with at least one million unique monthly visitors/users and over $50 million in annual revenue. That’s a distinction from the “promote or facilitate” liability discussed above, which doesn’t just apply to the big guys.
Covered providers must file an annual report with the Attorney General and the Federal Trade Commission that provides information about (among other things) the provider’s “culture of safety.” This means the provider must describe and assess the “measures and technologies” it employs for protecting child users and keeping its service from being used to sexually abuse or exploit children.
In addition, the “culture of safety” report must also list “[f]actors that interfere with the provider’s ability to detect or evaluate instances of child sexual exploitation and abuse,” and assess those factors’ impact.
That provision set off alarm bells in my head. I believe this reporting requirement is intended to force providers to cough up internal data and create impact assessments, so that the federal government can then turn around and use that information as ammunition to justify a subsequent legislative proposal to ban E2EE.
This hunch arises from Sen. Durbin’s own framing of the bill. As I noted above, his February press release about STOP CSAM spends its first two paragraphs claiming that E2EE would “turn off the lights” on detecting child sex abuse online. Given this framing, it’s pretty straightforward to conclude that the bill’s “interfering factors” report requirement has E2EE in mind.
So: In addition to opening the door to civil and/or criminal liability for E2EE services without ever mentioning the word “encryption” (as explained above), STOP CSAM is trying to lay the groundwork for justifying a later bill to more overtly ban providers from offering E2EE at all.
But It’s Not That Simple, Durbin
There’s no guarantee this plan will succeed, though. If this bill passes, I’m skeptical that its ploy to fish for evidence against E2EE will play out as intended, because it rests on a faulty assumption. The policy case for outlawing or weakening E2EE rests on the oft-repeated premise that online service providers can’t fight abuse unless they can access the contents of users’ files and communications at will, a capability E2EE impedes. However, my own research has proved this assumption untrue.
Last year, I published a peer-reviewed article analyzing the results of a survey I conducted of online service providers, including some encrypted messaging services. Many of the participating providers would likely be covered by the STOP CSAM bill. The survey asked participants to describe their trust and safety practices and rank how useful they were against twelve different categories of online abuse. Two categories pertained to child safety: CSAM and child sexual exploitation (CSE) such as grooming and enticement.
My findings show that CSAM is distinct from other kinds of online abuse. What currently works best to detect CSAM isn’t what works best against other abuse types, and vice versa. For CSAM, survey participants considered scanning for abusive content to be more useful than other techniques (user reporting and metadata analysis) that — unlike scanning — don’t rely on at-will provider access to user content. However, that wasn’t true of any other category of abuse — not even other child safety offenses.
For detecting CSE, user reporting and content scanning were considered equally useful for abuse detection. In most of the remaining 10 abuse categories, user reporting was deemed more useful than any other technique. Many of those categories (e.g., self-harm and harassment) affect children as well as adults online. In short, user reports are a critically important tool in providers’ trust and safety toolbox.
Here’s the thing: User reporting — the best weapon against most kinds of abuse, according to providers themselves — can be, and is, done in E2EE environments. That means rolling out E2EE doesn’t nuke a provider’s abuse-fighting capabilities. My research debunks that myth.
My findings show that E2EE does not affect a provider’s trust and safety efforts uniformly; rather, E2EE’s impact will likely vary depending on the type of abuse in question. Even online child safety is not a monolithic problem (as was cogently explained in another recent report by Laura Draper of American University). There’s simply no one-size-fits-all answer to solving online abuse.
From these findings, I conclude that policymakers should not pass laws regulating encryption and the Internet based on the example of CSAM alone, because CSAM poses such a unique challenge.
And yet that’s just what I suspect Sen. Durbin has in mind: to collect data about one type of abusive content as grounds to justify a subsequent law banning providers from offering E2EE to their users. Never mind that such a ban would affect all content and all users, whether abusive or not.
That’s an outcome we can’t afford. Legally barring providers from offering strong cybersecurity and privacy protections to their users wouldn’t keep children safe; it would just make everybody less safe, children included. As a recent report from the Child Rights International Network and DefendDigitalMe describes, while E2EE can be misused, it is nevertheless a vital tool for protecting the full range of children’s rights, from privacy to free expression to protection from violence (including state violence and abusive parents). That’s in addition to the role strong encryption plays in protecting the personal data, financial information, sensitive secrets, and even bodily safety of domestic violence victims, military servicemembers, journalists, government officials, and everyone in between.
Legislators’ tunnel-vision view of E2EE as nothing but a threat requires casting all those considerations aside — treating them as “collateral damage,” to borrow Sen. Durbin’s phrase. But the reality is that billions of people use E2EE services every day, of whom only a tiny sliver use them for harm — and my research shows that providers have other ways to deal with those bad actors. As I conclude in my article, anti-E2EE legislation just makes no sense.
Given the crucial importance of strong encryption to modern life, Sen. Durbin shouldn’t expect the providers of popular encrypted services to make it easy for him to ban it. Those major players covered by the STOP CSAM bill? They have PR departments, lawyers, and lobbyists. Those people weren’t born yesterday. If I can spot a trap, so can they. The “culture of safety” reporting requirements are meant to give providers enough rope to hang themselves. That’s like a job interviewer asking a candidate what their greatest weakness is and expecting a raw and damning response. The STOP CSAM bill may have been crafted as a ticking time bomb for blowing up encryption, but E2EE service providers won’t be rushing to light the fuse.
From my research, I know that providers’ internal child-safety efforts are too complex to be reduced to a laundry list of positives and negatives. If forced to submit the STOP CSAM bill’s mandated reports, providers will seize upon the opportunity to highlight how their E2EE services help protect children and describe how their panoply of abuse-detection measures (such as user reporting) help to mitigate any adverse impact of E2EE. While its opponents try to caricature E2EE as a bogeyman, the providers that actually offer E2EE will be able to paint a fuller picture.
Will It Even Matter What Providers’ “Culture of Safety” Reports Say?
Unfortunately, given how the encryption debate has played out in recent years, we can expect Congress and the Attorney General (a role recently held by vehemently anti-encryption individuals) to accuse providers of cherry-picking the truth in their reports. And they’ll do so even while they themselves cherry-pick statistics and anecdotes that favor their pre-existing agenda.
I’m basing that prediction on my own experience of watching my research, which shows that online trust and safety is compatible with E2EE, get repeatedly cherry-picked by those trying to outlaw E2EE. They invariably highlight my anomalous findings regarding CSAM while leaving out all the other findings and conclusions that are inconvenient to their false narrative that E2EE wholly precludes trust and safety enforcement. As an academic, I know I can’t control how my work product gets used. But that doesn’t mean I don’t keep notes on who’s misusing it and why.
Providers can offer E2EE and still effectively combat the misuse of their services. Users do not have to accept intrusive surveillance as the price of avoiding untrammeled abuse, contrary to what anti-encryption officials like Sen. Durbin would have us believe.
If the STOP CSAM bill passes and its transparency reporting provisions go into effect, providers will use them to highlight the complexity of their ongoing efforts against online child sex abuse, a problem that is as old as the Internet. The question is whether that will matter to congressmembers who have already made up their minds about the supposed evils of encryption and the tech companies that offer it — or whether those annual reports were always intended as an exercise in futility.
What’s Next for the STOP CSAM Bill?
It took two months after that February press release for Durbin to actually introduce the bill in mid-April, and it took even longer for the bill text to actually appear on the congressional bill tracker. Durbin chairs the Senate Judiciary Committee, where the bill was supposed to be considered in committee meetings during each of the last two weeks, but it got punted out both times. Now, the best guess is that it will be discussed and marked up this coming Thursday, May 4. However, it’s quite possible it will get delayed yet again. On the one hand, Durbin as the committee chair has a lot of power to move his own bill along; on the other hand, he hasn’t garnered a single co-sponsor yet, and might take more time to get other Senators on board before bringing it to markup.
I’m heartened that Durbin hasn’t gotten any co-sponsors and has had to slow-roll the bill. STOP CSAM is very dense, it’s very complicated, and in its current form, it poses a huge threat to the security and privacy of the Internet by dissuading E2EE. There may be some good things in the bill, as Techdirt wrote, but at 133 pages long, it’s hard to figure out what the bill actually does and whether those would be good or bad outcomes. I’m sure I’ll be writing more about STOP CSAM as I continue to read and digest it. Meanwhile, if you have any luck making sense of the bill yourself, and your Senator is on the Judiciary Committee, contact their office and let them know what you think.