Most observers cheered when the neo-Nazi Daily Stormer was booted from YouTube, CloudFlare, and other platforms around the Internet. At the same time, the site’s disappearance stirred anxiety about Internet companies’ power over online speech. It starkly illustrated how online speech can live or die at the discretion of private companies. The modern public square is in private hands.
Episodes like this have fueled calls to “regulate” Internet platforms. A recent op-ed suggested nationalizing companies like Google and Facebook. Others have said that a law currently under consideration in Washington, the Stop Enabling Sex Traffickers Act (SESTA), would be a step in the right direction. But SESTA wouldn’t reduce the power of Internet companies. It would greatly expand their role as hidden arbiters of online speech.
SESTA has admirable goals – it aims to help victims of sex trafficking. But the means its drafters chose to do so are dangerous. SESTA seriously undermines Section 230 of the 1996 Communications Decency Act (CDA 230), a core law protecting both Internet users and platforms. CDA 230 protects platforms by shielding them from liability for most user content. That protection is essential for contemporary Internet services. YouTube, for example, could never do meaningful legal review for the 400 hours of video it receives every minute. And even if YouTube could somehow do it, such efforts would be prohibitively expensive for smaller competitors.
Laws like CDA 230 effectively protect rights of Internet users, too. Platforms routinely receive false claims that online speech is illegal. Competitors make them in order to hurt one another’s businesses, organizations like the Church of Scientology use them to silence criticism, and even governments try to dupe platforms into deleting online speech. Ecuador, for example, used false copyright claims to suppress footage of police abuse. Research shows that platforms all too often follow the path of least resistance in response to such claims. Rather than spend money on lawyers and investigations, or risk legal exposure, they simply remove users’ lawful speech. Mounting evidence suggests that over-removal disproportionately affects African Americans and other minority user groups.
As numerous courts have recognized, this situation threatens the constitutional and human rights of Internet users. Human rights officials have said that platforms should take down user speech only after a court has decided it is illegal. Letting platforms decide the fate of online content may also make sense when they can clearly identify illegality on sight, as they do in the case of child sexual abuse images. But where legal judgment is called for, it should be provided by courts and other public, democratically accountable forums.
With SESTA, Congress would move away from such accountability. Although the bill aims to punish a particular set of bad actors – most importantly the classified advertising site Backpage.com, which is said to have colluded with traffickers – it actually does far more. It pushes decisions, under ambiguous legal standards, into the hands of platforms. Tech companies from Facebook to small infrastructure providers could be required to decide what counts as “facilitating” or “assisting” trafficking, and risk jail time or civil liability if they get the answer wrong. That is a recipe for abuse by those seeking to suppress online speech.
Imagine, for example, the creators of a new messaging app. If one user claims that another is running a trafficking operation using the app, should they believe the allegation, delete the user’s messages, and terminate the account? I’ve been a tech lawyer almost two decades, and spent hours trying to parse SESTA, but I don’t know the answer. Real-world entrepreneurs mostly won’t check with lawyers anyway. They’ll do the easy and safe thing: bar users or content from the platform.
Internet users deserve better legal protection than this. And Congress already drafted a law that provided it. The 2015 SAVE Act subjected Backpage.com and other purveyors of trafficking ads to federal prosecution – and it did so without SESTA’s collateral damage to Internet users. That makes it puzzling why there is any need for SESTA now.
No one questions the importance of helping trafficking victims. But doing it this way is a mistake. Congress has many other options – including pressing federal prosecutors to bring cases under the existing SAVE Act. If that fails, there are other ways to target traffickers without making tech companies the new speech judges. But SESTA’s clumsy approach should not be under consideration.
Daphne Keller is Director of Intermediary Liability at the Stanford Center for Internet and Society, whose funders include Microsoft and Google. She was previously Associate General Counsel for Google.