Three Constitutional Thickets: Why Regulating Online Violent Extremism is Hard

Author(s): 
Publication Type: 
White Paper / Report
Publication Date: 
September 12, 2019

The Program on Extremism Policy Paper series combines analysis on extremism-related issues by our researchers and guest contributors with tailored recommendations for policymakers.

Full paper available for download here

Introduction

In May of 2019, two months after an attacker horrified the world by livestreaming his massacre of worshippers in two New Zealand mosques, leaders of Internet platforms and governments around the world convened in Paris to formulate their response. In the resulting agreement, known as the Christchurch Call, they committed “to eliminate terrorist and violent extremist content online,” while simultaneously protecting freedom of expression. The exact parameters of the commitment, and the means to balance its two goals, were left vague – unsurprising in a document embraced by signatories from such divergent legal cultures as Canada, Indonesia, and Senegal. The U.S. did not sign, though it endorsed similar language through G7 as recently as 2018, and will be asked to do so again in 2019.

What might a law designed to meet these goals look like? International models abound – most of them establishing rules that, in the U.S., would not pass muster under the First Amendment. Australia’s post-Christchurch law was enacted with just 24 hours of public review, and imposes criminal penalties on executives of social media companies if they do not swiftly remove “abhorrent violent material.” A U.K. plan lists “extremist content and activity” as one of many areas to be regulated under to-be-determined rules by a to-be-determined government agency. Germany’s NetzDG gives platforms 24 hours to remove user posts that “manifestly” constitute “public incitement to crime” or encourage “the commission of a serious violent offence endangering the state[.]” The EU’s draft Terrorist Content Regulation allows police order platforms to take content down in just one hour. Officials have said that non-violent videos and religious poetry are among the things that must be removed. Other drafts of the EU’s planned law go much further, requiring hosting platforms of all sizes to adopt content filters – despite widespread concern about problems with the filters platforms use already. In this paper, I review U.S. constitutional considerations for lawmakers seeking to balance terrorist threats against free expression online. The point is not to advocate for any particular rule. In particular, I do not seek to answer moral or norms-based questions about what content Internet platforms should take down. I do, however, note the serious tensions between calls for platforms to remove horrific but FirstAmendment-protected extremist content – a category that probably includes the Christchurch shooter’s video – and calls for them to function as “public squares” by leaving up any speech the First Amendment permits. To lay out the issue, I draw on analysis developed at greater length in previous publications. This analysis concerns large user-facing platforms like Facebook and Google, and the word “platform” as used here refers to those large companies, not their smaller counterparts.

The paper’s first section covers territory relatively familiar to U.S. lawyers concerning the speech Congress can limit under anti-terrorism laws. This law is well-summarized elsewhere, so my discussion is quite brief. The second section explores a less widelyunderstood issue: Congress’s power to hold Internet platforms liable for their users’ speech. The third section ventures farthest afield, reviewing constitutional implications when platforms themselves set the speech rules, prohibiting legal speech under their Terms of Service (TOS). I will conclude that paths forward for U.S. lawmakers who want to both restrict violent extremist content and protect free expression are rocky, and that non-U.S. laws are likely to be primary drivers of platform behavior in this area in the coming years.