The EU's Terrorist Content Regulation: Expanding the Rule of Platform Terms of Service and Exporting Expression Restrictions from the EU's Most Conservative Member States

The EU’s proposed Terrorist Content Regulation gives national authorities sweeping new powers over comments, videos, and other content that people share using Internet platforms. Among other things, authorities – who may be police, not courts – can require platforms of all sizes to take content down within one hour. The Regulation also requires even small platforms to build upload filters and attempt to proactively weed out prohibited material. Critics have raised serious concerns about the Regulation’s likely ineffectiveness in combatting violent extremism, collateral damage for human rights, and disparate impact on racial minorities, as well as the anti-competitive impact of requiring small businesses to adopt expensive and poorly understood filtering technologies.

While the Regulation expands EU Member States’ power to ban online expression and information, it simultaneously reduces their power to protect them. It significantly erodes national lawmakers' authority to uphold Internet users' fundamental rights to receive and impart information. It does so by greatly increasing platforms’ incentives to prohibit content using private Terms of Service (TOS), rather than national law, and to take down more material than the law actually requires. At the same time, it effectively increases the power of authorities in any EU Member State to suppress information that is legal elsewhere in the EU. Authorities in Hungary and authorities in Sweden may disagree, for example, about whether a news organization sharing an interview with a current or former member of a terrorist organization is “promoting” or “glorifying” terrorism. Or they may differ on the legitimacy of a civil society organization's advocacy on complex issues in Chechnya, Israel, or Kurdistan. The Regulation gives platforms reason to use their TOS to accommodate whichever authority wants such content taken down – and to apply that decision to users everywhere.  

This shift from most Member States' Rule of Law to platforms' Rule of TOS, and the resulting empowerment of the most speech-restrictive government actors, are part of an ongoing trend. But it is not inevitable or unstoppable. Lawmakers considering the Terrorist Content Regulation have other options. They should take the time to consider them, and to craft wise policy to protect both fundamental rights and public safety. 

1. Rule of Law vs. Rule of TOS Today

Big platforms like Facebook or YouTube use their Terms of Service to prohibit an increasingly broad array of content. They take this prohibited-by-TOS material down globally, even if it is lawful in some countries. If content is permitted by the TOS but is prohibited by a country’s law, platforms typically block it only for users in that country.

Expressed visually, this relationship between public law and private Terms of Service for user expression in an EU Member State looks something like this.

 

Of course, that diagram will look different – the size and overlap of the circles will change – depending which country, and which platform, you’re talking about. Where two EU countries disagree about the scope of expression rights, the diagram may look more like this.  

 

These kinds of differences in national law are to be expected, given Member States’ divergent free expression traditions, and legal competence to define their own rules at the national (not EU) level. The differences may be particularly significant for the kinds of information covered by the Terrorist Content Regulation, since Member States also have their own approaches to, and responsibility for, public order and national security.

Despite this, the Regulation gives platforms of all sizes powerful new reasons to expand the range of content prohibited under their TOSes -- eclipsing most Member States' protections for free expression and information, and prohibiting anything that might violate even one authority's interpretation of law in one country. That would make our diagram look more like this.  

 

2. How the Regulation Increases the Rule of TOS and Exports Restrictions on Expression

The Regulation provides two tools for national authorities. The first, and easiest for all concerned, is a Referral, which requires the platform to expeditiously review content under its Terms of Service. The second is a binding Order. Orders require the platform to take content down, based on authorities’ determination that it violates the law. Referrals are the easiest choice for law enforcement, because they involve little or no legal analysis and the paperwork is simpler. They’re easier for platforms as well – and complying helps maintain good relationships with authorities.

Beyond these basic and to some extent pre-existing incentives, the Regulation adds major new ones. A platform that rejects a Referral and receives an Order is effectively choosing to accept major and unpredictable new obligations. For smaller platforms that have not already invested in content filters, getting an Order (or in some drafts several Orders) triggers the obligation to build them. That’s costly for any company, and may be financially insupportable for small ones. Companies that receive Orders also assume a new and poorly-defined relationship with authorities. They must submit annual reports describing their filtering efforts, and make engineering or product design changes if authorities aren’t satisfied. Since no one knows for sure who these new de facto regulators will be or how well they will understand available technologies, it’s hard to predict what they’ll ask for.

The best way for platforms to avoid these costs and uncertainties is to accept all Referrals – even if that requires changing how they interpret their TOS, and taking down previously permitted expression. That means accommodating even the most aggressive Referrals from national authorities, letting their requests shape online information access throughout the EU and around the world.

Of course, some platforms, sometimes, will risk getting Orders under the Regulation. They may even challenge them in court – which the Regulation permits them to do after taking the content down. A few principled or poorly-lawyered small platforms may gamble on that approach. For the most part, though, we should expect it only from platforms that can afford the legal risk and litigation cost; that are repeat players with an interest in clarifying the rules; and that have already invested in things like filters. In other words, giants like Facebook and YouTube will decide which questions of national speech law get referred to courts.

The Regulation’s drafters are aware of these problems. Some, including the IMCO and CULT Committees in Parliament, have proposed sensible amendments to address them. Unfortunately, the amendments that appear in the last official draft (from the Council) are largely cosmetic. They urge platforms and authorities to respect expression and information rights – but still incentivize them to do the opposite. They also establish a skeletal process for law enforcement “de-confliction” across Member State borders – which sounds good, but is unlikely to cure decades-old coordination problems.

3. Big Picture EU Legal Developments and Law vs. TOS

The Terrorist Content Regulation fits into a larger pattern, and a fundamental tension, in EU platform law today. Some lawmakers seem to want more Rule of TOS; some seem to want more Rule of Law; and some seem to want both at once. On the pro-Rule-of-TOS side are things like existing law enforcement Internet Referral Units and the Hate Speech Code of Conduct, which requires platforms to use their Terms of Service to take down content.

On the pro-Rule-of-Law side, other European legal developments try to put governance back in the hands of democratically accountable institutions. The Audio Visual Media Services Directive, which lets users appeal some video platform takedown decisions to regulators or courts, was supposed to work this way. (Though that may backfire, causing platforms to use their Terms of Service even more in order to avoid public review.) Recent German lower court decisions requiring Facebook to reinstate content it had removed under its TOS also push in this direction.

It’s hard to imagine how these two trends can be reconciled. To many observers, the slide toward the Rule of TOS feels inevitable.

4. Alternatives to the Rule of TOS

The temptation for lawmakers to rely on private Terms of Service instead of public law is understandable. Drafting wise rules for speech on the Internet is hard. For EU lawmakers, faced with 28 (or 27) different sets of national speech rules, it’s even harder. But giving up and embracing the primacy of private companies' rules for expression and information is a short-sighted solution.

Resignation to the Rule of TOS may be driven in part by conflating issues that, if considered separately, might have better solutions. For example,

  • One reason to allocate adjudication to platforms may be that they have better resources than governments to assess the sheer volume of Internet content removal decisions. That seems like the beginning of a conversation about costs and capabilities, not the end. Some would argue that well-resourced companies should help fund public, accountable Rule of Law processes, instead of replacing them.
  • Another possibility is that lawmakers may invoke private TOSes in order to avoid addressing the thorny problems of jurisdiction and divergent speech laws on a global Internet. Those problems aren’t going away soon, though, and they will never go away without government engagement. The Internet and Jurisdiction project provides an important forum for governments, civil society, and platforms to discuss possible ways forward.
  • Some lawmakers may hope that national law can counteract the Rule of TOS by requiring platforms to protect lawful expression. If that’s the goal, then going down the road of encouraging or mandating TOS-based takedowns in the first place is a dubious proposition. In any case, as I discuss in a recent paper, requiring platforms to carry lawful expression may be extremely difficult in practice. 
  • On the other side, some lawmakers may like the Rule of TOS as a way to effectively prohibit more speech than the EU Charter and European Convention would let them ban by law. (The equivalent issue is a huge unspoken factor in American discussions, since many people believe platforms have a moral responsibility to eliminate offensive or harmful speech that, as a matter of law, is protected by the First Amendment.) Acknowledging that goal, if it is a goal, would make for a more honest and productive conversation.

These are just a few of the reasons for lawmakers to slow down the Terrorist Content Regulation’s progress and carefully consider their options. (Follow the links at the top of this post about security and human rights for others.) The democratic process can and should do better at balancing all the interests at stake: protecting society from violent extremism, protecting Internet users’ fundamental rights, and maintaining the democratic Rule of Law.

 

 

 

 

Add new comment