Using Transparency to Fight Takedown Trolls – A Model from the DMCA

The Internet is full of trolls. So it’s no surprise that notice and takedown systems for online speech attract their fair share of them – people insisting that criticism of their scientific research, videos of police brutality, and other legitimate online speech should be removed from Internet platforms.

In the US, the Digital Millennium Copyright Act (DMCA) sets up a notice and takedown system for copyright claims. It’s the only notice and takedown tool our law provides – so it’s used for trollish removal demands, alongside the many legitimate ones submitted by copyright holders. Unless Internet intermediaries like Twitter or Etsy or Google (where I used to work) do a perfect job of identifying which claims are BS, legal speech gets removed from the Internet.  

How can we keep that from happening? The DMCA sets up some procedural rules that ought to help, but they fail a lot in practice. Internet users can “counternotice” if their online speech is wrongfully removed – but the process has problems and is rarely used. And the DMCA provides penalties for bad faith removal demands – but courts have set such difficult standards to prove bad faith, few people pursue this remedy either.

In practice, the best tool against DMCA abuse may be the same thing we use for other kinds of hidden misbehavior: public transparency.  It turns out that letting the whole Internet know about content removals can work magic. For one thing, fear of exposure deters some trolls from bothering to try. For another, when bad takedown requests do succeed, transparency crowd-sources the work of finding and correcting them. With enough eyeballs, all bugs are shallow.

The Lumen Database at Harvard’s Berkman Klein center makes one key part of DMCA transparency easy.  Companies can send Lumen copies of the removal requests they receive, and Lumen will archive them in a public, searchable repository.  This trove of data has enabled an amazing amount of scholarship, and helped us understand what works – and doesn’t – about the DMCA. Internet platforms can go one important step further by letting users who go looking for online information know when it has disappeared based on a removal request. Twitter does this with its “tweet withheld” notices in users’ feeds, for example; and Google does it with notices on its search results page.

Of course, Internet users can only spot bad removals if they know what content came down. Mere aggregate data – the kind included in companies’ own transparency reports -- doesn’t work for that.  If a removal demand lists URLs of content that’s still online, determined users might use a publicized copy of the demand itself to find infringing content. (Though this would presumably be less efficient than more well-trodden paths to piracy.) Some copyright holders have argued that this risk outweighs the upsides of transparency and public review for DMCA removals.  

Among Internet civil society groups around the world, though, transparency is increasingly discussed as a solution to an array of problems – many going well beyond copyright issues.  At meetings like the Internet Governance Forum, it’s pretty much common currency that Internet companies should be more transparent about other legal removal demands they comply with around the world, as well as what content they remove under their own policies. And increasingly, civil liberties advocates are asking governments and rightsholders to support transparency as well.  In discussions around the world, I’ve heard suggestions that

  • In countries where intermediaries are not sure if disclosing removal demands could get them in legal trouble, legal experts and lawmakers should make clear that such disclosures are welcome and protected by law. (Potentially this would include redacting certain personal information, which Lumen already does).
  • Transparency should be expected from governments or private actors who seek content removal, not just the Internet companies that process their requests.

This kind of transparency has real potential to shine sunlight on over-reaching removals of all sorts. Experience and tools developed in response to the US experience with the DMCA could help combat trollish removals, and protect online speech, around the world. 

 

Add new comment