[Stanford's Daphne Keller is a preeminent cyberlawyer and one of the world's leading experts on "intermediary liability" -- that is, when an online service should be held responsible for the actions of this user. She brings us a delightful tale of Facebook's inability to moderate content at scale, which is as much of a tale of the impossibility (and foolishness) of trying to support 2.3 billion users (who will generate 2,300 one-in-a-million edge-cases every day) as it is about a specific failure. We're delighted to get the chance to run this after a larger, more prestigious, longer running publication spiked it because it had a penis in it. Be warned: there is a willie after the jump. -Cory]
Those of us who study the rules that Internet platforms apply to online speech have increasingly rich data about platforms’ removal decisions. Sources like transparency reports provide a statistical big picture, aggregating individual takedown decisions.
What we mostly do not have is good information about the individual decisions. For the most part, we don’t know what specific statements, videos, or pictures are coming down. That makes it impossible to say how well platforms are applying the law or their own rules, or whether platforms’ overall decisions show bias against particular messages or particular groups. Instead, the public discussion is driven by anecdotes. The anecdotes almost invariably come from a very self-selecting group -- people who are upset or have a political agenda.
This dearth of public information probably explains why my own husband decided to turn me in to Facebook for breaking their rules. He clicked on a link, made some choices from a drop-down menu or two, told them to take down one of my posts, and they did -- permanently. At the time, I was mad. Now, though, I see this for the gift that it was. My husband was giving me an example, complete with screen shots, of the company’s process for handling a modestly complex judgment call about online speech. Thanks, honey. He was also illustrating the limits of content moderation at Internet scale.
Read the full post at Boing Boing.
- Publication Type:Other Writing
- Publication Date:09/27/2019