Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions that Shape Social Media

April 12, 2018 12:50 pm to 2:00 pm

RSVP is required for this free event. 

Content moderation is such a complex and laborious undertaking that, all things considered, it's amazing that it works at all, and as well as it does. Moderation is resource intensive and relentless; it requires making difficult and often untenable distinctions; it is wholly unclear what the standards should be, especially on a global scale; and one failure can incur enough public outrage to overshadow a million quiet successes.

Even so, as a society we have once again handed over to private companies the power to set and enforce the boundaries of appropriate public speech for us. That is an enormous cultural power, held by a few deeply invested stakeholders, and it is being done behind closed doors, making it difficult for anyone else to inspect or challenge. Platforms frequently, and conspicuously, fail to live up to our expectations—in fact, given the enormity of the undertaking, most platforms' own definition of success includes failing users on a regular basis. It is time for the discussion about content moderation to shift, away from a focus on the harms users face and the missteps platforms sometimes make in response, to a more expansive examination of the responsibilities of platforms.

To do this, our thinking about platforms must change. It is not just that all platforms moderate, or that they have to moderate, or that they tend to disavow it while doing so. It is that moderation, far from being occasional or ancillary, is in fact an essential, constant, and definitional part of what platforms do. I mean this literally: moderation is the essence of platforms, it is the commodity they offer.

-----------------

Tarleton Gillespie is a principal researcher at Microsoft Research New England, and an affiliated associate professor in the Department of Communication and Department of Information Science at Cornell University. For the past several years he has been studying how social media platforms moderate the content and behavior of their users, and how their approaches to moderation have broader implications for the character of public discourse. His book, Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions that Shape Social Media (Yale University Press) will be published in May 2018. He is also the author of Wired Shut: Copyright and the Shape of Digital Culture (MIT Press, 2007) co-editor of Media Technologies: Essays on Communication, Materiality, and Society (MIT, 2014), and the co-founder of the blog Culture Digitally.

Location: 
Stanford Law School - Room 280B
559 Nathan Abbott Way
Stanford, CA

Add new comment