May 14, 2020
“Internet Speech Will Never Go Back to Normal,” declared the headline of a recent Atlantic article by law professors Jack Goldsmith and Andrew Keane Woods. The piece argues that the U.S. must learn from China in regulating the internet. “[S]ignificant monitoring and speech control are inevitable components of a mature and flourishing internet,” the authors write, “and governments must play a large role in these practices to ensure that the internet is compatible with a society’s norms and values.”
But is this conclusion the only one available from the fallout of the coronavirus crisis? Or are there other ways to ensure a mature and flourishing internet in which free speech and public health can coexist? And could Facebook’s new Oversight Board be one of the answers?
Here to discuss the issue are two of the biggest experts on the subject of internet law and platform regulation: Daphne Keller, Platform Regulation Director at the Stanford Cyber Policy Center (formerly an Associate General Counsel at Google); and Kate Klonick, assistant professor at St. John’s University teaching internet law and information privacy, and a fellow at Yale Law School’s Information Society Project.
In this episode we discuss:
- Whether social media platforms like Facebook, Twitter, and YouTube have been paragons of responsibility or the lapdogs of censorious governments, when it comes to content moderation?
- If the current crisis justifies lowering the threshold for when content is deemed “harmful” or if we should be even more vigilant about what stays online?
- Are there specific problems in the policies and guidelines laid out by health authorities like WHO and CDC, which have changed their position on issues like facemasks and included inaccurate information about the nature of COVID-19?
- What is the impact and outcome of automated content moderation based on the performance during the pandemic?
- Whether democracies — particularly European ones — have weakened online freedom by choosing to respond to legitimate concerns about hate speech, disinformation, and terrorist content with illiberal laws?