Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights.
Whether and when communications platforms like Google, Twitter and Facebook are liable for their users’ online activities is one of the key factors that affects innovation and free speech. Most creative expression today takes place over communications networks owned by private companies. Governments around the world increasingly press intermediaries to block their users’ undesirable online content in order to suppress dissent, hate speech, privacy violations and the like. One form of pressure is to make communications intermediaries legally responsible for what their users do and say. Liability regimes that put platform companies at legal risk for users’ online activity are a form of censorship-by-proxy, and thereby imperil both free expression and innovation, even as governments seek to resolve very real policy problems.
In the United States, the core doctrines of section 230 of the Communications Decency Act and section 512 of the Digital Millennium Copyright Act have allowed these online intermediary platforms user generated content to flourish. But, immunities and safe harbors for intermediaries are under threat in the U.S. and globally as governments seek to deputize intermediaries to assist in law enforcement.
To contribute to this important policy debate, CIS studies international approaches to intermediary obligations concerning users’ copyright infringement, defamation, hate speech or other vicarious liabilities, immunities, or safe harbors; publishes a repository of information on international liability regimes and works with global platforms and free expression groups to advocate for policies that will protect innovation, freedom of expression, privacy and other user rights.
Joan Barata is an international expert in freedom of expression, freedom of information and media regulation. As a scholar, he has spoken and done extensive research in these areas, working and collaborating with various universities and academic centers, from Asia to Africa and America, authoring papers, articles and books, and addressing specialized Parliament committees.
Annemarie Bridy is a Professor of Law at the University of Idaho. She is also an Affiliated Fellow at the Yale Law School Information Society Project and a former Visiting Associate Research Scholar at the Princeton University Center for Information Technology Policy. Professor Bridy specializes in intellectual property and information law, with specific attention to the impact of new technologies on existing legal frameworks for the protection of intellectual property and the enforcement of intellectual property rights.
Giancarlo F. Frosio is a Non-Residential Fellow at the Center for Internet and Society at Stanford Law School. Previously he was the Intermediary Liabilty fellow with Stanford CIS. He is also an Associate Professor at the Center for International Intellectual Property Studies (CEIPI) at Strasbourg University. Giancarlo also serves as Affiliate Faculty at Harvard CopyrightX and Faculty Associate of the Nexa Research Center for Internet and Society in Turin. Giancarlo is a qualified attorney with a doctoral degree (S.J.D.) in intellectual property law from Duke University Law School.
Stanford Law School today announced the appointment of the freedom of expression, freedom of information, and media regulation international expert Joan Barata Mir as the Consulting Intermediary Liability Fellow at the Center for Internet and Society (CIS). Barata will pursue international and comparative approaches to intermediary obligations, focusing particularly on the implications vis-à-vis the exercise of the rights to freedom of expression and freedom of information.
Two important current trends in Internet law go together in ways that aren’t getting enough attention. They should, though, because the overlap is well on its way to messing up the Internet further.
Are Internet platforms distorting our political discourse by silencing conservatives? If they were, could Congress pass a law forcing them to play fair?
Public demands for internet platforms to intervene more aggressively in online content are steadily mounting. Calls for companies like YouTube and Facebook to fight problems ranging from “fake news” to virulent misogyny to online radicalization seem to make daily headlines. British prime minister Theresa May echoed the politically prevailing sentiment in Europe when she urged platforms to “go further and faster” in removing prohibited content, including through use of automated filters.
Thursday evening, the Attorney General, the Acting Homeland Security Secretary, and top law enforcement officials from the U.K. and Australia sent an open letter to Mark Zuckerberg. The letter emphasizes the scourge of child abuse content online, and the officials call on Facebook to press pause on end-to-end encryption for its messaging platforms.
[Stanford's Daphne Keller is a preeminent cyberlawyer and one of the world's leading experts on "intermediary liability" -- that is, when an online service should be held responsible for the actions of this user. She brings us a delightful tale of Facebook's inability to moderate content at scale, which is as much of a tale of the impossibility (and foolishness) of trying to support 2.3 billion users (who will generate 2,300 one-in-a-million edge-cases every day) as it is about a specific failure.
This past week, with some fanfare, Facebook announced its own version of the Supreme Court: a 40-member board that will make final decisions about user posts that Facebook has taken down. The announcement came after extended deliberations that have been described as Facebook’s “constitutional convention.”
"The complicating factor for the President in this case is that, well, he is the President. Not only that, but he uses his account to conduct government business. Because the government is the only body that can violate the First Amendment, that puts Trump's Twitter habits on tricky legal footing, says Danielle Citron, professor of law at the University of Maryland and author of the book Hate Crimes in Cyberspace. "He’s the President. Whenever the government creates zones of public discourse, they have very special obligations under the First Amendment," Citron says."
"“It’s really important to understand how much Europe is in the driver’s seat,” says Daphne Keller, director of Intermediary Liability at the Center for Internet and Society, as well as former associate general counsel at Google. “It kind of doesn’t matter what U.S. law says for a lot of things. Europe is extracting agreements by companies — they're going to enforce those agreements publicly.”"
"But that doesn't mean these videos aren't bullying. Shaheen Shariff is a professor at McGill University and the director of the Define the Line research program, which examines cyberbullying.
"I always talk about beginning the line. In this case they have definitely crossed the line and can be sued under various different legal options."
Register here: http://web.stanford.edu/dept/law/forms/conlawmay2019.fb
Friday, May 24
How Should Free Speech Principles Apply to the Content Policy of Internet Platforms?
• Danielle Citron, University of Maryland Carey School of Law
• Niall Ferguson, Stanford University
• Mary Anne Franks, University of Miami Law School
• Eugene Volokh, UCLA Law School
Moderator: Nate Persily, Stanford Law School
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm