Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights.
Whether and when communications platforms like Google, Twitter and Facebook are liable for their users’ online activities is one of the key factors that affects innovation and free speech. Most creative expression today takes place over communications networks owned by private companies. Governments around the world increasingly press intermediaries to block their users’ undesirable online content in order to suppress dissent, hate speech, privacy violations and the like. One form of pressure is to make communications intermediaries legally responsible for what their users do and say. Liability regimes that put platform companies at legal risk for users’ online activity are a form of censorship-by-proxy, and thereby imperil both free expression and innovation, even as governments seek to resolve very real policy problems.
In the United States, the core doctrines of section 230 of the Communications Decency Act and section 512 of the Digital Millennium Copyright Act have allowed these online intermediary platforms user generated content to flourish. But, immunities and safe harbors for intermediaries are under threat in the U.S. and globally as governments seek to deputize intermediaries to assist in law enforcement.
To contribute to this important policy debate, CIS studies international approaches to intermediary obligations concerning users’ copyright infringement, defamation, hate speech or other vicarious liabilities, immunities, or safe harbors; publishes a repository of information on international liability regimes and works with global platforms and free expression groups to advocate for policies that will protect innovation, freedom of expression, privacy and other user rights.
Joan Barata is an international expert in freedom of expression, freedom of information and media regulation. As a scholar, he has spoken and done extensive research in these areas, working and collaborating with various universities and academic centers, from Asia to Africa and America, authoring papers, articles and books, and addressing specialized Parliament committees.
Annemarie Bridy is a Professor of Law at the University of Idaho. She is also an Affiliated Fellow at the Yale Law School Information Society Project and a former Visiting Associate Research Scholar at the Princeton University Center for Information Technology Policy. Professor Bridy specializes in intellectual property and information law, with specific attention to the impact of new technologies on existing legal frameworks for the protection of intellectual property and the enforcement of intellectual property rights.
Giancarlo F. Frosio is a Non-Residential Fellow at the Center for Internet and Society at Stanford Law School. Previously he was the Intermediary Liabilty fellow with Stanford CIS. He is also a Senior Lecturer and Researcher at the Center for International Intellectual Property Studies (CEIPI) at Strasbourg University. Giancarlo also serves as Affiliate Faculty at Harvard CopyrightX and Faculty Associate of the Nexa Research Center for Internet and Society in Turin.
You are CEO of Google. When you wake up tomorrow morning, your general counsel calls you: "we've been sued in the E.U. for copyright infringement! The claim: our search results for Le Parisien and dozens of other newspapers used more than one word and/or beyond a 'short extract.'" Your response: "is this April Fools’ day?"
The present submission has the object of providing comments and recommendations regarding a very specific provision included in the document mentioned in the title, that is the specific proposed duty for intermediaries to:
“deploy technology based automated tools or appropriate mechanisms, with appropriate controls, for proactively identifying and removing or disabling public access to unlawful information or content.”
I have a new article coming out, called Who Do You Sue? State and Platform Hybrid Power over Online Speech. It is about free expression rights on platforms like Facebook or Twitter, which the Supreme Court has called “the modern public square.” One section is about speakers suing platforms. It looks at cases – over thirty so far – where users argue that companies like Facebook or Twitter have violated their free expression rights by taking down legal speech that is prohibited under the platforms’ Community Guidelines.
Tool Without A Handle: “A Mere Gallimaufry”
This blog has spent a good deal of real estate discussing networked information technologies as tools, but has not yet dealt thoroughly with the qualifier in its title: tools “without handles.” The addition of “without a handle” is intended to indicate that my primary metaphor of a tool in the control of a user - and thus my general preferred approach to Internet policy and regulation, favoring individual control and accountability for uses of tools – needs to be leavened a bit.
Tighter regulation of social media and other online services in now under discussion in several European countries, as well as in the UK where the government has released a white paper outlining its proposed approach to tackling online harm.
The Internet was going to set us all free. At least, that is what U.S. policy makers, pundits, and scholars believed in the 2000s. The Internet would undermine authoritarian rulers by reducing the government’s stranglehold on debate, helping oppressed people realize how much they all hated their government, and simply making it easier and cheaper to organize protests.
"The complicating factor for the President in this case is that, well, he is the President. Not only that, but he uses his account to conduct government business. Because the government is the only body that can violate the First Amendment, that puts Trump's Twitter habits on tricky legal footing, says Danielle Citron, professor of law at the University of Maryland and author of the book Hate Crimes in Cyberspace. "He’s the President. Whenever the government creates zones of public discourse, they have very special obligations under the First Amendment," Citron says."
"“It’s really important to understand how much Europe is in the driver’s seat,” says Daphne Keller, director of Intermediary Liability at the Center for Internet and Society, as well as former associate general counsel at Google. “It kind of doesn’t matter what U.S. law says for a lot of things. Europe is extracting agreements by companies — they're going to enforce those agreements publicly.”"
"But that doesn't mean these videos aren't bullying. Shaheen Shariff is a professor at McGill University and the director of the Define the Line research program, which examines cyberbullying.
"I always talk about beginning the line. In this case they have definitely crossed the line and can be sued under various different legal options."
Register here: http://web.stanford.edu/dept/law/forms/conlawmay2019.fb
Friday, May 24
How Should Free Speech Principles Apply to the Content Policy of Internet Platforms?
• Danielle Citron, University of Maryland Carey School of Law
• Niall Ferguson, Stanford University
• Mary Anne Franks, University of Miami Law School
• Eugene Volokh, UCLA Law School
Moderator: Nate Persily, Stanford Law School
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
CIS Affiliate Scholar David Levine interviews The Guardian's Julia Powles and Prof. Ellen Goodman of Rutgers Law School, on the "Right to Be Forgotten."
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.
CIS Affiliate Scholar David Levine interviews Prof. Jacqueline Lipton of The University of Akron Law School, author of Rethinking Cyberlaw: A New Vision for Internet Law.