Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights.
Whether and when communications platforms like Google, Twitter and Facebook are liable for their users’ online activities is one of the key factors that affects innovation and free speech. Most creative expression today takes place over communications networks owned by private companies. Governments around the world increasingly press intermediaries to block their users’ undesirable online content in order to suppress dissent, hate speech, privacy violations and the like. One form of pressure is to make communications intermediaries legally responsible for what their users do and say. Liability regimes that put platform companies at legal risk for users’ online activity are a form of censorship-by-proxy, and thereby imperil both free expression and innovation, even as governments seek to resolve very real policy problems.
In the United States, the core doctrines of section 230 of the Communications Decency Act and section 512 of the Digital Millennium Copyright Act have allowed these online intermediary platforms user generated content to flourish. But, immunities and safe harbors for intermediaries are under threat in the U.S. and globally as governments seek to deputize intermediaries to assist in law enforcement.
To contribute to this important policy debate, CIS studies international approaches to intermediary obligations concerning users’ copyright infringement, defamation, hate speech or other vicarious liabilities, immunities, or safe harbors; publishes a repository of information on international liability regimes and works with global platforms and free expression groups to advocate for policies that will protect innovation, freedom of expression, privacy and other user rights.
Joan Barata is an international expert in freedom of expression, freedom of information and media regulation. As a scholar, he has spoken and done extensive research in these areas, working and collaborating with various universities and academic centers, from Asia to Africa and America, authoring papers, articles and books, and addressing specialized Parliament committees.
Annemarie Bridy is a Professor of Law at the University of Idaho. She is also an Affiliated Fellow at the Yale Law School Information Society Project and a former Visiting Associate Research Scholar at the Princeton University Center for Information Technology Policy. Professor Bridy specializes in intellectual property and information law, with specific attention to the impact of new technologies on existing legal frameworks for the protection of intellectual property and the enforcement of intellectual property rights.
Giancarlo F. Frosio is a Non-Residential Fellow at the Center for Internet and Society at Stanford Law School. Previously he was the Intermediary Liabilty fellow with Stanford CIS. He is also an Associate Professor at the Center for International Intellectual Property Studies (CEIPI) at Strasbourg University. Giancarlo also serves as Affiliate Faculty at Harvard CopyrightX and Faculty Associate of the Nexa Research Center for Internet and Society in Turin. Giancarlo is a qualified attorney with a doctoral degree (S.J.D.) in intellectual property law from Duke University Law School.
David Drummond introduced the panel. The significance of this panel is that geoblocking tools are in many ways defining and enforcing jurisdiction.
A number of themes emerged from the panel.
David G. Post, reviewing what the original Law and Borders paper got right (and what it got wrong), noted that the central dilemma it had identified—the conflict between an a-territorial global network and an international legal system with territoriality at its core—had certainly proved to be a profoundly challenging one. He suggested that the failure (thus far) to make much headway on these problems of “governance on the Internet” (in Bertrand de la Chapelle’s phrase) may be pushing these problems “upward,” to the institutions (e.g., ICANN) concerned with “governance of the Internet,” as they face increasing pressure to leverage their control over critical infrastructure to exercise greater control over online content and conduct.
The essay below serves as introduction to the Stanford Center for Internet and Society's Law, Borders, and Speech Conference Proceedings Volume. The conference brought together experts from around the world to discuss conflicting national laws governing online speech -- and how courts, Internet platforms, and public interest advocates should respond to increasing demands for these laws to be enforced on the global Internet.
The Law, Borders, and Speech conference at Stanford’s Center for Internet and Society asked the important question: Which countries’ laws and values will govern Internet users’ online behavior, including their free expression rights? The conference used the landmark article written in 1996 by David G. Post and David R. Johnson to examine whether twenty years on their conclusions still held true. Post and Johnson had concluded that “[t]he rise of the global computer network is destroying the link between geographical location and: (1) the power of local governments to assert control over online behavior; (2) the effects of online behavior on individuals or things; (3) the legitimacy of the efforts of a local sovereign to enforce rules applicable to global phenomena; and (4) the ability of physical location to give notice of which sets of rules apply.” They proposed that national law must be reconciled with self-regulatory processes emerging from the network itself.
The Program on Extremism Policy Paper series combines analysis on extremism-related issues by our researchers and guest contributors with tailored recommendations for policymakers.
Full paper available for download here.
"Ryan Calo, a professor of digital law and privacy law at the University of Washington School of Law, said that tech companies need to say, clearly and publicly, "when they will engage in censorship, if at all, at the behest of another nation."
“At a minimum, Apple and other tech companies should say publicly the conditions under which they will comply with Chinese or other requests to censor content,” Calo said. “The very act of laying out public criteria manages expectations and forces the company to consider its values.”"
"“I don’t think there’s a chance that major economies like the E.U. are going to accept C.D.A. 230” said Daphne Keller, the director of intermediary liability at Stanford Law School’s Center for Internet and Society. “So I’m not sure what the net effect is.”"
"“The key thing about this case is what preventive measures can be imposed on Facebook,” said Martin Husovec, an assistant law professor at Tilburg University’s Institute for Law, Technology and Society in the Netherlands.
"“There has been real mission creep with the right to be forgotten,” said Daphne Keller, a lawyer at Stanford University’s Center for Internet and Society. “First it was supposed to be about information found using search engines, but now we see it affecting news reporting.”"
After a lengthy legislative process, the GDPR is finally ready. As the most significant overhaul of data privacy laws in Europe in twenty years, it will have a profound impact on Silicon Valley technology companies offering online services in Europe. The recently announced Privacy Shield will affect most US organisations that receive personal information from Europe.