Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
This panel considered issues of national jurisdiction in relation to Internet platforms’ voluntary content removal policies. These policies, typically set forth in Community Guidelines (CGs) or similar documents, prohibit content based on the platforms’ own rules or values—regardless of whether the content violates any law.
The essay below serves as introduction to the Stanford Center for Internet and Society's Law, Borders, and Speech Conference Proceedings Volume. The conference brought together experts from around the world to discuss conflicting national laws governing online speech -- and how courts, Internet platforms, and public interest advocates should respond to increasing demands for these laws to be enforced on the global Internet.
Today, someone asked me about the Internet and human well-being over the next decade. The question was a healthy provocation to look at the big picture. I chose “more helped than harmed” from the very short list of radio-button responses. Here’s my elaboration:
[Stanford's Daphne Keller is a preeminent cyberlawyer and one of the world's leading experts on "intermediary liability" -- that is, when an online service should be held responsible for the actions of this user. She brings us a delightful tale of Facebook's inability to moderate content at scale, which is as much of a tale of the impossibility (and foolishness) of trying to support 2.3 billion users (who will generate 2,300 one-in-a-million edge-cases every day) as it is about a specific failure.
This past week, with some fanfare, Facebook announced its own version of the Supreme Court: a 40-member board that will make final decisions about user posts that Facebook has taken down. The announcement came after extended deliberations that have been described as Facebook’s “constitutional convention.”
The Program on Extremism Policy Paper series combines analysis on extremism-related issues by our researchers and guest contributors with tailored recommendations for policymakers.
Full paper available for download here.
"“It’s so easy to point to the need for internet companies to do more that that becomes a real rallying cry,” says Daphne Keller, the director of Intermediary Liability at Stanford Law School’s Center for Internet and Society, and a former associate general counsel to Google. “In European lawmaking, they don’t have very good tech advice on what’s really possible.
"According to Daphne Keller, a lawyer at the Center for Internet and Society at Stanford University, the Austrian ruling may be "dangerous and short-sighted" because it could embolden other countries to impose local laws everywhere on Facebook.
"Daphne Keller, who studies these things over at Stanford Law School's Center for Internet and Society has both a larger paper and a shorter blog post discussing this, specifically in the context of serious concerns about how the Right To Be Forgotten (RTBF) under the GDPR will be implemented, and how it may stifle freedom of expression across Europe.
"However, Daphne Keller, the director of intermediary liability at the Stanford Law School Center for Internet and Society, questions whether machine monitoring is something we should even want to do.
"The idea that we can have an automated machine that can detect what's illegal from what's legal is pretty risky," Keller tells Lynch."
"Daphne Keller, Director of Intermediary Liability at Stanford’s Center for Internet and Society, told Quartz Facebook’s turnaround time was actually quite fast. Keller worked for years as an attorney at Google, and said that having been “on the other side,” she witnessed the massive volume of user reports these companies get, and how many of the flags they get are simply wrong or not actionable. “I don’t think it’s realistic to do anything better.”
Twenty years ago, the US Supreme Court’s decision in Reno v. ACLU established the framework for internet free speech and liability that remains in place today. This conference will consider the continuing viability of the Reno vision in the face of multiplying concerns about sex trafficking online, terrorist content, election interference, and other forms of contested content.
The Center for Internet and Society (CIS) is a public interest technology law and policy program at Stanford Law School and a part of Law, Science and Technology Program at Stanford Law School. CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry.
Over two years have passed since the Court of Justice of the European Union ruled, in the Google Spain case, that the search engine must “de-list” certain search results on request in order to honor the requesters’ data protection rights.
For many years since the European Data Protection Directive was implemented across Europe in 1998, data privacy was seen as an issue that mainly concerned what companies did with personal data behind the scenes.
Stanford CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry. Come hear CIS Directors Jennifer Granick + Daphne Keller and Resident Fellows Riana Pfefferkorn + Luiz Fernando Marrey Moncau talk about our work, and the assistance CIS provides to students in learning about these issues, selecting courses, identifying job opportunities, and making professional connections.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.