Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
Filtering Facebook: Introducing Dolphins in the Net, a New Stanford CIS White Paper
Why Internet Users and EU Policymakers Should Worry about the Advocate General’s Opinion in Glawischnig-Piesczek
White Paper: Dolphins in the Net: Internet Content Filters and the Advocate General’s Glawischnig-Piesczek v. Facebook Ireland Opinion
This discussion, excerpted from my Who Do You Sue article, very briefly reviews the implications of what I call “must-carry” arguments – claims that operators of major Internet platforms should be held to the same First Amendment standards as the government, and prevented from using their Terms of Service or Community Guidelines to prohibit lawful speech.
Lawmakers today are increasingly focused on their options for regulating the content we see on online platforms. I described several ambitious regulatory models for doing that in my recent paper, Who Do You Sue? State and Platform Hybrid Power Over Online Speech. This blog post excerpts that discussion, and sketches out potential legal regimes to address major platforms’ function as de facto gatekeepers of online speech and information.
In a concession to regulators, Google is . . . using “geo-blocking” technology to control what European users can see. Under the new system, Google will not only remove links on, say, google.fr, but it will block users in France from seeing those links on any other Google country site, or google.com itself. Unless they use tools like virtual private networks to disguise their locations, users in those countries will see pruned search results.
These comments were prepared and submitted in response to the U.S. Copyright Office's December 31, 2015 Notice and Request for Public Comment on the impact and effectiveness of the DMCA safe harbor provisions in Section 512 of Title 17.
Submission to the European Commission.
Includes Supplemental response to “Should action taken by hosting service providers remain effective over time ("take down and stay down" principle)?”
International Data Flows: Promoting Digital Trade in the 21st Century: Before the Subcommittee on Courts, Intellectual Property, and the Internet, 114 Cong 133 (2015) (Letter from Daphne Keller, Director of Intermediary Liability, Center for Internet and Society, Stanford Law School)
"Daphne Keller, who studies these things over at Stanford Law School's Center for Internet and Society has both a larger paper and a shorter blog post discussing this, specifically in the context of serious concerns about how the Right To Be Forgotten (RTBF) under the GDPR will be implemented, and how it may stifle freedom of expression across Europe.
"However, Daphne Keller, the director of intermediary liability at the Stanford Law School Center for Internet and Society, questions whether machine monitoring is something we should even want to do.
"The idea that we can have an automated machine that can detect what's illegal from what's legal is pretty risky," Keller tells Lynch."
"Daphne Keller, Director of Intermediary Liability at Stanford’s Center for Internet and Society, told Quartz Facebook’s turnaround time was actually quite fast. Keller worked for years as an attorney at Google, and said that having been “on the other side,” she witnessed the massive volume of user reports these companies get, and how many of the flags they get are simply wrong or not actionable. “I don’t think it’s realistic to do anything better.”
Daphne Keller, Director of Intermediary Liability at Stanford’s Center for Internet and Society, told Quartz Facebook’s turnaround time was actually quite fast. Keller worked for years as an attorney at Google, and said that having been “on the other side,” she witnessed the massive volume of user reports these companies get, and how many of the flags they get are simply wrong or not actionable. “I don’t think it’s realistic to do anything better.”
""I can't imagine Facebook knowing about [illegal content] and not taking it down," said Daphne Keller, the Director of Intermediary Liability at the Stanford Center for Internet and Society. More likely than not, they probably aren't aware of these videos unless someone flags them, she said."
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
Internet platforms like Facebook and Twitter play an ever-increasing role in our lives, and mediate our personal and public communications. What laws govern their choices about our speech? Come discuss the law of platforms and online free expression with CIS Intermediary Liability Director Daphne Keller.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.