Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
This is the second of four posts on real-world consequences of the European Court of Human Rights’ (ECHR) rulings in Delfi v. Estonia and MTE v. Hungary. Both cases arose from national court rulings that effectively required online news portals to monitor users’ speech in comment forums. The first case, Delfi, condoned a monitoring requirement in a case involving threats and hate speech.
Last summer, the Grand Chamber of the European Court of Human Rights (ECHR) delivered a serious setback to free expression on the Internet. The Court held, in Delfi v. Estonia, that a government could compel a news site to monitor its users’ online comments about articles.* This winter, the Court’s lower chamber ruled the other way in MTE v.
The probably-really-almost-totally final 2016 General Data Protection Regulation (GDPR) is here! Lawyers around the world have been hunkered down, analyzing its 200-plus pages. In the “Right to Be Forgotten” (RTBF) provisions, not much has changed from prior drafts.
Europe’s pending General Data Protection Regulation (GDPR) threatens free expression and access to information on the Internet. The threat comes from erasure requirements that work in ways the drafters may not have intended -- and that are not necessary to achieve the Regulation’s data protection purposes.
[Stanford's Daphne Keller is a preeminent cyberlawyer and one of the world's leading experts on "intermediary liability" -- that is, when an online service should be held responsible for the actions of this user. She brings us a delightful tale of Facebook's inability to moderate content at scale, which is as much of a tale of the impossibility (and foolishness) of trying to support 2.3 billion users (who will generate 2,300 one-in-a-million edge-cases every day) as it is about a specific failure.
This past week, with some fanfare, Facebook announced its own version of the Supreme Court: a 40-member board that will make final decisions about user posts that Facebook has taken down. The announcement came after extended deliberations that have been described as Facebook’s “constitutional convention.”
The Program on Extremism Policy Paper series combines analysis on extremism-related issues by our researchers and guest contributors with tailored recommendations for policymakers.
Full paper available for download here.
"However, Daphne Keller, the director of intermediary liability at the Stanford Law School Center for Internet and Society, questions whether machine monitoring is something we should even want to do.
"The idea that we can have an automated machine that can detect what's illegal from what's legal is pretty risky," Keller tells Lynch."
"Daphne Keller, Director of Intermediary Liability at Stanford’s Center for Internet and Society, told Quartz Facebook’s turnaround time was actually quite fast. Keller worked for years as an attorney at Google, and said that having been “on the other side,” she witnessed the massive volume of user reports these companies get, and how many of the flags they get are simply wrong or not actionable. “I don’t think it’s realistic to do anything better.”
Daphne Keller, Director of Intermediary Liability at Stanford’s Center for Internet and Society, told Quartz Facebook’s turnaround time was actually quite fast. Keller worked for years as an attorney at Google, and said that having been “on the other side,” she witnessed the massive volume of user reports these companies get, and how many of the flags they get are simply wrong or not actionable. “I don’t think it’s realistic to do anything better.”
""I can't imagine Facebook knowing about [illegal content] and not taking it down," said Daphne Keller, the Director of Intermediary Liability at the Stanford Center for Internet and Society. More likely than not, they probably aren't aware of these videos unless someone flags them, she said."
"In May a court allowed a lawsuit to proceed against Model Mayhem, a network that connects models and photographers, for having failed to warn users that rapists have used the site to target victims. In June a judge decided that Yelp, a site for crowdsourced reviews, cannot challenge a court order to remove a defamatory review of a lawyer by a client. Courts and lawmakers are not about to abolish section 230, says Daphne Keller of the Centre for Internet and Society at Stanford Law School, but it is unlikely to survive for decades."
Presented by Bloomberg, the Electronic Frontier Foundation and the First Amendment Coalition.
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
"Daphne Keller, a specialist in corporate liability and responsibility at Stanford Law School's Center for Internet and Society, says Facebook could face private lawsuits over privacy."
""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "
Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.