Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
This is the second of four posts on real-world consequences of the European Court of Human Rights’ (ECHR) rulings in Delfi v. Estonia and MTE v. Hungary. Both cases arose from national court rulings that effectively required online news portals to monitor users’ speech in comment forums. The first case, Delfi, condoned a monitoring requirement in a case involving threats and hate speech.
Last summer, the Grand Chamber of the European Court of Human Rights (ECHR) delivered a serious setback to free expression on the Internet. The Court held, in Delfi v. Estonia, that a government could compel a news site to monitor its users’ online comments about articles.* This winter, the Court’s lower chamber ruled the other way in MTE v.
The probably-really-almost-totally final 2016 General Data Protection Regulation (GDPR) is here! Lawyers around the world have been hunkered down, analyzing its 200-plus pages. In the “Right to Be Forgotten” (RTBF) provisions, not much has changed from prior drafts.
Europe’s pending General Data Protection Regulation (GDPR) threatens free expression and access to information on the Internet. The threat comes from erasure requirements that work in ways the drafters may not have intended -- and that are not necessary to achieve the Regulation’s data protection purposes.
Included in this PDF are:
- Notice of Motion and Motion for Leave to File Amicus Curiae Brief
- Amicus Curiae Brief of Electronic Frontier Foundation, Center for Democracy and Technology, Daphne Keller, Eric Goldman and Eugene Volokh in Support of Plaintiffs' Motion for Preliminary Injunction
In a concession to regulators, Google is . . . using “geo-blocking” technology to control what European users can see. Under the new system, Google will not only remove links on, say, google.fr, but it will block users in France from seeing those links on any other Google country site, or google.com itself. Unless they use tools like virtual private networks to disguise their locations, users in those countries will see pruned search results.
These comments were prepared and submitted in response to the U.S. Copyright Office's December 31, 2015 Notice and Request for Public Comment on the impact and effectiveness of the DMCA safe harbor provisions in Section 512 of Title 17.
"“It’s so easy to point to the need for internet companies to do more that that becomes a real rallying cry,” says Daphne Keller, the director of Intermediary Liability at Stanford Law School’s Center for Internet and Society, and a former associate general counsel to Google. “In European lawmaking, they don’t have very good tech advice on what’s really possible.
"According to Daphne Keller, a lawyer at the Center for Internet and Society at Stanford University, the Austrian ruling may be "dangerous and short-sighted" because it could embolden other countries to impose local laws everywhere on Facebook.
"Daphne Keller, who studies these things over at Stanford Law School's Center for Internet and Society has both a larger paper and a shorter blog post discussing this, specifically in the context of serious concerns about how the Right To Be Forgotten (RTBF) under the GDPR will be implemented, and how it may stifle freedom of expression across Europe.
"However, Daphne Keller, the director of intermediary liability at the Stanford Law School Center for Internet and Society, questions whether machine monitoring is something we should even want to do.
"The idea that we can have an automated machine that can detect what's illegal from what's legal is pretty risky," Keller tells Lynch."
"Daphne Keller, Director of Intermediary Liability at Stanford’s Center for Internet and Society, told Quartz Facebook’s turnaround time was actually quite fast. Keller worked for years as an attorney at Google, and said that having been “on the other side,” she witnessed the massive volume of user reports these companies get, and how many of the flags they get are simply wrong or not actionable. “I don’t think it’s realistic to do anything better.”
Presented by Bloomberg, the Electronic Frontier Foundation and the First Amendment Coalition.
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
The question of what responsibility should lie with Internet platforms for the content they host that is posted by their users has been the subject of debate around in the world as politicians, regulators, and the broader public seek to navigate policy choices to combat harmful speech that have implications for freedom of expression, online harms, competition, and innovation.
Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?
"Daphne Keller, a specialist in corporate liability and responsibility at Stanford Law School's Center for Internet and Society, says Facebook could face private lawsuits over privacy."
""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "
Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.