Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
This is the second of four posts on real-world consequences of the European Court of Human Rights’ (ECHR) rulings in Delfi v. Estonia and MTE v. Hungary. Both cases arose from national court rulings that effectively required online news portals to monitor users’ speech in comment forums. The first case, Delfi, condoned a monitoring requirement in a case involving threats and hate speech.
Last summer, the Grand Chamber of the European Court of Human Rights (ECHR) delivered a serious setback to free expression on the Internet. The Court held, in Delfi v. Estonia, that a government could compel a news site to monitor its users’ online comments about articles.* This winter, the Court’s lower chamber ruled the other way in MTE v.
The probably-really-almost-totally final 2016 General Data Protection Regulation (GDPR) is here! Lawyers around the world have been hunkered down, analyzing its 200-plus pages. In the “Right to Be Forgotten” (RTBF) provisions, not much has changed from prior drafts.
Europe’s pending General Data Protection Regulation (GDPR) threatens free expression and access to information on the Internet. The threat comes from erasure requirements that work in ways the drafters may not have intended -- and that are not necessary to achieve the Regulation’s data protection purposes.
Included in this PDF are:
- Notice of Motion and Motion for Leave to File Amicus Curiae Brief
- Amicus Curiae Brief of Electronic Frontier Foundation, Center for Democracy and Technology, Daphne Keller, Eric Goldman and Eugene Volokh in Support of Plaintiffs' Motion for Preliminary Injunction
In a concession to regulators, Google is . . . using “geo-blocking” technology to control what European users can see. Under the new system, Google will not only remove links on, say, google.fr, but it will block users in France from seeing those links on any other Google country site, or google.com itself. Unless they use tools like virtual private networks to disguise their locations, users in those countries will see pruned search results.
These comments were prepared and submitted in response to the U.S. Copyright Office's December 31, 2015 Notice and Request for Public Comment on the impact and effectiveness of the DMCA safe harbor provisions in Section 512 of Title 17.
"Daphne Keller at the Stanford Center for Internet and Society said internet companies doing business in countries with laws restricting speech know they will be expected to comply with the rules. One common means of doing so without deleting lawful speech elsewhere is to offer country-specific versions of services, like YouTube Thailand, said Keller.
"The company can then honor national law on the version of the service that is targeted to, and primarily used in, that country," she said."
"Daphne Keller, an Internet law expert at Stanford Law School and former attorney at Google, said prior court decisions favor Yelp and she would be surprised if the California Supreme Court didn't reverse the ruling.
"It should be a no-brainer for Yelp to win," she said."
"“The place we all go to exercise our freedom of expression and to share opinions is a private platform run by a private company, and they don’t let us say every single thing that’s legal,” says Daphne Keller, director of intermediary liability at the Stanford Center for Internet and Society and a former head lawyer for Google’s web search team. “They only let us say the things that their policies permit. There’s good business reasons for that for them, but it’s a strange impact for us as a society sharing speech.”"
"And its odds of winning are high, said Daphne Keller, director of intermediary liability at Stanford University’s Center for Internet and Society, who said many companies have successfully used the CDA as a defense."
"When platforms are made responsible for determining what speech is illegal, those intermediaries tend to over-remove content, out of an abundance of caution, Daphne Keller, the director of intermediary liability at the Stanford Center for Internet and Society, and a former associate general counsel at Google, told BuzzFeed News. “They take down perfectly legal content out of concern that otherwise they themselves could get in trouble,” Keller said.
Internet platforms like Facebook and Twitter play an ever-increasing role in our lives, and mediate our personal and public communications. What laws govern their choices about our speech? Come discuss the law of platforms and online free expression with CIS Intermediary Liability Director Daphne Keller.
The Center for Internet and Society (CIS) is a public interest technology law and policy program at Stanford Law School and a part of Law, Science and Technology Program at Stanford Law School.
When you give sites and services information about yourself, where does it go? Who else will get hold of it, and what will they use it for? The recent revelations about Cambridge Analytica's acquisition of data about tens of millions of Facebook users without their knowledge or consent have prompted renewed interest in how data about us gets shared, sold, used, and misused -- well beyond what we ever expected. Join us for a SLATA/CIS lunchtime conversation with three experts from Stanford’s Center for Internet and Society as we discuss the legal and policy implications of the Cambridge Analytica scandal and responses from Congress and courts. How can we prevent this from happening again? What new problems might we create through poorly-crafted legal responses?
"Daphne Keller, a specialist in corporate liability and responsibility at Stanford Law School's Center for Internet and Society, says Facebook could face private lawsuits over privacy."
""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "
Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.