Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
This is the second of four posts on real-world consequences of the European Court of Human Rights’ (ECHR) rulings in Delfi v. Estonia and MTE v. Hungary. Both cases arose from national court rulings that effectively required online news portals to monitor users’ speech in comment forums. The first case, Delfi, condoned a monitoring requirement in a case involving threats and hate speech.
Last summer, the Grand Chamber of the European Court of Human Rights (ECHR) delivered a serious setback to free expression on the Internet. The Court held, in Delfi v. Estonia, that a government could compel a news site to monitor its users’ online comments about articles.* This winter, the Court’s lower chamber ruled the other way in MTE v.
The probably-really-almost-totally final 2016 General Data Protection Regulation (GDPR) is here! Lawyers around the world have been hunkered down, analyzing its 200-plus pages. In the “Right to Be Forgotten” (RTBF) provisions, not much has changed from prior drafts.
Europe’s pending General Data Protection Regulation (GDPR) threatens free expression and access to information on the Internet. The threat comes from erasure requirements that work in ways the drafters may not have intended -- and that are not necessary to achieve the Regulation’s data protection purposes.
This essay closely examines the effect on free-expression rights when platforms such as Facebook or YouTube silence their users’ speech. The first part describes the often messy blend of government and private power behind many content removals, and discusses how the combination undermines users’ rights to challenge state action. The second part explores the legal minefield for users—or potentially, legislators—claiming a right to speak on major platforms.
On Tuesday, in a courtroom in Luxembourg, the Court of Justice of the European Union is to consider whether Google must enforce the “right to be forgotten” — which requires search engines to erase search results based on European law — everywhere in the world.
Policymakers increasingly ask Internet platforms like Facebook to “take responsibility” for material posted by their users. Mark Zuckerberg and other tech leaders seem willing to do so. That is in part a good development. Platforms are uniquely positioned to reduce harmful content online. But deputizing them to police users’ speech in the modern public square can also have serious unintended consequences. This piece reviews existing laws and current pressures to expand intermediaries’ liability for user-generated content.
"In May a court allowed a lawsuit to proceed against Model Mayhem, a network that connects models and photographers, for having failed to warn users that rapists have used the site to target victims. In June a judge decided that Yelp, a site for crowdsourced reviews, cannot challenge a court order to remove a defamatory review of a lawyer by a client. Courts and lawmakers are not about to abolish section 230, says Daphne Keller of the Centre for Internet and Society at Stanford Law School, but it is unlikely to survive for decades."
"Daphne Keller at the Stanford Center for Internet and Society said internet companies doing business in countries with laws restricting speech know they will be expected to comply with the rules. One common means of doing so without deleting lawful speech elsewhere is to offer country-specific versions of services, like YouTube Thailand, said Keller.
"The company can then honor national law on the version of the service that is targeted to, and primarily used in, that country," she said."
"Daphne Keller, an Internet law expert at Stanford Law School and former attorney at Google, said prior court decisions favor Yelp and she would be surprised if the California Supreme Court didn't reverse the ruling.
"It should be a no-brainer for Yelp to win," she said."
"“The place we all go to exercise our freedom of expression and to share opinions is a private platform run by a private company, and they don’t let us say every single thing that’s legal,” says Daphne Keller, director of intermediary liability at the Stanford Center for Internet and Society and a former head lawyer for Google’s web search team. “They only let us say the things that their policies permit. There’s good business reasons for that for them, but it’s a strange impact for us as a society sharing speech.”"
"And its odds of winning are high, said Daphne Keller, director of intermediary liability at Stanford University’s Center for Internet and Society, who said many companies have successfully used the CDA as a defense."
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
Internet platforms like Facebook and Twitter play an ever-increasing role in our lives, and mediate our personal and public communications. What laws govern their choices about our speech? Come discuss the law of platforms and online free expression with CIS Intermediary Liability Director Daphne Keller.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.