Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
A big new law is coming, and a lot of companies doing business online aren’t going to like it. Neither will many advocates of civil liberties for Internet users. Europe’s pending General Data Protection Regulation (GDPR) updates and overhauls EU data protection law – the law that produced this week’s Schrems case and last year’s “Right to Be Forgotten” ruling in the EU.
Today the French Data Protection regulator, CNIL, reaffirmed its position that Google must apply European “Right to Be Forgotten” (RTBF) law globally, by removing content from its services in all countries. Europe’s RTBF laws are rooted in citizens' rights to data protection and privacy. They are inconsistent with U.S. and other countries’ free expression laws, because they require suppression of information even if that information is true and not causing harm.
Policymakers around the world are showing renewed interest in the rules that govern Internet information flow across national borders.
European courts are beginning to sort through one of the most important follow-up questions to last spring’s “Right To Be Forgotten” ruling in Google v. Costeja: what does the case mean for hosting services? The answer matters for the Twitters, Facebooks and YouTubes of the world – not to mention European hosting services like DailyMotion, local political discussion forums, and blogs or newspapers with user comment sections.
This essay closely examines the effect on free-expression rights when platforms such as Facebook or YouTube silence their users’ speech. The first part describes the often messy blend of government and private power behind many content removals, and discusses how the combination undermines users’ rights to challenge state action. The second part explores the legal minefield for users—or potentially, legislators—claiming a right to speak on major platforms.
On Tuesday, in a courtroom in Luxembourg, the Court of Justice of the European Union is to consider whether Google must enforce the “right to be forgotten” — which requires search engines to erase search results based on European law — everywhere in the world.
Policymakers increasingly ask Internet platforms like Facebook to “take responsibility” for material posted by their users. Mark Zuckerberg and other tech leaders seem willing to do so. That is in part a good development. Platforms are uniquely positioned to reduce harmful content online. But deputizing them to police users’ speech in the modern public square can also have serious unintended consequences. This piece reviews existing laws and current pressures to expand intermediaries’ liability for user-generated content.
Facebook has come under increased scrutiny in recent months, the social media giant’s efforts to protect its users’ data questioned.
"Daphne Keller, a former Google lawyer now at Stanford’s Center for Internet and Society, agreed that the “knowingly” language is problematic. “It creates this incentive to bury your head in the sand and not try to find bad content,” she said."
"In a recent paper, Daphne Keller, director of Intermediary Liability at the Stanford Center for Internet and Society, points out that whether and how content hosts—such as social media companies—must honor RTBF requests under the GDPR is unclear.
"Policy experts also question how the bill would actually work. Daphne Keller of the Stanford Center for Internet and Society pointed to the challenges of determining whether an ad buyer is a foreign entity, particularly if buyers rely on outside vendors to purchase ads.
“Nobody knows how to figure out who counts as Russian,” she said. “It seems extremely easy to hide your identity.”"
"Daphne Keller of the Stanford Center for Internet and Society says that the new law could push some platforms and publishers to crack down on a wide variety of speech, to avoid the threat of lawsuits. It would give them “a reason to err on the side of removing internet users’ speech in response to any controversy,” she says, “and in response to false or mistaken allegations, which are often levied against online speech.”"
Stanford CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.