Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
A big new law is coming, and a lot of companies doing business online aren’t going to like it. Neither will many advocates of civil liberties for Internet users. Europe’s pending General Data Protection Regulation (GDPR) updates and overhauls EU data protection law – the law that produced this week’s Schrems case and last year’s “Right to Be Forgotten” ruling in the EU.
Today the French Data Protection regulator, CNIL, reaffirmed its position that Google must apply European “Right to Be Forgotten” (RTBF) law globally, by removing content from its services in all countries. Europe’s RTBF laws are rooted in citizens' rights to data protection and privacy. They are inconsistent with U.S. and other countries’ free expression laws, because they require suppression of information even if that information is true and not causing harm.
Policymakers around the world are showing renewed interest in the rules that govern Internet information flow across national borders.
European courts are beginning to sort through one of the most important follow-up questions to last spring’s “Right To Be Forgotten” ruling in Google v. Costeja: what does the case mean for hosting services? The answer matters for the Twitters, Facebooks and YouTubes of the world – not to mention European hosting services like DailyMotion, local political discussion forums, and blogs or newspapers with user comment sections.
These comments address the issue of transparency under the GDPR, as that topic arises in the context of Internet intermediaries and the “Right to Be Forgotten.” CIS Intermediary Liability Director Daphne Keller filed them in response to a public call for comments from the Article 29 Working Party – the EU-wide umbrella group of data protection regulators established under the 1995 Directive, soon to be succeeded by the European Data Protection Board established under the GDPR.
This Stanford Center for Internet and Society White Paper uses proposed US legislation, SESTA, as a starting point for an overview of Intermediary Liability models -- and their consequences. It draws on law and experience from both the US and countries that have adopted different models, and recommends specific improvements for SESTA and similar proposed legislation.
"“I think they are really struggling and that’s not surprising, because it’s a very hard problem,” said Daphne Keller, who used to be on Google’s legal team and is now with Stanford University.
"According to Daphne Keller, a director at the Center for Internet and Society at Stanford’s school of law, outing those anonymous defendants might be the only way Miltenberg can get the case heard. It’s likely that Google – which was not named in the suit – and Donegan as the document’s creator will be immunized by federal statute and could get the case dismissed, Keller said.
"It will set governments’ expectations about how they can use their leverage over internet platforms to effectively enforce their own laws globally,” said Daphne Keller, who studies platforms’ legal responsibilities at the Stanford Center for Internet and Society and previously was Google’s associate general counsel."
"“Users are calling on online platforms to provide a moral code,” says Daphne Keller, director of the intermediary liability project at Stanford’s Center for Internet and Society. “But we’ll never agree on what should come down. Whatever the rules, they’ll fail.” Humans and technical filters alike, according to Keller, will continue to make “grievous errors.”"
"We don’t have nearly enough information to see the big picture and know what speech platforms are taking down. For the most part, we only find out when the speakers themselves learn that their posts or accounts have disappeared and choose to call public attention to it. But the idea that platforms’ rules are biased — and that this undermines democracy — isn’t new, and it isn’t unique to conservatives.
Presented by Bloomberg, the Electronic Frontier Foundation and the First Amendment Coalition.
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.