Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
A big new law is coming, and a lot of companies doing business online aren’t going to like it. Neither will many advocates of civil liberties for Internet users. Europe’s pending General Data Protection Regulation (GDPR) updates and overhauls EU data protection law – the law that produced this week’s Schrems case and last year’s “Right to Be Forgotten” ruling in the EU.
Today the French Data Protection regulator, CNIL, reaffirmed its position that Google must apply European “Right to Be Forgotten” (RTBF) law globally, by removing content from its services in all countries. Europe’s RTBF laws are rooted in citizens' rights to data protection and privacy. They are inconsistent with U.S. and other countries’ free expression laws, because they require suppression of information even if that information is true and not causing harm.
Policymakers around the world are showing renewed interest in the rules that govern Internet information flow across national borders.
European courts are beginning to sort through one of the most important follow-up questions to last spring’s “Right To Be Forgotten” ruling in Google v. Costeja: what does the case mean for hosting services? The answer matters for the Twitters, Facebooks and YouTubes of the world – not to mention European hosting services like DailyMotion, local political discussion forums, and blogs or newspapers with user comment sections.
This essay closely examines the effect on free-expression rights when platforms such as Facebook or YouTube silence their users’ speech. The first part describes the often messy blend of government and private power behind many content removals, and discusses how the combination undermines users’ rights to challenge state action. The second part explores the legal minefield for users—or potentially, legislators—claiming a right to speak on major platforms.
On Tuesday, in a courtroom in Luxembourg, the Court of Justice of the European Union is to consider whether Google must enforce the “right to be forgotten” — which requires search engines to erase search results based on European law — everywhere in the world.
Policymakers increasingly ask Internet platforms like Facebook to “take responsibility” for material posted by their users. Mark Zuckerberg and other tech leaders seem willing to do so. That is in part a good development. Platforms are uniquely positioned to reduce harmful content online. But deputizing them to police users’ speech in the modern public square can also have serious unintended consequences. This piece reviews existing laws and current pressures to expand intermediaries’ liability for user-generated content.
If you paid attention to Mark Zuckerberg’s testimony before Congress last month, you might have gotten the impression that the internet consists entirely of titanic, California-based companies like Twitter, Facebook and Google. Congress is right to call these companies to account for outsize harms like disclosing personal data about many millions of users. But it is very wrong to act as though these companies are representative of the whole internet.
"However, Daphne Keller, the director of intermediary liability at the Stanford Law School Center for Internet and Society, questions whether machine monitoring is something we should even want to do.
"The idea that we can have an automated machine that can detect what's illegal from what's legal is pretty risky," Keller tells Lynch."
"Daphne Keller, Director of Intermediary Liability at Stanford’s Center for Internet and Society, told Quartz Facebook’s turnaround time was actually quite fast. Keller worked for years as an attorney at Google, and said that having been “on the other side,” she witnessed the massive volume of user reports these companies get, and how many of the flags they get are simply wrong or not actionable. “I don’t think it’s realistic to do anything better.”
Daphne Keller, Director of Intermediary Liability at Stanford’s Center for Internet and Society, told Quartz Facebook’s turnaround time was actually quite fast. Keller worked for years as an attorney at Google, and said that having been “on the other side,” she witnessed the massive volume of user reports these companies get, and how many of the flags they get are simply wrong or not actionable. “I don’t think it’s realistic to do anything better.”
""I can't imagine Facebook knowing about [illegal content] and not taking it down," said Daphne Keller, the Director of Intermediary Liability at the Stanford Center for Internet and Society. More likely than not, they probably aren't aware of these videos unless someone flags them, she said."
"In May a court allowed a lawsuit to proceed against Model Mayhem, a network that connects models and photographers, for having failed to warn users that rapists have used the site to target victims. In June a judge decided that Yelp, a site for crowdsourced reviews, cannot challenge a court order to remove a defamatory review of a lawyer by a client. Courts and lawmakers are not about to abolish section 230, says Daphne Keller of the Centre for Internet and Society at Stanford Law School, but it is unlikely to survive for decades."
Presented by Bloomberg, the Electronic Frontier Foundation and the First Amendment Coalition.
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
In this episode of the Arbiters of Truth series—Lawfare's new podcast series on disinformation in the run-up to the 2020 election—Quinta Jurecic and Evelyn Douek spoke with Daphne Keller, the director of intermediary liability at Stanford's Center for Internet and Society, about the nuts and bolts of content moderation. People often have big ideas for how tech platforms should decide what content to take down and what to keep up, but what kind of moderation is actually possible at scale?
In this episode, Daphne Keller, Director of Intermediary Liability at the Center for Internet and Society at Stanford Law School and former Associate General Counsel for Google, discusses her essay "Who Do You Sue?: State and Platform Hybrid Power Over Online Speech," which is published by the Hoover Institution.
On this segment of “Quality Assurance,” I take a deep dive on platforms and regulating speech. I spoke with Daphne Keller, who is at Stanford Law School’s Center for Internet and Society. The following is an edited transcript of our conversation.
The question of what responsibility should lie with Internet platforms for the content they host that is posted by their users has been the subject of debate around in the world as politicians, regulators, and the broader public seek to navigate policy choices to combat harmful speech that have implications for freedom of expression, online harms, competition, and innovation.
Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?