Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
Europe's new General Data Protection Regulation (GDPR) goes into force today, after two years of preparation. Meanwhile, in the US, a remarkable number of people are suggesting we should adopt something like the GDPR. What does that actually mean, and what policy trade-offs does it entail?
Canada's Office of the Privacy Commissioner has concluded that an existing law, the Personal Information Protection and Electronic Documents Act (PIPEDA), gives individuals legal power to make individual websites take down information. This goes well beyond the rights recognized by the European Court of Justice in its “right to be forgotten” case, and raises the following important questions
Should Canada adopt its own version of the “right to be forgotten”? The Office of the Privacy Commissioner of Canada (OPC) recently concluded, in a Draft Position Paper, that such a right actually exists already. According to the OPC, Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) gives individuals legal power to make search engines like Google de-list search results about them, and to make individual websites take down information. In a Comment filed last week, I argued that this interpretation of PIPEDA will create far more problems than it solves.
Attached to this post are Powerpoint slides introducing intermediary liability basics. This particular deck comes from a great CIDE program in Mexico City. It is descended from others I’ve used over the years teaching at Stanford and Berkeley, presenting at conferences, and training junior lawyers at Google. Ancestral decks that evolved into this one go back to at least 2012. (Which might explain why I struggle with fonts whenever I update them.)
This piece is exerpted from the Law, Borders, and Speech Conference Proceedings Volume, where it appears as an appendix. The terminology it explains is relevant for Intermediary Liability and content regulation issues generally - not only issues that arise in the jurisdiction or conflict-of-law context. The full conference Proceedings Volume contains other relevant resources, and is Creative Commons licensed.
[Stanford's Daphne Keller is a preeminent cyberlawyer and one of the world's leading experts on "intermediary liability" -- that is, when an online service should be held responsible for the actions of this user. She brings us a delightful tale of Facebook's inability to moderate content at scale, which is as much of a tale of the impossibility (and foolishness) of trying to support 2.3 billion users (who will generate 2,300 one-in-a-million edge-cases every day) as it is about a specific failure.
This past week, with some fanfare, Facebook announced its own version of the Supreme Court: a 40-member board that will make final decisions about user posts that Facebook has taken down. The announcement came after extended deliberations that have been described as Facebook’s “constitutional convention.”
The Program on Extremism Policy Paper series combines analysis on extremism-related issues by our researchers and guest contributors with tailored recommendations for policymakers.
Full paper available for download here.
"Daphne Keller, the director of intermediary liability at Stanford Law School's Center for Internet and Society and a former associate general counsel at Google, told CPJ she believes that that Twitter should “push back” if a government is asking for something that’s inconsistent with human rights. “But that’s expensive, and hard, and may cause them to lose a bunch of money,” she said."
"Were Section 230 to be abolished, as Benioff wants, it might actually hurt Facebook’s competitors more than it would hurt Facebook — not exactly the outcome Benioff seems to be hoping for. “Just crossing out 230 and leaving the courts to figure it out, I think, would be catastrophic,” Daphne Keller, the director of intermediary liability at Stanford's Center for Internet and Society, told BuzzFeed News. “We would have years and years of uncertainty. And that uncertainty would hurt little companies, who can’t afford to litigate things, worse than big companies.”"
"“I don’t think there’s a chance that major economies like the E.U. are going to accept C.D.A. 230” said Daphne Keller, the director of intermediary liability at Stanford Law School’s Center for Internet and Society. “So I’m not sure what the net effect is.”"
"“There has been real mission creep with the right to be forgotten,” said Daphne Keller, a lawyer at Stanford University’s Center for Internet and Society. “First it was supposed to be about information found using search engines, but now we see it affecting news reporting.”"
"Hate speech that isn’t an imminent threat is still protected by the Constitution, noted Daphne Keller, a researcher at Stanford’s Center for Internet and Society and a former associate general counsel for Google. “A law can’t just ban it. And Congress can’t just tell platforms to ban it, either — that use of government power would still violate the First Amendment,” she said."
Stanford CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry. Come hear CIS Directors Jennifer Granick + Daphne Keller and Resident Fellows Riana Pfefferkorn + Luiz Fernando Marrey Moncau talk about our work, and the assistance CIS provides to students in learning about these issues, selecting courses, identifying job opportunities, and making professional connections.
After a lengthy legislative process, the GDPR is finally ready. As the most significant overhaul of data privacy laws in Europe in twenty years, it will have a profound impact on Silicon Valley technology companies offering online services in Europe. The recently announced Privacy Shield will affect most US organisations that receive personal information from Europe.
In this episode of the Arbiters of Truth series—Lawfare's new podcast series on disinformation in the run-up to the 2020 election—Quinta Jurecic and Evelyn Douek spoke with Daphne Keller, the director of intermediary liability at Stanford's Center for Internet and Society, about the nuts and bolts of content moderation. People often have big ideas for how tech platforms should decide what content to take down and what to keep up, but what kind of moderation is actually possible at scale?
In this episode, Daphne Keller, Director of Intermediary Liability at the Center for Internet and Society at Stanford Law School and former Associate General Counsel for Google, discusses her essay "Who Do You Sue?: State and Platform Hybrid Power Over Online Speech," which is published by the Hoover Institution.
On this segment of “Quality Assurance,” I take a deep dive on platforms and regulating speech. I spoke with Daphne Keller, who is at Stanford Law School’s Center for Internet and Society. The following is an edited transcript of our conversation.
The question of what responsibility should lie with Internet platforms for the content they host that is posted by their users has been the subject of debate around in the world as politicians, regulators, and the broader public seek to navigate policy choices to combat harmful speech that have implications for freedom of expression, online harms, competition, and innovation.
Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?