Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
Canada's Office of the Privacy Commissioner has concluded that an existing law, the Personal Information Protection and Electronic Documents Act (PIPEDA), gives individuals legal power to make individual websites take down information. This goes well beyond the rights recognized by the European Court of Justice in its “right to be forgotten” case, and raises the following important questions
Should Canada adopt its own version of the “right to be forgotten”? The Office of the Privacy Commissioner of Canada (OPC) recently concluded, in a Draft Position Paper, that such a right actually exists already. According to the OPC, Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) gives individuals legal power to make search engines like Google de-list search results about them, and to make individual websites take down information. In a Comment filed last week, I argued that this interpretation of PIPEDA will create far more problems than it solves.
Attached to this post are Powerpoint slides introducing intermediary liability basics. This particular deck comes from a great CIDE program in Mexico City. It is descended from others I’ve used over the years teaching at Stanford and Berkeley, presenting at conferences, and training junior lawyers at Google. Ancestral decks that evolved into this one go back to at least 2012. (Which might explain why I struggle with fonts whenever I update them.)
This piece is exerpted from the Law, Borders, and Speech Conference Proceedings Volume, where it appears as an appendix. The terminology it explains is relevant for Intermediary Liability and content regulation issues generally - not only issues that arise in the jurisdiction or conflict-of-law context. The full conference Proceedings Volume contains other relevant resources, and is Creative Commons licensed.
This panel considered issues of national jurisdiction in relation to Internet platforms’ voluntary content removal policies. These policies, typically set forth in Community Guidelines (CGs) or similar documents, prohibit content based on the platforms’ own rules or values—regardless of whether the content violates any law.
This essay closely examines the effect on free-expression rights when platforms such as Facebook or YouTube silence their users’ speech. The first part describes the often messy blend of government and private power behind many content removals, and discusses how the combination undermines users’ rights to challenge state action. The second part explores the legal minefield for users—or potentially, legislators—claiming a right to speak on major platforms.
On Tuesday, in a courtroom in Luxembourg, the Court of Justice of the European Union is to consider whether Google must enforce the “right to be forgotten” — which requires search engines to erase search results based on European law — everywhere in the world.
Policymakers increasingly ask Internet platforms like Facebook to “take responsibility” for material posted by their users. Mark Zuckerberg and other tech leaders seem willing to do so. That is in part a good development. Platforms are uniquely positioned to reduce harmful content online. But deputizing them to police users’ speech in the modern public square can also have serious unintended consequences. This piece reviews existing laws and current pressures to expand intermediaries’ liability for user-generated content.
If you paid attention to Mark Zuckerberg’s testimony before Congress last month, you might have gotten the impression that the internet consists entirely of titanic, California-based companies like Twitter, Facebook and Google. Congress is right to call these companies to account for outsize harms like disclosing personal data about many millions of users. But it is very wrong to act as though these companies are representative of the whole internet.
"“It’s really important to understand how much Europe is in the driver’s seat,” says Daphne Keller, director of Intermediary Liability at the Center for Internet and Society, as well as former associate general counsel at Google. “It kind of doesn’t matter what U.S. law says for a lot of things. Europe is extracting agreements by companies — they're going to enforce those agreements publicly.”"
"“When lawmakers create new rules that have never been tested by courts – like Australia's new law or the rules proposed in the UK's White Paper – and then tell platforms to enforce them, we can only expect that a broad swathe of perfectly legal speech is going to disappear,” said Daphne Keller, director of intermediary liability at the Stanford Centre for Internet and Society.
"The issue highlights the pressure on many internet platforms to attract customers by presenting a critical mass of listings to demonstrate scale, says Daphne Keller, director of intermediary liability at Stanford Law School’s Center for Internet and Society. She added that inactive or false listings don’t produce a good customer experience either. “You don’t want to have a bunch of listings in there that turn out to be dead ends,” Ms. Keller said. A Care.com spokeswoman declined to comment on Ms. Keller’s assessment."
"“Its role in enabling a certain kind of technical innovation is unambiguous,” says Daphne Keller at Stanford Law School’s Center for Internet and Society. “It made it possible for investors to get behind companies who were in the business of transmitting so much speech and information that they couldn't possibly assess it all and figure what was legal or illegal.”
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
Internet platforms like Facebook and Twitter play an ever-increasing role in our lives, and mediate our personal and public communications. What laws govern their choices about our speech? Come discuss the law of platforms and online free expression with CIS Intermediary Liability Director Daphne Keller.
Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?
"Daphne Keller, a specialist in corporate liability and responsibility at Stanford Law School's Center for Internet and Society, says Facebook could face private lawsuits over privacy."
""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "
Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.