Daphne Keller studies the ways that Internet content platforms – and the laws governing them -- shape information access and other rights of ordinary Internet users. As the Director of Intermediary Liability at the Stanford Center for Internet and Society, she has written and spoken widely about the Right to Be Forgotten, copyright notice-and-takedown systems, cross-border content removal orders, platforms’ own discretionary content-removal decisions, and more. She has testified on these topics before legislatures, courts, and regulatory bodies around the world. In her previous role as Associate General Counsel at Google, Daphne worked on cases including Viacom, Perfect 10, Equustek, Mosley, and Metropolitan Schools; and was the primary counsel for products ranging from Web Search to the Chrome browser. Daphne has taught Internet law at Stanford, Berkeley, and Duke law schools. She is a graduate of Yale Law School and Brown University, and mother to some awesome kids in San Francisco.
High Res Photo of Daphne Keller
Attached to this post are Powerpoint slides introducing intermediary liability basics. This particular deck comes from a great CIDE program in Mexico City. It is descended from others I’ve used over the years teaching at Stanford and Berkeley, presenting at conferences, and training junior lawyers at Google. Ancestral decks that evolved into this one go back to at least 2012. (Which might explain why I struggle with fonts whenever I update them.)
This piece is exerpted from the Law, Borders, and Speech Conference Proceedings Volume, where it appears as an appendix. The terminology it explains is relevant for Intermediary Liability and content regulation issues generally - not only issues that arise in the jurisdiction or conflict-of-law context. The full conference Proceedings Volume contains other relevant resources, and is Creative Commons licensed.
This panel considered issues of national jurisdiction in relation to Internet platforms’ voluntary content removal policies. These policies, typically set forth in Community Guidelines (CGs) or similar documents, prohibit content based on the platforms’ own rules or values—regardless of whether the content violates any law.
The essay below serves as introduction to the Stanford Center for Internet and Society's Law, Borders, and Speech Conference Proceedings Volume. The conference brought together experts from around the world to discuss conflicting national laws governing online speech -- and how courts, Internet platforms, and public interest advocates should respond to increasing demands for these laws to be enforced on the global Internet.
Today, someone asked me about the Internet and human well-being over the next decade. The question was a healthy provocation to look at the big picture. I chose “more helped than harmed” from the very short list of radio-button responses. Here’s my elaboration:
Public demands for internet platforms to intervene more aggressively in online content are steadily mounting. Calls for companies like YouTube and Facebook to fight problems ranging from “fake news” to virulent misogyny to online radicalization seem to make daily headlines. British prime minister Theresa May echoed the politically prevailing sentiment in Europe when she urged platforms to “go further and faster” in removing prohibited content, including through use of automated filters.
Policymakers increasingly ask Internet platforms like Facebook to “take responsibility” for material posted by their users. Mark Zuckerberg and other tech leaders seem willing to do so. That is in part a good development. Platforms are uniquely positioned to reduce harmful content online. But deputizing them to police users’ speech in the modern public square can also have serious unintended consequences. This piece reviews existing laws and current pressures to expand intermediaries’ liability for user-generated content.
If you paid attention to Mark Zuckerberg’s testimony before Congress last month, you might have gotten the impression that the internet consists entirely of titanic, California-based companies like Twitter, Facebook and Google. Congress is right to call these companies to account for outsize harms like disclosing personal data about many millions of users. But it is very wrong to act as though these companies are representative of the whole internet.
"“Users are calling on online platforms to provide a moral code,” says Daphne Keller, director of the intermediary liability project at Stanford’s Center for Internet and Society. “But we’ll never agree on what should come down. Whatever the rules, they’ll fail.” Humans and technical filters alike, according to Keller, will continue to make “grievous errors.”"
"Daphne Keller, of the Stanford Center for Internet and Society, said Section 230 was designed to allow platforms like Facebook to do some moderation and make editorial decisions without generally being liable for users’ posts: “They need to be able to make discretionary choices about content.”
The law seemed to be on Facebook’s side, she said, but added that it was an unusual case given the focus on app data access while previous cases have centered on more straightforward censorship claims."
"However, a video for example showing Isis recruitment can violate the law in one context, but also be legal and important for purposes such as documenting crimes for future prosecution, says Daphne Keller, intermediary liability director at Stanford's Centre for Internet and Society.
“The more we push companies to carry out fast, sloppy content removals, the more mistakes we will see,” Keller says. She thinks lawmakers should “slow down, talk to experts including both security researchers and members of the affected communities, and build on that foundation”."
"Stanford's Daphne Keller is one of the world's foremost experts on intermediary liability protections and someone we've mentioned on the website many times in the past (and have had her on the podcast a few times as well). She's just published a fantastic paper presenting lessons from making internet platforms liable for the speech of its users. As she makes clear, she is not arguing that platforms should do no moderation at all.
RSVP is required for this free event.
Internet platforms like Facebook and Twitter play an ever-increasing role in our lives, and mediate our personal and public communications. What laws govern their choices about our speech? Come discuss the law of platforms and online free expression with CIS Intermediary Liability Director Daphne Keller.
When you give sites and services information about yourself, where does it go? Who else will get hold of it, and what will they use it for? The recent revelations about Cambridge Analytica's acquisition of data about tens of millions of Facebook users without their knowledge or consent have prompted renewed interest in how data about us gets shared, sold, used, and misused -- well beyond what we ever expected. Join us for a SLATA/CIS lunchtime conversation with three experts from Stanford’s Center for Internet and Society as we discuss the legal and policy implications of the Cambridge Analytica scandal and responses from Congress and courts. How can we prevent this from happening again? What new problems might we create through poorly-crafted legal responses?
Vinton G. Cerf is one of the founding fathers of the internet, and on Wednesday, February 28th, he will be on Canada 2020’s stage for an exclusive event.
Tickets are free and open to the public, but available in limited quantities. Click below to secure yours.
Known most for being the co-designer of the TCP/IP protocols and the architecture of the modern Internet, Vint will join us in Ottawa to talk about online citizenship, the right to be forgotten, and state of the modern internet.
Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?
"Daphne Keller, a specialist in corporate liability and responsibility at Stanford Law School's Center for Internet and Society, says Facebook could face private lawsuits over privacy."
""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "
Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.