Daphne Keller studies the ways that Internet content platforms – and the laws governing them -- shape information access and other rights of ordinary Internet users. As the Director of Intermediary Liability at the Stanford Center for Internet and Society, she has written and spoken widely about the Right to Be Forgotten, copyright notice-and-takedown systems, cross-border content removal orders, platforms’ own discretionary content-removal decisions, and more. She has testified on these topics before legislatures, courts, and regulatory bodies around the world. In her previous role as Associate General Counsel at Google, Daphne worked on cases including Viacom, Perfect 10, Equustek, Mosley, and Metropolitan Schools; and was the primary counsel for products ranging from Web Search to the Chrome browser. Daphne has taught Internet law at Stanford, Berkeley, and Duke law schools. She is a graduate of Yale Law School and Brown University, and mother to some awesome kids in San Francisco.
High Res Photo of Daphne Keller
Canada's Office of the Privacy Commissioner has concluded that an existing law, the Personal Information Protection and Electronic Documents Act (PIPEDA), gives individuals legal power to make individual websites take down information. This goes well beyond the rights recognized by the European Court of Justice in its “right to be forgotten” case, and raises the following important questions
Should Canada adopt its own version of the “right to be forgotten”? The Office of the Privacy Commissioner of Canada (OPC) recently concluded, in a Draft Position Paper, that such a right actually exists already. According to the OPC, Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) gives individuals legal power to make search engines like Google de-list search results about them, and to make individual websites take down information. In a Comment filed last week, I argued that this interpretation of PIPEDA will create far more problems than it solves.
Attached to this post are Powerpoint slides introducing intermediary liability basics. This particular deck comes from a great CIDE program in Mexico City. It is descended from others I’ve used over the years teaching at Stanford and Berkeley, presenting at conferences, and training junior lawyers at Google. Ancestral decks that evolved into this one go back to at least 2012. (Which might explain why I struggle with fonts whenever I update them.)
This piece is exerpted from the Law, Borders, and Speech Conference Proceedings Volume, where it appears as an appendix. The terminology it explains is relevant for Intermediary Liability and content regulation issues generally - not only issues that arise in the jurisdiction or conflict-of-law context. The full conference Proceedings Volume contains other relevant resources, and is Creative Commons licensed.
This panel considered issues of national jurisdiction in relation to Internet platforms’ voluntary content removal policies. These policies, typically set forth in Community Guidelines (CGs) or similar documents, prohibit content based on the platforms’ own rules or values—regardless of whether the content violates any law.
If you paid attention to Mark Zuckerberg’s testimony before Congress last month, you might have gotten the impression that the internet consists entirely of titanic, California-based companies like Twitter, Facebook and Google. Congress is right to call these companies to account for outsize harms like disclosing personal data about many millions of users. But it is very wrong to act as though these companies are representative of the whole internet.
These comments address the issue of transparency under the GDPR, as that topic arises in the context of Internet intermediaries and the “Right to Be Forgotten.” CIS Intermediary Liability Director Daphne Keller filed them in response to a public call for comments from the Article 29 Working Party – the EU-wide umbrella group of data protection regulators established under the 1995 Directive, soon to be succeeded by the European Data Protection Board established under the GDPR.
""I can't imagine Facebook knowing about [illegal content] and not taking it down," said Daphne Keller, the Director of Intermediary Liability at the Stanford Center for Internet and Society. More likely than not, they probably aren't aware of these videos unless someone flags them, she said."
"In May a court allowed a lawsuit to proceed against Model Mayhem, a network that connects models and photographers, for having failed to warn users that rapists have used the site to target victims. In June a judge decided that Yelp, a site for crowdsourced reviews, cannot challenge a court order to remove a defamatory review of a lawyer by a client. Courts and lawmakers are not about to abolish section 230, says Daphne Keller of the Centre for Internet and Society at Stanford Law School, but it is unlikely to survive for decades."
"Daphne Keller at the Stanford Center for Internet and Society said internet companies doing business in countries with laws restricting speech know they will be expected to comply with the rules. One common means of doing so without deleting lawful speech elsewhere is to offer country-specific versions of services, like YouTube Thailand, said Keller.
"The company can then honor national law on the version of the service that is targeted to, and primarily used in, that country," she said."
"Daphne Keller, an Internet law expert at Stanford Law School and former attorney at Google, said prior court decisions favor Yelp and she would be surprised if the California Supreme Court didn't reverse the ruling.
"It should be a no-brainer for Yelp to win," she said."
"“The place we all go to exercise our freedom of expression and to share opinions is a private platform run by a private company, and they don’t let us say every single thing that’s legal,” says Daphne Keller, director of intermediary liability at the Stanford Center for Internet and Society and a former head lawyer for Google’s web search team. “They only let us say the things that their policies permit. There’s good business reasons for that for them, but it’s a strange impact for us as a society sharing speech.”"
When you give sites and services information about yourself, where does it go? Who else will get hold of it, and what will they use it for? The recent revelations about Cambridge Analytica's acquisition of data about tens of millions of Facebook users without their knowledge or consent have prompted renewed interest in how data about us gets shared, sold, used, and misused -- well beyond what we ever expected. Join us for a SLATA/CIS lunchtime conversation with three experts from Stanford’s Center for Internet and Society as we discuss the legal and policy implications of the Cambridge Analytica scandal and responses from Congress and courts. How can we prevent this from happening again? What new problems might we create through poorly-crafted legal responses?
Vinton G. Cerf is one of the founding fathers of the internet, and on Wednesday, February 28th, he will be on Canada 2020’s stage for an exclusive event.
Tickets are free and open to the public, but available in limited quantities. Click below to secure yours.
Known most for being the co-designer of the TCP/IP protocols and the architecture of the modern Internet, Vint will join us in Ottawa to talk about online citizenship, the right to be forgotten, and state of the modern internet.
Twenty years ago, the US Supreme Court’s decision in Reno v. ACLU established the framework for internet free speech and liability that remains in place today. This conference will consider the continuing viability of the Reno vision in the face of multiplying concerns about sex trafficking online, terrorist content, election interference, and other forms of contested content.
The Center for Internet and Society (CIS) is a public interest technology law and policy program at Stanford Law School and a part of Law, Science and Technology Program at Stanford Law School. CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry.
Over two years have passed since the Court of Justice of the European Union ruled, in the Google Spain case, that the search engine must “de-list” certain search results on request in order to honor the requesters’ data protection rights.
"Daphne Keller, a specialist in corporate liability and responsibility at Stanford Law School's Center for Internet and Society, says Facebook could face private lawsuits over privacy."
""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "
Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.