Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
Europe's new General Data Protection Regulation (GDPR) goes into force today, after two years of preparation. Meanwhile, in the US, a remarkable number of people are suggesting we should adopt something like the GDPR. What does that actually mean, and what policy trade-offs does it entail?
Canada's Office of the Privacy Commissioner has concluded that an existing law, the Personal Information Protection and Electronic Documents Act (PIPEDA), gives individuals legal power to make individual websites take down information. This goes well beyond the rights recognized by the European Court of Justice in its “right to be forgotten” case, and raises the following important questions
Should Canada adopt its own version of the “right to be forgotten”? The Office of the Privacy Commissioner of Canada (OPC) recently concluded, in a Draft Position Paper, that such a right actually exists already. According to the OPC, Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) gives individuals legal power to make search engines like Google de-list search results about them, and to make individual websites take down information. In a Comment filed last week, I argued that this interpretation of PIPEDA will create far more problems than it solves.
Attached to this post are Powerpoint slides introducing intermediary liability basics. This particular deck comes from a great CIDE program in Mexico City. It is descended from others I’ve used over the years teaching at Stanford and Berkeley, presenting at conferences, and training junior lawyers at Google. Ancestral decks that evolved into this one go back to at least 2012. (Which might explain why I struggle with fonts whenever I update them.)
This piece is exerpted from the Law, Borders, and Speech Conference Proceedings Volume, where it appears as an appendix. The terminology it explains is relevant for Intermediary Liability and content regulation issues generally - not only issues that arise in the jurisdiction or conflict-of-law context. The full conference Proceedings Volume contains other relevant resources, and is Creative Commons licensed.
This essay closely examines the effect on free-expression rights when platforms such as Facebook or YouTube silence their users’ speech. The first part describes the often messy blend of government and private power behind many content removals, and discusses how the combination undermines users’ rights to challenge state action. The second part explores the legal minefield for users—or potentially, legislators—claiming a right to speak on major platforms.
On Tuesday, in a courtroom in Luxembourg, the Court of Justice of the European Union is to consider whether Google must enforce the “right to be forgotten” — which requires search engines to erase search results based on European law — everywhere in the world.
Policymakers increasingly ask Internet platforms like Facebook to “take responsibility” for material posted by their users. Mark Zuckerberg and other tech leaders seem willing to do so. That is in part a good development. Platforms are uniquely positioned to reduce harmful content online. But deputizing them to police users’ speech in the modern public square can also have serious unintended consequences. This piece reviews existing laws and current pressures to expand intermediaries’ liability for user-generated content.
If you paid attention to Mark Zuckerberg’s testimony before Congress last month, you might have gotten the impression that the internet consists entirely of titanic, California-based companies like Twitter, Facebook and Google. Congress is right to call these companies to account for outsize harms like disclosing personal data about many millions of users. But it is very wrong to act as though these companies are representative of the whole internet.
Daphne Keller, Director of Intermediary Liability at Stanford’s Center for Internet and Society, told Quartz Facebook’s turnaround time was actually quite fast. Keller worked for years as an attorney at Google, and said that having been “on the other side,” she witnessed the massive volume of user reports these companies get, and how many of the flags they get are simply wrong or not actionable. “I don’t think it’s realistic to do anything better.”
""I can't imagine Facebook knowing about [illegal content] and not taking it down," said Daphne Keller, the Director of Intermediary Liability at the Stanford Center for Internet and Society. More likely than not, they probably aren't aware of these videos unless someone flags them, she said."
"In May a court allowed a lawsuit to proceed against Model Mayhem, a network that connects models and photographers, for having failed to warn users that rapists have used the site to target victims. In June a judge decided that Yelp, a site for crowdsourced reviews, cannot challenge a court order to remove a defamatory review of a lawyer by a client. Courts and lawmakers are not about to abolish section 230, says Daphne Keller of the Centre for Internet and Society at Stanford Law School, but it is unlikely to survive for decades."
"Daphne Keller at the Stanford Center for Internet and Society said internet companies doing business in countries with laws restricting speech know they will be expected to comply with the rules. One common means of doing so without deleting lawful speech elsewhere is to offer country-specific versions of services, like YouTube Thailand, said Keller.
"The company can then honor national law on the version of the service that is targeted to, and primarily used in, that country," she said."
"Daphne Keller, an Internet law expert at Stanford Law School and former attorney at Google, said prior court decisions favor Yelp and she would be surprised if the California Supreme Court didn't reverse the ruling.
"It should be a no-brainer for Yelp to win," she said."
After a lengthy legislative process, the GDPR is finally ready. As the most significant overhaul of data privacy laws in Europe in twenty years, it will have a profound impact on Silicon Valley technology companies offering online services in Europe. The recently announced Privacy Shield will affect most US organisations that receive personal information from Europe.
Over 800 attendees registered at the State of the Net Conference (SOTN) in 2015. The conference provides unparalleled opportunities to network and engage on key Internet policy issues. SOTN is the largest Internet policy conference in the U.S. and the only one with over 50 percent Congressional staff and government policymakers in attendance.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.