Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
This is one of a series of posts about the pending EU General Data Protection Regulation (GDPR), and its consequences for intermediaries and user speech online. In an earlier introduction and FAQ, I discuss the GDPR’s impact on both data protection law and Internet intermediary liability law.
This is one of a series of posts about the pending EU General Data Protection Regulation (GDPR), and its consequences for intermediaries and user speech online.
Most intermediaries offer legal “Notice and Takedown” systems – tools for people to alert the company if user-generated content violates the law, and for the company to remove that content if necessary.
[Stanford's Daphne Keller is a preeminent cyberlawyer and one of the world's leading experts on "intermediary liability" -- that is, when an online service should be held responsible for the actions of this user. She brings us a delightful tale of Facebook's inability to moderate content at scale, which is as much of a tale of the impossibility (and foolishness) of trying to support 2.3 billion users (who will generate 2,300 one-in-a-million edge-cases every day) as it is about a specific failure.
This past week, with some fanfare, Facebook announced its own version of the Supreme Court: a 40-member board that will make final decisions about user posts that Facebook has taken down. The announcement came after extended deliberations that have been described as Facebook’s “constitutional convention.”
The Program on Extremism Policy Paper series combines analysis on extremism-related issues by our researchers and guest contributors with tailored recommendations for policymakers.
Full paper available for download here.
"Policy experts also question how the bill would actually work. Daphne Keller of the Stanford Center for Internet and Society pointed to the challenges of determining whether an ad buyer is a foreign entity, particularly if buyers rely on outside vendors to purchase ads.
“Nobody knows how to figure out who counts as Russian,” she said. “It seems extremely easy to hide your identity.”"
"Daphne Keller of the Stanford Center for Internet and Society says that the new law could push some platforms and publishers to crack down on a wide variety of speech, to avoid the threat of lawsuits. It would give them “a reason to err on the side of removing internet users’ speech in response to any controversy,” she says, “and in response to false or mistaken allegations, which are often levied against online speech.”"
"“When platforms don’t know what to do, the legally over-cautious response is to go way overboard on taking things down just in case they’re illegal,” Daphne Keller, Director of Intermediary Liability at Stanford University’s Center for Internet and Society, told BuzzFeed News. “My worst case scenario legislation would be some vague obligation for platforms to make sure that users don’t do bad things.”"
"“Historically, the place you went to exercise your speech rights was the public square. Now the equivalent is Twitter and YouTube and Facebook,” said Daphne Keller of the Stanford Center for Internet and Society. “In a practical matter, how much you can speak is not in the hands of the constitution but in the hands of these private companies.”"
"“Many people suing for harassment have tried to find exemptions under the CDA,” said Daphne Keller, director of intermediary liability at Stanford University’s Center for Internet and Society, making the point that the platforms usually win."
Internet platforms like Facebook and Twitter play an ever-increasing role in our lives, and mediate our personal and public communications. What laws govern their choices about our speech? Come discuss the law of platforms and online free expression with CIS Intermediary Liability Director Daphne Keller.
The Center for Internet and Society (CIS) is a public interest technology law and policy program at Stanford Law School and a part of Law, Science and Technology Program at Stanford Law School.
When you give sites and services information about yourself, where does it go? Who else will get hold of it, and what will they use it for? The recent revelations about Cambridge Analytica's acquisition of data about tens of millions of Facebook users without their knowledge or consent have prompted renewed interest in how data about us gets shared, sold, used, and misused -- well beyond what we ever expected. Join us for a SLATA/CIS lunchtime conversation with three experts from Stanford’s Center for Internet and Society as we discuss the legal and policy implications of the Cambridge Analytica scandal and responses from Congress and courts. How can we prevent this from happening again? What new problems might we create through poorly-crafted legal responses?
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.