Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
I have a new article coming out, called Who Do You Sue? State and Platform Hybrid Power over Online Speech. It is about free expression rights on platforms like Facebook or Twitter, which the Supreme Court has called “the modern public square.” One section is about speakers suing platforms. It looks at cases – over thirty so far – where users argue that companies like Facebook or Twitter have violated their free expression rights by taking down legal speech that is prohibited under the platforms’ Community Guidelines.
Two important current trends in Internet law go together in ways that aren’t getting enough attention. They should, though, because the overlap is well on its way to messing up the Internet further.
Are Internet platforms distorting our political discourse by silencing conservatives? If they were, could Congress pass a law forcing them to play fair?
Public demands for internet platforms to intervene more aggressively in online content are steadily mounting. Calls for companies like YouTube and Facebook to fight problems ranging from “fake news” to virulent misogyny to online radicalization seem to make daily headlines. British prime minister Theresa May echoed the politically prevailing sentiment in Europe when she urged platforms to “go further and faster” in removing prohibited content, including through use of automated filters.
Europe's new General Data Protection Regulation (GDPR) goes into force today, after two years of preparation. Meanwhile, in the US, a remarkable number of people are suggesting we should adopt something like the GDPR. What does that actually mean, and what policy trade-offs does it entail?
This essay closely examines the effect on free-expression rights when platforms such as Facebook or YouTube silence their users’ speech. The first part describes the often messy blend of government and private power behind many content removals, and discusses how the combination undermines users’ rights to challenge state action. The second part explores the legal minefield for users—or potentially, legislators—claiming a right to speak on major platforms.
On Tuesday, in a courtroom in Luxembourg, the Court of Justice of the European Union is to consider whether Google must enforce the “right to be forgotten” — which requires search engines to erase search results based on European law — everywhere in the world.
Policymakers increasingly ask Internet platforms like Facebook to “take responsibility” for material posted by their users. Mark Zuckerberg and other tech leaders seem willing to do so. That is in part a good development. Platforms are uniquely positioned to reduce harmful content online. But deputizing them to police users’ speech in the modern public square can also have serious unintended consequences. This piece reviews existing laws and current pressures to expand intermediaries’ liability for user-generated content.
If you paid attention to Mark Zuckerberg’s testimony before Congress last month, you might have gotten the impression that the internet consists entirely of titanic, California-based companies like Twitter, Facebook and Google. Congress is right to call these companies to account for outsize harms like disclosing personal data about many millions of users. But it is very wrong to act as though these companies are representative of the whole internet.
"“It’s really important to understand how much Europe is in the driver’s seat,” says Daphne Keller, director of Intermediary Liability at the Center for Internet and Society, as well as former associate general counsel at Google. “It kind of doesn’t matter what U.S. law says for a lot of things. Europe is extracting agreements by companies — they're going to enforce those agreements publicly.”"
"“When lawmakers create new rules that have never been tested by courts – like Australia's new law or the rules proposed in the UK's White Paper – and then tell platforms to enforce them, we can only expect that a broad swathe of perfectly legal speech is going to disappear,” said Daphne Keller, director of intermediary liability at the Stanford Centre for Internet and Society.
"The issue highlights the pressure on many internet platforms to attract customers by presenting a critical mass of listings to demonstrate scale, says Daphne Keller, director of intermediary liability at Stanford Law School’s Center for Internet and Society. She added that inactive or false listings don’t produce a good customer experience either. “You don’t want to have a bunch of listings in there that turn out to be dead ends,” Ms. Keller said. A Care.com spokeswoman declined to comment on Ms. Keller’s assessment."
"“Its role in enabling a certain kind of technical innovation is unambiguous,” says Daphne Keller at Stanford Law School’s Center for Internet and Society. “It made it possible for investors to get behind companies who were in the business of transmitting so much speech and information that they couldn't possibly assess it all and figure what was legal or illegal.”
The Center for Internet and Society (CIS) is a public interest technology law and policy program at Stanford Law School and a part of Law, Science and Technology Program at Stanford Law School.
When you give sites and services information about yourself, where does it go? Who else will get hold of it, and what will they use it for? The recent revelations about Cambridge Analytica's acquisition of data about tens of millions of Facebook users without their knowledge or consent have prompted renewed interest in how data about us gets shared, sold, used, and misused -- well beyond what we ever expected. Join us for a SLATA/CIS lunchtime conversation with three experts from Stanford’s Center for Internet and Society as we discuss the legal and policy implications of the Cambridge Analytica scandal and responses from Congress and courts. How can we prevent this from happening again? What new problems might we create through poorly-crafted legal responses?
Vinton G. Cerf is one of the founding fathers of the internet, and on Wednesday, February 28th, he will be on Canada 2020’s stage for an exclusive event.
Tickets are free and open to the public, but available in limited quantities. Click below to secure yours.
Known most for being the co-designer of the TCP/IP protocols and the architecture of the modern Internet, Vint will join us in Ottawa to talk about online citizenship, the right to be forgotten, and state of the modern internet.
Twenty years ago, the US Supreme Court’s decision in Reno v. ACLU established the framework for internet free speech and liability that remains in place today. This conference will consider the continuing viability of the Reno vision in the face of multiplying concerns about sex trafficking online, terrorist content, election interference, and other forms of contested content.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.