Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
The EU’s proposed Terrorist Content Regulation gives national authorities sweeping new powers over comments, videos, and other content that people share using Internet platforms. Among other things, authorities – who may be police, not courts – can require platforms of all sizes to take content down within one hour. The Regulation also requires even small platforms to build upload filters and attempt to proactively weed out prohibited material.
I have a new article coming out, called Who Do You Sue? State and Platform Hybrid Power over Online Speech. It is about free expression rights on platforms like Facebook or Twitter, which the Supreme Court has called “the modern public square.” One section is about speakers suing platforms. It looks at cases – over thirty so far – where users argue that companies like Facebook or Twitter have violated their free expression rights by taking down legal speech that is prohibited under the platforms’ Community Guidelines.
Two important current trends in Internet law go together in ways that aren’t getting enough attention. They should, though, because the overlap is well on its way to messing up the Internet further.
Are Internet platforms distorting our political discourse by silencing conservatives? If they were, could Congress pass a law forcing them to play fair?
Public demands for internet platforms to intervene more aggressively in online content are steadily mounting. Calls for companies like YouTube and Facebook to fight problems ranging from “fake news” to virulent misogyny to online radicalization seem to make daily headlines. British prime minister Theresa May echoed the politically prevailing sentiment in Europe when she urged platforms to “go further and faster” in removing prohibited content, including through use of automated filters.
This Stanford Center for Internet and Society White Paper uses proposed US legislation, SESTA, as a starting point for an overview of Intermediary Liability models -- and their consequences. It draws on law and experience from both the US and countries that have adopted different models, and recommends specific improvements for SESTA and similar proposed legislation.
Most observers cheered when the neo-Nazi Daily Stormer was booted from YouTube, CloudFlare, and other platforms around the Internet. At the same time, the site’s disappearance stirred anxiety about Internet companies’ power over online speech. It starkly illustrated how online speech can live or die at the discretion of private companies. The modern public square is in private hands.
Prime Minister Theresa May’s political fortunes may be waning in Britain, but her push to make internet companies police their users’ speech is alive and well. In the aftermath of the recent London attacks, Ms. May called platforms like Google and Facebook breeding grounds for terrorism.
"Daphne Keller, a former Google lawyer now at Stanford’s Center for Internet and Society, agreed that the “knowingly” language is problematic. “It creates this incentive to bury your head in the sand and not try to find bad content,” she said."
"In a recent paper, Daphne Keller, director of Intermediary Liability at the Stanford Center for Internet and Society, points out that whether and how content hosts—such as social media companies—must honor RTBF requests under the GDPR is unclear.
"Policy experts also question how the bill would actually work. Daphne Keller of the Stanford Center for Internet and Society pointed to the challenges of determining whether an ad buyer is a foreign entity, particularly if buyers rely on outside vendors to purchase ads.
“Nobody knows how to figure out who counts as Russian,” she said. “It seems extremely easy to hide your identity.”"
"Daphne Keller of the Stanford Center for Internet and Society says that the new law could push some platforms and publishers to crack down on a wide variety of speech, to avoid the threat of lawsuits. It would give them “a reason to err on the side of removing internet users’ speech in response to any controversy,” she says, “and in response to false or mistaken allegations, which are often levied against online speech.”"
"“When platforms don’t know what to do, the legally over-cautious response is to go way overboard on taking things down just in case they’re illegal,” Daphne Keller, Director of Intermediary Liability at Stanford University’s Center for Internet and Society, told BuzzFeed News. “My worst case scenario legislation would be some vague obligation for platforms to make sure that users don’t do bad things.”"
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
Internet platforms like Facebook and Twitter play an ever-increasing role in our lives, and mediate our personal and public communications. What laws govern their choices about our speech? Come discuss the law of platforms and online free expression with CIS Intermediary Liability Director Daphne Keller.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.