Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
The EU’s proposed Terrorist Content Regulation gives national authorities sweeping new powers over comments, videos, and other content that people share using Internet platforms. Among other things, authorities – who may be police, not courts – can require platforms of all sizes to take content down within one hour. The Regulation also requires even small platforms to build upload filters and attempt to proactively weed out prohibited material.
I have a new article coming out, called Who Do You Sue? State and Platform Hybrid Power over Online Speech. It is about free expression rights on platforms like Facebook or Twitter, which the Supreme Court has called “the modern public square.” One section is about speakers suing platforms. It looks at cases – over thirty so far – where users argue that companies like Facebook or Twitter have violated their free expression rights by taking down legal speech that is prohibited under the platforms’ Community Guidelines.
Two important current trends in Internet law go together in ways that aren’t getting enough attention. They should, though, because the overlap is well on its way to messing up the Internet further.
Are Internet platforms distorting our political discourse by silencing conservatives? If they were, could Congress pass a law forcing them to play fair?
Public demands for internet platforms to intervene more aggressively in online content are steadily mounting. Calls for companies like YouTube and Facebook to fight problems ranging from “fake news” to virulent misogyny to online radicalization seem to make daily headlines. British prime minister Theresa May echoed the politically prevailing sentiment in Europe when she urged platforms to “go further and faster” in removing prohibited content, including through use of automated filters.
These comments address the issue of transparency under the GDPR, as that topic arises in the context of Internet intermediaries and the “Right to Be Forgotten.” CIS Intermediary Liability Director Daphne Keller filed them in response to a public call for comments from the Article 29 Working Party – the EU-wide umbrella group of data protection regulators established under the 1995 Directive, soon to be succeeded by the European Data Protection Board established under the GDPR.
This Stanford Center for Internet and Society White Paper uses proposed US legislation, SESTA, as a starting point for an overview of Intermediary Liability models -- and their consequences. It draws on law and experience from both the US and countries that have adopted different models, and recommends specific improvements for SESTA and similar proposed legislation.
"Any effort to regulate social media companies and search engines would run up against a bevy of constitutional free speech questions. Legally, Trump doesn’t have any authority to change how Google displays search results, said Daphne Keller, director of intermediary liability at the Stanford Center for Internet and Society.
“By dictating what a private company does with search results, it legally would be like dictating what The Chronicle does with news reporting,” Keller said. “It would be a First Amendment violation.”"
"“There should be a whole gradation of how this [software] should work,” Daphne Keller, the director of the Stanford Center for Internet and Society (and mother of two), told Quartz. “We should be able to choose something in between, that is a good balance [between safety and surveillance], rather than forcing kids to divulge all their data without any control.”"
"Daphne Keller, of the Stanford Center for Internet and Society, said Section 230 was designed to allow platforms like Facebook to do some moderation and make editorial decisions without generally being liable for users’ posts: “They need to be able to make discretionary choices about content.”
The law seemed to be on Facebook’s side, she said, but added that it was an unusual case given the focus on app data access while previous cases have centered on more straightforward censorship claims."
"However, a video for example showing Isis recruitment can violate the law in one context, but also be legal and important for purposes such as documenting crimes for future prosecution, says Daphne Keller, intermediary liability director at Stanford's Centre for Internet and Society.
“The more we push companies to carry out fast, sloppy content removals, the more mistakes we will see,” Keller says. She thinks lawmakers should “slow down, talk to experts including both security researchers and members of the affected communities, and build on that foundation”."
"Stanford's Daphne Keller is one of the world's foremost experts on intermediary liability protections and someone we've mentioned on the website many times in the past (and have had her on the podcast a few times as well). She's just published a fantastic paper presenting lessons from making internet platforms liable for the speech of its users. As she makes clear, she is not arguing that platforms should do no moderation at all.
Internet platforms like Facebook and Twitter play an ever-increasing role in our lives, and mediate our personal and public communications. What laws govern their choices about our speech? Come discuss the law of platforms and online free expression with CIS Intermediary Liability Director Daphne Keller.
The Center for Internet and Society (CIS) is a public interest technology law and policy program at Stanford Law School and a part of Law, Science and Technology Program at Stanford Law School.
When you give sites and services information about yourself, where does it go? Who else will get hold of it, and what will they use it for? The recent revelations about Cambridge Analytica's acquisition of data about tens of millions of Facebook users without their knowledge or consent have prompted renewed interest in how data about us gets shared, sold, used, and misused -- well beyond what we ever expected. Join us for a SLATA/CIS lunchtime conversation with three experts from Stanford’s Center for Internet and Society as we discuss the legal and policy implications of the Cambridge Analytica scandal and responses from Congress and courts. How can we prevent this from happening again? What new problems might we create through poorly-crafted legal responses?
The question of what responsibility should lie with Internet platforms for the content they host that is posted by their users has been the subject of debate around in the world as politicians, regulators, and the broader public seek to navigate policy choices to combat harmful speech that have implications for freedom of expression, online harms, competition, and innovation.
Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?
"Daphne Keller, a specialist in corporate liability and responsibility at Stanford Law School's Center for Internet and Society, says Facebook could face private lawsuits over privacy."
""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "
Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.