Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
The EU’s proposed Terrorist Content Regulation gives national authorities sweeping new powers over comments, videos, and other content that people share using Internet platforms. Among other things, authorities – who may be police, not courts – can require platforms of all sizes to take content down within one hour. The Regulation also requires even small platforms to build upload filters and attempt to proactively weed out prohibited material.
I have a new article coming out, called Who Do You Sue? State and Platform Hybrid Power over Online Speech. It is about free expression rights on platforms like Facebook or Twitter, which the Supreme Court has called “the modern public square.” One section is about speakers suing platforms. It looks at cases – over thirty so far – where users argue that companies like Facebook or Twitter have violated their free expression rights by taking down legal speech that is prohibited under the platforms’ Community Guidelines.
Two important current trends in Internet law go together in ways that aren’t getting enough attention. They should, though, because the overlap is well on its way to messing up the Internet further.
Are Internet platforms distorting our political discourse by silencing conservatives? If they were, could Congress pass a law forcing them to play fair?
Public demands for internet platforms to intervene more aggressively in online content are steadily mounting. Calls for companies like YouTube and Facebook to fight problems ranging from “fake news” to virulent misogyny to online radicalization seem to make daily headlines. British prime minister Theresa May echoed the politically prevailing sentiment in Europe when she urged platforms to “go further and faster” in removing prohibited content, including through use of automated filters.
Included in this PDF are:
- Notice of Motion and Motion for Leave to File Amicus Curiae Brief
- Amicus Curiae Brief of Electronic Frontier Foundation, Center for Democracy and Technology, Daphne Keller, Eric Goldman and Eugene Volokh in Support of Plaintiffs' Motion for Preliminary Injunction
In a concession to regulators, Google is . . . using “geo-blocking” technology to control what European users can see. Under the new system, Google will not only remove links on, say, google.fr, but it will block users in France from seeing those links on any other Google country site, or google.com itself. Unless they use tools like virtual private networks to disguise their locations, users in those countries will see pruned search results.
These comments were prepared and submitted in response to the U.S. Copyright Office's December 31, 2015 Notice and Request for Public Comment on the impact and effectiveness of the DMCA safe harbor provisions in Section 512 of Title 17.
"“I don’t think there’s a chance that major economies like the E.U. are going to accept C.D.A. 230” said Daphne Keller, the director of intermediary liability at Stanford Law School’s Center for Internet and Society. “So I’m not sure what the net effect is.”"
"“There has been real mission creep with the right to be forgotten,” said Daphne Keller, a lawyer at Stanford University’s Center for Internet and Society. “First it was supposed to be about information found using search engines, but now we see it affecting news reporting.”"
"Hate speech that isn’t an imminent threat is still protected by the Constitution, noted Daphne Keller, a researcher at Stanford’s Center for Internet and Society and a former associate general counsel for Google. “A law can’t just ban it. And Congress can’t just tell platforms to ban it, either — that use of government power would still violate the First Amendment,” she said."
"“It’s really important to understand how much Europe is in the driver’s seat,” says Daphne Keller, director of Intermediary Liability at the Center for Internet and Society, as well as former associate general counsel at Google. “It kind of doesn’t matter what U.S. law says for a lot of things. Europe is extracting agreements by companies — they're going to enforce those agreements publicly.”"
Over 800 attendees registered at the State of the Net Conference (SOTN) in 2015. The conference provides unparalleled opportunities to network and engage on key Internet policy issues. SOTN is the largest Internet policy conference in the U.S. and the only one with over 50 percent Congressional staff and government policymakers in attendance.
Stanford CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry
The question of what responsibility should lie with Internet platforms for the content they host that is posted by their users has been the subject of debate around in the world as politicians, regulators, and the broader public seek to navigate policy choices to combat harmful speech that have implications for freedom of expression, online harms, competition, and innovation.
Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?
"Daphne Keller, a specialist in corporate liability and responsibility at Stanford Law School's Center for Internet and Society, says Facebook could face private lawsuits over privacy."
""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "
Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.