Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
A big new law is coming, and a lot of companies doing business online aren’t going to like it. Neither will many advocates of civil liberties for Internet users. Europe’s pending General Data Protection Regulation (GDPR) updates and overhauls EU data protection law – the law that produced this week’s Schrems case and last year’s “Right to Be Forgotten” ruling in the EU.
Today the French Data Protection regulator, CNIL, reaffirmed its position that Google must apply European “Right to Be Forgotten” (RTBF) law globally, by removing content from its services in all countries. Europe’s RTBF laws are rooted in citizens' rights to data protection and privacy. They are inconsistent with U.S. and other countries’ free expression laws, because they require suppression of information even if that information is true and not causing harm.
Policymakers around the world are showing renewed interest in the rules that govern Internet information flow across national borders.
European courts are beginning to sort through one of the most important follow-up questions to last spring’s “Right To Be Forgotten” ruling in Google v. Costeja: what does the case mean for hosting services? The answer matters for the Twitters, Facebooks and YouTubes of the world – not to mention European hosting services like DailyMotion, local political discussion forums, and blogs or newspapers with user comment sections.
Included in this PDF are:
- Notice of Motion and Motion for Leave to File Amicus Curiae Brief
- Amicus Curiae Brief of Electronic Frontier Foundation, Center for Democracy and Technology, Daphne Keller, Eric Goldman and Eugene Volokh in Support of Plaintiffs' Motion for Preliminary Injunction
In a concession to regulators, Google is . . . using “geo-blocking” technology to control what European users can see. Under the new system, Google will not only remove links on, say, google.fr, but it will block users in France from seeing those links on any other Google country site, or google.com itself. Unless they use tools like virtual private networks to disguise their locations, users in those countries will see pruned search results.
These comments were prepared and submitted in response to the U.S. Copyright Office's December 31, 2015 Notice and Request for Public Comment on the impact and effectiveness of the DMCA safe harbor provisions in Section 512 of Title 17.
"Daphne Keller at the Stanford Center for Internet and Society said internet companies doing business in countries with laws restricting speech know they will be expected to comply with the rules. One common means of doing so without deleting lawful speech elsewhere is to offer country-specific versions of services, like YouTube Thailand, said Keller.
"The company can then honor national law on the version of the service that is targeted to, and primarily used in, that country," she said."
"Daphne Keller, an Internet law expert at Stanford Law School and former attorney at Google, said prior court decisions favor Yelp and she would be surprised if the California Supreme Court didn't reverse the ruling.
"It should be a no-brainer for Yelp to win," she said."
"“The place we all go to exercise our freedom of expression and to share opinions is a private platform run by a private company, and they don’t let us say every single thing that’s legal,” says Daphne Keller, director of intermediary liability at the Stanford Center for Internet and Society and a former head lawyer for Google’s web search team. “They only let us say the things that their policies permit. There’s good business reasons for that for them, but it’s a strange impact for us as a society sharing speech.”"
"And its odds of winning are high, said Daphne Keller, director of intermediary liability at Stanford University’s Center for Internet and Society, who said many companies have successfully used the CDA as a defense."
"When platforms are made responsible for determining what speech is illegal, those intermediaries tend to over-remove content, out of an abundance of caution, Daphne Keller, the director of intermediary liability at the Stanford Center for Internet and Society, and a former associate general counsel at Google, told BuzzFeed News. “They take down perfectly legal content out of concern that otherwise they themselves could get in trouble,” Keller said.
Over 800 attendees registered at the State of the Net Conference (SOTN) in 2015. The conference provides unparalleled opportunities to network and engage on key Internet policy issues. SOTN is the largest Internet policy conference in the U.S. and the only one with over 50 percent Congressional staff and government policymakers in attendance.
Stanford CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry
In this episode of the Arbiters of Truth series—Lawfare's new podcast series on disinformation in the run-up to the 2020 election—Quinta Jurecic and Evelyn Douek spoke with Daphne Keller, the director of intermediary liability at Stanford's Center for Internet and Society, about the nuts and bolts of content moderation. People often have big ideas for how tech platforms should decide what content to take down and what to keep up, but what kind of moderation is actually possible at scale?
In this episode, Daphne Keller, Director of Intermediary Liability at the Center for Internet and Society at Stanford Law School and former Associate General Counsel for Google, discusses her essay "Who Do You Sue?: State and Platform Hybrid Power Over Online Speech," which is published by the Hoover Institution.
On this segment of “Quality Assurance,” I take a deep dive on platforms and regulating speech. I spoke with Daphne Keller, who is at Stanford Law School’s Center for Internet and Society. The following is an edited transcript of our conversation.
The question of what responsibility should lie with Internet platforms for the content they host that is posted by their users has been the subject of debate around in the world as politicians, regulators, and the broader public seek to navigate policy choices to combat harmful speech that have implications for freedom of expression, online harms, competition, and innovation.
Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?