Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
Most people I talk to think that Facebook, Twitter, and other social media companies should take down ugly-but-legal user speech. Platforms are generally applauded for taking down racist posts from the White Nationalist demonstrators in Charlottesville, for example. I see plenty of disagreement about exactly what user-generated content should come down -- breastfeeding images? Passages from Lolita? Passages from Mein Kampf? But few really oppose the basic predicate of these removals: that private companies can and should be arbiters of permissible speech on their platforms.*
Alarm bells are sounding around the Internet about proposed changes to one of the US’s core Intermediary Liability laws, Communications Decency Act Section 230 (CDA 230). CDA 230 broadly immunizes Internet platforms against legal claims based on speech posted by their users. It has been credited as a key protection for both online expression and Internet innovation in the US. CDA 230 immunities have limits, though. Platforms are not protected from intellectual property claims (mostly handled under the DMCA) or federal criminal claims.
In its Equustek ruling in June, the Canadian Supreme Court held that Google must delete search results for users everywhere in the world, based on Canadian law. Google has now filed suit in the US, asking the court to confirm that the order can’t be enforced here. Here’s my take on that claim.
The Canadian Supreme Court this morning issued its long-awaited ruling in Equustek. The court upheld an order compelling Google to remove search results for specified websites, not just in Canada, but everywhere in the world.
Most observers cheered when the neo-Nazi Daily Stormer was booted from YouTube, CloudFlare, and other platforms around the Internet. At the same time, the site’s disappearance stirred anxiety about Internet companies’ power over online speech. It starkly illustrated how online speech can live or die at the discretion of private companies. The modern public square is in private hands.
Prime Minister Theresa May’s political fortunes may be waning in Britain, but her push to make internet companies police their users’ speech is alive and well. In the aftermath of the recent London attacks, Ms. May called platforms like Google and Facebook breeding grounds for terrorism.
These comments were prepared and submitted in response to the U.S. Copyright Office's November 8, 2016 Notice of Inquiry requesting additional public comment on the impact and effectiveness of the DMCA safe harbor provisions in Section 512 of Title 17
Forthcoming in the Berkeley Technology Law Journal
"“I don’t think there’s a chance that major economies like the E.U. are going to accept C.D.A. 230” said Daphne Keller, the director of intermediary liability at Stanford Law School’s Center for Internet and Society. “So I’m not sure what the net effect is.”"
"“There has been real mission creep with the right to be forgotten,” said Daphne Keller, a lawyer at Stanford University’s Center for Internet and Society. “First it was supposed to be about information found using search engines, but now we see it affecting news reporting.”"
"Hate speech that isn’t an imminent threat is still protected by the Constitution, noted Daphne Keller, a researcher at Stanford’s Center for Internet and Society and a former associate general counsel for Google. “A law can’t just ban it. And Congress can’t just tell platforms to ban it, either — that use of government power would still violate the First Amendment,” she said."
"“It’s really important to understand how much Europe is in the driver’s seat,” says Daphne Keller, director of Intermediary Liability at the Center for Internet and Society, as well as former associate general counsel at Google. “It kind of doesn’t matter what U.S. law says for a lot of things. Europe is extracting agreements by companies — they're going to enforce those agreements publicly.”"
Twenty years ago, the US Supreme Court’s decision in Reno v. ACLU established the framework for internet free speech and liability that remains in place today. This conference will consider the continuing viability of the Reno vision in the face of multiplying concerns about sex trafficking online, terrorist content, election interference, and other forms of contested content.
The Center for Internet and Society (CIS) is a public interest technology law and policy program at Stanford Law School and a part of Law, Science and Technology Program at Stanford Law School. CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry.
Over two years have passed since the Court of Justice of the European Union ruled, in the Google Spain case, that the search engine must “de-list” certain search results on request in order to honor the requesters’ data protection rights.
For many years since the European Data Protection Directive was implemented across Europe in 1998, data privacy was seen as an issue that mainly concerned what companies did with personal data behind the scenes.
Stanford CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry. Come hear CIS Directors Jennifer Granick + Daphne Keller and Resident Fellows Riana Pfefferkorn + Luiz Fernando Marrey Moncau talk about our work, and the assistance CIS provides to students in learning about these issues, selecting courses, identifying job opportunities, and making professional connections.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.