Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
A big new law is coming, and a lot of companies doing business online aren’t going to like it. Neither will many advocates of civil liberties for Internet users. Europe’s pending General Data Protection Regulation (GDPR) updates and overhauls EU data protection law – the law that produced this week’s Schrems case and last year’s “Right to Be Forgotten” ruling in the EU.
Today the French Data Protection regulator, CNIL, reaffirmed its position that Google must apply European “Right to Be Forgotten” (RTBF) law globally, by removing content from its services in all countries. Europe’s RTBF laws are rooted in citizens' rights to data protection and privacy. They are inconsistent with U.S. and other countries’ free expression laws, because they require suppression of information even if that information is true and not causing harm.
Policymakers around the world are showing renewed interest in the rules that govern Internet information flow across national borders.
European courts are beginning to sort through one of the most important follow-up questions to last spring’s “Right To Be Forgotten” ruling in Google v. Costeja: what does the case mean for hosting services? The answer matters for the Twitters, Facebooks and YouTubes of the world – not to mention European hosting services like DailyMotion, local political discussion forums, and blogs or newspapers with user comment sections.
This Stanford Center for Internet and Society White Paper uses proposed US legislation, SESTA, as a starting point for an overview of Intermediary Liability models -- and their consequences. It draws on law and experience from both the US and countries that have adopted different models, and recommends specific improvements for SESTA and similar proposed legislation.
Most observers cheered when the neo-Nazi Daily Stormer was booted from YouTube, CloudFlare, and other platforms around the Internet. At the same time, the site’s disappearance stirred anxiety about Internet companies’ power over online speech. It starkly illustrated how online speech can live or die at the discretion of private companies. The modern public square is in private hands.
Prime Minister Theresa May’s political fortunes may be waning in Britain, but her push to make internet companies police their users’ speech is alive and well. In the aftermath of the recent London attacks, Ms. May called platforms like Google and Facebook breeding grounds for terrorism.
"“Historically, the place you went to exercise your speech rights was the public square. Now the equivalent is Twitter and YouTube and Facebook,” said Daphne Keller of the Stanford Center for Internet and Society. “In a practical matter, how much you can speak is not in the hands of the constitution but in the hands of these private companies.”"
"“Many people suing for harassment have tried to find exemptions under the CDA,” said Daphne Keller, director of intermediary liability at Stanford University’s Center for Internet and Society, making the point that the platforms usually win."
"“This part of the Charlottesville story makes people think about who controls speech on the Internet,” says Daphne Keller of Stanford Law School’s Center for Internet and Society. “We don’t have 1st Amendment rights to stop private companies from shutting down our speech, and most of the Internet is run by private companies. Most of us want some intermediaries to play that role — when we go on Twitter, we don’t want to be barraged with obscenities and on Facebook we don’t want to see racism.
"That doesn’t mean these companies aren’t feeling the pressure from advertisers and users who fear that pages belonging to alt-right publications like the Daily Stormer could incite violence, said Daphne Keller, Director of Intermediary Liability at Stanford Law School’s Center for Internet and Society.
""The number of net intermediaries acting as gatekeepers has increased," since GoDaddy booted Daily Stormer, said Daphne Keller, who studies platforms' legal responsibilities at the Stanford Center for Internet and Society. "Suddenly the domain registrars are sitting in judgment on content and speech," joining the usual players around free speech such as Google, Facebook and Twitter."
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
Internet platforms like Facebook and Twitter play an ever-increasing role in our lives, and mediate our personal and public communications. What laws govern their choices about our speech? Come discuss the law of platforms and online free expression with CIS Intermediary Liability Director Daphne Keller.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.