Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
Most people I talk to think that Facebook, Twitter, and other social media companies should take down ugly-but-legal user speech. Platforms are generally applauded for taking down racist posts from the White Nationalist demonstrators in Charlottesville, for example. I see plenty of disagreement about exactly what user-generated content should come down -- breastfeeding images? Passages from Lolita? Passages from Mein Kampf? But few really oppose the basic predicate of these removals: that private companies can and should be arbiters of permissible speech on their platforms.*
Alarm bells are sounding around the Internet about proposed changes to one of the US’s core Intermediary Liability laws, Communications Decency Act Section 230 (CDA 230). CDA 230 broadly immunizes Internet platforms against legal claims based on speech posted by their users. It has been credited as a key protection for both online expression and Internet innovation in the US. CDA 230 immunities have limits, though. Platforms are not protected from intellectual property claims (mostly handled under the DMCA) or federal criminal claims.
In its Equustek ruling in June, the Canadian Supreme Court held that Google must delete search results for users everywhere in the world, based on Canadian law. Google has now filed suit in the US, asking the court to confirm that the order can’t be enforced here. Here’s my take on that claim.
The Canadian Supreme Court this morning issued its long-awaited ruling in Equustek. The court upheld an order compelling Google to remove search results for specified websites, not just in Canada, but everywhere in the world.
These comments address the issue of transparency under the GDPR, as that topic arises in the context of Internet intermediaries and the “Right to Be Forgotten.” CIS Intermediary Liability Director Daphne Keller filed them in response to a public call for comments from the Article 29 Working Party – the EU-wide umbrella group of data protection regulators established under the 1995 Directive, soon to be succeeded by the European Data Protection Board established under the GDPR.
This Stanford Center for Internet and Society White Paper uses proposed US legislation, SESTA, as a starting point for an overview of Intermediary Liability models -- and their consequences. It draws on law and experience from both the US and countries that have adopted different models, and recommends specific improvements for SESTA and similar proposed legislation.
"“The place we all go to exercise our freedom of expression and to share opinions is a private platform run by a private company, and they don’t let us say every single thing that’s legal,” says Daphne Keller, director of intermediary liability at the Stanford Center for Internet and Society and a former head lawyer for Google’s web search team. “They only let us say the things that their policies permit. There’s good business reasons for that for them, but it’s a strange impact for us as a society sharing speech.”"
"And its odds of winning are high, said Daphne Keller, director of intermediary liability at Stanford University’s Center for Internet and Society, who said many companies have successfully used the CDA as a defense."
"When platforms are made responsible for determining what speech is illegal, those intermediaries tend to over-remove content, out of an abundance of caution, Daphne Keller, the director of intermediary liability at the Stanford Center for Internet and Society, and a former associate general counsel at Google, told BuzzFeed News. “They take down perfectly legal content out of concern that otherwise they themselves could get in trouble,” Keller said.
""Other countries will look at this and say, 'This looks like a good idea, let's see what leverage I have to get similar agreements,'" said Daphne Keller, former associate general counsel at Google and director of intermediary liability at the Stanford Center for Internet and Society.
"Anybody with an interest in getting certain types of content removed is going to find this interesting.""
"Daphne Keller, the director of intermediary liability at the Stanford Center for Internet and Society, recognises that the current systems in place for flagged content are slow, and says it would be “sensible” for companies to prioritise live video over older content to some degree.
After a lengthy legislative process, the GDPR is finally ready. As the most significant overhaul of data privacy laws in Europe in twenty years, it will have a profound impact on Silicon Valley technology companies offering online services in Europe. The recently announced Privacy Shield will affect most US organisations that receive personal information from Europe.
Over 800 attendees registered at the State of the Net Conference (SOTN) in 2015. The conference provides unparalleled opportunities to network and engage on key Internet policy issues. SOTN is the largest Internet policy conference in the U.S. and the only one with over 50 percent Congressional staff and government policymakers in attendance.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.