Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
Most people I talk to think that Facebook, Twitter, and other social media companies should take down ugly-but-legal user speech. Platforms are generally applauded for taking down racist posts from the White Nationalist demonstrators in Charlottesville, for example. I see plenty of disagreement about exactly what user-generated content should come down -- breastfeeding images? Passages from Lolita? Passages from Mein Kampf? But few really oppose the basic predicate of these removals: that private companies can and should be arbiters of permissible speech on their platforms.*
Alarm bells are sounding around the Internet about proposed changes to one of the US’s core Intermediary Liability laws, Communications Decency Act Section 230 (CDA 230). CDA 230 broadly immunizes Internet platforms against legal claims based on speech posted by their users. It has been credited as a key protection for both online expression and Internet innovation in the US. CDA 230 immunities have limits, though. Platforms are not protected from intellectual property claims (mostly handled under the DMCA) or federal criminal claims.
In its Equustek ruling in June, the Canadian Supreme Court held that Google must delete search results for users everywhere in the world, based on Canadian law. Google has now filed suit in the US, asking the court to confirm that the order can’t be enforced here. Here’s my take on that claim.
The Canadian Supreme Court this morning issued its long-awaited ruling in Equustek. The court upheld an order compelling Google to remove search results for specified websites, not just in Canada, but everywhere in the world.
These comments address the issue of transparency under the GDPR, as that topic arises in the context of Internet intermediaries and the “Right to Be Forgotten.” CIS Intermediary Liability Director Daphne Keller filed them in response to a public call for comments from the Article 29 Working Party – the EU-wide umbrella group of data protection regulators established under the 1995 Directive, soon to be succeeded by the European Data Protection Board established under the GDPR.
This Stanford Center for Internet and Society White Paper uses proposed US legislation, SESTA, as a starting point for an overview of Intermediary Liability models -- and their consequences. It draws on law and experience from both the US and countries that have adopted different models, and recommends specific improvements for SESTA and similar proposed legislation.
"It will set governments’ expectations about how they can use their leverage over internet platforms to effectively enforce their own laws globally,” said Daphne Keller, who studies platforms’ legal responsibilities at the Stanford Center for Internet and Society and previously was Google’s associate general counsel."
"“Users are calling on online platforms to provide a moral code,” says Daphne Keller, director of the intermediary liability project at Stanford’s Center for Internet and Society. “But we’ll never agree on what should come down. Whatever the rules, they’ll fail.” Humans and technical filters alike, according to Keller, will continue to make “grievous errors.”"
"Any effort to regulate social media companies and search engines would run up against a bevy of constitutional free speech questions. Legally, Trump doesn’t have any authority to change how Google displays search results, said Daphne Keller, director of intermediary liability at the Stanford Center for Internet and Society.
“By dictating what a private company does with search results, it legally would be like dictating what The Chronicle does with news reporting,” Keller said. “It would be a First Amendment violation.”"
"We don’t have nearly enough information to see the big picture and know what speech platforms are taking down. For the most part, we only find out when the speakers themselves learn that their posts or accounts have disappeared and choose to call public attention to it. But the idea that platforms’ rules are biased — and that this undermines democracy — isn’t new, and it isn’t unique to conservatives.
"“There should be a whole gradation of how this [software] should work,” Daphne Keller, the director of the Stanford Center for Internet and Society (and mother of two), told Quartz. “We should be able to choose something in between, that is a good balance [between safety and surveillance], rather than forcing kids to divulge all their data without any control.”"
Presented by Bloomberg, the Electronic Frontier Foundation and the First Amendment Coalition.
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.