Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
The EU’s new General Data Protection Regulation (GDPR) will come into effect in the spring of 2018, bringing with it a newly codified version of the “Right to Be Forgotten” (RTBF). Depending how the new law is interpreted, this right could prove broader than the “right to be de-listed” established in 2014’s Google Spain case. It could put even more decisions about the balance between privacy and free expression in the hands of private Internet platforms like Google.
The Internet is full of trolls. So it’s no surprise that notice and takedown systems for online speech attract their fair share of them – people insisting that criticism of their scientific research, videos of police brutality, and other legitimate online speech should be removed from Internet platforms.
The French DPA's claim that Google should de-list search results globally to comply with "Right to Be Forgotten" laws is inconsistent with the CJEU's ruling in the Google Spain case.
This essay closely examines the effect on free-expression rights when platforms such as Facebook or YouTube silence their users’ speech. The first part describes the often messy blend of government and private power behind many content removals, and discusses how the combination undermines users’ rights to challenge state action. The second part explores the legal minefield for users—or potentially, legislators—claiming a right to speak on major platforms.
On Tuesday, in a courtroom in Luxembourg, the Court of Justice of the European Union is to consider whether Google must enforce the “right to be forgotten” — which requires search engines to erase search results based on European law — everywhere in the world.
Policymakers increasingly ask Internet platforms like Facebook to “take responsibility” for material posted by their users. Mark Zuckerberg and other tech leaders seem willing to do so. That is in part a good development. Platforms are uniquely positioned to reduce harmful content online. But deputizing them to police users’ speech in the modern public square can also have serious unintended consequences. This piece reviews existing laws and current pressures to expand intermediaries’ liability for user-generated content.
If you paid attention to Mark Zuckerberg’s testimony before Congress last month, you might have gotten the impression that the internet consists entirely of titanic, California-based companies like Twitter, Facebook and Google. Congress is right to call these companies to account for outsize harms like disclosing personal data about many millions of users. But it is very wrong to act as though these companies are representative of the whole internet.
"Any effort to regulate social media companies and search engines would run up against a bevy of constitutional free speech questions. Legally, Trump doesn’t have any authority to change how Google displays search results, said Daphne Keller, director of intermediary liability at the Stanford Center for Internet and Society.
“By dictating what a private company does with search results, it legally would be like dictating what The Chronicle does with news reporting,” Keller said. “It would be a First Amendment violation.”"
"“There should be a whole gradation of how this [software] should work,” Daphne Keller, the director of the Stanford Center for Internet and Society (and mother of two), told Quartz. “We should be able to choose something in between, that is a good balance [between safety and surveillance], rather than forcing kids to divulge all their data without any control.”"
"Daphne Keller, of the Stanford Center for Internet and Society, said Section 230 was designed to allow platforms like Facebook to do some moderation and make editorial decisions without generally being liable for users’ posts: “They need to be able to make discretionary choices about content.”
The law seemed to be on Facebook’s side, she said, but added that it was an unusual case given the focus on app data access while previous cases have centered on more straightforward censorship claims."
"However, a video for example showing Isis recruitment can violate the law in one context, but also be legal and important for purposes such as documenting crimes for future prosecution, says Daphne Keller, intermediary liability director at Stanford's Centre for Internet and Society.
“The more we push companies to carry out fast, sloppy content removals, the more mistakes we will see,” Keller says. She thinks lawmakers should “slow down, talk to experts including both security researchers and members of the affected communities, and build on that foundation”."
"Stanford's Daphne Keller is one of the world's foremost experts on intermediary liability protections and someone we've mentioned on the website many times in the past (and have had her on the podcast a few times as well). She's just published a fantastic paper presenting lessons from making internet platforms liable for the speech of its users. As she makes clear, she is not arguing that platforms should do no moderation at all.
Presented by Bloomberg, the Electronic Frontier Foundation and the First Amendment Coalition.
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.