Daphne Keller studies the ways that Internet content platforms – and the laws governing them -- shape information access and other rights of ordinary Internet users. As the Director of Intermediary Liability at the Stanford Center for Internet and Society, she has written and spoken widely about the Right to Be Forgotten, copyright notice-and-takedown systems, cross-border content removal orders, platforms’ own discretionary content-removal decisions, and more. She has testified on these topics before legislatures, courts, and regulatory bodies around the world. In her previous role as Associate General Counsel at Google, Daphne worked on cases including Viacom, Perfect 10, Equustek, Mosley, and Metropolitan Schools; and was the primary counsel for products ranging from Web Search to the Chrome browser. Daphne has taught Internet law at Stanford, Berkeley, and Duke law schools. She is a graduate of Yale Law School and Brown University, and mother to some awesome kids in San Francisco.
High Res Photo of Daphne Keller
This is one of a series of posts about the pending EU General Data Protection Regulation (GDPR), and its consequences for intermediaries and user speech online.
Most intermediaries offer legal “Notice and Takedown” systems – tools for people to alert the company if user-generated content violates the law, and for the company to remove that content if necessary.
A big new law is coming, and a lot of companies doing business online aren’t going to like it. Neither will many advocates of civil liberties for Internet users. Europe’s pending General Data Protection Regulation (GDPR) updates and overhauls EU data protection law – the law that produced this week’s Schrems case and last year’s “Right to Be Forgotten” ruling in the EU.
Today the French Data Protection regulator, CNIL, reaffirmed its position that Google must apply European “Right to Be Forgotten” (RTBF) law globally, by removing content from its services in all countries. Europe’s RTBF laws are rooted in citizens' rights to data protection and privacy. They are inconsistent with U.S. and other countries’ free expression laws, because they require suppression of information even if that information is true and not causing harm.
Policymakers around the world are showing renewed interest in the rules that govern Internet information flow across national borders.
SESTA, the Stop Enabling Sex Traffickers Act, would overhaul US intermediary liability law and potentially expose hundreds of thousands of US platforms to new civil and criminal claims. Its exact legal consequences are uncertain, because the bill is so badly drafted that no one can agree on its meaning. But SESTA’s confusing language and poor policy choices, combined with platforms’ natural incentive to avoid legal risk, make its likely practical consequences all too clear.
Most observers cheered when the neo-Nazi Daily Stormer was booted from YouTube, CloudFlare, and other platforms around the Internet. At the same time, the site’s disappearance stirred anxiety about Internet companies’ power over online speech. It starkly illustrated how online speech can live or die at the discretion of private companies. The modern public square is in private hands.
Prime Minister Theresa May’s political fortunes may be waning in Britain, but her push to make internet companies police their users’ speech is alive and well. In the aftermath of the recent London attacks, Ms. May called platforms like Google and Facebook breeding grounds for terrorism.
These comments were prepared and submitted in response to the U.S. Copyright Office's November 8, 2016 Notice of Inquiry requesting additional public comment on the impact and effectiveness of the DMCA safe harbor provisions in Section 512 of Title 17
"Daphne Keller of the Stanford Center for Internet and Society says that the new law could push some platforms and publishers to crack down on a wide variety of speech, to avoid the threat of lawsuits. It would give them “a reason to err on the side of removing internet users’ speech in response to any controversy,” she says, “and in response to false or mistaken allegations, which are often levied against online speech.”"
"“When platforms don’t know what to do, the legally over-cautious response is to go way overboard on taking things down just in case they’re illegal,” Daphne Keller, Director of Intermediary Liability at Stanford University’s Center for Internet and Society, told BuzzFeed News. “My worst case scenario legislation would be some vague obligation for platforms to make sure that users don’t do bad things.”"
"“Historically, the place you went to exercise your speech rights was the public square. Now the equivalent is Twitter and YouTube and Facebook,” said Daphne Keller of the Stanford Center for Internet and Society. “In a practical matter, how much you can speak is not in the hands of the constitution but in the hands of these private companies.”"
"“Many people suing for harassment have tried to find exemptions under the CDA,” said Daphne Keller, director of intermediary liability at Stanford University’s Center for Internet and Society, making the point that the platforms usually win."
"“This part of the Charlottesville story makes people think about who controls speech on the Internet,” says Daphne Keller of Stanford Law School’s Center for Internet and Society. “We don’t have 1st Amendment rights to stop private companies from shutting down our speech, and most of the Internet is run by private companies. Most of us want some intermediaries to play that role — when we go on Twitter, we don’t want to be barraged with obscenities and on Facebook we don’t want to see racism.
Twenty years ago, the US Supreme Court’s decision in Reno v. ACLU established the framework for internet free speech and liability that remains in place today. This conference will consider the continuing viability of the Reno vision in the face of multiplying concerns about sex trafficking online, terrorist content, election interference, and other forms of contested content.
The Center for Internet and Society (CIS) is a public interest technology law and policy program at Stanford Law School and a part of Law, Science and Technology Program at Stanford Law School. CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry.
Over two years have passed since the Court of Justice of the European Union ruled, in the Google Spain case, that the search engine must “de-list” certain search results on request in order to honor the requesters’ data protection rights.
For many years since the European Data Protection Directive was implemented across Europe in 1998, data privacy was seen as an issue that mainly concerned what companies did with personal data behind the scenes.
Stanford CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry. Come hear CIS Directors Jennifer Granick + Daphne Keller and Resident Fellows Riana Pfefferkorn + Luiz Fernando Marrey Moncau talk about our work, and the assistance CIS provides to students in learning about these issues, selecting courses, identifying job opportunities, and making professional connections.
""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "
Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.