Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
This panel considered issues of national jurisdiction in relation to Internet platforms’ voluntary content removal policies. These policies, typically set forth in Community Guidelines (CGs) or similar documents, prohibit content based on the platforms’ own rules or values—regardless of whether the content violates any law.
The essay below serves as introduction to the Stanford Center for Internet and Society's Law, Borders, and Speech Conference Proceedings Volume. The conference brought together experts from around the world to discuss conflicting national laws governing online speech -- and how courts, Internet platforms, and public interest advocates should respond to increasing demands for these laws to be enforced on the global Internet.
Today, someone asked me about the Internet and human well-being over the next decade. The question was a healthy provocation to look at the big picture. I chose “more helped than harmed” from the very short list of radio-button responses. Here’s my elaboration:
Most observers cheered when the neo-Nazi Daily Stormer was booted from YouTube, CloudFlare, and other platforms around the Internet. At the same time, the site’s disappearance stirred anxiety about Internet companies’ power over online speech. It starkly illustrated how online speech can live or die at the discretion of private companies. The modern public square is in private hands.
Prime Minister Theresa May’s political fortunes may be waning in Britain, but her push to make internet companies police their users’ speech is alive and well. In the aftermath of the recent London attacks, Ms. May called platforms like Google and Facebook breeding grounds for terrorism.
These comments were prepared and submitted in response to the U.S. Copyright Office's November 8, 2016 Notice of Inquiry requesting additional public comment on the impact and effectiveness of the DMCA safe harbor provisions in Section 512 of Title 17
Forthcoming in the Berkeley Technology Law Journal
"“When lawmakers create new rules that have never been tested by courts – like Australia's new law or the rules proposed in the UK's White Paper – and then tell platforms to enforce them, we can only expect that a broad swathe of perfectly legal speech is going to disappear,” said Daphne Keller, director of intermediary liability at the Stanford Centre for Internet and Society.
"The issue highlights the pressure on many internet platforms to attract customers by presenting a critical mass of listings to demonstrate scale, says Daphne Keller, director of intermediary liability at Stanford Law School’s Center for Internet and Society. She added that inactive or false listings don’t produce a good customer experience either. “You don’t want to have a bunch of listings in there that turn out to be dead ends,” Ms. Keller said. A Care.com spokeswoman declined to comment on Ms. Keller’s assessment."
"“Its role in enabling a certain kind of technical innovation is unambiguous,” says Daphne Keller at Stanford Law School’s Center for Internet and Society. “It made it possible for investors to get behind companies who were in the business of transmitting so much speech and information that they couldn't possibly assess it all and figure what was legal or illegal.”
""The bottom line of the case is that its legal merits barely matter, because the point is political theater," Daphne Keller, the director of intermediary liability at the Stanford Center for Internet and Society, told The Hill.
"As theater, I suspect it will be quite successful.""
"Ultimately, the use case for purely AI-driven content moderation is fairly narrow, says Daphne Keller, the director of intermediary liability at the Stanford Center for Internet and Society, because nuanced decisions are too complex to outsource to machines.
“If context does not matter at all, you can give it to a machine,” she told me. “But, if context does matter, which is the case for most things that are about newsworthy events, nobody has a piece of software that can replace humans.”"
Presented by Bloomberg, the Electronic Frontier Foundation and the First Amendment Coalition.
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.