Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.
High Res Photo of Daphne Keller
The EU’s new General Data Protection Regulation (GDPR) will come into effect in the spring of 2018, bringing with it a newly codified version of the “Right to Be Forgotten” (RTBF). Depending how the new law is interpreted, this right could prove broader than the “right to be de-listed” established in 2014’s Google Spain case. It could put even more decisions about the balance between privacy and free expression in the hands of private Internet platforms like Google.
The Internet is full of trolls. So it’s no surprise that notice and takedown systems for online speech attract their fair share of them – people insisting that criticism of their scientific research, videos of police brutality, and other legitimate online speech should be removed from Internet platforms.
The French DPA's claim that Google should de-list search results globally to comply with "Right to Be Forgotten" laws is inconsistent with the CJEU's ruling in the Google Spain case.
Most observers cheered when the neo-Nazi Daily Stormer was booted from YouTube, CloudFlare, and other platforms around the Internet. At the same time, the site’s disappearance stirred anxiety about Internet companies’ power over online speech. It starkly illustrated how online speech can live or die at the discretion of private companies. The modern public square is in private hands.
Prime Minister Theresa May’s political fortunes may be waning in Britain, but her push to make internet companies police their users’ speech is alive and well. In the aftermath of the recent London attacks, Ms. May called platforms like Google and Facebook breeding grounds for terrorism.
These comments were prepared and submitted in response to the U.S. Copyright Office's November 8, 2016 Notice of Inquiry requesting additional public comment on the impact and effectiveness of the DMCA safe harbor provisions in Section 512 of Title 17
Forthcoming in the Berkeley Technology Law Journal
""Other countries will look at this and say, 'This looks like a good idea, let's see what leverage I have to get similar agreements,'" said Daphne Keller, former associate general counsel at Google and director of intermediary liability at the Stanford Center for Internet and Society.
"Anybody with an interest in getting certain types of content removed is going to find this interesting.""
"Daphne Keller, the director of intermediary liability at the Stanford Center for Internet and Society, recognises that the current systems in place for flagged content are slow, and says it would be “sensible” for companies to prioritise live video over older content to some degree.
"The regulation continues to put a heavy onus on Internet companies, which are threatened with fines if they do not comply immediately with takedown requests. "The law still sets out a notice and takedown process that strongly encourages Internet intermediaries to delete challenged content, even if the challenge is legally groundless," Daphne Keller, director of Intermediary Liability at Stanford Law School's Center for Law and Society, warned last December.
"If Google rejects a request for removal of a link, the requestor can appeal to his or her country’s regulators or the courts, Keller says. “But there’s no role for the publisher, who put the speech up in the first place and is being silenced” to protest, Keller says.
"As we wrote in our last post, Daphne Keller at Stanford's Center for Internet and Society is writing a series of blog posts raising concerns about how the new rules clash with basic concepts of free speech. She's now written one about the immensely troubling setup of the "notice and takedown" rules included in the General Data Protection Regulation (GDPR).
Presented by Bloomberg, the Electronic Frontier Foundation and the First Amendment Coalition.
Lunch: 1:00 pm
Program: 1:30 pm - 3:00 pm
"Daphne Keller, a specialist in corporate liability and responsibility at Stanford Law School's Center for Internet and Society, says Facebook could face private lawsuits over privacy."
""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "
Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.
Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.