Daphne Keller
Daphne Keller studies the ways that Internet content platforms – and the laws governing them -- shape information access and other rights of ordinary Internet users.
Whether and when communications platforms like Google, Twitter and Facebook are liable for their users’ online activities is one of the key factors that affects innovation and free speech. Most creative expression today takes place over communications networks owned by private companies. Governments around the world increasingly press intermediaries to block their users’ undesirable online content in order to suppress dissent, hate speech, privacy violations and the like. One form of pressure is to make communications intermediaries legally responsible for what their users do and say. Liability regimes that put platform companies at legal risk for users’ online activity are a form of censorship-by-proxy, and thereby imperil both free expression and innovation, even as governments seek to resolve very real policy problems.
In the United States, the core doctrines of section 230 of the Communications Decency Act and section 512 of the Digital Millennium Copyright Act have allowed these online intermediary platforms user generated content to flourish. But, immunities and safe harbors for intermediaries are under threat in the U.S. and globally as governments seek to deputize intermediaries to assist in law enforcement.
To contribute to this important policy debate, CIS studies international approaches to intermediary obligations concerning users’ copyright infringement, defamation, hate speech or other vicarious liabilities, immunities, or safe harbors; publishes a repository of information on international liability regimes and works with global platforms and free expression groups to advocate for policies that will protect innovation, freedom of expression, privacy and other user rights.
Daphne Keller studies the ways that Internet content platforms – and the laws governing them -- shape information access and other rights of ordinary Internet users.
Joan Barata is an international expert in freedom of expression, freedom of information and media regulation. As a scholar, he has spoken and done extensive research in these areas, working and collaborating with various universities and academic centers, from Asia to Africa and America, authoring papers, articles and books, and addressing specialized Parliament committees.
Giancarlo F. Frosio is a Non-Residential Fellow at the Center for Internet and Society at Stanford Law School. Previously he was the Intermediary Liabilty fellow with Stanford CIS. He is also a Senior Lecturer and Researcher at the Center for International Intellectual Property Studies (CEIPI) at Strasbourg University. Giancarlo also serves as Affiliate Faculty at Harvard CopyrightX and Faculty Associate of the Nexa Research Center for Internet and Society in Turin.
Luiz Fernando Marrey Moncau is a Non-Residential Fellow at the Stanford Center for Internet and Society. He was previously the Intermediary Liability Fellow at Stanford CIS. He was also head of the Center for Technology and Society (CTS) at the law school of the Getulio Vargas Foundation in Rio de Janeiro (FGV DIREITO RIO), where he coordinated and conducted research on freedom of expression, intellectual property, Internet regulation, consumer rights and telecommunications regulation.
By Giancarlo Frosio on November 21, 2013 at 3:16 pm
The long standing saga of Max Moseley’s sexual images has recently offered European decision makers a new opportunity to strike a balance between freedom of expression and the right of privacy in light of the ubiquitous and unstoppable distribution of information propelled by the power of Internet search engines. When courts are confronted with novel questions, finding adequate solutions may be extremely challenging. But once again European courts seem to prefer to sideline freedom of expression in favor of protecting other fundamental rights.
By Giancarlo Frosio on November 19, 2013 at 3:23 pm
The first meeting of the Stanford Intermediary Liability Lab (SILLab) will take place on Thursday, November 21 at 4pm in room Neukom 104 at Stanford Law School.
By Jennifer Granick on October 26, 2013 at 9:35 am
Over at Just Security, I have a new post looking at the legal issues and new amici briefs in the Lavabit case.
By Giancarlo Frosio on October 25, 2013 at 4:26 pm
A recent decision of the European Court of Human Rights (ECHR) may expand considerably web portals’ liability for hosting users’ comments. In Delfi AS v. Estonia, the ECHR has found Delfi, one of the largest news portals on the Internet in Estonia, liable for anonymous defamation.
This essay closely examines the effect on free-expression rights when platforms such as Facebook or YouTube silence their users’ speech. The first part describes the often messy blend of government and private power behind many content removals, and discusses how the combination undermines users’ rights to challenge state action. The second part explores the legal minefield for users—or potentially, legislators—claiming a right to speak on major platforms.
Hollywood writers could not have scripted it better. Merely a month before the implementation date of the General Data Protection Regulation (GDPR) in May this year, a data protection scandal roils the world. A whistleblower reveals the leakage of personal data from Facebook through Cambridge Analytica to malevolent actors aiming to influence the U.S. presidential elections. What could possibly better illustrate the crucial role of GDPR in an age where data drives not only marketing and online commerce but also fateful issues for democracy and world peace?
Prevention of terrorism is undeniably an important and legitimate aim in many countries of the world. In the course of the last years, the European Union (EU) institutions, and the Commission (EC) in particular, have shown a growing concern regarding the potential use of online intermediary platforms for the dissemination of illegal content, particularly content of terrorist nature, based on the assumption that this content can reasonably increase the danger of new terrorist attacks being committed on European soil.
Margaret E. Roberts is an assistant professor at the University of California at San Diego and the author of “Censored: Distraction and Diversion Inside China’s Great Firewall,” a book about the new techniques that authoritarian governments like China are using to censor content. I asked her questions about her book.
"“Article 13 creates more or less limitless liability with extraordinarily narrow exemptions,” says Annemarie Bridy, an academic intellectual property and technology lawyer at the University of Idaho. “The result will be that a few platforms will be positioned in terms of resources to operate with the related risk and expense. The rest will either stop hosting user-generated content, which would be a shame, or continue to do it until they get hit with an existentially threatening lawsuit, and fold.”"
"The propublica tool, much like ours, was aimed to provide users with valuable information about political ads to help shed more light on a process that has historically been very secretive," Marshall Erwin, Mozilla's head of trust and security, told Mashable over email. "Major tech companies need to provide more transparency into political advertising, and support researchers and other organizations, like Mozilla, working in good faith to strengthen our democratic processes.""
"This raises the question of whether Congress could draft a law narrow enough to help victims of deepfakes without such unintended consequences. As a cautionary tale, Annemarie Bridy, a law professor at the University of Idaho, points to the misuse of the copyright takedown system in which companies and individuals have acted in bad faith to remove legitimate criticism and other legal content.
Still, given what’s at stake with pornographic deep fake videos, Bridy says, it could be worth drafting a new law.
"Some cyberlaw experts fear a ruling against Grindr will put the creativity of the internet as we know it at risk. They say that requiring platforms to more closely monitor users would give an advantage to tech giants like Facebook, Twitter, and Google while hindering smaller startups with niche audiences, including Grindr. It would be more expensive to start new businesses online because of the cost of hiring watchdogs, said Jennifer Granick, surveillance and cybersecurity counsel at the American Civil Liberties Union.
RSVP is required for this free event.
RSVP is required for this free event.
For more information and to RSVP visit: https://firstamendmentcoalition.org/what-happened-to-the-golden-age-of-f...
January 15, 2019
The latest in the EU's string of internet regulatory efforts has a new target: terrorist propaganda. Just as with past regulations, the proposed rules seem onerous and insane, creating huge liability for internet platforms that fail to do the impossible.
June 26, 2018
Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?
April 24, 2018
"The European Union is getting ready to enact sweeping new digital privacy laws. Facebook says it’s going comply. Is what’s good for Europe good for the U.S.?
On the legal challenges of the right to be forgotten
March 20, 2018
"Daphne Keller, a specialist in corporate liability and responsibility at Stanford Law School's Center for Internet and Society, says Facebook could face private lawsuits over privacy."