The Center for Internet and Society at Stanford Law School is a leader in the study of the law and policy around the Internet and other emerging technologies.
Whether and when communications platforms like Google, Twitter and Facebook are liable for their users’ online activities is one of the key factors that affects innovation and free speech. Most creative expression today takes place over communications networks owned by private companies. Governments around the world increasingly press intermediaries to block their users’ undesirable online content in order to suppress dissent, hate speech, privacy violations and the like. One form of pressure is to make communications intermediaries legally responsible for what their users do and say. Liability regimes that put platform companies at legal risk for users’ online activity are a form of censorship-by-proxy, and thereby imperil both free expression and innovation, even as governments seek to resolve very real policy problems.
In the United States, the core doctrines of section 230 of the Communications Decency Act and section 512 of the Digital Millennium Copyright Act have allowed these online intermediary platforms user generated content to flourish. But, immunities and safe harbors for intermediaries are under threat in the U.S. and globally as governments seek to deputize intermediaries to assist in law enforcement.
To contribute to this important policy debate, CIS studies international approaches to intermediary obligations concerning users’ copyright infringement, defamation, hate speech or other vicarious liabilities, immunities, or safe harbors; publishes a repository of information on international liability regimes and works with global platforms and free expression groups to advocate for policies that will protect innovation, freedom of expression, privacy and other user rights.
The Supreme Court is about to review a constitutional challenge to two unprecedented and very complicated laws regulating social media. The laws were enacted by Texas and Florida in order to counter “censorship” and alleged anti-conservative bias of major Internet platforms like Facebook or YouTube. Both laws have “must-carry” rules that restrict platforms’ ability to moderate content under their preferred editorial policies, and “transparency” rules including requirements for platforms to notify users when their posts have been moderated. Read more about FAQs about the NetChoice Cases at the Supreme Court, Part 1
One of the least appreciated transparency measures in the EU’s new Digital Services Act (DSA) is the requirement for platforms to send the Commission information about each individual content moderation action, and for the Commission to make that information available in a public database. Draft technical specifications for submissions to the database are out now. They are in real need of improvement, but it's not clear if there is time for that to happen. Read more about Rushing to Launch the EU's Platform Database Experiment