The Center for Internet and Society at Stanford Law School is a leader in the study of the law and policy around the Internet and other emerging technologies.
Whether and when communications platforms like Google, Twitter and Facebook are liable for their users’ online activities is one of the key factors that affects innovation and free speech. Most creative expression today takes place over communications networks owned by private companies. Governments around the world increasingly press intermediaries to block their users’ undesirable online content in order to suppress dissent, hate speech, privacy violations and the like. One form of pressure is to make communications intermediaries legally responsible for what their users do and say. Liability regimes that put platform companies at legal risk for users’ online activity are a form of censorship-by-proxy, and thereby imperil both free expression and innovation, even as governments seek to resolve very real policy problems.
In the United States, the core doctrines of section 230 of the Communications Decency Act and section 512 of the Digital Millennium Copyright Act have allowed these online intermediary platforms user generated content to flourish. But, immunities and safe harbors for intermediaries are under threat in the U.S. and globally as governments seek to deputize intermediaries to assist in law enforcement.
To contribute to this important policy debate, CIS studies international approaches to intermediary obligations concerning users’ copyright infringement, defamation, hate speech or other vicarious liabilities, immunities, or safe harbors; publishes a repository of information on international liability regimes and works with global platforms and free expression groups to advocate for policies that will protect innovation, freedom of expression, privacy and other user rights.
On July 2, the Senate Judiciary Committee held a full-committee hearing at which it made significant changes to the pending EARN IT Act bill, S.3398, about which I’ve written extensively on the CIS blog. Read more about The EARN IT Act Threatens Our Online Freedoms. New Amendments Don’t Fix It.
Policymakers in Europe and around the world are currently pursuing two reasonable-sounding goals for platform regulation. First, they want platforms to abide by a “duty of care,” going beyond today’s notice-and-takedown based legal models to more proactively weed out illegal content posted by users. Second, they want to preserve existing immunities, with platforms generally not facing liability for content they aren’t aware of. Read more about Systemic Duties of Care and Intermediary Liability
For further insights on managing misinformation, we should look to the ways in which humans form identity through imitation, purge enmity through scapegoating, and often lack the inability to internally generate a clear sense of preferences or make choices that align with them.
One of the mechanisms worth analyzing is the human tendency to assign trajectories to immediate observations and, similarly, to be attracted to "trend stories" wagering predictions. This tendency contributes to misinformation problems as it assigns undue weight to both the ability of the predictor and the probability the prediction will come to pass.
I prefer to think, though, that rightness demands we protect the right of humans to so choose, even if it means they reject truth for fantasy. And even if free choice is inhabited with a bit of illusion, one created by subconscious beliefs that control our thinking, and thus our actions, without our immediate awareness.
Generating shared perspectives is an important component of this response. Misinformation flourishes in environments where shared perspectives are weak. Art can help illustrate, in ways that argument and evidence cannot, shared qualities of experience and perspective. Read more about Tool Without A Handle: Tools, Trends, Technology