The Center for Internet and Society at Stanford Law School is a leader in the study of the law and policy around the Internet and other emerging technologies.
On August 7, I had the pleasure of doing a "fireside chat" with my friend and Section 230 expert Cathy Gellis at this year's virtual DEF CON Crypto & Privacy Village. Cathy gave a primer on Section 230, and then we had a discussion about the EARN IT Act bill, the LAED Act bill, and the threats they pose to online speech, privacy, security, and encryption. You can watch the video here. Read more about EARN IT Act Talk at the DEF CON Crypto & Privacy Village
I want to share some musings I had about what criminal punishment means right now in America. I don’t really write about the basics of criminal law and procedure much – it’s not my focus, and I’m not well-read in it, so please excuse my fumbling discussion of the following concepts. Read more about What Does Retribution Mean Now? Thoughts on COVID-19, Prison, and Schadenfreude
The Ninth Circuit Court of Appeals has rebuffed my attempt to unseal information about the Department of Justice's unsuccessful secret effort to force Facebook to change the encryption of its Messenger app so that it could wiretap criminal suspects' voice calls. Read more about We Won't Find Out How the DOJ Tried to Force Facebook to Change Its Encryption
On July 2, the Senate Judiciary Committee held a full-committee hearing at which it made significant changes to the pending EARN IT Act bill, S.3398, about which I’ve written extensively on the CIS blog. Read more about The EARN IT Act Threatens Our Online Freedoms. New Amendments Don’t Fix It.
On Tuesday, June 23, Senators Graham (R-SC), Cotton (R-AR), and Blackburn (R-TN) introduced a bill that is a full-frontal nuclear assault on encryption in the United States. You can find the bill text here. Read more about There’s Now an Even Worse Anti-Encryption Bill Than EARN IT. That Doesn’t Make the EARN IT Bill OK.
In a previous post, I described the growing calls for what I called a “systemic duty of care” (SDOC) in platform regulation. I suggested that SDOC requirements would create difficult questions in ordinary intermediary liability litigation. By encouraging or requiring platforms to review user content or exercise more control over it, SDOC laws would change courts’ reasoning in cases about YouTube’s liability for a defamatory video, for example. Read more about Broad Consequences of a Systemic Duty of Care for Platforms
Policymakers in Europe and around the world are currently pursuing two reasonable-sounding goals for platform regulation. First, they want platforms to abide by a “duty of care,” going beyond today’s notice-and-takedown based legal models to more proactively weed out illegal content posted by users. Second, they want to preserve existing immunities, with platforms generally not facing liability for content they aren’t aware of. Read more about Systemic Duties of Care and Intermediary Liability
For further insights on managing misinformation, we should look to the ways in which humans form identity through imitation, purge enmity through scapegoating, and often lack the inability to internally generate a clear sense of preferences or make choices that align with them.
One of the mechanisms worth analyzing is the human tendency to assign trajectories to immediate observations and, similarly, to be attracted to "trend stories" wagering predictions. This tendency contributes to misinformation problems as it assigns undue weight to both the ability of the predictor and the probability the prediction will come to pass.
I prefer to think, though, that rightness demands we protect the right of humans to so choose, even if it means they reject truth for fantasy. And even if free choice is inhabited with a bit of illusion, one created by subconscious beliefs that control our thinking, and thus our actions, without our immediate awareness.
Generating shared perspectives is an important component of this response. Misinformation flourishes in environments where shared perspectives are weak. Art can help illustrate, in ways that argument and evidence cannot, shared qualities of experience and perspective. Read more about Tool Without A Handle: Tools, Trends, Technology