Blog

Remember That Time We Saved the Internet?

My Twitter feed tells me that today is the fifth anniversary of the day the Internet “went dark” in protest of the Stop Online Piracy Act (SOPA) and the Protect Intellectual Property Act (PIPA). For anyone who needs a reminder, SOPA and PIPA were pieces of copyright legislation touted by their proponents as necessary to prevent online piracy and to protect U.S. jobs in the film, television, and music industries.

Tool Without a Handle: "Trustworthy Tools"

“Tool Without a Handle:  Trustworthy Tools”

“’What is truth?’ said jesting Pilate, who did not stay for an answer.” – Francis Bacon, Of Truth (Essays, Civil and Moral (1625).

This blog previously dealt with one flavor of “fake news”:  provocative fictions that can prompt panic and violence.  In this blog, I’ll deal with the related issue of propaganda.  My conclusion:  propaganda achieves its harmful effects by the meaning readers assign to the content.  As such, responses to propaganda should focus on the process by which readers assign meaning, and how that process leads to anxiety or anger and then, in turn, to harmful action.

Should Government Agencies Know Precisely Where You Get Picked Up and Dropped Off?

I submitted comments this week to the New York City Taxi and Limousine Commission as the Director of Privacy at the Center for Internet and Society (CIS). The emergence of new transportation networks and platforms certainly presents privacy challenges and the private companies in these emerging markets certainly have had their share of privacy mis-steps.

Tool Without a Handle: "A Dust Cloud of Nonsense"

In this blog, I address one type of “fake news” - content that causes tangible harm; provocative fictions that can prompt panic and violence. The “PizzaGate” events are case in point: fictional accusations a restaurant was being used for child abuse prompted a case of assault with a deadly weapon. In this context, President Obama referred recently to the “dust cloud” of false information online. The “dust cloud” metaphor is apt as a “dust cloud” a) obscures; b) interferes with intended functionality; c) appears to come from no single origin; d) can be harmful to life and property.

As with other problematic uses of Internet tools, technology and innovation are often responsive to public concerns, and self-regulation can best integrate liberty and safety interests in responding to “fake news” concerns. That’s not to say government action against certain types of “fake news” is completely out of the question. Government action can take the form of private action: civil laws against defamation.

Further, though, responses to fictional provocations should look for opportunities for re-connecting to shared beliefs. Of course “fake news” is false, but what to do with that fact? Too often, human psychology will conflate receipt of corrective facts with an attempt to challenge motivations – or as an attack on one’s moral agenda. A “mike drop” moment feels really good, but by definition it ends dialogue. As with effective responses to violent extremism online, effective responses to "fake news" will recognize this and offer corrections accordingly.

CEIPI Opinion on a EU Proposal for a Neighboring Right for Press Publishers Online

At CEIPI, I have co-authored with Christophe Geiger and Oleksandr Bulayenko a position paper discussing the proposed introduction in EU law of neighboring rights for press publishers for the digital uses of their publications. The proposal is included in the European Commission’s Draft Directive on copyright in the Digital Single Market of September 14, 2016. Below you find the summary of the paper:

Pages

Subscribe to Stanford CIS Blog