Chuck Cosson is Senior Corporate Counsel, Privacy, at T-Mobile USA, based in Bellevue, WA. At T-Mobile, Chuck oversees privacy compliance programs and provides privacy guidance on mobile Internet, location services and other business issues. Chuck spent 7 years at Microsoft leading the company’s public policy work on human rights, free expression, and child online safety. He has also worked in Washington, D.C. on telecommunications policy and regulation. His engagement with Stanford will focus on the role of metaphor as a guide for contemporary privacy law - a conception of the Internet not as a “place you go” but as a “tool you use.”
In this blog, I address one type of “fake news” - content that causes tangible harm; provocative fictions that can prompt panic and violence. The “PizzaGate” events are case in point: fictional accusations a restaurant was being used for child abuse prompted a case of assault with a deadly weapon. In this context, President Obama referred recently to the “dust cloud” of false information online. The “dust cloud” metaphor is apt as a “dust cloud” a) obscures; b) interferes with intended functionality; c) appears to come from no single origin; d) can be harmful to life and property.
As with other problematic uses of Internet tools, technology and innovation are often responsive to public concerns, and self-regulation can best integrate liberty and safety interests in responding to “fake news” concerns. That’s not to say government action against certain types of “fake news” is completely out of the question. Government action can take the form of private action: civil laws against defamation.
Further, though, responses to fictional provocations should look for opportunities for re-connecting to shared beliefs. Of course “fake news” is false, but what to do with that fact? Too often, human psychology will conflate receipt of corrective facts with an attempt to challenge motivations – or as an attack on one’s moral agenda. A “mike drop” moment feels really good, but by definition it ends dialogue. As with effective responses to violent extremism online, effective responses to "fake news" will recognize this and offer corrections accordingly.
This post continues my thoughts on qualities of digital tools that have helped make political and artistic expression more subjective, accessible and fluid. In the previous post, we looked at the searchability of text. In this post I examine the impact of mobility: the ability afforded by digital tools to access vast troves of information, to communicate, to record, and to create from virtually anywhere on the planet.
There are at least three significant capabilities of digital technologies that have been shaped by portability: mobile commerce, access to news and information, and visual communications. Each of these capabilities accelerated significantly with the development of the “smartphone” – in particular the Apple iPhone in 2007 – but were inherent in mobile technologies from their initiation. Below, I discuss each quality in turn and identify some of its impacts.
This blog has, to date, primarily focused on the qualities of networked information technologies and regulatory responses to them – in particular qualities that raise issues of privacy and free expression. This installment of “Tool Without a Handle” looks at the qualities that render these tools influential on artistic and political discourse. In this first part, I will look at one particular quality: searchable text.
The searchability of digital content may contribute to confirmation bias and the “filter bubble” phenomena. A pre-conceived preference as to what news or information is desired can be easily bolstered with references found via targeted searches. At the same time, a distinct value of search technology is that it has enabled a vast number of entertainment and creative works to find audiences. A library of programming the size of Netflix or YouTube is simply not navigable with the rotary dials that graced the television set of my childhood.
This blog continues the analysis of how to respond to terrorist activity (including recruitment and planning of attacks) using network information technology, in particular social media. As noted earlier, I think promising avenues to investigate include three areas:
1) Countering misinformation
2) Active recruitment to alternative missions
3) Areas beyond communication - e.g., algorithmic adjustments by social media platforms
While information technologies, and the business platforms that deploy them have a central role, the core of the best responses to violent extremism may turn out not to be tools, but people.
Much consideration has been given to the role of tools in recruitment to extremist violence, the desirability of restricting the use of tools for those purposes, the collateral effects of such restrictions, and the opportunity to use tools for alternative narratives.
This blog concludes that in some cases, restrictions on such uses can be desirable. At the same time though, with few exceptions, the choice of such restrictions should be left to the private sector, and carried out in a way that advances liberal principles. Moreover, there is unlikely to be a solely technological solution to the problem of radicalization or its products, including planning of terrorist attacks. Ultimately, it may be people rather than tools, that are the most effective resource for curtailing extremist violence.