Chuck Cosson is Senior Corporate Counsel, Privacy, at T-Mobile USA, based in Bellevue, WA. At T-Mobile, Chuck oversees privacy compliance programs and provides privacy guidance on mobile Internet, location services and other business issues. Chuck spent 7 years at Microsoft leading the company’s public policy work on human rights, free expression, and child online safety. He has also worked in Washington, D.C. on telecommunications policy and regulation. His engagement with Stanford will focus on the role of metaphor as a guide for contemporary privacy law - a conception of the Internet not as a “place you go” but as a “tool you use.”
This blog post picks up (finally) on the topic of regulation – in particular to discuss cases where the issue is universally understood as worthy of regulation, so much so that variation in regulatory approaches is less desirable. One example of tool use that is worthy of sanction is the non-consensual public distribution of private, sexually explicit images, particularly of children. Questions remain, though, as to whether regulation should apply to direct actors or also to intermediaries, and what specific requirements should apply.
In this post, I suggest some core criteria that should be present whenever any regulation of tool providers is considered: 1) strong social consensus there are concrete and significant harms to be addressed; 2) strong consensus that obligations should apply equally across all intermediaries and online providers; 3) strong consensus the regulation is appropriately tailored and enforceable as a technical and practical matter. Read more » about Tool Without a Handle: "Justified Regulation"
In my last post, I explored how law influences use of information technology through both rules and a concomitant degree of social consensus that a particular behavior creates undue risk or otherwise warrants a response. In this post, I’ll explore this point further in the context of two areas: legal obligations regarding data security, and attempts to regulate the use of “cookies.”
My point though, is the same in both cases: effectively regulating the use of information technology tools effectively requires a degree of social consensus on critical points. Whether a moral or social obligation exists is only one of several considerations. This, in turn, points out an advantage for thinking of information technology as tools, rather than as a “space” or as a thing analogous to a sovereignty unto itself. Read more » about Tool Without a Handle: “Getting A Grip”
Every tool can be misused, and potential harms from such misuse are often checked through force of law. Information technology is no different. To consider regulating information technology through use of the tool metaphor, we need to first consider the dynamics of information technology regulation in general. This blog post addresses key aspects of information technology regulation: how regulation is at once incapable of fully addressing the potential harms that flow from misuse of information technology, but powerful nonetheless in shaping how information technology tools are built and used. Read more » about Tool Without A Handle: The Dark Side
"Tools for Civic Purposes"
It's somewhat "old hat" to note that networked information technology creates tremendous potential for social and civic good. The corporate communications departments of leading technology companies roll out examples of this on a regular basis (I should know as I was part of these efforts...). What is interesting, though, is that the conversation of "technology for good" invariably uses the metaphor of "technology as tool" to make its case. To the extent there are more conversations about the socially beneficial uses of information technology, more conversations will displace the "cyberspace" metaphor to focus on networked information technology as tool. Read more » about Tools for Civic Purposes
In this post, I apply the “tool” metaphor to two common concerns with respect to online services and privacy: profanity in discussion fora and the publication of “creepshots” - public photos (and associated comments) of women taken without the consent of the subject. These two examples help demonstrate the point that thinking of online services as “tools” better generates innovative responses to social concerns, including those where privacy and free expression interests collide. “Tools” call for solutions that change how the service works, while thinking of services as “spaces” calls fo Read more » about Tool Without a Handle: “Kittens, Cities, and Creepshots”