PP

From CyberWiki
Jump to: navigation, search

Privacy Preferences Project

Today’s Internet users are releasing incredible amounts of personal content online, often without the means to communicate their privacy preferences or limit third-party uses of this content. The result is more expression, but also more potential for privacy harms and abuse. We seek to create a technological solution that empowers users to express and enforce their privacy preferences over the content they share by reinforcing and leveraging social norms.

A prominent entrepreneur has predicted that “next year, people will share twice as much information as they share this year, and [the] next year, they will be sharing twice as much as they did the year before.” To achieve this goal, it is critical to provide a safe environment for sharing content where users can express their privacy preferences in a manner others will respect. Without these protections, either sharing will decrease, or abuses of privacy will run rampant. A tool to embedded preferences is a great first step toward achieving a balance that avoids both of these harmful results.


The Problem: Users want to retain a degree of control over the content they create and upload.

➢ Internet users now disclose vast amounts of personal information, including pictures, videos, text – even their present location or state of mind.

➢ Once disclosed, however, users largely surrender control over this information.

➢ Without more control, users may face the adverse consequences of unwanted scrutiny and eventually become more reluctant to generate and share content.


The Solution: A tool for users to express and exercise privacy preferences over uploaded content.

➢ We will develop a tool that allows users to express their intentions by tagging any uploaded content with an icon that immediately conveys privacy preferences to third parties.

➢ On a Creative Commons model, this tool will provide immediate visual feedback to third parties about the content owner’s preferences and link to a website that provides more detailed guidance about how the content may be used or shared.


Why It Will Work: Social norms in online communities, as well as existing principles of law, promote neighborly respect for expressed privacy preferences.

➢ Websites have long been able to signal to search engines their intention to keep certain pages private by embedding metadata. Search engines overwhelmingly respect these preferences. The example of “robots.txt” shows that online actors respect preferences that are clearly articulated and easily observed.

➢ A common metadata system could create a widely accepted social norm regarding privacy preferences. Commercial and individual users alike will be far more hesitant to abuse user privacy preferences when such preferences appear clearly alongside the relevant content.

➢ Established principles of law, including contract and tort doctrines, generally support the right of consumers to exercise control over their privacy, thereby reinforcing the community norms that define socially acceptable behavior.

Neighborliness, lock-in, and free speech values

Enhancing privacy will promote voluntary content sharing; because this is our goal, our model needs to be deployed in a way that does not facilitate lock-in to particular websites or environments. Much user-generated content is already available without signing-in, and content flows easily across communities and hosts. This is as it should be, and the privacy tool we have in mind ought to be equally mobile. To ensure widespread and full use, moreover, it should be available to all consumers of user-generated content at the click of a button, regardless of whether they are members or authenticated users of the particular environment in which it is deployed.

Likewise, mandatory technical enforcement of expressed privacy preferences, while sometimes desirable, would be unnecessary and undermine the social sensitivity of the tool we have in mind. Simple neighborliness requires that we honor each other’s privacy preferences until or unless they conflict with stronger interests or implicate free speech values. When this occurs, of course, it should be possible to override another’s preferences. Mere expression of privacy preferences accommodates this delicate social dynamic, whereas automatic enforcement of preferences would disrupt it. Worse, a “kill switch” for content could ultimately threaten voluntary sharing, and severely chill speech online.

We believe in the social force of a plea for privacy and that the norm of neighborliness will usually be enough. The best way to correct for the erosion of privacy that results when content of a personal nature is shared online is not to deploy gate-keeping measures and an inflexible hierarchy that privileges certain speakers, subjects, or expressed preferences. It is to let simple social signals exert their own force.