Privacy settings and other technological controls used to protect privacy have been justifiably criticized a bit lately. Danielle Citron recently blogged at Concurring Opinions about an important new study conducted by Columbia’s Michelle Madejski, Maritza Johnson and Steve Bellovin that found that Facebook’s default privacy settings fail to capture real-world expectations. The United Kingdom Government has recently indicated that browser settings alone cannot be used by Web users to give consent to being tracked online under a new EU law. The Government's rationale for this decision was that these browser settings were not flexible enough to reflect a user's true privacy preferences. The general consensus seems to be that most privacy settings simply aren't that good at protecting the actual information we consider private in a given context. I think some skepticism regarding privacy controls is warranted, particularly in light of the current technology. However, I'd like to show some support for privacy controls, or, rather, the promise of privacy controls. My hope is that that courts and lawmakers do not completely sour on recognizing privacy controls as a legitimate way to protect an Internet user's privacy.
First, it is important to remember that nuanced privacy controls are still in their relative infancy. While courts and lawmakers are right to take a cautious approach when dealing with the legal significance of technology, they should acknowledge that it is far too early to entirely dismiss privacy controls as ineffective or insignificant. With every failure and innovation, privacy controls ostensibly become more efficient, accurate and productive. It is likely that they will also become easier to use and, as a result, will continue to be increasingly adopted by Internet users. As Internet users' contextual expectations become more salient to designers, new privacy controls can evolve to accommodate their needs.
Recent research demonstrates how privacy controls could be improved. Ryan Calo's "Against Notice Skepticism" details different ways user experience itself could be used to provide notice to users without using (or in addition to) text or symbols. This same "visceral notice" could be incorporated into privacy controls to better inform users of the consequences of selecting a certain privacy setting. The more we interact with websites, the more tailored our experience can become. For example, Facebook's "Top News" option employs an algorithm that relies upon things like your previous use of the site. This same kind of data could be employed to create adaptive privacy controls. These controls will react to the environment and network configurations within the environment. Adaptive privacy controls could better reflect a user's expectations because they can change according to the various contexts in which information is disclosed. (Indeed, Madejski, Johnson and Bellovin make similar recommendations in their study).
Privacy controls also serve multiple functions beyond simply limiting access to information. They also represent a tool for users to develop their relationships with websites, which have been historically one-sided. In my recent article "Website Design as Contract," I highlight research that demonstrates that privacy settings can be a significant part of a user's legal relationship with a website. These controls are already woven into the agreement between many websites and users. Users often rely on these settings as promises made by the website, as evidenced by several lawsuits.
Finally, privacy controls can help save us from ourselves. While current privacy controls might sometimes act as a blunt tool that protects too much information, there can be an unintended benefit to bringing a tank to a knife fight. As Danielle and others have observed, individuals consistently underestimate privacy risks. We're horrible at accurately gauging how the disclosure of information can harm us. Yet, in spite of this deficiency, which has also been called "optimism bias," a growing majority of Internet users utilize privacy settings when they are available. Campaigns like the Electronic Frontier Foundation's "HTTPS Now" will continue to bolster the use of privacy and security controls. By simply utilizing whatever controls are available, Internet users can sometimes help offset optimism bias by protecting more information than anticipated. While excessive protection does not address the importance of context in disclosure, it could still be seen as a benefit in certain situations.
Of course, privacy controls are a problem when they deceive users by providing less protection than what users expect. Among other reasons, this is why Madejski, Johnson and Bellovin's research, and the related work of others, is important. I think that all of this underscores the importance of privacy controls and the need to get them right, rather than a need to shy away from controls as a significant legal aspect of user privacy.