“Tool Without A Handle” “Justified Regulation (Part 2 – Privacy)

This blog moves from discussion of cases of justified and widely supported regulation to examine current topics that are more complex and more hotly contested. A currently popular topic in privacy is the potential for information flows to distort the ordinary balance of economic power,[1] and the concomitant impact on individuals.  It’s my contention that a better understanding of this topic can come from framing the issue using the “tool” metaphor, coupled with a more granular understanding of the privacy values at stake. 

Current discussions about privacy, “big data,” and economic power have illustrated the need for better clarity about privacy issues, harms and violations.  In some cases, privacy issues have been framed as a debate over whether the core issue is individual control of personal information or corporate / government control.[2]  In some cases, the topic is framed as a civil rights issue.[3]  In other views, it is a “due process” issue.[4]  Some notable analyses of privacy issues include observations on a “taxonomy” of privacy,[5] delineation of “subjective” and “objective” privacy harms,[6] and distinctions between “tangible,” “intangible,” and “abstract” privacy harms.[7]

This blog draws a more basic distinction - between “privacy” questions on one hand, and “fairness” questions on the other.  I believe the “privacy” conversation is not well served when we fail to carefully distinguish “privacy” and “fairness” issues.  Moreover, for much of current privacy law and policy, the debate is not really about privacy so much as it is about “fairness.”[8]

Rules and beliefs about “privacy” (individual rights against publicity or surveillance) address a different and separate set of concerns than “fairness” (including civil rights, consumer protection, and economic opportunity). Too often, these considerations are lumped together. As the introductory letter to the 2012 White House Report on Consumer Data Privacy put it,

Justice Brandeis taught us that privacy is the “right to be let alone,” but we also know that privacy is about much more than just solitude or secrecy. Citizens who feel protected from misuse of their personal information feel free to engage in commerce, to participate in the political process, or to seek needed health care. This is why we have laws that protect financial privacy and health privacy, and that protect consumers against unfair and deceptive uses of their information. [9]

Put differently, “privacy as solitude” is about the freedom to choose whether or not certain information linked (or linkable) to you is shared or even made public.  “Privacy as fairness,” in contrast, concerns personal data that is already volunteered in some fashion or context, and aims to set rules about the relative rights of the data subject and of public and private sector parties who collect and use that data.  Both privacy and fairness are important, including for the reasons the White House report notes, but they ultimately derive from different sets of values.

Here is a quick chart illustrating differences between “solitude” and “fairness” topics[10]:  Please note, by "topics" I mean "topics of discussion"; I do not imply here whether any of these should (or should not) be classified as privacy "violations," "harms," or be subject to regulation.

“Solitude” topics

“Fairness” topics

Public or private interception of search queries sent to search engine (no intent for data to be public or provided to that party)

Search engine company creating consumer profiles using your browsing history (search queries were voluntarily sent to service provider to generate search results; issue is fairness of other uses)

 

 

“Phishing” or other deceptive acquisition of personal financial or other data (subject’s consent to make data public obtained under false pretenses)

Use of accurate data in credit decisions (issue is fairness in use not secrecy of collection of personal financial data)

 

 

Right to anonymous political speech (breaking of anonymity discloses personal data without consent).

Cyberbullying, threats, and defamation (voluntary participation in public life met with unfair responses)

One reason to recommend distinguishing “privacy” from “fairness” is there is broad social consensus about “solitude” concerns – as discussed in my prior blog post.[11]  However, debates about economic fairness and power imbalances are often informed by different (and irreconcilable) political philosophies.  Such debates are an important part of democracy, but less likely to lead to at a broad and strong social consensus.[12]

Additionally (or as a result), the “fairness” issues are relatively more complicated.  As FTC Commissioner Julie Brill recently noted, “[w]hether and how consumer profiles based on big data are used to discriminate or treat consumers unfairly involves many subtle and difficult questions.”[13]  So it’s helpful to start by asking what “fairness” principles are actually well-settled, and to see how any of them might illuminate concerns that are more contested.  Among the well-settled rules of “privacy as fairness” are:

·       Right of individuals to have decisions to issue credit or offer employment based on accurate data, and to redress if decisions are based on incorrect data.  For example, rights protected by the Fair Credit Reporting Act.[14]

·       Right of individuals to some level of notice of data collection and use. Examples include the Privacy Act of 1974 (vis-à-vis the federal government),[15] the FTC Act,[16] California’s Online Privacy Protection Act (requiring privacy notices),[17] and widely accepted “fair information privacy practices.”[18]

·       Right to have choices with respect to certain uses of data.  For example, laws governing financial, health, and telephone record data require, in certain circumstances, some form of notice and choice before data is shared with 3d parties for novel purposes.[19]

·       Rights to security and accountability.  Individuals who entrust personal data to others may have remedies if data is handled carelessly or inconsistent with promises made.[20]

Current issues can then be understood as questions about refining these well-settled principles.  For example:

●      What types of decisions are so important that law should regulate data accuracy? 

●      How far should the right of individuals to control use of data about them extend?

●      How do balance fairness interests, including fairness to those who make beneficial use of data? 

(N.B. - these are only examples; I take no position on any given issue in any current regulatory inquiry). 

This approach – starting from concepts of “fairness” and then determining how to apply them - may reduce the extent to which debates about “privacy principles” obscure the issues, and obscure the motives behind a given position. While there is broad consensus in support of “fairness,” there is less consensus as to what “fairness” entails in a given situation. Each “well-settled” rule of “privacy as fairness” is the application of a principle, not a principle itself.  Thus policy analysis of "fairness" issues should focus more on facts, and on cost-benefit analysis, than on an attempt to reason from first principles (which is better suited to issues of "privacy as solitude.")

[1]http://www.technologyreview.com/featuredstory/520426/the-real-privacy-problem.  These issues are often collected under the heading of “big data.” In turn, “big data” can be understood as a set of concerns (and opportunities) created by the fact that technical innovation has enabled the collection and analysis of larger data sets, through more complex algorithms, than in decades past.  This has reduced the relative obscurity of some personal data and has the potential to increase the impact of data analysis on individuals.  I eschew the term “big data” in this blog, however, because I find the core principles of personal privacy and basic fairness persist no matter how large the data set, and both the issues and associated rules existed well before today’s innovations.  

[3]See, e.g., Foreword: Wade Henderson, President & CEO, The Leadership Conference on Civil and Human Rights, “Civil Rights, Big Data, and an Algorithmic Future:  A Report by Robinson + Yu” (“Whether we use the language of big data or civil rights, we’re looking at many of the same questions.”), online at: http://bigdata.fairness.io

[4]Kate Crawford and Jason Schultz, “Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms,”  Boston College Law Review, Vol. 55, No. 93, 2014, online at: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2325784##

[5]Daniel Solove notes, in formulating a “taxonomy” of privacy – that “attempts to conceptualize privacy go astray because they attempt to find a common denominator in all things we deem as implicating ‘privacy.’”  See http://www.concurringopinions.com/archives/2006/03/a_taxonomy_of_p.html.

[6]Ryan Calo, “The Boundaries of Privacy Harm,” 86 Indiana Law Journal 1131 (2011), online at: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1641487  Calo articulates distinctions between “subjective” privacy harms (apprehension of an invasion of solitude) and “objective” privacy harms (e.g., identity theft – demonstrable invasion of privacy).  Generally speaking both of these categories deal with unwanted or unknown uses of personal information, rather than appropriate use of personal information legitimately acquired.

[7]See Jules Polonetsky, Omer Tene, Joseph Jerome, “Benefit-Risk Analysis for Big Data Projects” http://www.futureofprivacy.org/wp-content/uploads/FPF_DataBenefitAnalysis_FINAL.pdf (describing a “spectrum of privacy challenges”).

[8]See, e.g., UK Information Commissioner’s Office report, “Big Data and Data Protection,” online at: http://ico.org.uk/news/latest_news/2014/~/media/documents/library/Data_Protection/Practical_application/big-data-and-data-protection.pdf (p.14:  “The first question for organisations to consider when using personal data for big data analytics is whether the processing is fair.”); see also NIST Privacy Engineering Discussion Deck http://www.nist.gov/itl/csd/upload/nist_privacy_engr_objectives_risk_model_discussion_deck.pdf (describing as “privacy harms” considerations such as economic power imbalances, stigmatization, and discrimination).

[9]http://www.whitehouse.gov/sites/default/files/privacy-final.pdf; see Samuel Warren and Louis Brandeis, “The Right to Privacy,” 4 Harvard L.R. 193 (Dec. 15, 1890); online at http://groups.csail.mit.edu/mac/classes/6.805/articles/privacy/Privacy_brand_warr2.html

[10]Another way to visualize the distinction is to think of “solitude” concerns as those concerning a vertical relationship – one between the data subject and any other party that concerns solely whether data is public (or publicized), while visualizing “fairness” concerns as those concerning horizontal relationships – relations between the data subject and many other parties, each with competing claims to data use, or even ownership.

[11]http://cyberlaw.stanford.edu/blog/2014/09/tool-without-handle-justified-regulation. This consensus extends so far as to create not only regulation of commercial activity but Constitutional protections (Fourth Amendment), and civil tort liability for invasion of privacy or even for public disclosure of private facts, where highly offensive to a reasonable person.  See Restatement of the Law, Second, Torts, § 652; http://www.dmlp.org/legal-guide/publication-private-facts

[12]Debate between the rightness of power versus fairness, for example, goes back at least to Thucydides, Hobbes and Machiavelli (and their respective critics).   One blog post would surely fall short at settling this discussion.

[13]“Privacy in the Age of Omniscience: Approaches in the United States and Europe,” U.S. Federal Trade Commissioner Julie Brill, Mentor Group Vienna Forum, September 11, 2014, online at: http://www.ftc.gov/system/files/documents/public_statements/581751/140911mentorgroup.pdf; these questions were examined by an FTC workshop, with a variety of written contributions.  See http://www.ftc.gov/news-events/events-calendar/2014/09/big-data-tool-inclusion-or-exclusion; workshop papers from Future of Privacy Forum https://s3.amazonaws.com/s3.documentcloud.org/documents/1293324/big-data-a-tool-for-fighting-discrimination-and.pdf and Robinson + Yu http://bigdata.fairness.io/

[14]15 U.S.C § 1681, et. seq.

[15]5 U.S.C § 552a.

[16]15 U.S.C § 45 (prohibiting unfair and deceptive acts, which can include inadequate disclosure of data collection practices).  See, e.g., http://www.ftc.gov/system/files/documents/cases/140508snapchatcmpt.pdf.

[19]See 45 C.F.R. § 164.528 (health data); 15 U.S.C § 6802 (financial data); 47 U.S.C § 222(c) (1) (telephone records).

 

Add new comment