On Reverse Engineering Privacy Law

Michael Birnhack, a professor at Tel Aviv University Faculty of Law, is one of the leading thinkers about privacy and data protection today (for some of his previous work see here and here and here; he’s also written a deep, thoughtful, innovative book in Hebrew about the theory of privacy. See here). In a new article, Reverse Engineering Informational Privacy Law, which is about to be published in the Yale Journal of Law & Technology, Birnhack sets out to unearth the technological underpinnings of the EU Data Protection Directive (DPD). The DPD, enacted in 1995 and currently undergoing a process of thorough review, is surely the most influential legal instrument concerning data privacy all over the world. It has been heralded by proponents as “technology neutral” – a recipe for longevity in a world marked by rapid technological change. Birnhack unveils highly technology-specific fundamentals of the DPD, thereby putting into doubt its continued relevance.

The first part of Birnhack’s article analyzes what technological neutrality of a legal framework means and why it’s sought after. He posits the idea behind it is simple: “the law should not name, specify or describe a particular technology, but rather speak in broader terms that can encompass more than one technology and hopefully, would cover future technologies that are not yet known at the time of legislation.” One big advantage is flexibility (the law can apply to a broad set of technologies, sometimes invented after it’s enacted); consider the continued viability of the tech-neutral Fourth Amendment versus the obviously archaic nature of the tech-specific ECPA. Another advantage is the promotion of innovation; tech-specific legislation can lock-in a specific technology thereby stifling innovation.

Birnhack continues by creating a typology of tech-related legislation. He examines factors such as whether the law regulates technology as a means or as an end; whether it actively promotes, passively permits or directly restricts technology; at which level of abstraction it relates to technology; and who is put in charge of regulating it. Throughout the discussion, Birnhack’s broad, rich expertise in everything law and technology is evident; his examples range from copyright and patent law to nuclear non-proliferation.

The article in its entirety is well worth reading, but I’ll focus here on just its last part, where Birnhack sets out to “reverse engineer” the DPD, revealing its hidden technological assumptions. Whether the DPD is tech-neutral or not is a very practical concern these days, as the EU Parliament and Council contemplate a comprehensive reform proposal submitted in January by the European Commission. The reform proposal, for the most part, relies on the same principles as the DPD. This is based on the assumption that the DPD has withstood the test of time. Consider the opinion of the group of EU regulators administering the DPD, stating: “Directive 95/46/EC has stood well the influx of these technological developments because it holds principles and uses concepts that are not only sound but also technologically neutral. Such principles and concepts remain equally relevant, valid and applicable in today's networked world.”

To test this, Birnhack analyzes the key constructs of the DPD. First, he looks at the most basic building block of all – the definition of “personal data” (aka PII in the US). For many years, the European concept of “personal data”, which is content-neutral and based on identifiability of individual “data subjects”, seemed like a success. The definition – “any information relating to an identified or identifiable individual”, proved adaptable to a digital reality where aggregation of innocuous, insensitive facts could combine to an undeniable privacy impact. Unlike the US sector-specific approach, which protected certain categories of information – about health (HIPAA), financials (GLBA), credit history (FCRA), video rentals (VPPA), children (COPPA), the European model triggered privacy protections whenever any type of data concerning an “identified or identifiable individual” was implicated.

Of course, if identifiability subjects data to the privacy framework, then lack of identifiability extricates data from those same obligations. Anonymization or de-identification were thus perceived as a silver bullet, allowing organizations to “have the cake and eat it too”, that is to retain information, repurpose and analyze it while at the same time preserving individuals’ privacy.

Alas, over the past decade it became increasingly clear that in a world of big data collection, storage and analysis, de-identification is becoming increasingly strained by shrewd re-identification techniques applied by clever adversaries. Today, examples of re-identification of apparently de-identified data abound. A couple of years ago, Paul Ohm drew on the computer science literature to “blow the whistle” on de-identification.

Consequently, the scope of the DPD becomes either overbroad, potentially encompassing every bit and byte of information, ostensibly not about individuals; or overly narrow, excluding de-identified information, which could be re-identified with relative ease. Indeed, it now appears that the DPD concept is outmoded, ill suited to deal with a big data reality. As Birnhack writes: “the definition of personal data is rooted within a digital technological paradigm, for good or for bad. The good part is that it is more advanced than the previous, analogue, content-based definition; the bad part is that the concept of non-identification is about to collapse, if it has not already collapsed.”

The definition of personal data has become deficient not only in its perception of de-identification but also in its view of personal data as a static concept, referring to “an individual”. This notion of data, sometimes referred to as “microdata”, fails to account for the fact that data, which are ostensibly not about “an individual”, such as the social grid or stylometry (analysis of writing style), may have a profound privacy impact. Clearly, new thinking is needed with respect to the definition of personal data. Unfortunately, more advanced notions, which have gained credence in the scientific community, such as differential privacy and privacy enhancing technologies, have largely been left out of the debate.

An additional fundamental concept of the DPD is that of data “processing”, defined to mean “any operation or set of operations which is performed upon personal data … such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction”. Here, Birnhack points out the clear, linearity of the concept of data being collected by an organization and then flowing through its systems (storage, retrieval, use…) until finally being put to rest (erasure or destruction). As Birnhack explains, while apparently tech-neutral, “the linear sequence assumes a particular technological environment.”

The DPD’s vision of a data “processing” is – how shall I put it – very “1970s”. In those days, an active “data controller” would collect data from a passive individual and then store, use or transfer it until its ultimate deletion. Today, with the explosion of peer-produced content on social networking services as well as the introduction of layer upon layer of service providers into the data value chain, this linear model has, in many contexts, become obsolete. Privacy risks are now posed by an indefinite number of geographically dispersed actors, not least individuals themselves, who voluntarily share their own information and that of their friends and relatives. In addition, in many contexts, such as mobile applications, behavioral advertising or social networking services, it is not necessarily the controller, but rather an intermediary or platform provider, that wields control over information.

An additional fundamental concept of the DPD, not addressed by Birnhack’s paper is that of location. The DPD views data transfers as discrete point-to-point transactions occurring between two data controllers. This view of data as “residing” in a jurisdiction, no longer fits the ephemeral, geographical indeterminate nature of cloud storage and transfers. For many years, transborder data flow regulation has caused much consternation to businesses on both sides of the Atlantic, while generating formidable legal fees. Unfortunately, this does not seem about to change.

(Re-posted with permission from Concurring Opinions blog. See original post here).

 

 

 

Add new comment