Stanford CIS

"Legally Defensible" Security and the (Surprisingly Lacking) Right to Privacy

By Larry Downes on

I participated last week in a Techdirt webinar titled, “What IT needs  to know about Law.”  (You can read Dennis  Yang’s summary here, or follow his link to watch the full one-hour  discussion.  Free registration required.)

The key message of  The Laws of  Disruption is that IT and other executives need to know a  great deal about law—and more all the time.  And Techdirt does an  admirable job of reporting the latest breakdowns between innovation and  regulation on a daily basis.  So I was happy to participate.

Not surprisingly, there were far too many topics to cover in a single  seminar, so we decided to focus narrowly on just one:  potential legal  liability when data security is breached, whether through negligence  (lost laptop) or the criminal act of a third party (hacking attacks).

We were fortunate to have as the main presenter David Navetta, founding  partner with The Information Law Group,  who had recently written  an excellent article on what he calls “legally-defensible security”  practices.

I started the seminar off with some context, pointing out that one of  the biggest surprises for companies in the Internet age is the  discovery that having posted a website on the World Wide Web, they are  suddenly and often inappropriately subject to the laws and jurisdiction  of governments around the world.   (How wide is the web? World.)

In the case of security breaches, for example, a company may be  required to disclose the incident to affected third parties (customers,  employees, etc.) under state law.  At the other extreme, executives of  the company handling the data may be criminally-liable if the breach  involved personally-identifiable information of citizens of the European  Union (e.g., the infamous Google Video case in Italy earlier this year,  which is pending appeal).  Individuals and companies affected by a  breach may sue the company under a variety of common law claims,  including breach of contract (perhaps the violation of a stated privacy  policy) or simple negligence.

The move to cloud computing amplifies and accelerates the potential  nightmares.  In the cloud model, data and processing are subcontracted  over the network to a potentially-wide array of providers who offer  economies of scale, application or functional expertise, scalable  hardware or proprietary software.  Data is everywhere, and its  disclosure can occur in an exploding number of inadvertent ways.  If a  security breach occurs in the course of any given transaction, just  untangling which parties handled the data—let alone who let it slip  out—could be a logistical (and litigation) nightmare.

Not all security breaches involve private or personal information,  but it’s not surprising that the most notable breakdowns (or at least  the most vividly-reported) in security are those that expose consumer or  citizen data, sometimes for millions of affected parties.  (Some of the  most egregious losses have involved government computers left  unsecured, with sensitive citizen data unencrypted on the hard drive.)   Consumer computing activity has surpassed corporate computing and is  growing much faster.  Privacy and security are topics that are  increasingly hard to disentangle

Which is not to say that the bungling of data that affects millions  of users necessarily translates to legal consequences for the company  who held the information.   Often, under current law, even the most  irresponsible behavior by a data handler does not necessarily translate  to liability.

For one thing, U.S. law does not require companies to spare no  expense in protecting data.  As David Navetta points out, courts may  find that despite a breach the precautions taken may have nonetheless  been economically sensible, meaning that the precautions taken were  justified given the likelihood of a breach and the potential  consequences that followed.  Adherence to ISO or other industry  standards on data security may be sufficient to insulate a company from  liability—though not always.  (Courts sometimes find that industry  standards are too lax.)

For the most part, tort law still follows the classic negligence  formula of the beatified American jurist Learned Hand, who explained  that the duty of courts was to encourage behavior by defendants that  made economic sense.  If courts found liability any time a breach  occurred, then data handlers would be incentivized to spend inefficient  amounts of money on protecting it, leading to net social loss.  (The  classic cases involved sparks from locomotives causing fire damage to  crops—perfect avoidance of damage, the courts ruled, would cost too much  relative to the harm caused and the probability of it occurring.)

That, at least, is the common law regime that applies in the U.S.   The E.U., under laws enacted in support of its 1995 Privacy Directive,  follow a different rule, one that comes closer to product liability law,  where any failure leads to per se liability for the manufacturer, or  indeed for any company in the chain of sales to a consumer.

A case last week from the Ninth Circuit Court of Appeals, however,  reminds us that a finding of liability doesn’t necessarily lead to an  award of damages.  In Ruiz v. Gap, a job applicant whose personal  information was lost when two laptop computers were stolen from a Gap  vendor who was processing applications sued Gap, claiming to represent a  class of applicants who were victims of the loss.

For more, see "The Privacy and Security Totentanz."

Published in: Blog , Privacy , Notice by Design