Stanford CIS

Securing Privacy: Challenges for the Chief Privacy Officer

By Stanford Center for Internet and Society on

Defining the Legal Standard for Information Security
Thomas Smedinghoff

Wants to look at practical question surrounding security: what does the law require that you do?

Laws and regulations fall into four different categories:
- “Just make it happen” – ensure that you need guarantee security of the data
- “Do this specific thing” – specify certain security mechanisms and methods
- “Do what is reasonable” – ensures that you use reasonable security or use commercially reasonable security
- “Follow this process” – this is where regulations are heading – setting up best practices for businesses

Developing law of information security:

“Companies have a legal obligation to provide security.” This is extending to third parties and is becoming an upper management responsibility, not that of coders, etc.

What do you have to do? The emerging focus on “process”:

- Asset assessment of what needs to be protected
- Risk assessment of the vulnerabilities of those assets
- Assessing the burden

So what are we protecting? We have many different types of information – financial, health-care, trade secret information… What are the foreseeable threats to this information? What is the likelihood that those threats will materialize? What is the potential damage that each of them would cause? This is now being codified in regulations and consent decrees. Increasingly becoming a focus on a process.

Essentially, what we’re doing is a tort-type of analysis. Industry standards might be a minimum. “Best Practices” may not be sufficient.

Most statutes and regulations focus on physical, administrative, and technical security measures. (making sure that your employees don’t post their passwords on their CRTs…)

Continual revelation required. “Security is never done.” It is a continual process in analyzing, reassessing, and changes in your business and what it is that you should be doing. Many of the consent decrees that the FCC is entering into require audits, for example.

The bottom line: there is no fixed answer to what it is you need to do. Companies must go through the analysis process. Answering “how do I know when I’ve done enough?” is very, very difficult.
--
Can we get to the point where companies can certify that they’ve done what’s legally required to secure privacy?

Check out bakernet.com/ecommerce for more information about these statutes.

16h45
Graham filling in for Jake for a few minutes

Andrew Charlesworth – University of Bristol

He is:
Not making another attempt to explain why EU information privacy laws are better than US info privacy laws.

Not talking about existing difficulties in complying with and enforcing EU info privacy law.

Not talking about difficulties in complying with and enforcing UK info privacy law.

Not talking about ongoing PET/PIT debate, nor suggesting that P3P is bad.

16h51
Jake back

He is :
Arguing that we should move away from the type of command and control regulation we have at the moment.
These types of regulation will not work in the future.

Some kind of de-centralized regulation is required. But whatever we do, we need to have a better idea of what we’re going to protect.

The public perception of privacy changes very, very rapidly. 1970s literature seems as outdated as 1920s literature. Guess what—we’re in a different era.

If state centered regulation fails, then decentered regulation HAS to work. If it doesn’t, this will pose major difficulties with the implementation of new technologies.

Episodic laws: We make a regulation. People forget about it and go back to their old practices. No attempt to engage what privacy interests are at issue/what “privacy as a human right” means. We draw largely from our past regulatory tool-kit. There is no attempt to innovative. “impotent in the face of radical technological change.” We simply do not have the tools to deal with that change.

We need to totally reevaluate:
- think about how regulators regulate
- how regulators perceive their role
- how the regulated react.

Sometime privacy regulation should be destructive. If cause certain organizations major distress, this should not keep us from implementing the privacy measures (Do Not Call databases).

We need to involve the regulated in the regulation a lot more. We need to mix our regulation. If this is done properly, we don’t need to add incrementally to our laws with new technology. Our law should be flexible enough to handle emerging tech.

The EU’s Data Protective Directive is no good. There was little dialogue in its creation. Little desire for putting privacy into context. It was developed Pre-Internet.

Classic arguments: It is not transparent. It is not accountable

Federated Identity Management will be very difficult for the EU to handle.

Mobile commerce will spread across borders, it will distance data providers from users, it will make who controls information ambiguous. You can’t apply command and control: you’ll have to accept the demise of centrally controlled data industry regulation

Alex Fowler
PricewaterhouseCoopers

Embedding Meaningful Privacy Measures Into Business

Privacy laws are only one of the many issues on the table. There are competing interests to put privacy in its context.

Organizations are becoming decentralized. Products/services stand as their own businesses, and this makes it difficult to understand how privacy should be handled throughout the company.

Return on investment to working on privacy is also a consideration.

There are two schools to thinking about privacy. 1) it is a threat to business, a cost center, and not going to add shareholder value. 2) the new school: looks at privacy as an opportunity, a new way of doing business. This gets back to branding, marketing, and fostering long-term customer relationships. Both of these perspectives are at play with each client. Organizations are at odds with themselves.

All of the money spent on compliance comes down to the quality of the experience a person has when he or she is left to sign a privacy-informing document.

Organizations will continually be in a process of policy reinvention. There will be a disconnect between the incentive to do the right thing and the added gain from using a piece of private information.

Typical thinking: “seek as much data as possible and then try to commodify it later”

Organizations are very confused: they think that $1 spent evaluating information security is a $1 spent addressing privacy.

Organizations also think that taking care of privacy online will take care of it offline.

There are three types of people with respect to privacy: privacy pragmatists, privacy fundamentalists, and the unconcerned. Organizations cannot ignore any one of these groups lest risk losing sales.

…If there are better ways to design your data management infrastructure to aggregate, anonymize, and weed out information, you’re on the right track…

Jon Sobel
Folger Levin & Kahn LLP

Covering the issue from the perspective of someone who is in a business.

The way that organizations collect information is often a result of legacy issues. It is very convoluted. You must go back and unwind all of that. There is not one person who knows what all the data is, where it is coming in, and how it is being handled. Most issues arise because of mistakes.

So what’s the mindset that organizations have? We are currently in a contract/market regime… but there must be some sort of market failure. There is a belief among many that something is not working, that something is happening with their data…

There is a whole range of expectations about data that is very context driven. We should decide what people really care about. It is early enough yet that we don’t really know the answers to many of these issues.

So what’s it like from the inside: internal challenges cannot be overemphasized. The imperative for most people in business is to make money, to help the organization succeed. BUT – it is also in the company’s self interest to do the right thing because of their brand. Smart business people really do understand that they should not blow privacy.

There is always a tradeoff in contracts between thoroughness on one hand and readability on the other. At one point, Yahoo! hired an English professor with a law degree to ensure that their privacy policy was readable to a seventh grader.

Are the ordinary notions of consent applicable to the Internet? Regulators might say that companies did not seek adequate consent. Regulators might want consent-plus.

There are also a number of different regulators who can fine a company/regulate it.

A plea: Having spent much time trying to represent Yahoo! in changing times, it is very helpful to be open and for people to acknowledge what they do/don’t know. As we work through these things, it will go much better if we simply recognize that this is difficult stuff and we haven’t gotten it figured out yet, but, as always, we’ll figure out some good answers.

QUESTIONS:

Not really a question, but an observation: people are willing to try potentially privacy-injuring technologies if they are paid money or given some other form of incentive to do so.

Is there any hard evidence that people would be loyal to brands that protect privacy?

Panel’s Answer: There are a number of marketing studies that have advanced that argument, but we’re not at the stage to say that such studies are conclusive… Several very solid Internet companies believe that they must fix privacy issues immediately.

Published in: Blog