Stanford CIS

PCLOB: Defining Privacy Interests, 2/2

By Richard Forno on

Panel 2 of today's PCLOB public hearing explored privacy interests within a counterterrorism context with an emphasis on what impact technology has upon it.  The panel was fairly fluid and free-flowing with discussion, but a few salient points emerged that are worth commenting on.

Georgia Tech's Annie Anton believes that practical anonymization is good for society and that mandated backdoors to encryption are not. (Rick here: I agree. The so-called 'Golden Key' is a bad idea, and simply tells a would-be adversary to 'go dark', thereby making tracking them even more difficult. Unfortunately, the FBI currently is revisiting this defunct 1990s idea.) She strongly supports the use of robust encryption by default in products/services and the inclusion of privacy considerations in assorted threat modelling practices. In something of a rarity for talk in Washington, she recommends improving the "state of practice" regarding cryptanalysis and code-breaking techniques versus placing society at-risk by mandating exploitable and lower-quality cryptographic security for all.

Alvaro Bedoya from Georgetown's Center on Privacy & Technology echoed Ed Felten's earlier comments by urging the government to resist giving up proposed restrictions on the collection of data in favor of more restrictions on the use of that data. Here again, the concern was voiced that once data is collected, there is no guarantee regarding how it might, can, or will be used in the future, regardless of any government policy in-place at the time of collection. Using a historical analogy of collected census data being used to identify and detain Japanese-American citizens during World War II, Bedoya warned that "today's discrimination is yesterday's national security measure."  He also reiterated the need for competent and knowledgeable oversight, noting that many Congressional members do not have staff that are "properly cleared" to be of assistance on these issues, thus leaving the elected official at a significant knowledge disadvantage when analyzing or voting on them.

Microsoft's chief privacy officer Michael Hintze told the Commission that customers "won't use their products if they don't trust them" and went on to describe Microsoft's privacy-by-design approach to necessary data collection and product development.  In terms of surveillance reforms, he emphatically called for an end to bulk collection activities, expressed a desire that governments not hack or otherwise routinely target data centers or Internet connections, recommended adversarial advocates during FISC hearings, and that any discussions of privacy and surveillance policy/practices must be global in scope to be effective and obtain meaningful stakeholder buy-in over the long term.

As a technologist, Hadi Nahari, Chief Security Architect at NVIDIA, shared many of these views. He particularly emphasized that while the Internet of Things is the next big concern for privacy advocates, the technologies, design, and protocols of the Internet itself make security and privacy difficult to implement. During Q&A, he endorsed the idea of an organic user-level data revocation capability whereby "pressing a button" could terminate a company's collection, retention, and/or use of a user's data instantly on-demand. Nahari did suggest that in some cases, this capability could be helpful in allowing citizens to control some of the ways their citizen information is collected and used by government entities, but that such a capability in the government sphere had its (legitimate) limitations.

The second panel successfully built upon the issues raised earlier in the morning and injected the role of technology in both facilitating and hindering solutions to the security-v-privacy situation.

In other news, as we break for lunch, I notice that "NSA Surveillance Van 13" seems to have driven off.

(I'm unable to get back in time for Panel #4, unfortunately.)