Professor Hartzog is a Professor of Law and Computer Science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. He holds a joint appointment with the School of Law and the College of Computer and Information Science. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Professor Hartzog’s work has been published in numerous scholarly publications such as the Yale Law Journal, Columbia Law Review, California Law Review, and Michigan Law Review and popular national publications such as The Guardian, Wired, BBC, CNN, Bloomberg, New Scientist, Slate, The Atlantic, and The Nation. His book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies, is under contract with Harvard University Press. He has testified twice before Congress on data protection issues.
Professor Hartzog has served as a Visiting Professor at Notre Dame Law School and the University of Maine School of Law. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center. He holds a PhD in mass communication from the University of North Carolina at Chapel Hill, an LLM in intellectual property from the George Washington University Law School, and a JD from Samford University.
For those who don't know it, Surprisingly Free has hosted many excellent guests, so I recommend exploring the website. If you're interested in law and technology podcasts, I also highly recommend CIS's own Hearsay Culture.
Privacy settings and other technological controls used to protect privacy have been justifiably criticized a bit lately. Danielle Citron recently blogged at Concurring Opinions about an important new study conducted by Columbia’s Michelle Madejski, Maritza Johnson and Steve Bellovin that found that Facebook’s default privacy settings fail to capture real-world expectations. The United Kingdom Government has recently indicated that browser settings alone cannot be used by Web users to give consent to being tracked online under a new EU law. The Government's rationale for this decision was that these browser settings were not flexible enough to reflect a user's true privacy preferences. The general consensus seems to be that most privacy settings simply aren't that good at protecting the actual information we consider private in a given context. I think some skepticism regarding privacy controls is warranted, particularly in light of the current technology. However, I'd like to show some support for privacy controls, or, rather, the promise of privacy controls. My hope is that that courts and lawmakers do not completely sour on recognizing privacy controls as a legitimate way to protect an Internet user's privacy.
In the past few weeks a few potential employers and schools were reported to have asked for access to the Facebook profile of an applicant or student. These reports are starting to feel like a trend. I think these requests are problematic not just for the Facebook user, but also the employer or administrator asking for access. In short, anyone asking for access to Facebook profiles and/or login credentials is asking users to betray the trust of their network and subjecting all parties involved to the potential deactivation of their Facebook account.
Website scraping, which is the bulk extraction of website information by software, is becoming an increasingly visible activity. The Lovely-Faces controversy shows how scraped information can disrupt a sense of privacy when re-published in a different context. The Lovely-Faces website, deemed “a social experiment” by its creators, re-contextualizes names, locations, and photos scraped from publicly accessible Facebook pages in a mock dating website.
The Michigan Law Review recently published “The Fight to Frame Privacy,” Woodrow Hartzog's book review of Daniel Solove’s “Nothing to Hide: The False Tradeoff Between Privacy and Security.”
Read the full review here: http://www.michiganlawreview.org/articles/the-fight-to-frame-privacy
The concept of implied confidentiality has deep legal roots, but it has been largely ignored by the law in online-related disputes. A closer look reveals that implied confidentiality has not been developed enough to be consistently applied in environments that often lack obvious physical or contextual cues of confidence, such as the Internet. This absence is significant because implied confidentiality could be one of the missing pieces that help users, courts, and lawmakers meaningfully address the vexing privacy problems inherent in the use of the social web.
Disclosing personal information online often feels like losing control over one’s data forever; but this loss is not inevitable. This essay proposes a “chain-link confidentiality” approach to protecting online privacy. One of the most difficult challenges to guarding privacy in the digital age is the protection of information once it is exposed to other people. A chain-link confidentiality regime would contractually link the disclosure of personal information to obligations to protect that information as the information moves downstream.
On the Internet, obscure information has a minimal risk of being discovered or understood by unintended recipients. Empirical research demonstrates that Internet users rely on obscurity perhaps more than anything else to protect their privacy. Yet, online obscurity has been largely ignored by courts and lawmakers. In this article, we argue that obscurity is a critical component of online privacy, but it has not been embraced by courts and lawmakers because it has never been adequately defined or conceptualized.
"Hearing witness Woodrow Hartzog, associate professor at Cumberland School of Law, said he would like to see minimal pre-emption of state rules and would like the FTC to be given rulemaking authority in association with legislation."
"If a national law preempts strong state laws, "hard won consumer protections will be lost," added Woodrow Hartzog, a law professor focused on data privacy issues at Samford University."
"Privacy law expert Woodrow Hartzog, however, will push back on any effort to have a federal law override state rules. "Our critical data protection infrastructure will be weakened if federal legislation scales back protection, consolidates regulatory authority, and sets specific rules in stone," he said in written testimony."
"Woodrow N. Hartzog, a law professor who studies privacy, said that without knowing more about how private companies use the Healthcare.gov data, it’s hard to tell how worried users of the site should be. “Are third-party recipients of this information allowed to share with other people?” he asked. “Are they under an obligation to keep from trying to re-identify that information” (that is, from trying to link data to people’s real identities)? “Without transparency,” he said, “it’s really difficult to know actually how concerned we should be about this.”"
"Online black markets are likely to continue to be created and shut down. Yet this trial has also reminded us of the limits of technology. When the Internet was in its infancy, many thought online activity was also beyond the reach of the law. We've seen time and time again this is just not true. Bitcoin is a very powerful and interesting technology, but it is important not to overestimate innovation. It's equally important not to underestimate how our offline actions can make us vulnerable online.
Solutions to many pressing economic and societal challenges lie in better understanding data. New tools for analyzing disparate information sets, called Big Data, have revolutionized our ability to find signals amongst the noise. Big Data techniques hold promise for breakthroughs ranging from better health care, a cleaner environment, safer cities, and more effective marketing. Yet, privacy advocates are concerned that the same advances will upend the power relationships between government, business and individuals, and lead to prosecutorial abuse, racial or other profiling, discrimination, redlining, overcriminalization, and other restricted freedoms.
‘Read Me’ Or Just Tap ‘I Agree’
There’s a huge group of people at work behind our screens. They’re called behaviour architects, persuasive designers or user-experience specialists and the power they have is massive.
That urge to keep swiping through your twitter feed? That’s design. The way we all click ‘I Agree’ to the terms and conditions? That’s design. Swiping right or left on Tinder? Well, that’s design too.
We live in an online world of someone else’s making and most of us never even give it a second thought. And actually, that’s design as well.
Speaking before the audience at the recent IAPP Data Protection Congress in Brussels, keynoter Woody Hartzog made a challenging assertion: "Control is the wrong goal for privacy by design, perhaps the wrong goal for data protection in general." But isn't control a central tenet of good privacy? It sure is. But it shouldn't be, the author of "Privacy’s Blueprint: The Battle to Control the Design of New Technologies" argued. While everyone emphasizes "control" of personal data as core to privacy, too much zeal for control dilutes efforts to design information tech correctly.
Design is one of the most important but overlooked factors that determines people’s privacy. Social media apps, surveillance technologies, and the Internet of Things are all built in ways that make it hard to guard personal information. And the law says this is okay because it is up to users to protect themselves ― even when the odds are deliberately stacked against them.
Our modern privacy frameworks, with their emphasis on gaining informed consent from consumers in order to use their data, are broken models. That's according to Woodrow Hartzog, a law professor at Northeastern University in Boston. In this episode of The Privacy Advisor Podcast, Hartzog discusses the ways that, given such models, technologies are designed at the engineering level to undermine user privacy.
Recently 50 million Facebook users had their personal information extracted and used for political and commercial purposes. In the wake of this scandal, we’ve all become much more aware of how our use of social media clashes with our desire for privacy. Are technical fixes and awareness enough, or is it time for Facebook and other online services to be regulated? Our guest Woodrow Hartzog is a professor of law and computer science at Northeastern University and discusses the battle and future of our personal information.