Professor Hartzog is a Professor of Law and Computer Science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. He holds a joint appointment with the School of Law and the College of Computer and Information Science. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Professor Hartzog’s work has been published in numerous scholarly publications such as the Yale Law Journal, Columbia Law Review, California Law Review, and Michigan Law Review and popular national publications such as The Guardian, Wired, BBC, CNN, Bloomberg, New Scientist, Slate, The Atlantic, and The Nation. His book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies, is under contract with Harvard University Press. He has testified twice before Congress on data protection issues.
Professor Hartzog has served as a Visiting Professor at Notre Dame Law School and the University of Maine School of Law. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center. He holds a PhD in mass communication from the University of North Carolina at Chapel Hill, an LLM in intellectual property from the George Washington University Law School, and a JD from Samford University.
I have just uploaded a new essay about online privacy to SSRN that will appear in Volume 46 of the Georgia Law Review. The essay, titled "Chain-Link Confidentiality," asserts that personal information that is shared online can be better protected if we require our confidants to make sure that their confidants are watching out for us. This strategy could help us retain control over our personal information as it moves downstream. Your comments are warmly welcome.
Last week, the Supreme Court issued its opinion in United States v. Jones, in which the Justices held that the government's installation of a GPS device on a target's vehicle, and its use of that device to monitor the vehicle's movements, constituted a Fourth Amendment search. The decision was surprisingly unanimous on this point, though concurring opinions by Justices Sotomayor and Alito potentially amplify the significance of the opinion by proposing alternate approaches to the larger problem of ubiquitous surveillance technologies and privacy in public. Given the majority opinion's narrow focus on the attachment of the device to the car, the larger issue of privacy in public remains unsettled.
Others have done an exemplary job of commenting on the decision. The dominant themes arising from the decision and analysis of the decision seem to be the (re?)injection of the concept of trespass into Fourth Amendment doctrine, signs of potential withering of the third party doctrine, and recognition that Fourth Amendment and privacy doctrine will soon enough be useless if they do not adequately protect against ever-evolving surveillance methods and technologies.
I'd like to focus on an aspect of the decision that has not shown up much in the analysis of the case, likely because it was never explicitly mentioned in the text. Although the word obscurity does not appear anywhere in United States v. Jones, I think the decision, particularly Justice Sotomayor's concurring opinion, supports the idea that the obscurity of our personal information is worth protecting.
Co-authored with Daniel Solove.
Third-party data service providers, especially providers of cloud computing services, present unique and difficult privacy and data security challenges. While many companies that directly collect data from consumers are bound by the promises they make to individuals in their privacy policies, cloud service providers are usually not a part of this arrangement. It is not entirely clear what, if any, obligations cloud service providers have to protect the data of individuals with whom they have no contractual relationship.
"“A lot of what we’re getting at in the Carpenter case,” said Woodrow Hartzog, professor of law and computer science at Northeastern, “is a growing sense of discontent from the judges over the seemingly simplistic rules we crafted years ago about when and how the government can surveil and collect information about us in light of all these powerful information technologies."
"Every day we use countless digital devices and web services to shop, track our fitness, chat with friends, play games, check-in at stores and restaurants, you name it. While these activities are becoming increasingly essential in our digital society, they also can put our personal information at risk, says professor Woodrow Hartzog, whose research focuses on privacy, data protection, robotics, and automated technologies.
""People always complain that this is a slap on the wrist compared to Europe," says Woodrow Hartzog, a professor of law and computer science at Northeastern University School of Law. "But there's only so much the FTC can do.""
"According to university law professors Woodrow Hartzog and Danielle Citron, “It is the first such complaint by the FTC that involved bots designed to actively deceive consumers.” It’s one thing to create a Twitter chatbot that acquires hundreds of followers who might not know it isn’t a real person. It’s quite another to maliciously program a bot to commit a crime."
Solutions to many pressing economic and societal challenges lie in better understanding data. New tools for analyzing disparate information sets, called Big Data, have revolutionized our ability to find signals amongst the noise. Big Data techniques hold promise for breakthroughs ranging from better health care, a cleaner environment, safer cities, and more effective marketing. Yet, privacy advocates are concerned that the same advances will upend the power relationships between government, business and individuals, and lead to prosecutorial abuse, racial or other profiling, discrimination, redlining, overcriminalization, and other restricted freedoms.
‘Read Me’ Or Just Tap ‘I Agree’
There’s a huge group of people at work behind our screens. They’re called behaviour architects, persuasive designers or user-experience specialists and the power they have is massive.
That urge to keep swiping through your twitter feed? That’s design. The way we all click ‘I Agree’ to the terms and conditions? That’s design. Swiping right or left on Tinder? Well, that’s design too.
We live in an online world of someone else’s making and most of us never even give it a second thought. And actually, that’s design as well.
Speaking before the audience at the recent IAPP Data Protection Congress in Brussels, keynoter Woody Hartzog made a challenging assertion: "Control is the wrong goal for privacy by design, perhaps the wrong goal for data protection in general." But isn't control a central tenet of good privacy? It sure is. But it shouldn't be, the author of "Privacy’s Blueprint: The Battle to Control the Design of New Technologies" argued. While everyone emphasizes "control" of personal data as core to privacy, too much zeal for control dilutes efforts to design information tech correctly.
Design is one of the most important but overlooked factors that determines people’s privacy. Social media apps, surveillance technologies, and the Internet of Things are all built in ways that make it hard to guard personal information. And the law says this is okay because it is up to users to protect themselves ― even when the odds are deliberately stacked against them.
Our modern privacy frameworks, with their emphasis on gaining informed consent from consumers in order to use their data, are broken models. That's according to Woodrow Hartzog, a law professor at Northeastern University in Boston. In this episode of The Privacy Advisor Podcast, Hartzog discusses the ways that, given such models, technologies are designed at the engineering level to undermine user privacy.
Recently 50 million Facebook users had their personal information extracted and used for political and commercial purposes. In the wake of this scandal, we’ve all become much more aware of how our use of social media clashes with our desire for privacy. Are technical fixes and awareness enough, or is it time for Facebook and other online services to be regulated? Our guest Woodrow Hartzog is a professor of law and computer science at Northeastern University and discusses the battle and future of our personal information.