Professor Hartzog is a Professor of Law and Computer Science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. He holds a joint appointment with the School of Law and the College of Computer and Information Science. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Professor Hartzog’s work has been published in numerous scholarly publications such as the Yale Law Journal, Columbia Law Review, California Law Review, and Michigan Law Review and popular national publications such as The Guardian, Wired, BBC, CNN, Bloomberg, New Scientist, Slate, The Atlantic, and The Nation. His book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies, is under contract with Harvard University Press. He has testified twice before Congress on data protection issues.
Professor Hartzog has served as a Visiting Professor at Notre Dame Law School and the University of Maine School of Law. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center. He holds a PhD in mass communication from the University of North Carolina at Chapel Hill, an LLM in intellectual property from the George Washington University Law School, and a JD from Samford University.
I have just uploaded a new essay about online privacy to SSRN that will appear in Volume 46 of the Georgia Law Review. The essay, titled "Chain-Link Confidentiality," asserts that personal information that is shared online can be better protected if we require our confidants to make sure that their confidants are watching out for us. This strategy could help us retain control over our personal information as it moves downstream. Your comments are warmly welcome.
Last week, the Supreme Court issued its opinion in United States v. Jones, in which the Justices held that the government's installation of a GPS device on a target's vehicle, and its use of that device to monitor the vehicle's movements, constituted a Fourth Amendment search. The decision was surprisingly unanimous on this point, though concurring opinions by Justices Sotomayor and Alito potentially amplify the significance of the opinion by proposing alternate approaches to the larger problem of ubiquitous surveillance technologies and privacy in public. Given the majority opinion's narrow focus on the attachment of the device to the car, the larger issue of privacy in public remains unsettled.
Others have done an exemplary job of commenting on the decision. The dominant themes arising from the decision and analysis of the decision seem to be the (re?)injection of the concept of trespass into Fourth Amendment doctrine, signs of potential withering of the third party doctrine, and recognition that Fourth Amendment and privacy doctrine will soon enough be useless if they do not adequately protect against ever-evolving surveillance methods and technologies.
I'd like to focus on an aspect of the decision that has not shown up much in the analysis of the case, likely because it was never explicitly mentioned in the text. Although the word obscurity does not appear anywhere in United States v. Jones, I think the decision, particularly Justice Sotomayor's concurring opinion, supports the idea that the obscurity of our personal information is worth protecting.
To hear some in industry and government tell it, the answer to our modern privacy dilemma is simple: give users more control. There is seemingly no privacy-relevant arena, from social media to big data to biometrics that cannot be remedied with a heaping dose of personal control. Facebook founder and CEO Mark Zuckerberg said “What people want isn’t complete privacy. It isn’t that they want secrecy.
On December 14, 2016, the Federal Trade Commission settled a complaint with the company running the adult finder site Ashley Madison over the 2015 data breach that exposed the personal data of more than 36 million users and highlighted the site’s unfair and deceptive practices.
"Every day we use countless digital devices and web services to shop, track our fitness, chat with friends, play games, check-in at stores and restaurants, you name it. While these activities are becoming increasingly essential in our digital society, they also can put our personal information at risk, says professor Woodrow Hartzog, whose research focuses on privacy, data protection, robotics, and automated technologies.
""People always complain that this is a slap on the wrist compared to Europe," says Woodrow Hartzog, a professor of law and computer science at Northeastern University School of Law. "But there's only so much the FTC can do.""
"According to university law professors Woodrow Hartzog and Danielle Citron, “It is the first such complaint by the FTC that involved bots designed to actively deceive consumers.” It’s one thing to create a Twitter chatbot that acquires hundreds of followers who might not know it isn’t a real person. It’s quite another to maliciously program a bot to commit a crime."
"On Feb. 16, eight privacy and security law professors—Kenneth A. Bamberger, Woodrow Hartzog, Chris Jay Hoofnagle, William McGeveran, Deirdre K. Mulligan, Paul Ohm, Daniel J. Solove and Peter Swire—filed a brief in support of the FTC.
The Tech/Law Colloquium speaker for September 19, 2017 will be Woodrow Hartzog, a professor of law and computer science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Talk: Privacy’s Blueprint: The Battle to Control the Design of New Technologies
Robots are starting to look suspiciously familiar. Increasingly sophisticated robots designed to resemble us are striking up more and more symbiotic relationships with humans, at home as our companions and at our workplaces as colleagues.
Human-robot interactions will continue to evolve as robotic technology transforms the way we see our creations and the way they react to us. But as machines cease acting like machines and become more integrated into our lives, how will we feel about them? And, dare we ask, how will they feel about us?
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog will all be participating in this two-day conference.
Registration is open for We Robot 2015 and we have a great program planned:
Friday, April 10
Registration and Breakfast
Welcome Remarks: Dean Kellye Testy, University of Washington School of Law
Introductory Remarks: Ryan Calo, Program Committee Chair
2013 PRIVACY PAPERS FOR POLICY MAKERS
The Future of Privacy Forum
Co-chairs Jules Polonetsky and Christopher Wolf
in conjunction with Congresswoman Sheila Jackson Lee invite you to
“Privacy Papers for Policy Makers”
A discussion of leading privacy research
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog are listed as participants for We Robot 2014. Robotics is becoming a transformative technology. We Robot 2014 builds on existing scholarship exploring the role of robotics to examine how the increasing sophistication of robots and their widespread deployment everywhere from the home, to hospitals, to public spaces, and even to the battlefield disrupts existing legal regimes or requires rethinking of various policy issues. If you are on the front lines of robot theory, design, or development, we hope to see you.
Watch the full video at the Energy & Commerce Committee website.
Woodrow Hartzog, Associate Professor Cumberland School of Law
See more at: http://energycommerce.house.gov/hearing/what-are-elements-sound-data-bre...
CIS Affiliate Scholar David Levine interviews Prof. Ryan Calo of University of Washington School of Law and Woodrow Hartzog of Cumberland School of Law on robotics law.
Listen to the full radio show (in German) at Deutschlandradio.
"On the other hand: even algorithms can make mistakes. You will eventually written by humans. And just legal texts can be difficult in a formalized language to translate. They are, says Woodraw Hartzog, just not made for it to be automated. And they are not made to be enforced to one hundred percent."