Professor Hartzog is a Professor of Law and Computer Science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. He holds a joint appointment with the School of Law and the College of Computer and Information Science. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Professor Hartzog’s work has been published in numerous scholarly publications such as the Yale Law Journal, Columbia Law Review, California Law Review, and Michigan Law Review and popular national publications such as The Guardian, Wired, BBC, CNN, Bloomberg, New Scientist, Slate, The Atlantic, and The Nation. His book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies, is under contract with Harvard University Press. He has testified twice before Congress on data protection issues.
Professor Hartzog has served as a Visiting Professor at Notre Dame Law School and the University of Maine School of Law. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center. He holds a PhD in mass communication from the University of North Carolina at Chapel Hill, an LLM in intellectual property from the George Washington University Law School, and a JD from Samford University.
For those who don't know it, Surprisingly Free has hosted many excellent guests, so I recommend exploring the website. If you're interested in law and technology podcasts, I also highly recommend CIS's own Hearsay Culture.
Privacy settings and other technological controls used to protect privacy have been justifiably criticized a bit lately. Danielle Citron recently blogged at Concurring Opinions about an important new study conducted by Columbia’s Michelle Madejski, Maritza Johnson and Steve Bellovin that found that Facebook’s default privacy settings fail to capture real-world expectations. The United Kingdom Government has recently indicated that browser settings alone cannot be used by Web users to give consent to being tracked online under a new EU law. The Government's rationale for this decision was that these browser settings were not flexible enough to reflect a user's true privacy preferences. The general consensus seems to be that most privacy settings simply aren't that good at protecting the actual information we consider private in a given context. I think some skepticism regarding privacy controls is warranted, particularly in light of the current technology. However, I'd like to show some support for privacy controls, or, rather, the promise of privacy controls. My hope is that that courts and lawmakers do not completely sour on recognizing privacy controls as a legitimate way to protect an Internet user's privacy.
In the past few weeks a few potential employers and schools were reported to have asked for access to the Facebook profile of an applicant or student. These reports are starting to feel like a trend. I think these requests are problematic not just for the Facebook user, but also the employer or administrator asking for access. In short, anyone asking for access to Facebook profiles and/or login credentials is asking users to betray the trust of their network and subjecting all parties involved to the potential deactivation of their Facebook account.
Website scraping, which is the bulk extraction of website information by software, is becoming an increasingly visible activity. The Lovely-Faces controversy shows how scraped information can disrupt a sense of privacy when re-published in a different context. The Lovely-Faces website, deemed “a social experiment” by its creators, re-contextualizes names, locations, and photos scraped from publicly accessible Facebook pages in a mock dating website.
We are constantly exposed in public. Yet most of our actions will fade into obscurity. Do you, for example, remember the faces of strangers who stood in line with you the last time you bought medicine at a drugstore? Probably not. Thanks to limited memory and norms against staring, they probably don’t remember yours either.
Since the dawn of the Internet, American regulators and companies have pursued two goals to protect our privacy: that people should be in control of their data and that companies should be transparent about what they do with our data. We can see these goals detailed in the privacy policies and terms of service that we “agree” to as well as companies’ increasingly complicated systems of privacy dashboards, permissions and sharing controls.
"“The future of human flourishing depends upon facial recognition technology being banned,” wrote Woodrow Hartzog, a professor of law and computer science at Northeastern, and Evan Selinger, a professor of philosophy at the Rochester Institute of Technology, last year. “Otherwise, people won’t know what it’s like to be in public without being automatically identified, profiled, and potentially exploited.”
"In the near future, robots might be able to manipulate our emotions, everyone will be rated on everything, and we won’t be able to trust our own eyes, Woodrow Hartzog said.
“I promise I will end on a positive note,” the Northeastern professor said Thursday, eliciting chuckles from a roomful of people gathered at the university’s Charlotte campus to hear him discuss online privacy.
"“You’ve really got a rock-and-a-hard-place situation happening here,” said Woody Hartzog, a professor of law and computer science at Northeastern University. “Facial recognition can be incredibly harmful when it’s inaccurate and incredibly oppressive the more accurate it gets.”"
"At the Senate hearing, Northeastern University’s Professor of Law and Computer Science Woodrow Hartzog said he teaches his students to expect to deal with a patchwork of state laws on a variety of issues, so the argument that state regulation stifles innovation may not be true, because many industries consider it a legal reality.
"“In the United States, we have a tradition of dealing with a patchwork of 50 state laws, [and] while there are virtues to consistency, it’s not the obstacle that would strike me as the first thing we have to surmount if we’re going to get privacy right,” said Woodrow Hartzog, the lone privacy advocate testifying before the committee, and a professor of law and computer science at Northeastern University."
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog will all be participating in this two-day conference.
Registration is open for We Robot 2015 and we have a great program planned:
Friday, April 10
Registration and Breakfast
Welcome Remarks: Dean Kellye Testy, University of Washington School of Law
Introductory Remarks: Ryan Calo, Program Committee Chair
2013 PRIVACY PAPERS FOR POLICY MAKERS
The Future of Privacy Forum
Co-chairs Jules Polonetsky and Christopher Wolf
in conjunction with Congresswoman Sheila Jackson Lee invite you to
“Privacy Papers for Policy Makers”
A discussion of leading privacy research
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog are listed as participants for We Robot 2014. Robotics is becoming a transformative technology. We Robot 2014 builds on existing scholarship exploring the role of robotics to examine how the increasing sophistication of robots and their widespread deployment everywhere from the home, to hospitals, to public spaces, and even to the battlefield disrupts existing legal regimes or requires rethinking of various policy issues. If you are on the front lines of robot theory, design, or development, we hope to see you.
For more information and to register please visit: http://www.siliconflatirons.com/events.php?id=1381
What harms are privacy laws designed to prevent? How are people injured when corporations, governments, or other individuals collect, disclose, or use information about them in ways that defy expectations, prior agreements, formal rules, or settled norms? How has technology changed the nature of privacy harm?
DARC is a multidisciplinary conference about Unmanned Aerial Vehicles (UAVs) and drones—with an emphasis on civilian applications.
Attendees will take part in a far-ranging exploration of these technologies and see firsthand the latest advancements in aerial robotics. In addition to looking at the cultural impact, legal challenges, and business potential, we’ll also examine specific applications for drones including: agriculture, policing, wildlife conservation, weather, mapping, logistics, and more.
Speaking before the audience at the recent IAPP Data Protection Congress in Brussels, keynoter Woody Hartzog made a challenging assertion: "Control is the wrong goal for privacy by design, perhaps the wrong goal for data protection in general." But isn't control a central tenet of good privacy? It sure is. But it shouldn't be, the author of "Privacy’s Blueprint: The Battle to Control the Design of New Technologies" argued. While everyone emphasizes "control" of personal data as core to privacy, too much zeal for control dilutes efforts to design information tech correctly.
Design is one of the most important but overlooked factors that determines people’s privacy. Social media apps, surveillance technologies, and the Internet of Things are all built in ways that make it hard to guard personal information. And the law says this is okay because it is up to users to protect themselves ― even when the odds are deliberately stacked against them.
Our modern privacy frameworks, with their emphasis on gaining informed consent from consumers in order to use their data, are broken models. That's according to Woodrow Hartzog, a law professor at Northeastern University in Boston. In this episode of The Privacy Advisor Podcast, Hartzog discusses the ways that, given such models, technologies are designed at the engineering level to undermine user privacy.
Recently 50 million Facebook users had their personal information extracted and used for political and commercial purposes. In the wake of this scandal, we’ve all become much more aware of how our use of social media clashes with our desire for privacy. Are technical fixes and awareness enough, or is it time for Facebook and other online services to be regulated? Our guest Woodrow Hartzog is a professor of law and computer science at Northeastern University and discusses the battle and future of our personal information.
Woodrow Hartzog, a professor at Northeastern University Law School, discusses Facebook CEO Mark Zuckerberg’s agreement to appear before the House Energy and Commerce Committee about the company’s data usage policies. He speaks with Bloomberg’s June Grasso.