Professor Hartzog is a Professor of Law and Computer Science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. He holds a joint appointment with the School of Law and the College of Computer and Information Science. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Professor Hartzog’s work has been published in numerous scholarly publications such as the Yale Law Journal, Columbia Law Review, California Law Review, and Michigan Law Review and popular national publications such as The Guardian, Wired, BBC, CNN, Bloomberg, New Scientist, Slate, The Atlantic, and The Nation. His book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies, is under contract with Harvard University Press. He has testified twice before Congress on data protection issues.
Professor Hartzog has served as a Visiting Professor at Notre Dame Law School and the University of Maine School of Law. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center. He holds a PhD in mass communication from the University of North Carolina at Chapel Hill, an LLM in intellectual property from the George Washington University Law School, and a JD from Samford University.
For those who don't know it, Surprisingly Free has hosted many excellent guests, so I recommend exploring the website. If you're interested in law and technology podcasts, I also highly recommend CIS's own Hearsay Culture.
Privacy settings and other technological controls used to protect privacy have been justifiably criticized a bit lately. Danielle Citron recently blogged at Concurring Opinions about an important new study conducted by Columbia’s Michelle Madejski, Maritza Johnson and Steve Bellovin that found that Facebook’s default privacy settings fail to capture real-world expectations. The United Kingdom Government has recently indicated that browser settings alone cannot be used by Web users to give consent to being tracked online under a new EU law. The Government's rationale for this decision was that these browser settings were not flexible enough to reflect a user's true privacy preferences. The general consensus seems to be that most privacy settings simply aren't that good at protecting the actual information we consider private in a given context. I think some skepticism regarding privacy controls is warranted, particularly in light of the current technology. However, I'd like to show some support for privacy controls, or, rather, the promise of privacy controls. My hope is that that courts and lawmakers do not completely sour on recognizing privacy controls as a legitimate way to protect an Internet user's privacy.
In the past few weeks a few potential employers and schools were reported to have asked for access to the Facebook profile of an applicant or student. These reports are starting to feel like a trend. I think these requests are problematic not just for the Facebook user, but also the employer or administrator asking for access. In short, anyone asking for access to Facebook profiles and/or login credentials is asking users to betray the trust of their network and subjecting all parties involved to the potential deactivation of their Facebook account.
Website scraping, which is the bulk extraction of website information by software, is becoming an increasingly visible activity. The Lovely-Faces controversy shows how scraped information can disrupt a sense of privacy when re-published in a different context. The Lovely-Faces website, deemed “a social experiment” by its creators, re-contextualizes names, locations, and photos scraped from publicly accessible Facebook pages in a mock dating website.
Co-authored with Evan Selinger.
Until recently, concerns over facial recognition technologies were largely theoretical. Only a few companies could create databases of names and faces large enough to identify significant portions of the population by sight. These companies had little motivation to widely exploit this technology in invasive ways.
Co-authored with Evan Selinger.
Some people argue that the Digital Age has eviscerated obscurity. They say shifts in the technological and economic landscapes have forever changed society.
Their argument is that a tipping point has occurred; it’s now too late to stop others from collecting, aggregating, and analyzing nearly every aspect of our data trail, and profiting from a steady stream of intrusive privacy invasions.
Social Media is always updating to give people more. More features like video and picture sharing. More freedom to use third-party apps. More capacity to store more data and make more connections. More platforms so we can use one service while loading another.
Paradoxically, the future of social media is also about providing less. Sometimes the best social media design will constrain invasive and harmful practices. If we want online social interaction to be safe and sustainable, we should embrace the limitations.
Co-authored by Danielle Citron and Woodrow Hartzog.
Revenge-pornography websites are a reminder that preying on the vulnerable has long been big business. And while various laws protect people against scam artists, extortionists, manipulators, and other unscrupulous enterprises, the law has not been able to keep up with all malicious businesses.
Cross-posted from Wired.
If you’re a Snapchat user, you should know something: The “Snappening” is not your fault.
"Hartzog said a 50-state regulatory patchwork isn’t unworkable “because it’s what we’ve been dealing with all along.”"
"Woodrow Hartzog, a professor of law and computer science at Northeastern University, said in his written testimony for the Senate hearing that this model doesn’t work at any large scale.
“The problem with notice and choice models is that they create incentives for companies to both hide the risks in their data practices though manipulative design, vague abstractions, and excessive and complex words while at the same time shifting risk by engineering a system meant to expedite the transfer of rights and relinquishment of protections,” he said."
"Privacy lawyers say the collection of health data by nonhealth entities is legal in most U.S. states, provided there is sufficient disclosure in an app’s and Facebook’s terms of service. The Federal Trade Commission has taken an interest in cases in which data sharing deviates widely from what users might expect, particularly if any explanation was hard for users to find, said Woodrow Hartzog, a professor of law and computer science at Northeastern University."
"“This is the first piece of legislation that I’ve seen that really takes facial recognition technology as seriously as it is warranted and treats it as uniquely dangerous,” says Woodrow Hartzog, professor of law and computer science at Northeastern University."
"Woodrow Hartzog, a Northeastern University professor of law and computer science, examines dark patterns in his book "Privacy’s Blueprint" (Harvard University Press, 2018). “In the aggregate,” he says, the practice “amounts to this collective machine that is trying to extract every ounce of data and value from us.""
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog will all be participating in this two-day conference.
Registration is open for We Robot 2015 and we have a great program planned:
Friday, April 10
Registration and Breakfast
Welcome Remarks: Dean Kellye Testy, University of Washington School of Law
Introductory Remarks: Ryan Calo, Program Committee Chair
2013 PRIVACY PAPERS FOR POLICY MAKERS
The Future of Privacy Forum
Co-chairs Jules Polonetsky and Christopher Wolf
in conjunction with Congresswoman Sheila Jackson Lee invite you to
“Privacy Papers for Policy Makers”
A discussion of leading privacy research
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog are listed as participants for We Robot 2014. Robotics is becoming a transformative technology. We Robot 2014 builds on existing scholarship exploring the role of robotics to examine how the increasing sophistication of robots and their widespread deployment everywhere from the home, to hospitals, to public spaces, and even to the battlefield disrupts existing legal regimes or requires rethinking of various policy issues. If you are on the front lines of robot theory, design, or development, we hope to see you.
For more information and to register please visit: http://www.siliconflatirons.com/events.php?id=1381
What harms are privacy laws designed to prevent? How are people injured when corporations, governments, or other individuals collect, disclose, or use information about them in ways that defy expectations, prior agreements, formal rules, or settled norms? How has technology changed the nature of privacy harm?
DARC is a multidisciplinary conference about Unmanned Aerial Vehicles (UAVs) and drones—with an emphasis on civilian applications.
Attendees will take part in a far-ranging exploration of these technologies and see firsthand the latest advancements in aerial robotics. In addition to looking at the cultural impact, legal challenges, and business potential, we’ll also examine specific applications for drones including: agriculture, policing, wildlife conservation, weather, mapping, logistics, and more.
Listen to the full radio show (in German) at Deutschlandradio.
"On the other hand: even algorithms can make mistakes. You will eventually written by humans. And just legal texts can be difficult in a formalized language to translate. They are, says Woodraw Hartzog, just not made for it to be automated. And they are not made to be enforced to one hundred percent."