Professor Hartzog is a Professor of Law and Computer Science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. He holds a joint appointment with the School of Law and the College of Computer and Information Science. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Professor Hartzog’s work has been published in numerous scholarly publications such as the Yale Law Journal, Columbia Law Review, California Law Review, and Michigan Law Review and popular national publications such as The Guardian, Wired, BBC, CNN, Bloomberg, New Scientist, Slate, The Atlantic, and The Nation. His book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies, is under contract with Harvard University Press. He has testified twice before Congress on data protection issues.
Professor Hartzog has served as a Visiting Professor at Notre Dame Law School and the University of Maine School of Law. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center. He holds a PhD in mass communication from the University of North Carolina at Chapel Hill, an LLM in intellectual property from the George Washington University Law School, and a JD from Samford University.
For those who don't know it, Surprisingly Free has hosted many excellent guests, so I recommend exploring the website. If you're interested in law and technology podcasts, I also highly recommend CIS's own Hearsay Culture.
Privacy settings and other technological controls used to protect privacy have been justifiably criticized a bit lately. Danielle Citron recently blogged at Concurring Opinions about an important new study conducted by Columbia’s Michelle Madejski, Maritza Johnson and Steve Bellovin that found that Facebook’s default privacy settings fail to capture real-world expectations. The United Kingdom Government has recently indicated that browser settings alone cannot be used by Web users to give consent to being tracked online under a new EU law. The Government's rationale for this decision was that these browser settings were not flexible enough to reflect a user's true privacy preferences. The general consensus seems to be that most privacy settings simply aren't that good at protecting the actual information we consider private in a given context. I think some skepticism regarding privacy controls is warranted, particularly in light of the current technology. However, I'd like to show some support for privacy controls, or, rather, the promise of privacy controls. My hope is that that courts and lawmakers do not completely sour on recognizing privacy controls as a legitimate way to protect an Internet user's privacy.
In the past few weeks a few potential employers and schools were reported to have asked for access to the Facebook profile of an applicant or student. These reports are starting to feel like a trend. I think these requests are problematic not just for the Facebook user, but also the employer or administrator asking for access. In short, anyone asking for access to Facebook profiles and/or login credentials is asking users to betray the trust of their network and subjecting all parties involved to the potential deactivation of their Facebook account.
Website scraping, which is the bulk extraction of website information by software, is becoming an increasingly visible activity. The Lovely-Faces controversy shows how scraped information can disrupt a sense of privacy when re-published in a different context. The Lovely-Faces website, deemed “a social experiment” by its creators, re-contextualizes names, locations, and photos scraped from publicly accessible Facebook pages in a mock dating website.
Amazon, the company synonymous with online shopping, is supplying facial recognition technology to government and law enforcement agencies over its web services platform. Branded Rekognition, the technology is every bit as dystopian as it sounds.
Imagine a technology that is potently, uniquely dangerous — something so inherently toxic that it deserves to be completely rejected, banned, and stigmatized. Something so pernicious that regulation cannot adequately protect citizens from its effects. That technology is already here. It is facial recognition technology, and its dangers are so great that it must be rejected entirely.
Imagine a technology that is potently, uniquely dangerous — something so inherently toxic that it deserves to be completely rejected, banned, and stigmatized. Something so pernicious that regulation cannot adequately protect citizens from its effects.
That technology is already here. It is facial recognition technology, and its dangers are so great that it must be rejected entirely.
The user agreement has become a potent symbol of our asymmetric relationship with technology firms. For most of us, it’s our first interaction with a given company. We sign up and are asked to read the dreaded user agreement — a process that we know signifies some complex and inconveniently detrimental implications of using the service, but one that we choose to ignore.
The revelation that Cambridge Analytica was involved in the extraction of data involving over 50 million Facebook users has raised more than a few questions about just what went wrong and who is to blame.
"Perhaps, or perhaps not, said Woodrow Hartzog, who teaches law and computer science at Northeastern University. "The idea that this is simply neutral technology that can be used for good or evil and Amazon shouldn't be responsible, I think is purely wrong," he said.
"It's not unreasonable to say if you build a product that is capable of harm than you should be responsible for the design choices you make for enabling the harm," he said, "and when you release it out into the world, you're doing so in a safe and sustainable way.""
"But Albert Gidari, consulting director of privacy at the Stanford Center for Internet and Society, said it's not unusual to see a tech company without a CPO.
"While there have been some very public mistakes, like many tech companies, [Uber] seems to have learned, albeit the hard way, to invest in a serious privacy and security infrastructure," Gidari said. "It is important for the CPO to be in the "C" suite, and Uber has made a serious hire with Ruby Zefo and Simon Hania.""
""Some companies may realize it’s better to just extend GDPR protections to all their customers, period, rather than one one policy for European citizens and one policy for the rest of the world," says Richard Forno, a cyber security researcher and the Assistant Director of UMBC's Center for Cybersecurity. "
"Northeastern professor Woodrow Hartzog, whose new book, Privacy’s Blueprint, published last month, calls the law a “watershed moment,” saying it’s built on the notion that privacy is a fundamental right. He said that while the law applies directly to Europeans, companies that have customers all over the world—like Facebook, Google, Twitter and many of your favorite apps—are updating their terms for everyone, including Americans.
Part of the Cyber Insecurity series.
Probe the difficult questions that we will need to address as human-robot relationships evolve in the coming decades. Explore the nuances of our future and prepare for the complex problems that will rise as our lives become more A.I. dependent.
Adults 18+ Only.
This program is free thanks to the generosity of the Lowell Institute.
Ranging across consumer protection, data aggregation, digital networks, high-tech devices and surveillance, this panel brings together top privacy and surveillance experts to discuss how the Trump administration has and will continue to shape our privacy in these and other areas.
- ELIZABETH JOH Professor of Law, UC Davis School of Law
- AHMED GHAPPOUR Associate Professor of Law, Boston University School of Law
- ANDREA MATWYSHYN Professor of Law, Northeastern University School of Law
The Tech/Law Colloquium speaker for September 19, 2017 will be Woodrow Hartzog, a professor of law and computer science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Talk: Privacy’s Blueprint: The Battle to Control the Design of New Technologies
Robots are starting to look suspiciously familiar. Increasingly sophisticated robots designed to resemble us are striking up more and more symbiotic relationships with humans, at home as our companions and at our workplaces as colleagues.
Human-robot interactions will continue to evolve as robotic technology transforms the way we see our creations and the way they react to us. But as machines cease acting like machines and become more integrated into our lives, how will we feel about them? And, dare we ask, how will they feel about us?
Recently 50 million Facebook users had their personal information extracted and used for political and commercial purposes. In the wake of this scandal, we’ve all become much more aware of how our use of social media clashes with our desire for privacy. Are technical fixes and awareness enough, or is it time for Facebook and other online services to be regulated? Our guest Woodrow Hartzog is a professor of law and computer science at Northeastern University and discusses the battle and future of our personal information.
Woodrow Hartzog, a professor at Northeastern University Law School, discusses Facebook CEO Mark Zuckerberg’s agreement to appear before the House Energy and Commerce Committee about the company’s data usage policies. He speaks with Bloomberg’s June Grasso.
Sharing passwords with a partner can be tricky. NPR's Lulu Garcia-Navarro talks with tech experts Nancy Baym and Woodrow Hartzog while Becky McDougal from Malden, Mass. shares her experience.
Watch the full video at the Energy & Commerce Committee website.
Woodrow Hartzog, Associate Professor Cumberland School of Law
See more at: http://energycommerce.house.gov/hearing/what-are-elements-sound-data-bre...