Professor Hartzog is a Professor of Law and Computer Science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. He holds a joint appointment with the School of Law and the College of Computer and Information Science. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Professor Hartzog’s work has been published in numerous scholarly publications such as the Yale Law Journal, Columbia Law Review, California Law Review, and Michigan Law Review and popular national publications such as The Guardian, Wired, BBC, CNN, Bloomberg, New Scientist, Slate, The Atlantic, and The Nation. His book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies, is under contract with Harvard University Press. He has testified twice before Congress on data protection issues.
Professor Hartzog has served as a Visiting Professor at Notre Dame Law School and the University of Maine School of Law. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center. He holds a PhD in mass communication from the University of North Carolina at Chapel Hill, an LLM in intellectual property from the George Washington University Law School, and a JD from Samford University.
For those who don't know it, Surprisingly Free has hosted many excellent guests, so I recommend exploring the website. If you're interested in law and technology podcasts, I also highly recommend CIS's own Hearsay Culture.
Privacy settings and other technological controls used to protect privacy have been justifiably criticized a bit lately. Danielle Citron recently blogged at Concurring Opinions about an important new study conducted by Columbia’s Michelle Madejski, Maritza Johnson and Steve Bellovin that found that Facebook’s default privacy settings fail to capture real-world expectations. The United Kingdom Government has recently indicated that browser settings alone cannot be used by Web users to give consent to being tracked online under a new EU law. The Government's rationale for this decision was that these browser settings were not flexible enough to reflect a user's true privacy preferences. The general consensus seems to be that most privacy settings simply aren't that good at protecting the actual information we consider private in a given context. I think some skepticism regarding privacy controls is warranted, particularly in light of the current technology. However, I'd like to show some support for privacy controls, or, rather, the promise of privacy controls. My hope is that that courts and lawmakers do not completely sour on recognizing privacy controls as a legitimate way to protect an Internet user's privacy.
In the past few weeks a few potential employers and schools were reported to have asked for access to the Facebook profile of an applicant or student. These reports are starting to feel like a trend. I think these requests are problematic not just for the Facebook user, but also the employer or administrator asking for access. In short, anyone asking for access to Facebook profiles and/or login credentials is asking users to betray the trust of their network and subjecting all parties involved to the potential deactivation of their Facebook account.
Website scraping, which is the bulk extraction of website information by software, is becoming an increasingly visible activity. The Lovely-Faces controversy shows how scraped information can disrupt a sense of privacy when re-published in a different context. The Lovely-Faces website, deemed “a social experiment” by its creators, re-contextualizes names, locations, and photos scraped from publicly accessible Facebook pages in a mock dating website.
In February, a South Korean woman was sleeping on the floor when her robot vacuum ate her hair, forcing her to call for emergency help. It may not be the dystopian future that Stephen Hawking warned us about – where intelligent devices “spell the end of the human race” – but it does highlight one of the unexpected dangers of inviting robots into our home.
Co-authored by Evan Selinger.
Co-authored with Evan Selinger.
Until recently, concerns over facial recognition technologies were largely theoretical. Only a few companies could create databases of names and faces large enough to identify significant portions of the population by sight. These companies had little motivation to widely exploit this technology in invasive ways.
Co-authored with Evan Selinger.
Some people argue that the Digital Age has eviscerated obscurity. They say shifts in the technological and economic landscapes have forever changed society.
Their argument is that a tipping point has occurred; it’s now too late to stop others from collecting, aggregating, and analyzing nearly every aspect of our data trail, and profiting from a steady stream of intrusive privacy invasions.
""The FTC was never created as a pure data protection authority, but it's stepped in to fill the void," said Woodrow Hartzog, a law and computer science professor at Northeastern University. "Even after all the FTC has done, it's still very limited in substantive authority and in terms of resources.""
"This may be the first true large-scale reckoning for the information age, a 21st-century problem screaming for an immediate answer. Woodrow Hartzog, a professor of law and computer science at Northeastern and an affiliate scholar at The Center for Internet and Society at Stanford Law School, has a suggestion.
"“There hasn’t been a real vivid example of how information is extracted on a massive scale and then weaponized against you,” said Woodrow Hartzog, a professor of law and computer science at Northeastern University. “To rob people of agency in a really important, core area of identity, which is political expression, ideology—the idea that we’ve lost control of so much, to lose this as well is just difficult to swallow.”"
"“You have to proceed on the assumption that this information has been extracted from you,” Woodrow Hartzog, author of Privacy’s Blueprint: The Battle to Control the Design of New Technologies, told me on the phone. “Cambridge Analytica used an information extraction technique that was well-known to technologists for years. The implications of this debacle is about crystallizing the threat about how dangerous the information ecosystem is.”"
Solutions to many pressing economic and societal challenges lie in better understanding data. New tools for analyzing disparate information sets, called Big Data, have revolutionized our ability to find signals amongst the noise. Big Data techniques hold promise for breakthroughs ranging from better health care, a cleaner environment, safer cities, and more effective marketing. Yet, privacy advocates are concerned that the same advances will upend the power relationships between government, business and individuals, and lead to prosecutorial abuse, racial or other profiling, discrimination, redlining, overcriminalization, and other restricted freedoms.
Woodrow Hartzog, a professor at Northeastern University Law School, discusses Facebook CEO Mark Zuckerberg’s agreement to appear before the House Energy and Commerce Committee about the company’s data usage policies. He speaks with Bloomberg’s June Grasso.
Sharing passwords with a partner can be tricky. NPR's Lulu Garcia-Navarro talks with tech experts Nancy Baym and Woodrow Hartzog while Becky McDougal from Malden, Mass. shares her experience.
Watch the full video at the Energy & Commerce Committee website.
Woodrow Hartzog, Associate Professor Cumberland School of Law
See more at: http://energycommerce.house.gov/hearing/what-are-elements-sound-data-bre...