Professor Hartzog is a Professor of Law and Computer Science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. He holds a joint appointment with the School of Law and the College of Computer and Information Science. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Professor Hartzog’s work has been published in numerous scholarly publications such as the Yale Law Journal, Columbia Law Review, California Law Review, and Michigan Law Review and popular national publications such as The Guardian, Wired, BBC, CNN, Bloomberg, New Scientist, Slate, The Atlantic, and The Nation. His book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies, is under contract with Harvard University Press. He has testified twice before Congress on data protection issues.
Professor Hartzog has served as a Visiting Professor at Notre Dame Law School and the University of Maine School of Law. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center. He holds a PhD in mass communication from the University of North Carolina at Chapel Hill, an LLM in intellectual property from the George Washington University Law School, and a JD from Samford University.
According to NPR, 300 plus teenagers broke into former NFL player Brian Holloway’s vacation home, causing massive damage and showcasing their exploits on social media. In response, Holloway created a website,helpmesave300.com, that collects the alleged culprits’ social media posts. He claims this repository has enabled teens to be identified, and that the growing list of names is “being turned over to the sheriffs (sic) department to assist them to verify and identify the facts.”
Online stalking, harassment, and invasions of privacy can be incredibly destructive. Yet very little empirical data exisits regarding these incidents. This paucity of data hinders educational, support, research and policy efforts. Without My Consent, a non-profit organization seeking to combat online invasions of privacy, is conducting research to better understand the experiences of online harassment. If you are 18 or older and have experienced harassment on the Internet, please consider taking their survey.
The New Republic recently published a piece by Jeffrey Rosen titled “The Delete Squad: Google, Twitter, Facebook, and the New Global Battle Over the Future of Free Speech.” In it, Rosen provides an interesting account of how the content policies of many major websites were developed and how influential those policies are for online expression.
We are constantly exposed in public. Yet most of our actions will fade into obscurity. Do you, for example, remember the faces of strangers who stood in line with you the last time you bought medicine at a drugstore? Probably not. Thanks to limited memory and norms against staring, they probably don’t remember yours either.
Since the dawn of the Internet, American regulators and companies have pursued two goals to protect our privacy: that people should be in control of their data and that companies should be transparent about what they do with our data. We can see these goals detailed in the privacy policies and terms of service that we “agree” to as well as companies’ increasingly complicated systems of privacy dashboards, permissions and sharing controls.
""They created a platform where sharing was mindlessly easy and interacting with each other required almost no forethought at all," said Woodrow Hartzog, a law and computer science professor at Northeastern University. "As a result, there was massive sharing, including gushing of personal information that put lots of people at risk.""
To make sense of this world, and to try to sift through the new emerging definitions of privacy, I turned to Woodrow Hartzog. In recent years, Hartzog has emerged as an important thinker on matters of design, privacy, and power relationships between users and tech companies. A professor of law and computer science at Northeastern University, Hartzog has written for the mainstream press about these issues, sometimes in collaboration with his colleague Daniel Solove.
"“Facial recognition is probably the most menacing, dangerous surveillance technology ever invented,” Woodrow Hartzog, a professor of law and computer science at Northeastern University, told me in an email. “We should all be extremely skeptical of having it deployed in any wearable technology, particularly in contexts [where] the surveilled are so vulnerable, such as in many contexts involving law enforcement.”"
"Perhaps, or perhaps not, said Woodrow Hartzog, who teaches law and computer science at Northeastern University. "The idea that this is simply neutral technology that can be used for good or evil and Amazon shouldn't be responsible, I think is purely wrong," he said.
"It's not unreasonable to say if you build a product that is capable of harm than you should be responsible for the design choices you make for enabling the harm," he said, "and when you release it out into the world, you're doing so in a safe and sustainable way.""
"But Albert Gidari, consulting director of privacy at the Stanford Center for Internet and Society, said it's not unusual to see a tech company without a CPO.
"While there have been some very public mistakes, like many tech companies, [Uber] seems to have learned, albeit the hard way, to invest in a serious privacy and security infrastructure," Gidari said. "It is important for the CPO to be in the "C" suite, and Uber has made a serious hire with Ruby Zefo and Simon Hania.""
Part of the Cyber Insecurity series.
Probe the difficult questions that we will need to address as human-robot relationships evolve in the coming decades. Explore the nuances of our future and prepare for the complex problems that will rise as our lives become more A.I. dependent.
Adults 18+ Only.
This program is free thanks to the generosity of the Lowell Institute.
Ranging across consumer protection, data aggregation, digital networks, high-tech devices and surveillance, this panel brings together top privacy and surveillance experts to discuss how the Trump administration has and will continue to shape our privacy in these and other areas.
- ELIZABETH JOH Professor of Law, UC Davis School of Law
- AHMED GHAPPOUR Associate Professor of Law, Boston University School of Law
- ANDREA MATWYSHYN Professor of Law, Northeastern University School of Law
The Tech/Law Colloquium speaker for September 19, 2017 will be Woodrow Hartzog, a professor of law and computer science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Talk: Privacy’s Blueprint: The Battle to Control the Design of New Technologies
Robots are starting to look suspiciously familiar. Increasingly sophisticated robots designed to resemble us are striking up more and more symbiotic relationships with humans, at home as our companions and at our workplaces as colleagues.
Human-robot interactions will continue to evolve as robotic technology transforms the way we see our creations and the way they react to us. But as machines cease acting like machines and become more integrated into our lives, how will we feel about them? And, dare we ask, how will they feel about us?
Sharing passwords with a partner can be tricky. NPR's Lulu Garcia-Navarro talks with tech experts Nancy Baym and Woodrow Hartzog while Becky McDougal from Malden, Mass. shares her experience.
Watch the full video at the Energy & Commerce Committee website.
Woodrow Hartzog, Associate Professor Cumberland School of Law
See more at: http://energycommerce.house.gov/hearing/what-are-elements-sound-data-bre...
CIS Affiliate Scholar David Levine interviews Prof. Ryan Calo of University of Washington School of Law and Woodrow Hartzog of Cumberland School of Law on robotics law.
Listen to the full radio show (in German) at Deutschlandradio.
"On the other hand: even algorithms can make mistakes. You will eventually written by humans. And just legal texts can be difficult in a formalized language to translate. They are, says Woodraw Hartzog, just not made for it to be automated. And they are not made to be enforced to one hundred percent."