Professor Hartzog is a Professor of Law and Computer Science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. He holds a joint appointment with the School of Law and the College of Computer and Information Science. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.
Professor Hartzog’s work has been published in numerous scholarly publications such as the Yale Law Journal, Columbia Law Review, California Law Review, and Michigan Law Review and popular national publications such as The Guardian, Wired, BBC, CNN, Bloomberg, New Scientist, Slate, The Atlantic, and The Nation. His book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies, is under contract with Harvard University Press. He has testified twice before Congress on data protection issues.
Professor Hartzog has served as a Visiting Professor at Notre Dame Law School and the University of Maine School of Law. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center. He holds a PhD in mass communication from the University of North Carolina at Chapel Hill, an LLM in intellectual property from the George Washington University Law School, and a JD from Samford University.
According to NPR, 300 plus teenagers broke into former NFL player Brian Holloway’s vacation home, causing massive damage and showcasing their exploits on social media. In response, Holloway created a website,helpmesave300.com, that collects the alleged culprits’ social media posts. He claims this repository has enabled teens to be identified, and that the growing list of names is “being turned over to the sheriffs (sic) department to assist them to verify and identify the facts.”
Online stalking, harassment, and invasions of privacy can be incredibly destructive. Yet very little empirical data exisits regarding these incidents. This paucity of data hinders educational, support, research and policy efforts. Without My Consent, a non-profit organization seeking to combat online invasions of privacy, is conducting research to better understand the experiences of online harassment. If you are 18 or older and have experienced harassment on the Internet, please consider taking their survey.
The New Republic recently published a piece by Jeffrey Rosen titled “The Delete Squad: Google, Twitter, Facebook, and the New Global Battle Over the Future of Free Speech.” In it, Rosen provides an interesting account of how the content policies of many major websites were developed and how influential those policies are for online expression.
Excited teenagers – in other words normal teenagers – have never been famous for consistently wise decisions, nor should they be. Trial and error is a critical part of growing up.
But the emergence and widespread uptake of social media has further complicated the ability of teenagers to put past issues behind them. What used to remain only in fading memories increasingly lingers in code on computer servers in the cloud.
When Facebook Inc. recently lifted its restriction on public posts by teenagers, some privacy scholars applauded the move as a win for parents -- offering them a chance to teach their children about digital accountability. They may be overstating the case, however. If information and communication technologies aren’t designed to help users -- especially younger ones -- guard their information, appeals to good judgment and discipline won’t go very far.
Big Data in Small Hands by Woodrow Hartzog & Evan Selinger
“Big data” can be defined as a problem-solving philosophy that leverages massive datasets and algorithmic analysis to extract “hidden information and surprising correlations.” Not only does big data pose a threat to traditional notions of privacy, but it also compromises socially shared information. This point remains underappreciated because our so-called public disclosures are not nearly as public as courts and policymakers have argued—at least, not yet. That is subject to change once big data becomes user friendly.
Design-based solutions to confront technological privacy threats are becoming popular with regulators. However, these promising solutions have left the full potential of design untapped. With respect to online communication technologies, design-based solutions for privacy remain incomplete because they have yet to successfully address the trickiest aspect of the Internet — social interaction. This Article posits that privacy-protection strategies such as “Privacy by Design” face unique challenges with regard to social software and social technology due to their interactional nature.
""Some companies may realize it’s better to just extend GDPR protections to all their customers, period, rather than one one policy for European citizens and one policy for the rest of the world," says Richard Forno, a cyber security researcher and the Assistant Director of UMBC's Center for Cybersecurity. "
"Northeastern professor Woodrow Hartzog, whose new book, Privacy’s Blueprint, published last month, calls the law a “watershed moment,” saying it’s built on the notion that privacy is a fundamental right. He said that while the law applies directly to Europeans, companies that have customers all over the world—like Facebook, Google, Twitter and many of your favorite apps—are updating their terms for everyone, including Americans.
"Deceptive design nudges, tricks and goads you into sharing more than you might intend to online, Professor Hartzog argues in his new book, Privacy's Blueprint: The Battle to Control the Design of New Technologies.
And when you think you're in control of your own data, you rarely are.
"If you want to know when social media companies are trying to manipulate you into disclosing information or engaging more, the answer is always," he said."
""Once this sort of behavior becomes normalized, it becomes harder to push back on both as a consumer and as a matter of policy," said Woodrow Hartzog, a professor of law and computer science at Northeastern University."
Solutions to many pressing economic and societal challenges lie in better understanding data. New tools for analyzing disparate information sets, called Big Data, have revolutionized our ability to find signals amongst the noise. Big Data techniques hold promise for breakthroughs ranging from better health care, a cleaner environment, safer cities, and more effective marketing. Yet, privacy advocates are concerned that the same advances will upend the power relationships between government, business and individuals, and lead to prosecutorial abuse, racial or other profiling, discrimination, redlining, overcriminalization, and other restricted freedoms.
‘Read Me’ Or Just Tap ‘I Agree’
There’s a huge group of people at work behind our screens. They’re called behaviour architects, persuasive designers or user-experience specialists and the power they have is massive.
That urge to keep swiping through your twitter feed? That’s design. The way we all click ‘I Agree’ to the terms and conditions? That’s design. Swiping right or left on Tinder? Well, that’s design too.
We live in an online world of someone else’s making and most of us never even give it a second thought. And actually, that’s design as well.
Speaking before the audience at the recent IAPP Data Protection Congress in Brussels, keynoter Woody Hartzog made a challenging assertion: "Control is the wrong goal for privacy by design, perhaps the wrong goal for data protection in general." But isn't control a central tenet of good privacy? It sure is. But it shouldn't be, the author of "Privacy’s Blueprint: The Battle to Control the Design of New Technologies" argued. While everyone emphasizes "control" of personal data as core to privacy, too much zeal for control dilutes efforts to design information tech correctly.
Design is one of the most important but overlooked factors that determines people’s privacy. Social media apps, surveillance technologies, and the Internet of Things are all built in ways that make it hard to guard personal information. And the law says this is okay because it is up to users to protect themselves ― even when the odds are deliberately stacked against them.
Our modern privacy frameworks, with their emphasis on gaining informed consent from consumers in order to use their data, are broken models. That's according to Woodrow Hartzog, a law professor at Northeastern University in Boston. In this episode of The Privacy Advisor Podcast, Hartzog discusses the ways that, given such models, technologies are designed at the engineering level to undermine user privacy.
Recently 50 million Facebook users had their personal information extracted and used for political and commercial purposes. In the wake of this scandal, we’ve all become much more aware of how our use of social media clashes with our desire for privacy. Are technical fixes and awareness enough, or is it time for Facebook and other online services to be regulated? Our guest Woodrow Hartzog is a professor of law and computer science at Northeastern University and discusses the battle and future of our personal information.