Woodrow Hartzog is an Assistant Professor at the Cumberland School of Law at Samford University. His research focuses on privacy, human-computer interaction, online communication, and electronic agreements. He holds a Ph.D. in mass communication from the University of North Carolina at Chapel Hill, an LL.M. in intellectual property from the George Washington University Law School, and a J.D. from Samford University. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center.
I have just uploaded a new essay about online privacy to SSRN that will appear in Volume 46 of the Georgia Law Review. The essay, titled "Chain-Link Confidentiality," asserts that personal information that is shared online can be better protected if we require our confidants to make sure that their confidants are watching out for us. This strategy could help us retain control over our personal information as it moves downstream. Your comments are warmly welcome.
Last week, the Supreme Court issued its opinion in United States v. Jones, in which the Justices held that the government's installation of a GPS device on a target's vehicle, and its use of that device to monitor the vehicle's movements, constituted a Fourth Amendment search. The decision was surprisingly unanimous on this point, though concurring opinions by Justices Sotomayor and Alito potentially amplify the significance of the opinion by proposing alternate approaches to the larger problem of ubiquitous surveillance technologies and privacy in public. Given the majority opinion's narrow focus on the attachment of the device to the car, the larger issue of privacy in public remains unsettled.
Others have done an exemplary job of commenting on the decision. The dominant themes arising from the decision and analysis of the decision seem to be the (re?)injection of the concept of trespass into Fourth Amendment doctrine, signs of potential withering of the third party doctrine, and recognition that Fourth Amendment and privacy doctrine will soon enough be useless if they do not adequately protect against ever-evolving surveillance methods and technologies.
I'd like to focus on an aspect of the decision that has not shown up much in the analysis of the case, likely because it was never explicitly mentioned in the text. Although the word obscurity does not appear anywhere in United States v. Jones, I think the decision, particularly Justice Sotomayor's concurring opinion, supports the idea that the obscurity of our personal information is worth protecting.
Are you an OkCupid user? Would you consider the data on your profile public—fair game for anyone to download and share with the rest of the world?
Few things represent the age of social media better than posting a selfie. We share these ubiquitous self-portraits with such an urgency you’d think we’d cease to exist if we stopped producing them at a rapid and ongoing rate. Think about taking a trip to a gorgeous location. If you exercise “selfie-control” and don’t post a picture of yourself at a place like the beach, did the exquisite voyage really happen?
For some crimes the entire law enforcement process can now be automated. No humans are needed to detect the crime, identify the perpetrator, or impose punishment. While automated systems are cheap and efficient, governments and citizens must look beyond these obvious savings as manual labor is replaced by robots and computers.
"“Using location data this way is dangerous,” said Woodrow Hartzog, a law professor at Samford University, via email. “People need to keep their visits to places like doctor’s offices, rehab, and support centers discreet. Once Facebook users realize that the ‘People You May Know’ are the ‘People That Go To the Same Places You Do,’ this feature will inevitably start outing people’s intimate information without their knowledge.”"
Back in 2000, ING Direct Canada – the digital bank that became Tangerine Bank – piloted a “biometric” mouse that would scan users’ fingerprints to help bypass the need for passwords.
“Installing the mouse involved 16 different registry changes,” says Charaka Kithulegoda, Tangerine’s chief information officer, referring to changes to computer settings. “We said, ‘The tech works great, the concept works, but the experience is awful.’”
"When you have a conversation with a chatbot, it’s clear that you’re talking to software, not a human. The conversation feels stiff. But some bots are adept at shooting the breeze, a skill that can make it hard to know you’re conversing with code. “Disclosure is going to be really important here,” says Woodrow Hartzog, a law professor at Samford University. “Problems can come up when people think they’re dealing with humans, but really they’re dealing with bots.”
"Just because someone might be able to use their ear at checkout doesn’t mean it’s necessarily going to happen anytime soon, though. “Biometrics are tricky,” Woodrow Hartzog, an Associate Professor of Law at Samford University told WIRED. “They can be great because they are really secure. It’s hard to fake someone’s ear, eye, gait, or other things that make an individual uniquely identifiable. But if a biometric is compromised, you’re done. You can’t get another ear.”
"The central issue may come down to what Christine Rosen, senior editor of the New Atlantis, called “the Stepford Wife problem,” which she described as the probability that we’ll end up with emotional attachments to our robots. But Woodrow Hartzog, a law professor at Samford University and the owner of a Roomba nicknamed Rocko, argued that there’s nothing wrong with developing an emotional attachment to a robot.
Robots are starting to look suspiciously familiar. Increasingly sophisticated robots designed to resemble us are striking up more and more symbiotic relationships with humans, at home as our companions and at our workplaces as colleagues.
Human-robot interactions will continue to evolve as robotic technology transforms the way we see our creations and the way they react to us. But as machines cease acting like machines and become more integrated into our lives, how will we feel about them? And, dare we ask, how will they feel about us?
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog will all be participating in this two-day conference.
Registration is open for We Robot 2015 and we have a great program planned:
Friday, April 10
Registration and Breakfast
Welcome Remarks: Dean Kellye Testy, University of Washington School of Law
Introductory Remarks: Ryan Calo, Program Committee Chair
2013 PRIVACY PAPERS FOR POLICY MAKERS
The Future of Privacy Forum
Co-chairs Jules Polonetsky and Christopher Wolf
in conjunction with Congresswoman Sheila Jackson Lee invite you to
“Privacy Papers for Policy Makers”
A discussion of leading privacy research
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog are listed as participants for We Robot 2014. Robotics is becoming a transformative technology. We Robot 2014 builds on existing scholarship exploring the role of robotics to examine how the increasing sophistication of robots and their widespread deployment everywhere from the home, to hospitals, to public spaces, and even to the battlefield disrupts existing legal regimes or requires rethinking of various policy issues. If you are on the front lines of robot theory, design, or development, we hope to see you.
For more information and to register please visit: http://www.siliconflatirons.com/events.php?id=1381
What harms are privacy laws designed to prevent? How are people injured when corporations, governments, or other individuals collect, disclose, or use information about them in ways that defy expectations, prior agreements, formal rules, or settled norms? How has technology changed the nature of privacy harm?
Watch the full video at the Energy & Commerce Committee website.
Woodrow Hartzog, Associate Professor Cumberland School of Law
See more at: http://energycommerce.house.gov/hearing/what-are-elements-sound-data-bre...
CIS Affiliate Scholar David Levine interviews Prof. Ryan Calo of University of Washington School of Law and Woodrow Hartzog of Cumberland School of Law on robotics law.
Listen to the full radio show (in German) at Deutschlandradio.
"On the other hand: even algorithms can make mistakes. You will eventually written by humans. And just legal texts can be difficult in a formalized language to translate. They are, says Woodraw Hartzog, just not made for it to be automated. And they are not made to be enforced to one hundred percent."