Woodrow Hartzog is an Assistant Professor at the Cumberland School of Law at Samford University. His research focuses on privacy, human-computer interaction, online communication, and electronic agreements. He holds a Ph.D. in mass communication from the University of North Carolina at Chapel Hill, an LL.M. in intellectual property from the George Washington University Law School, and a J.D. from Samford University. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center.
According to NPR, 300 plus teenagers broke into former NFL player Brian Holloway’s vacation home, causing massive damage and showcasing their exploits on social media. In response, Holloway created a website,helpmesave300.com, that collects the alleged culprits’ social media posts. He claims this repository has enabled teens to be identified, and that the growing list of names is “being turned over to the sheriffs (sic) department to assist them to verify and identify the facts.”
Online stalking, harassment, and invasions of privacy can be incredibly destructive. Yet very little empirical data exisits regarding these incidents. This paucity of data hinders educational, support, research and policy efforts. Without My Consent, a non-profit organization seeking to combat online invasions of privacy, is conducting research to better understand the experiences of online harassment. If you are 18 or older and have experienced harassment on the Internet, please consider taking their survey.
The New Republic recently published a piece by Jeffrey Rosen titled “The Delete Squad: Google, Twitter, Facebook, and the New Global Battle Over the Future of Free Speech.” In it, Rosen provides an interesting account of how the content policies of many major websites were developed and how influential those policies are for online expression.
Are you an OkCupid user? Would you consider the data on your profile public—fair game for anyone to download and share with the rest of the world?
Few things represent the age of social media better than posting a selfie. We share these ubiquitous self-portraits with such an urgency you’d think we’d cease to exist if we stopped producing them at a rapid and ongoing rate. Think about taking a trip to a gorgeous location. If you exercise “selfie-control” and don’t post a picture of yourself at a place like the beach, did the exquisite voyage really happen?
For some crimes the entire law enforcement process can now be automated. No humans are needed to detect the crime, identify the perpetrator, or impose punishment. While automated systems are cheap and efficient, governments and citizens must look beyond these obvious savings as manual labor is replaced by robots and computers.
"Woodrow Hartzog, a law professor at Samford University who specializes in privacy law, says the history of computer crime law shows that vague language can lead to unintended consequences as technology evolves. “Even slight vagaries or miscalculations can result in dramatic expansions of power,” he says, citing language in the Computer Fraud and Abuse Act, passed in 1986, that has created “an incredible amount of confusion” over what constitutes a crime.
“The way privacy law largely works for consumers in the United States is through what regulators call ‘notice and choice,'” said Samford University law professor Woodrow Hartzog by email. “That means that so long as users were put on notice of an app’s data practices and made the choice to continue using the app in light of that notice, then the app’s data practices are presumptively permissible.”
"“Using location data this way is dangerous,” said Woodrow Hartzog, a law professor at Samford University, via email. “People need to keep their visits to places like doctor’s offices, rehab, and support centers discreet. Once Facebook users realize that the ‘People You May Know’ are the ‘People That Go To the Same Places You Do,’ this feature will inevitably start outing people’s intimate information without their knowledge.”"
Back in 2000, ING Direct Canada – the digital bank that became Tangerine Bank – piloted a “biometric” mouse that would scan users’ fingerprints to help bypass the need for passwords.
“Installing the mouse involved 16 different registry changes,” says Charaka Kithulegoda, Tangerine’s chief information officer, referring to changes to computer settings. “We said, ‘The tech works great, the concept works, but the experience is awful.’”
Robots are starting to look suspiciously familiar. Increasingly sophisticated robots designed to resemble us are striking up more and more symbiotic relationships with humans, at home as our companions and at our workplaces as colleagues.
Human-robot interactions will continue to evolve as robotic technology transforms the way we see our creations and the way they react to us. But as machines cease acting like machines and become more integrated into our lives, how will we feel about them? And, dare we ask, how will they feel about us?
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog will all be participating in this two-day conference.
Registration is open for We Robot 2015 and we have a great program planned:
Friday, April 10
Registration and Breakfast
Welcome Remarks: Dean Kellye Testy, University of Washington School of Law
Introductory Remarks: Ryan Calo, Program Committee Chair
2013 PRIVACY PAPERS FOR POLICY MAKERS
The Future of Privacy Forum
Co-chairs Jules Polonetsky and Christopher Wolf
in conjunction with Congresswoman Sheila Jackson Lee invite you to
“Privacy Papers for Policy Makers”
A discussion of leading privacy research
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog are listed as participants for We Robot 2014. Robotics is becoming a transformative technology. We Robot 2014 builds on existing scholarship exploring the role of robotics to examine how the increasing sophistication of robots and their widespread deployment everywhere from the home, to hospitals, to public spaces, and even to the battlefield disrupts existing legal regimes or requires rethinking of various policy issues. If you are on the front lines of robot theory, design, or development, we hope to see you.
For more information and to register please visit: http://www.siliconflatirons.com/events.php?id=1381
What harms are privacy laws designed to prevent? How are people injured when corporations, governments, or other individuals collect, disclose, or use information about them in ways that defy expectations, prior agreements, formal rules, or settled norms? How has technology changed the nature of privacy harm?
Watch the full video at the Energy & Commerce Committee website.
Woodrow Hartzog, Associate Professor Cumberland School of Law
See more at: http://energycommerce.house.gov/hearing/what-are-elements-sound-data-bre...
CIS Affiliate Scholar David Levine interviews Prof. Ryan Calo of University of Washington School of Law and Woodrow Hartzog of Cumberland School of Law on robotics law.
Listen to the full radio show (in German) at Deutschlandradio.
"On the other hand: even algorithms can make mistakes. You will eventually written by humans. And just legal texts can be difficult in a formalized language to translate. They are, says Woodraw Hartzog, just not made for it to be automated. And they are not made to be enforced to one hundred percent."