Ryan Calo is an assistant professor at the University of Washington School of Law and a former research director at CIS. A nationally recognized expert in law and emerging technology, Ryan's work has appeared in the New York Times, the Wall Street Journal, NPR, Wired Magazine, and other news outlets. Ryan serves on several advisory committees, including the Electronic Frontier Foundation, the Electronic Privacy Information Center, and the Future of Privacy Forum. He co-chairs the American Bar Association Committee on Robotics and Artificial Intelligence and serves on the program committee of National Robotics Week.
UPDATE: Facebook explains the security procedure here. Apparently they only use photos if you have not set up another verification means. Also, I have confirmation that the photo identification is not being done for a secondary purpose.
I recently tried to sign on to Facebook from a coffee shop. I was told that I had to pass a security screening because of the "strange location." Fair enough. The actual test, however, was surprising. It was comprised of a multiple choice exam where I had to identify who was in a given picture.
A couple of things. First, some of the pictures were embarrassing. I doubt the person who uploaded them thought they would be used to screen for improper access. Think about it. Facebook is showing random private photos to people because it suspects they may not be the account holder. The photos must be private because they form the basis of a security screening.
I attended a fascinating thesis defense today on the subject of human-robot interaction by Stanford PhD candidate Victoria Groom. HRI experiments apparently tend to focus on human encounters with robots; few studies test the psychology behind robot operation. Groom’s work explores how we feel about the tasks we perform through robots. One of the more interesting questions she and her colleagues ask is: to what extent do we feel like it’s really us performing the task? The question is important where, as in the military, people work through robots to carry out morally charged tasks. And the answer might have repercussions for how we think about evaluation and punishment.
I started a new blog around robotics programming and scholarship at Stanford Law School. Some of us here believe that robotics is a transformative technology on par with the Internet. (We're not alone: the "roadmap for U.S. robotics" prepared for Congress by a coalition of robotics labs and research institutes is called "From Internet to Robotics.") I've said before and I'll say again: the age of Internet exceptionalism is over. We can now do "digital" things in the real world. The chief importance (and danger) of the Internet is the imaginative possibilities it opens up. Robotics is how we will prove the slogan Chris Anderson came up with in a slightly different context: "Atoms are the new bits." Please stay tuned.
Thanks to Elaine Adolfo for the image.
The response to WhatApp.org has been wonderful, thanks! We now have over 20 registered and approved experts from a wide variety of sectors, including privacy compliance, law, and computer science. Many (many) people have signed up, left comments, edited wikis, or suggested apps to review for privacy, security, and openness. (We're going to run out of apps to review, so please do "add an app" if you get a chance!) If you have comments or questions, please email firstname.lastname@example.org. It's a work in progress and we need your help. Thanks again---especially to the Rose Foundation for their generous support.
In a fresh and recent whitepaper, Brookings Institution senior fellow Benjamin Wittes and law student Jodie Liu turn the standard privacy argument on its head: as they see it, many supposed threats to our privacy actually benefit it.
The Federal Aviation Administration announced its proposal this morning for what rules should govern small unmanned aerial systems, meaning drones 55 pounds or lighter. We do not know how long it will take for the rules to go into effect. When they do, the new rules will permit vastly more drone use in the United States, bringing us closer into line with other countries where drones can be commercially operated today.
We are not ready for driverless cars because our public officials lack the expertise to evaluate the safety of this new class of automobiles.
It is always fun, and sometimes worrying, to see imagination come to life. I was on a panel last year at UC Berkeley around robotics and law. We talked about some of the conundrums robots and artificial intelligence might pose for law and policy–the subject of my forthcoming work Robotics and the Lessons of Cyberlaw. One hypothetical involved a shopping “bot” that randomly purchases items on the Internet.
"Ryan Calo, assistant professor of law at the University of Washington and a privacy expert, told me that he’s concerned with how facial recognition technology could judge the mental state of exiting passengers. “What I worry about with biometrics is the capacity to tell things like: Is this person nervous? Are they lying? … I worry about too closely studying human subjects at the borders, in or out,” he says."
"Law professor Ryan Calo believes that robots are soon going to constitute a more abrupt departure from the technologies that preceded them than did the Internet from personal computers and telephones. Robotic technology is changing so fast, with such significant implications, that he believes the federal government is ill equipped to regulate the society we'll soon be living in. Hence his Friday pitch to an Aspen Ideas Festival crowd: a new federal agency to regulate robots."
"Over at Slate, business reporter Jordan Weissmann assesses the bigger picture and offers advice for law school fence-sitters: Apply to law school now.
The argument advanced by Mr. Weissmann is one that’s slowly gaining currency among legal education observers.
University of Washington law professor Ryan Calo expressed similar optimism in an article for Forbes last fall."
"Ryan Calo, an assistant professor at the University of Washington School of Law who specializes in robotics and drones, told me that the worry about drones colliding in the air, or people being hit by them, will start to ease as drones become smarter.
“The next generation of drones, which are truly autonomous and can navigate using sensors and code, rather than people controlling them, will be much safer than the drones we’re seeing today,” Mr. Calo said in a phone interview."
"Laws has to keep up with new technologies, and Ryan Calo has his eye on robot legalities, particularly with respect to policy and ethics.
For example, Calo was quoted in this New York Times piece titled "When Driverless Cars Break The Law." Spoiler alert: it's complicated. "Criminal law is going to be looking for a guilty mind, a particular mental state — should this person have known better? If you’re not driving the car, it’s going to be difficult," he said."
CIS Affilate Scholar Ryan Calo wil be part of a panel titled "Understanding the Implications of Open Data".
How can open data promote trust in government without creating a transparent citizenry?
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog will all be participating in this two-day conference.
Registration is open for We Robot 2015 and we have a great program planned:
Friday, April 10
Registration and Breakfast
Welcome Remarks: Dean Kellye Testy, University of Washington School of Law
Introductory Remarks: Ryan Calo, Program Committee Chair
Date/Time: Wednesday, March 25, 12:00 p.m.
Location: Microsoft Corporation, Redmond, WA
A Brave New Era? Or, Back to the Future? Are we in 1934? 1993? Or, 2015? The FCC’s order on the open internet – What did the FCC really do and what will it mean for internet service providers, online music and video companies, e-commerce companies, transit providers and consumers?
There are a million ways people might use drones in the future, from deliveries and police work to journalism. But in this episode, we’re going to talk about consumer drones — something that you or I might use for ourselves. What does the world look like when everybody with a smart phone also has a drone?
"“We don’t need to get to this crazy world in which robots are trying to take over in order for there to be really difficult, interesting complex legal questions,” says Ryan Calo, professor of law at the University of Washington, “That’s happening right now.”
Here’s a sample:
“How do we make sure these drones are not recording things that they shouldn’t," Calo says, "and those things aren’t winding up .... on Amazon servers,or somehow getting out to the public or to law enforcement?"
"What will Amazon’s drone highway in the sky look like?
Probably not a drone highway. Amazon unveiled a proposal where low-level air space would be carved out for drones: 200 to 400 feet would be reserved for high-speed transit drones. Below, there would be space for low -speed local drone traffic, and above would be a no-fly buffer zone to keep drones out of manned-vehicle air space, aka flight paths.
Robots have been used in factories around the world for decades, often carrying out dangerous or highly repetitive operations. However the city of Dongguan, China, has become home to the first fully automated factory - where the workforce is made of up entirely of robots. Changying Precision Technology will only employ a small number of human staff who will monitor operations of the machinery, but all processes are completed by robotic equipment.
Is this a sign of things to come? Newsday spoke to Ryan Calo, a professor with the University of Washington Tech Policy Lab.