Ryan Calo is an assistant professor at the University of Washington School of Law and a former research director at CIS. A nationally recognized expert in law and emerging technology, Ryan's work has appeared in the New York Times, the Wall Street Journal, NPR, Wired Magazine, and other news outlets. Ryan serves on several advisory committees, including the Electronic Frontier Foundation, the Electronic Privacy Information Center, and the Future of Privacy Forum. He co-chairs the American Bar Association Committee on Robotics and Artificial Intelligence and serves on the program committee of National Robotics Week.
Over Christmas, I received a series 530 Roomba, the robotic vacuum cleaner from iRobot. It cleans the floor really well. But that is all it does. This year at the Consumer Electronics Show, iRobot revealed the prototype AVA. It is, essentially, an open robotic platform. Think of it as an iPad with a body. It has no dedicated purpose and, importantly, it has an API and will run software made by third-party developers.
Yes, apps for robots. This is a wonderful development, one that I predicted in a forthcoming essay in Maryland Law Review. As iRobot founder Colin Angle points out, "If you think of the thousands of apps out there: Which iPad apps would be more cool if they moved?" More importantly, would you not be more inclined to buy a personal robot that came with thousands of programs, with more on the way.
UPDATE: The New York Times published most of the rest of my comments on Bits Blog. Thanks!
I was quoted in a cover story in today's New York Times as saying, essentially, that law enforcement was "just trying to do their job" in pushing for greater subpoena power. This particular remark was an aside, made if anything to soften the impression that I was overly critical of the government. For instance, I lamented that consumers do not understand the state of the electronic privacy law and spoke about the dangers of dragnet or otherwise excessive surveillance. (Presumably I am one of the unnamed "[e]lectronic privacy and civil rights advocates" that worries "because the WikiLeaks court order gained such widespread attention, it could have a chilling effect on people’s speech on the Internet.")
I did not mean to imply that we should not push back against government and in fact praised Google and Twitter for having done so. I did offer that the government's purpose in pushing for greater surveillance power was not to erode civil liberties for its own sake, but in order to protect Americans by detecting and punishing crimes. But the gist of my remarks was that we need more protection, not less. Some of my talking points appear below for context.
Affiliate scholar Marvin Ammori offers eight good reasons why the United States should not prosecute Wikileaks founder Julian Assange. I mostly agree with Ammori’s analysis and write to emphasize one point: an Assange trial, regardless of outcome, would help the government gloss over one of the worst security breaches in modern history. And the First Amendment could supply this distraction’s brightest fireworks.
The website Wikileaks recently published hundreds of thousands of confidential State Department cables. These communications apparently reveal the details of conversations with, and personal impressions and assessments of, foreign leaders and diplomats. Many fear that the leak will undermine international relations in profound and unknowable ways. One of the unintended consequence of the leak, however, may be to strengthen the case for a national consumer privacy law.
UPDATE: As told to Jules Polonetsky over at The Future of Privacy Forum, Capital One was engaging in "totally random" rate changes that were not related to browser type. On the other hand, according to the Wall Street Journal, Capital One was at one point using [x+1] data to calibrate what credit card offers to show.
The other day, I suggested that the facts of the Clementi suicide may perfectly illustrate why no actual transfer of information is necessary for someone to suffer a severe subjective privacy harm. (Thanks to TechDirt and PogoWasRight for the write ups.)
Just now I learned about an allegation against Capital One that the company offered someone a different lending rate on the basis of what browser he used (Chrome vs. Firefox). A similar allegation was made against Amazon, which apparently used cookies for a time to calibrate the price of DVDs.
Here you have a clear objective privacy harm: your information (browser type) is being used adversely in a tangible and unexpected way. It matters not at all whether a human being sees the information or whether a company knows "who you are." Neither personally identifying information, nor the revelation of information to a person, is necessary for there to be a privacy harm.
NO: It Is the Way to Kill Innovation
By Ryan Calo
The year is 1910. Orville and Wilbur Wright are testing their plane and happen to fly hundreds of feet over a stretch of land you own. Could you sue them?
Technically, you could. In 1910, your property rights extended ad coelum et ad inferos—up to heaven and down to hell. Anyone who flew over your property without permission was trespassing.
I am a law professor who writes about robotics. I’m also a big Paolo Bacigalupi fan, particularly his breakout novel The Windup Girl involving an artificial girl. So for me, “Mika Model” was not entirely new territory. For all my familiarity with its themes, however, Bacigalupi’s story revealed an important connection in robotics law that had never before occurred to me.
"In 2012, Ryan Calo and Michael Froomkin -- law professors at the Universities of Washington and Miami respectively -- sensed that robots were at approximately the stage of the internet circa 1988, and began to think about how to preemptively create good policy about them. Where, they asked, were the legal conflicts going to be? What new laws will be needed, what existing laws can be adapted, what metaphors will apply?
"Ryan Calo, a professor at the University of Washington who’s a leading expert on the intersection of robots and law, said making law enforcement agencies draft policies about how and when they can use robots and drones forces them to think through scenarios in advance. “If you want to put a Taser on a drone and tase a mentally ill person,” Calo said, “or if you want to follow someone around with a drone, that’s where you need to have a process in place that you’ve properly vetted with the leadership.”"
"Driverless cars may end up being a form of public transport rather than vehicles you own, says Ryan Calo at Stanford University, California. That is happening in the UK and Singapore, where government-provided driverless “pods” are being launched.
That would go down poorly in the US, however. “The idea that the government would take over driverless cars and treat them as a public good would get absolutely nowhere here,” says Calo."
"Ryan Calo, an assistant professor at the University of Washington who specializes in law and robotics, says that although the idea of drones confronting humans is unusual, he doesn’t foresee significant objections to the idea. “The beauty of this is that it would be in an environment where people shouldn’t be going,” he says."
The University of Washington School of Law is delighted to announce a public workshop on the law and policy of artificial intelligence, co-hosted by the White House and UW’s Tech Policy Lab. The event places leading artificial intelligence experts from academia and industry in conversation with government officials interested in developing a wise and effective policy framework for this increasingly important technology. The event is free and open to the public but requires registration. -
CIS Affilate Scholar Ryan Calo wil be part of a panel titled "Understanding the Implications of Open Data".
How can open data promote trust in government without creating a transparent citizenry?
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog will all be participating in this two-day conference.
Registration is open for We Robot 2015 and we have a great program planned:
Friday, April 10
Registration and Breakfast
Welcome Remarks: Dean Kellye Testy, University of Washington School of Law
Introductory Remarks: Ryan Calo, Program Committee Chair
"What will Amazon’s drone highway in the sky look like?
Probably not a drone highway. Amazon unveiled a proposal where low-level air space would be carved out for drones: 200 to 400 feet would be reserved for high-speed transit drones. Below, there would be space for low -speed local drone traffic, and above would be a no-fly buffer zone to keep drones out of manned-vehicle air space, aka flight paths.
Robots have been used in factories around the world for decades, often carrying out dangerous or highly repetitive operations. However the city of Dongguan, China, has become home to the first fully automated factory - where the workforce is made of up entirely of robots. Changying Precision Technology will only employ a small number of human staff who will monitor operations of the machinery, but all processes are completed by robotic equipment.
Is this a sign of things to come? Newsday spoke to Ryan Calo, a professor with the University of Washington Tech Policy Lab.
CIS Affiliate Scholar Ryan Calo on Good Morning America segment "Popularity of Drones Raises Privacy Concerns," many have reported drones with cameras invading their privacy.
Ryan Calo, Assistant Law Professor at the University of Washington and an affiliate scholar at the Stanford Center for Internet and Society, talks about testing Google’s driverless cars.
Listen to the full show at Marketplace Tech.