Ryan Calo is an assistant professor at the University of Washington School of Law and a former research director at CIS. A nationally recognized expert in law and emerging technology, Ryan's work has appeared in the New York Times, the Wall Street Journal, NPR, Wired Magazine, and other news outlets. Ryan serves on several advisory committees, including the Electronic Frontier Foundation, the Electronic Privacy Information Center, and the Future of Privacy Forum. He co-chairs the American Bar Association Committee on Robotics and Artificial Intelligence and serves on the program committee of National Robotics Week.
Judge Richard Posner took the occasion of the Boston bombing to remind us of his view that privacy should lose out to other values. Privacy, argues Judge Posner, is largely about concealing truths “that, if known, would make it more difficult for us to achieve our personal goals.” For instance: privacy helps the victims of domestic violence achieve their personal goal of living free from fear; it helps the elderly achieve their personal goal of staying off of marketing “sucker lists;” and it helps children achieve their personal goal of avoiding sexual predators online.
As if we don’t have enough to worry about, now there’s spyware for your brain. Or, there could be. Researchers at Oxford, Geneva, and Berkeley have created a proof of concept for using commercially available brain-computer interfaces to discover private facts about today's gamers.
I’ve blogged on these pages before about the claim, popularized by Larry Lessig, that “code is law.” During the Concurring Opinions symposium on Jonathan Zittrain’s 2010 book The Future of The Internet (And How To Stop It), I cataloged the senses in which architecture or “code” is said to constitute a form of regulation. “Primary” architecture refers to altering a physical or digital environment to stop conduct before it happens. Speed bumps are a classic example. “Secondary” architecture instead alters an environment in order to make conduct harder to get away with—for instance, by installing a traffic light camera or forcing a communications network to build an entry point for law enforcement.
I have yet to sit down and read Evgeny Morozov’s new book, To Save Everything, Click Here: The Folly of Technological Solutionism. I certainly found his last book very thought provoking. But I did get a chance to read an op ed Morozov recently wrote in the Wall Street Journal with the provocative title “Is Smart Making Us Dumb?” The piece draws a distinction between mobile and other devices that are “good smart” and ones that are “bad smart.” Good smart devices “leave us in complete control of the situation and seek to enhance our decision-making by providing more information.” Morozov offers the example of a teapot that relays the state of the energy grid. Whereas bad smart ones “make certain choices and behaviors impossible,” a theme Lawrence Lessig, Jonathan Zittrain, and others famously develop under the rubric of "code."
I wrote a new essay entitled “Code, Nudge, or Notice?” that might interest CIS readers. The essay compares side-by-side three ways that the government tries to influence citizen behavior short of making it illegal. It uses contemporary examples, like the graphic warnings the FDA wants to put on cigarettes, to make the point that it sometimes hard to sort regulations into neat categories like “architecture,” “libertarian paternalism,” or “mandatory disclosure” (code, nudge, or notice). Instead, I argue that regulators should focus on the more fundamental difference between helping people and hindering them. Along the way, I make the point that all of forensics may be a kind of “code” that turns an ordinary location into a crime scene—sort of like putting a traffic camera up at an intersection only after someone runs the red light. Thoughts warmly welcome. Here is the abstract:
I am a law professor who writes about robotics. I’m also a big Paolo Bacigalupi fan, particularly his breakout novel The Windup Girl involving an artificial girl. So for me, “Mika Model” was not entirely new territory. For all my familiarity with its themes, however, Bacigalupi’s story revealed an important connection in robotics law that had never before occurred to me.
Over the last year, the FBI has had harsh words for Apple, accusing the tech giant of endangering human lives and aiding criminals by turning on encryption by default on the iPhone. When Google announced it would add the feature to Android, meaning that smartphone users would need to unlock their phones for police to be able to go through them, government officials and law enforcement representatives similarly freaked out.
Privacy law scholars tend to be skeptical of markets. Markets “unravel” privacy by penalizing consumers who prefer it, degrade privacy by treating it as just another commodity to be traded, and otherwise interfere with the values or processes that privacy exists to preserve.
"Ryan Calo, assistant professor of law at the University of Washington and a privacy expert, told me that he’s concerned with how facial recognition technology could judge the mental state of exiting passengers. “What I worry about with biometrics is the capacity to tell things like: Is this person nervous? Are they lying? … I worry about too closely studying human subjects at the borders, in or out,” he says."
"Law professor Ryan Calo believes that robots are soon going to constitute a more abrupt departure from the technologies that preceded them than did the Internet from personal computers and telephones. Robotic technology is changing so fast, with such significant implications, that he believes the federal government is ill equipped to regulate the society we'll soon be living in. Hence his Friday pitch to an Aspen Ideas Festival crowd: a new federal agency to regulate robots."
"Over at Slate, business reporter Jordan Weissmann assesses the bigger picture and offers advice for law school fence-sitters: Apply to law school now.
The argument advanced by Mr. Weissmann is one that’s slowly gaining currency among legal education observers.
University of Washington law professor Ryan Calo expressed similar optimism in an article for Forbes last fall."
"Ryan Calo, an assistant professor at the University of Washington School of Law who specializes in robotics and drones, told me that the worry about drones colliding in the air, or people being hit by them, will start to ease as drones become smarter.
“The next generation of drones, which are truly autonomous and can navigate using sensors and code, rather than people controlling them, will be much safer than the drones we’re seeing today,” Mr. Calo said in a phone interview."
"Laws has to keep up with new technologies, and Ryan Calo has his eye on robot legalities, particularly with respect to policy and ethics.
For example, Calo was quoted in this New York Times piece titled "When Driverless Cars Break The Law." Spoiler alert: it's complicated. "Criminal law is going to be looking for a guilty mind, a particular mental state — should this person have known better? If you’re not driving the car, it’s going to be difficult," he said."
CIS Affilate Scholar Ryan Calo wil be part of a panel titled "Understanding the Implications of Open Data".
How can open data promote trust in government without creating a transparent citizenry?
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog will all be participating in this two-day conference.
Registration is open for We Robot 2015 and we have a great program planned:
Friday, April 10
Registration and Breakfast
Welcome Remarks: Dean Kellye Testy, University of Washington School of Law
Introductory Remarks: Ryan Calo, Program Committee Chair
Date/Time: Wednesday, March 25, 12:00 p.m.
Location: Microsoft Corporation, Redmond, WA
A Brave New Era? Or, Back to the Future? Are we in 1934? 1993? Or, 2015? The FCC’s order on the open internet – What did the FCC really do and what will it mean for internet service providers, online music and video companies, e-commerce companies, transit providers and consumers?
There are a million ways people might use drones in the future, from deliveries and police work to journalism. But in this episode, we’re going to talk about consumer drones — something that you or I might use for ourselves. What does the world look like when everybody with a smart phone also has a drone?
"“We don’t need to get to this crazy world in which robots are trying to take over in order for there to be really difficult, interesting complex legal questions,” says Ryan Calo, professor of law at the University of Washington, “That’s happening right now.”
Here’s a sample:
“How do we make sure these drones are not recording things that they shouldn’t," Calo says, "and those things aren’t winding up .... on Amazon servers,or somehow getting out to the public or to law enforcement?"
"What will Amazon’s drone highway in the sky look like?
Probably not a drone highway. Amazon unveiled a proposal where low-level air space would be carved out for drones: 200 to 400 feet would be reserved for high-speed transit drones. Below, there would be space for low -speed local drone traffic, and above would be a no-fly buffer zone to keep drones out of manned-vehicle air space, aka flight paths.
Robots have been used in factories around the world for decades, often carrying out dangerous or highly repetitive operations. However the city of Dongguan, China, has become home to the first fully automated factory - where the workforce is made of up entirely of robots. Changying Precision Technology will only employ a small number of human staff who will monitor operations of the machinery, but all processes are completed by robotic equipment.
Is this a sign of things to come? Newsday spoke to Ryan Calo, a professor with the University of Washington Tech Policy Lab.