Ryan Calo is an assistant professor at the University of Washington School of Law and a former research director at CIS. A nationally recognized expert in law and emerging technology, Ryan's work has appeared in the New York Times, the Wall Street Journal, NPR, Wired Magazine, and other news outlets. Ryan serves on several advisory committees, including the Electronic Frontier Foundation, the Electronic Privacy Information Center, and the Future of Privacy Forum. He co-chairs the American Bar Association Committee on Robotics and Artificial Intelligence and serves on the program committee of National Robotics Week.
Is it lawful for a car to drive itself? In the absence of any law to the contrary, it should well be. A new bill is working its way through the Nevada state legislature that would remove any doubt in that state. A.B. 511 directs the Nevada Department of Transportation to authorize autonomous vehicle testing in certain geographic areas of Nevada. Should vehicles meet Nevada DOT standards, they would be permitted to "operate on a highway." The bill defines not only autonomous vehicle, but artificial intelligence as well. AI is "the use of computers and related equipment to enable a machine to duplicate or mimic the behavior of human beings." An autonomous vehicle uses "artificial intelligence, sensors, and [GPS] coordinates to drive itself." To be clear: autonomous vehicles are not yet the law of the land in Nevada. This bill must pass through two committees and receive a hearing before it can be voted on and become law. Some preliminary thoughts on the bill in its present form follow.
Requiring notice is an extraordinarily popular way to regulate. In online privacy, for instance, giving notice about their practices is among the only affirmative obligations websites face. The strategy is also one of the most heavily criticized. Not only does no one read privacy policies, skeptics rightly point out, but many believe that their mere existence guarantees certain base level protections that may or may not exist.
Should we give up on notice? My recent draft paper argues: maybe not. We should explore two possibilities, at any rate, before we do. The first is that regulators may sometimes select the wrong form of notice for the job. Today most website “terms” say that the company “may disclose data pursuant to lawful requests.” That does very little to further user understanding or action. But maybe it could work to:
As an alternative, I argue for a concept I've been calling "visceral" privacy notice. Rather than tell people at length what your privacy practices may be, you show them what they really are. Facebook took a step in this direction today, joining Google and Yahoo! in what I hope to be an emerging best practice.
The intuition that privacy and innovation are somehow opposed is surprisingly common. It is true that overzealous or reactionary appeals to privacy can cut off interesting ventures. (For instance, some believe Steamtunnels would have evolved into a social network in 1999 were it not shut down by the Stanford University due to privacy and copyright concerns.) But privacy generally supports innovation, and vice versa.
NO: It Is the Way to Kill Innovation
By Ryan Calo
The year is 1910. Orville and Wilbur Wright are testing their plane and happen to fly hundreds of feet over a stretch of land you own. Could you sue them?
Technically, you could. In 1910, your property rights extended ad coelum et ad inferos—up to heaven and down to hell. Anyone who flew over your property without permission was trespassing.
I am a law professor who writes about robotics. I’m also a big Paolo Bacigalupi fan, particularly his breakout novel The Windup Girl involving an artificial girl. So for me, “Mika Model” was not entirely new territory. For all my familiarity with its themes, however, Bacigalupi’s story revealed an important connection in robotics law that had never before occurred to me.
"But for others self-certification is “a very big leap”, said Ryan Calo, a law professor at University of Washington, who is arguing for independent audits. “I’m worried by the idea of a company saying, ‘We’re good.’”"
"One thing missing from the regs: any driving test to pass before letting the robot fly solo. Instead, companies will “self-certify” their vehicles. “That’s like me going to the DMV and saying, believe me, I’m an excellent driver,” says Ryan Calo, who studies robotics law at the University of Washington School of Law. “It makes me a little nervous, honestly.” He would rather see a common requirement, or at least have a third party check the cars out before they hit the public streets."
"California is not the first jurisdiction to pass rules governing the deployment of fully automated vehicles. Michigan has a law contemplating driverless fleets, and Florida has a law that its drafter says covers this, too. “But this would make California the most consciously permissive jurisdiction in the world,” says Ryan Calo, a professor at the University of Washington who teaches a course on robot law. “I question the wisdom of self-certification, especially with players that are not as sophisticated. I think it would be wiser to have third parties audit the technology.”"
"In Rosenblat and Calo’s view, government agencies like the Federal Trade Commission need to more actively step up and investigate possible abuses by peer-to-peer platform operators. Earlier this year, Uber agreed to pay $20 million to the agency, which charged that the company’s advertising had misled recruits about how much income they could expect to earn as drivers. Still, they would prefer to see the FTC dig deeper, prying into their digital back-ends rather than relying on publicly posted documentation.
"Ryan Calo, a law professor at the University of Washington who focuses on emerging technologies, said that evidence from devices like pacemakers shouldn’t even be admissible into court. Like DNA evidence before it, Calo said the risk of using it to wrongly implicate someone in a crime is just too high.
“There’s a tendency to believe that because something is recorded by a machine it is gospel,” Calo said."
U.S. Sen. John Thune (R-S.D.), chairman of the Senate Committee on Commerce, Science, and Transportation, will convene a hearing on Wednesday, November 16, 2016, at 3:00 p.m. entitled “Exploring Augmented Reality.” The hearing will examine the emergence, benefits, and implications of augmented reality technologies. Unlike virtual reality that creates a wholly simulated reality, augmented reality attempts to superimpose images and visual data on the physical world in an intuitive way.
• Mr. Brian Blau, Research Vice President, Gartner
The University of Washington School of Law is delighted to announce a public workshop on the law and policy of artificial intelligence, co-hosted by the White House and UW’s Tech Policy Lab. The event places leading artificial intelligence experts from academia and industry in conversation with government officials interested in developing a wise and effective policy framework for this increasingly important technology. The event is free and open to the public but requires registration. -
CIS Affilate Scholar Ryan Calo wil be part of a panel titled "Understanding the Implications of Open Data".
How can open data promote trust in government without creating a transparent citizenry?
The University of Washington School of Law is delighted to announce a public workshop on the law and policy of artificial intelligence, co-hosted by the White House and UW’s Tech Policy Lab. The event places leading artificial intelligence experts from academia and industry in conversation with government officials interested in developing a wise and effective policy framework for this increasingly important technology.
Simon Jack reports from Seattle on robots at work. From the Boeing factory where robots make planes to a clothes shop where a robot helps him buy a new pair of jeans. Plus Ryan Calo, professor of law at the University of Washington, grapples with the question of who to blame when robots go wrong, and whether there is such a thing as robot rights.
There are a million ways people might use drones in the future, from deliveries and police work to journalism. But in this episode, we’re going to talk about consumer drones — something that you or I might use for ourselves. What does the world look like when everybody with a smart phone also has a drone?
"“We don’t need to get to this crazy world in which robots are trying to take over in order for there to be really difficult, interesting complex legal questions,” says Ryan Calo, professor of law at the University of Washington, “That’s happening right now.”
Here’s a sample:
“How do we make sure these drones are not recording things that they shouldn’t," Calo says, "and those things aren’t winding up .... on Amazon servers,or somehow getting out to the public or to law enforcement?"