Stanford CIS

Robot Rules

on

Ryan Calo, a residential fellow at the Center for Internet & Society, is quoted on robotics and liability issues. Richard Acello of the ABA Journal filed this story:

Robots may now be confined to sweeping living rooms and working assembly lines, but futurists and attorneys agree they are destined to take on a much greater role soon. Bill Gates has compared the development of robots to the earliest personal computers in the 1970s.

However, unlike that stationary pile of microchips on your desktop, robots have the potential to act in the real world. Attorneys and legal scholars are now puzzling over how harmful actions of robots will be assigned liability, and particularly how robotic maneuvers will fit into traditional legal concepts of responsibility and agency.

“Robots present a unique situation,” says Ryan Calo, a fellow at the Stanford Law School’s Center for Internet and Society. “Like the computer, it runs on software, but it can touch you. It doesn’t have a particular purpose like a lawn mower or a toaster; it’s more like a platform that you can program to do all kinds of different things. And it can act on the world, so it has legal repercussions. It might be very difficult to ascertain where the liability lies when a robot causes some physical harm.”

One possible avenue would be to view the robot as an agent of its owner, who would be presumed liable for the robot’s actions, but Calo says it’s not so simple. He has blogged about robotics and the law, and led a panel on the subject last year.

“Let’s say you rent a robot from the hospital to take care of Granny, and the neighborhood kids hack into the robot and it menaces Granny and she falls down the stairs. Who’s liable?” he asks. Possibilities include the hospital (which released it), the manufacturer (it’s easy to hack into), the neighborhood kids, or the consumer who failed to do something easy like update the software.

...

“Society tolerates Microsoft Word eating your thesis, but it won’t tolerate a robot running into somebody,” Calo says. “If you look at cases where computers have caused physical injury, then you could recover—for example, if the computer gave a cancer patient too much radiation.”

Calo favors limited immunity for the robot industry, similar to section 230 of the federal Communications Decency Act, which gives “interactive computer services” immunity for information put on their sites.

“There is a real danger if there is complete legal uncertainty and you have a couple of bad incidents involving sympathetic plaintiffs,” Calo says. “That could put a chill on robot development.”

...

Dan Siciliano, faculty director of the corporate governance center at Stanford University, says robot law will most likely develop as an extension of traditional legal concepts. He expects plaintiffs lawyers will “name everybody” in the putative suit for damage caused by robots. “You will see liability work its way up the chain,” he says.

Published in: Press , Robotics