Ryan Calo is an assistant professor at the University of Washington School of Law and a former research director at CIS. A nationally recognized expert in law and emerging technology, Ryan's work has appeared in the New York Times, the Wall Street Journal, NPR, Wired Magazine, and other news outlets. Ryan serves on several advisory committees, including the Electronic Frontier Foundation, the Electronic Privacy Information Center, and the Future of Privacy Forum. He co-chairs the American Bar Association Committee on Robotics and Artificial Intelligence and serves on the program committee of National Robotics Week.
Please visit the Center for Internet and Society's new wiki (cyberlaw.stanford.edu/wiki) and contribute to our privacy enhancing technology (PET) database.
Stanford Law School student Seth Gilmore got us started on a PET wiki. As the name suggests, PETs are technologies or techniques that assist users in protecting their information from abuse. They include software allowing for anonymous surfing, plug-ins that reveal who is tracking you online, and improvements in browser security. Microsoft and the Office of the Information and Privacy Commissioner of Ontario, Canada cosponsor an award for PETs, and there is a call for papers (due March 2, 2009) for an upcoming PET conference in Seattle.
David Cancel just created a wonderful privacy enhancing technology for Firefox---up there with Ad Blocker Plus in my view. In a simple and straightforward way, Ghostery reveals who is tracking your views of a page on the Internet according to a common but under-examined method: web bugs.
As David explains, "[w]eb bugs are used to track your behavior on the web in order to help the sites you visit to understand their own audiences and to allow advertisers to target ads at you." To expand a little, web bugs are tiny (generally one-pixel) pictures on a web page that tell a host or third-party when and by whom they are being loaded, which in turn reveals that the page itself has been loaded. David's elegant plug-in "scans the web pages you visit to find web bugs" and displays their owners in the upper right hand corner of the page. Ghostery is easy to install, use, and shut off.
Social networks have gotten a lot of play in recent years. What about social devices? I've been thinking about whether/how the nature of computer interfaces is changing—specifically, becoming less passive and more “social.”
My conversations with academics in Stanford's Department of Communications, and the research they've guided me toward, leads me to believe that we are once again at the edge of a shift in the way we communicate. For a variety or reasons, PCs and other computers in cars, mobile devices, etc., are making increased use of voice-driven, natural language interfaces or avatars, moving computing away from the traditional mode of passive information processing toward a more social, "person to person" interaction.
Some quick examples. Google's VP of Search gave a recent interview at Le Web during which she said that Google was exploring a more conversational interface that would allow users to actually ask Google questions out loud as though conversing with a person. Although it has met with (comic) resistance in the past, a trail of Microsoft patents going back ten years shows how serious the company is about developing a social interface, complete with voice, expressions, and gestures. As much as twenty-five percent of Microsoft's research efforts reportedly involve artificial intelligence. Even the U.S. government has gotten into this game: the U.S. Army’s virtual recruiter, SGT Star, responds to questions out loud, changes moods, makes jokes, etc. According to developer statistics, SGT Star has responded to over two million questions since his debut in 2006.
Electronic books are a little like flying cars; always right about to catch on. Today the New York Times asks “Could book lovers finally be willing to switch from pages to pixels?” In an interesting piece in Technology, Brad Stone and Motoko Rich interview publishers in an attempt to size this market, concluding that the era of e-books may (finally) have arrived.
Lots could be said on this topic--much praised and much lamented. But we've been discussing a particular angle here at CIS: whether this page to pixel migration might have serious repercussions for reader privacy.
A recent Computerworld blog post shows how tone deaf we can be about the implications of new technology. A group of car dealers in Oregon apparently attached GPS devices to cars sold to customers with poor credit so as to be able to track them down more easily in the event of repossession.
NO: It Is the Way to Kill Innovation
By Ryan Calo
The year is 1910. Orville and Wilbur Wright are testing their plane and happen to fly hundreds of feet over a stretch of land you own. Could you sue them?
Technically, you could. In 1910, your property rights extended ad coelum et ad inferos—up to heaven and down to hell. Anyone who flew over your property without permission was trespassing.
I am a law professor who writes about robotics. I’m also a big Paolo Bacigalupi fan, particularly his breakout novel The Windup Girl involving an artificial girl. So for me, “Mika Model” was not entirely new territory. For all my familiarity with its themes, however, Bacigalupi’s story revealed an important connection in robotics law that had never before occurred to me.
"According to University of Washington law professor Ryan Calo, the situation doesn’t give rise to any new legal issues, but is unsettling for a different reason: We are okay with cops using lethal force in a justified situation, but we expect them to do so in a familiar way—with firearms. The use of an improvised robot bomb is unsettling in the same way as if the cops had used a knife or dropped an anvil on the shooter.
"“It is essentially a jury-rigged version of a drone strike,” Ryan Calo, a University of Washington School of Law professor specializing in cyber and robotic law, told me. “If they would have been justified in throwing a grenade, then they’re likely justified in doing this, which was quite frankly a creative thing.”
"“No court would find a legal problem here,” said Ryan Calo, a professor at the University of Washington law school. “When someone is an ongoing lethal danger, there isn’t an obligation on the part of officers to put themselves in harm’s way.”"
"Today we learned the Dallas police used a bomb disposal robot to deliver and detonate an explosion, killing the suspect in last night's shooting. It was surprising, and shocking, and left me with a lot of questions. But, really, there's only one question: is this ethical?
"The way the robot was used in the Dallas case is likely legally no different from sending an officer in to shoot a hostile suspect, according to University of Washington law professor Ryan Calo.
Still, the Dallas Police Department's decision to use the unit in this way could have a major effect on how the public views the increasing integration of robots into daily life, he said.
U.S. Sen. John Thune (R-S.D.), chairman of the Senate Committee on Commerce, Science, and Transportation, will convene a hearing on Wednesday, November 16, 2016, at 3:00 p.m. entitled “Exploring Augmented Reality.” The hearing will examine the emergence, benefits, and implications of augmented reality technologies. Unlike virtual reality that creates a wholly simulated reality, augmented reality attempts to superimpose images and visual data on the physical world in an intuitive way.
• Mr. Brian Blau, Research Vice President, Gartner
The University of Washington School of Law is delighted to announce a public workshop on the law and policy of artificial intelligence, co-hosted by the White House and UW’s Tech Policy Lab. The event places leading artificial intelligence experts from academia and industry in conversation with government officials interested in developing a wise and effective policy framework for this increasingly important technology. The event is free and open to the public but requires registration. -
CIS Affilate Scholar Ryan Calo wil be part of a panel titled "Understanding the Implications of Open Data".
How can open data promote trust in government without creating a transparent citizenry?
The University of Washington School of Law is delighted to announce a public workshop on the law and policy of artificial intelligence, co-hosted by the White House and UW’s Tech Policy Lab. The event places leading artificial intelligence experts from academia and industry in conversation with government officials interested in developing a wise and effective policy framework for this increasingly important technology.
Simon Jack reports from Seattle on robots at work. From the Boeing factory where robots make planes to a clothes shop where a robot helps him buy a new pair of jeans. Plus Ryan Calo, professor of law at the University of Washington, grapples with the question of who to blame when robots go wrong, and whether there is such a thing as robot rights.
There are a million ways people might use drones in the future, from deliveries and police work to journalism. But in this episode, we’re going to talk about consumer drones — something that you or I might use for ourselves. What does the world look like when everybody with a smart phone also has a drone?
"“We don’t need to get to this crazy world in which robots are trying to take over in order for there to be really difficult, interesting complex legal questions,” says Ryan Calo, professor of law at the University of Washington, “That’s happening right now.”
Here’s a sample:
“How do we make sure these drones are not recording things that they shouldn’t," Calo says, "and those things aren’t winding up .... on Amazon servers,or somehow getting out to the public or to law enforcement?"