Ryan Calo is an assistant professor at the University of Washington School of Law and a former research director at CIS. A nationally recognized expert in law and emerging technology, Ryan's work has appeared in the New York Times, the Wall Street Journal, NPR, Wired Magazine, and other news outlets. Ryan serves on several advisory committees, including the Electronic Frontier Foundation, the Electronic Privacy Information Center, and the Future of Privacy Forum. He co-chairs the American Bar Association Committee on Robotics and Artificial Intelligence and serves on the program committee of National Robotics Week.
Please visit the Center for Internet and Society's new wiki (cyberlaw.stanford.edu/wiki) and contribute to our privacy enhancing technology (PET) database.
Stanford Law School student Seth Gilmore got us started on a PET wiki. As the name suggests, PETs are technologies or techniques that assist users in protecting their information from abuse. They include software allowing for anonymous surfing, plug-ins that reveal who is tracking you online, and improvements in browser security. Microsoft and the Office of the Information and Privacy Commissioner of Ontario, Canada cosponsor an award for PETs, and there is a call for papers (due March 2, 2009) for an upcoming PET conference in Seattle.
David Cancel just created a wonderful privacy enhancing technology for Firefox---up there with Ad Blocker Plus in my view. In a simple and straightforward way, Ghostery reveals who is tracking your views of a page on the Internet according to a common but under-examined method: web bugs.
As David explains, "[w]eb bugs are used to track your behavior on the web in order to help the sites you visit to understand their own audiences and to allow advertisers to target ads at you." To expand a little, web bugs are tiny (generally one-pixel) pictures on a web page that tell a host or third-party when and by whom they are being loaded, which in turn reveals that the page itself has been loaded. David's elegant plug-in "scans the web pages you visit to find web bugs" and displays their owners in the upper right hand corner of the page. Ghostery is easy to install, use, and shut off.
Social networks have gotten a lot of play in recent years. What about social devices? I've been thinking about whether/how the nature of computer interfaces is changing—specifically, becoming less passive and more “social.”
My conversations with academics in Stanford's Department of Communications, and the research they've guided me toward, leads me to believe that we are once again at the edge of a shift in the way we communicate. For a variety or reasons, PCs and other computers in cars, mobile devices, etc., are making increased use of voice-driven, natural language interfaces or avatars, moving computing away from the traditional mode of passive information processing toward a more social, "person to person" interaction.
Some quick examples. Google's VP of Search gave a recent interview at Le Web during which she said that Google was exploring a more conversational interface that would allow users to actually ask Google questions out loud as though conversing with a person. Although it has met with (comic) resistance in the past, a trail of Microsoft patents going back ten years shows how serious the company is about developing a social interface, complete with voice, expressions, and gestures. As much as twenty-five percent of Microsoft's research efforts reportedly involve artificial intelligence. Even the U.S. government has gotten into this game: the U.S. Army’s virtual recruiter, SGT Star, responds to questions out loud, changes moods, makes jokes, etc. According to developer statistics, SGT Star has responded to over two million questions since his debut in 2006.
Electronic books are a little like flying cars; always right about to catch on. Today the New York Times asks “Could book lovers finally be willing to switch from pages to pixels?” In an interesting piece in Technology, Brad Stone and Motoko Rich interview publishers in an attempt to size this market, concluding that the era of e-books may (finally) have arrived.
Lots could be said on this topic--much praised and much lamented. But we've been discussing a particular angle here at CIS: whether this page to pixel migration might have serious repercussions for reader privacy.
A recent Computerworld blog post shows how tone deaf we can be about the implications of new technology. A group of car dealers in Oregon apparently attached GPS devices to cars sold to customers with poor credit so as to be able to track them down more easily in the event of repossession.
NO: It Is the Way to Kill Innovation
By Ryan Calo
The year is 1910. Orville and Wilbur Wright are testing their plane and happen to fly hundreds of feet over a stretch of land you own. Could you sue them?
Technically, you could. In 1910, your property rights extended ad coelum et ad inferos—up to heaven and down to hell. Anyone who flew over your property without permission was trespassing.
I am a law professor who writes about robotics. I’m also a big Paolo Bacigalupi fan, particularly his breakout novel The Windup Girl involving an artificial girl. So for me, “Mika Model” was not entirely new territory. For all my familiarity with its themes, however, Bacigalupi’s story revealed an important connection in robotics law that had never before occurred to me.
"I learned about the case of [Ferguson v. Bombardier Services Corp] from Robots in American Law, a study published by University of Washington assistant law professor Ryan Calo. I spoke to Calo about the latest study.
“The pilot got sued by the survivors of the victims,” Calo said today. “That’s one very clear-cut case where a person was bad at monitoring and taking control from a rudimentary autopilot and it resulted in an autopilot, for which a human got blamed.”"
"According to Ryan Calo, a law professor at the University of Washington specializing in robotics law, the NHTSA's decision reflects the principle of 'assumption of risk': that the affected party knowingly accepted the dangers of the activity concerned, meaning that the manufacturer was not at fault."
"How do you draw the line between prosecuting a robot that does harm and its creator? Who bears the burden of the crime or wrongdoing?
I recently got the chance to respond to a short story by a science fiction writer I admire. The author, Paulo Bacigalupi, imagines a detective investigating the “murder” of a man by his artificial companion. The robot insists it killed its owner intentionally in retaliation for abuse and demands a lawyer.
"yan Calo, a professor at the UW School of Law who specializes in privacy, robotics and cyberlaw issues, says the Bentonville Police Department’s fishing expedition is “unlikely to yield anything.” The reason is that the Echo sends information up to Amazon’s cloud only when it hears a wake word, usually “Alexa” or “Echo.”
U.S. Sen. John Thune (R-S.D.), chairman of the Senate Committee on Commerce, Science, and Transportation, will convene a hearing on Wednesday, November 16, 2016, at 3:00 p.m. entitled “Exploring Augmented Reality.” The hearing will examine the emergence, benefits, and implications of augmented reality technologies. Unlike virtual reality that creates a wholly simulated reality, augmented reality attempts to superimpose images and visual data on the physical world in an intuitive way.
• Mr. Brian Blau, Research Vice President, Gartner
The University of Washington School of Law is delighted to announce a public workshop on the law and policy of artificial intelligence, co-hosted by the White House and UW’s Tech Policy Lab. The event places leading artificial intelligence experts from academia and industry in conversation with government officials interested in developing a wise and effective policy framework for this increasingly important technology. The event is free and open to the public but requires registration. -
CIS Affilate Scholar Ryan Calo wil be part of a panel titled "Understanding the Implications of Open Data".
How can open data promote trust in government without creating a transparent citizenry?
The University of Washington School of Law is delighted to announce a public workshop on the law and policy of artificial intelligence, co-hosted by the White House and UW’s Tech Policy Lab. The event places leading artificial intelligence experts from academia and industry in conversation with government officials interested in developing a wise and effective policy framework for this increasingly important technology.
Simon Jack reports from Seattle on robots at work. From the Boeing factory where robots make planes to a clothes shop where a robot helps him buy a new pair of jeans. Plus Ryan Calo, professor of law at the University of Washington, grapples with the question of who to blame when robots go wrong, and whether there is such a thing as robot rights.
There are a million ways people might use drones in the future, from deliveries and police work to journalism. But in this episode, we’re going to talk about consumer drones — something that you or I might use for ourselves. What does the world look like when everybody with a smart phone also has a drone?
"“We don’t need to get to this crazy world in which robots are trying to take over in order for there to be really difficult, interesting complex legal questions,” says Ryan Calo, professor of law at the University of Washington, “That’s happening right now.”
Here’s a sample:
“How do we make sure these drones are not recording things that they shouldn’t," Calo says, "and those things aren’t winding up .... on Amazon servers,or somehow getting out to the public or to law enforcement?"