Ryan Calo is an assistant professor at the University of Washington School of Law and a former research director at CIS. A nationally recognized expert in law and emerging technology, Ryan's work has appeared in the New York Times, the Wall Street Journal, NPR, Wired Magazine, and other news outlets. Ryan serves on several advisory committees, including the Electronic Frontier Foundation, the Electronic Privacy Information Center, and the Future of Privacy Forum. He co-chairs the American Bar Association Committee on Robotics and Artificial Intelligence and serves on the program committee of National Robotics Week.
UPDATE (Dec. 14, 2008): A user has created a Facebook Group against My Buddies. Meanwhile, as Beth says below, My Buddies has mutated into My Friends...
I recently received a series of notifications on Facebook alerting me that friends of mine had answered various personal questions about me. One notification claimed that a high school friend had just answered a specific yes/no question about my sexual orientation. Clicking on the link labeled “What did she say?”, I was invited to join My Buddies – a new Facebook application with an icon identical to the default running man on AIM, implying a connection to AOL that I doubt exists.
The United Press International reports that "[n]ewly released documents in Florida's Caylee Anthony case show ominous search words entered on the family computer prior to the child's disappearance." Some thoughts:
1. I've yet to see an investigation wherein the search terms at issue came from the service provider (e.g., Google or Yahoo!). Rather, they appear to be taken from the defendant's computer pursuant to a warrant.
2. I think the introduction of search terms into evidence presents a real danger in the context of inchoate crimes such as attempted murder. Searches can be snapshots of a person's mind, but no more than that. The concern is that a jury will see concrete intentions in Internet searches and not require a showing of a firm will to go through with the crime.
3. As Search Engine Watch points out, searches can lead to convictions in another way -- by allowing citizens to make connections and report them to the police. In one case, a Florida woman reported a man for practicing medicine without a license after an Internet search revealed that his license had been revoked.
4. Why is it always Florida?
A Washington Post tech blogger reports that President-elect Barak Obama has named a team to guide technology policy for the administration: Julius Genachowski (former chief counsel to FCC Chairman Reed Hundt, former senior executive at IAC), Sonal Shah (head of global development at Google.org) and Blair Levin (telecommunications policy analyst and consultant). Previous appointments around tech policy include Susan Crawford (Michigan Law School) and Kevin Wernach (World of Wharton), recently named to Obama’s FCC transition team, with more appointments to follow. Obama appears to be striking the right balance between academics, policy wonks, and practitioners. He has hired former insiders who also appear to have the right “Silicon Valleyues” of innovation and openness.
I don’t think I’ve ever seen such a commitment to push back against third-party requests in a public legal document before. And the 23andMe panelist's commitment, though oral, was at least as strong.
Consumer Genomics: Law and Policy
November 10, 2008 from 5:00 pm - 6:30 pm
Stanford Law School, Room 190
With a credit card and a saliva sample, consumers can now unlock the secrets carried in their DNA. Consumer genomics offers direct access to one's genetic code, plus interpretations of health risks, family lineage, opportunities for social networking, and more. But how should consumer genomics be regulated? Join us for a panel discussion with Stephen Moore (General Counsel, Navigenics), Anne Wojcicki (Co-founder, 23andMe), and Alexis Madrigal (Wired), moderated by bioscience and law expert Hank Greely (Stanford Law School). Open to the public.
Brought to you by the Stanford Law School Center for Law and the Biosciences and co-sponsored by the Center for Internet and Society.
NO: It Is the Way to Kill Innovation
By Ryan Calo
The year is 1910. Orville and Wilbur Wright are testing their plane and happen to fly hundreds of feet over a stretch of land you own. Could you sue them?
Technically, you could. In 1910, your property rights extended ad coelum et ad inferos—up to heaven and down to hell. Anyone who flew over your property without permission was trespassing.
I am a law professor who writes about robotics. I’m also a big Paolo Bacigalupi fan, particularly his breakout novel The Windup Girl involving an artificial girl. So for me, “Mika Model” was not entirely new territory. For all my familiarity with its themes, however, Bacigalupi’s story revealed an important connection in robotics law that had never before occurred to me.
Over the last year, the FBI has had harsh words for Apple, accusing the tech giant of endangering human lives and aiding criminals by turning on encryption by default on the iPhone. When Google announced it would add the feature to Android, meaning that smartphone users would need to unlock their phones for police to be able to go through them, government officials and law enforcement representatives similarly freaked out.
"“[Social networks] can censor more or less anything they want and it also have incredible abilities to leave up as much as they wants to leave up,” said Ryan Calo, professor of law at the University of Washington and co-director of the school’s Tech Policy Lab."
"Shouting down web-based terrorist recruiting cells, that’s a good thing, said Ryan Calo, professor of law at the University of Washington and co-director of the Tech Policy Lab.
Porn on Twitter, maybe not such a good thing, he said. It could be offensive to religious Muslims (or Christians or Jews), the overwhelming majority of whom are not terrorists and want nothing to do with sexually explicit images.
"“Chatbots may be able to get us to say more about ourselves than an ordinary website,” says Ryan Calo, codirector of the Tech Policy Lab at the University of Washington.
"Even the stereo, which was affected by Lexus's update, can create an unsafe situation, robotics law expert Ryan Calo told the Monitor. For example, buggy software might cause the radio to blare suddenly, startling the driver and causing an accident.
Tesla recently introduced a software update to control the whole vehicle, Dr. Calo tells the Monitor, although he says Lexus' update is technically not critical to safety.
The result, he says, is that "The line between control-critical and entertainment systems is not perfectly clean.""
"“The public should have an accurate mental model of what we mean when we say artificial intelligence,” says Ryan Calo, who teaches law at University of Washington. Calo spoke last week at the first of four workshops the White House hosts this summer to examine how to address an increasingly AI-powered world.
The University of Washington School of Law is delighted to announce a public workshop on the law and policy of artificial intelligence, co-hosted by the White House and UW’s Tech Policy Lab. The event places leading artificial intelligence experts from academia and industry in conversation with government officials interested in developing a wise and effective policy framework for this increasingly important technology. The event is free and open to the public but requires registration. -
CIS Affilate Scholar Ryan Calo wil be part of a panel titled "Understanding the Implications of Open Data".
How can open data promote trust in government without creating a transparent citizenry?
CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog will all be participating in this two-day conference.
Registration is open for We Robot 2015 and we have a great program planned:
Friday, April 10
Registration and Breakfast
Welcome Remarks: Dean Kellye Testy, University of Washington School of Law
Introductory Remarks: Ryan Calo, Program Committee Chair
The University of Washington School of Law is delighted to announce a public workshop on the law and policy of artificial intelligence, co-hosted by the White House and UW’s Tech Policy Lab. The event places leading artificial intelligence experts from academia and industry in conversation with government officials interested in developing a wise and effective policy framework for this increasingly important technology.
Simon Jack reports from Seattle on robots at work. From the Boeing factory where robots make planes to a clothes shop where a robot helps him buy a new pair of jeans. Plus Ryan Calo, professor of law at the University of Washington, grapples with the question of who to blame when robots go wrong, and whether there is such a thing as robot rights.
There are a million ways people might use drones in the future, from deliveries and police work to journalism. But in this episode, we’re going to talk about consumer drones — something that you or I might use for ourselves. What does the world look like when everybody with a smart phone also has a drone?
"“We don’t need to get to this crazy world in which robots are trying to take over in order for there to be really difficult, interesting complex legal questions,” says Ryan Calo, professor of law at the University of Washington, “That’s happening right now.”
Here’s a sample:
“How do we make sure these drones are not recording things that they shouldn’t," Calo says, "and those things aren’t winding up .... on Amazon servers,or somehow getting out to the public or to law enforcement?"