Ryan Calo is an assistant professor at the University of Washington School of Law and a former research director at CIS. A nationally recognized expert in law and emerging technology, Ryan's work has appeared in the New York Times, the Wall Street Journal, NPR, Wired Magazine, and other news outlets. Ryan serves on several advisory committees, including the Electronic Frontier Foundation, the Electronic Privacy Information Center, and the Future of Privacy Forum. He co-chairs the American Bar Association Committee on Robotics and Artificial Intelligence and serves on the program committee of National Robotics Week.
UPDATE (Dec. 14, 2008): A user has created a Facebook Group against My Buddies. Meanwhile, as Beth says below, My Buddies has mutated into My Friends...
I recently received a series of notifications on Facebook alerting me that friends of mine had answered various personal questions about me. One notification claimed that a high school friend had just answered a specific yes/no question about my sexual orientation. Clicking on the link labeled “What did she say?”, I was invited to join My Buddies – a new Facebook application with an icon identical to the default running man on AIM, implying a connection to AOL that I doubt exists.
The United Press International reports that "[n]ewly released documents in Florida's Caylee Anthony case show ominous search words entered on the family computer prior to the child's disappearance." Some thoughts:
1. I've yet to see an investigation wherein the search terms at issue came from the service provider (e.g., Google or Yahoo!). Rather, they appear to be taken from the defendant's computer pursuant to a warrant.
2. I think the introduction of search terms into evidence presents a real danger in the context of inchoate crimes such as attempted murder. Searches can be snapshots of a person's mind, but no more than that. The concern is that a jury will see concrete intentions in Internet searches and not require a showing of a firm will to go through with the crime.
3. As Search Engine Watch points out, searches can lead to convictions in another way -- by allowing citizens to make connections and report them to the police. In one case, a Florida woman reported a man for practicing medicine without a license after an Internet search revealed that his license had been revoked.
4. Why is it always Florida?
A Washington Post tech blogger reports that President-elect Barak Obama has named a team to guide technology policy for the administration: Julius Genachowski (former chief counsel to FCC Chairman Reed Hundt, former senior executive at IAC), Sonal Shah (head of global development at Google.org) and Blair Levin (telecommunications policy analyst and consultant). Previous appointments around tech policy include Susan Crawford (Michigan Law School) and Kevin Wernach (World of Wharton), recently named to Obama’s FCC transition team, with more appointments to follow. Obama appears to be striking the right balance between academics, policy wonks, and practitioners. He has hired former insiders who also appear to have the right “Silicon Valleyues” of innovation and openness.
I don’t think I’ve ever seen such a commitment to push back against third-party requests in a public legal document before. And the 23andMe panelist's commitment, though oral, was at least as strong.
Consumer Genomics: Law and Policy
November 10, 2008 from 5:00 pm - 6:30 pm
Stanford Law School, Room 190
With a credit card and a saliva sample, consumers can now unlock the secrets carried in their DNA. Consumer genomics offers direct access to one's genetic code, plus interpretations of health risks, family lineage, opportunities for social networking, and more. But how should consumer genomics be regulated? Join us for a panel discussion with Stephen Moore (General Counsel, Navigenics), Anne Wojcicki (Co-founder, 23andMe), and Alexis Madrigal (Wired), moderated by bioscience and law expert Hank Greely (Stanford Law School). Open to the public.
Brought to you by the Stanford Law School Center for Law and the Biosciences and co-sponsored by the Center for Internet and Society.
I am a law professor who writes about robotics. I’m also a big Paolo Bacigalupi fan, particularly his breakout novel The Windup Girl involving an artificial girl. So for me, “Mika Model” was not entirely new territory. For all my familiarity with its themes, however, Bacigalupi’s story revealed an important connection in robotics law that had never before occurred to me.
Over the last year, the FBI has had harsh words for Apple, accusing the tech giant of endangering human lives and aiding criminals by turning on encryption by default on the iPhone. When Google announced it would add the feature to Android, meaning that smartphone users would need to unlock their phones for police to be able to go through them, government officials and law enforcement representatives similarly freaked out.
Privacy law scholars tend to be skeptical of markets. Markets “unravel” privacy by penalizing consumers who prefer it, degrade privacy by treating it as just another commodity to be traded, and otherwise interfere with the values or processes that privacy exists to preserve.
"Assistant Professor of law Ryan Calo developed his answer from a law and policy perspective. Currently, law could find inadequacies in security of household robots and cars, which enables federal agencies to sue or fine such crimes.
""Right now they're focusing on how to protect people and airplanes," Ryan Calo, drone expert and law professor at the University of Washington, told NBC News. "They haven't even thought about privacy much, let alone animals."
"There are two possible lines for making the determination, Ryan Calo, a law professor at the University of Washington, suggested to The Huffington Post: weight and use.
"The Federal Aviation Authority (FAA) makes a big deal about commercial use or use by law enforcement," he said. "I'm much more comfortable with the requirement that corporations or startups or police or fire fighters have to register them, so there's some accountability for using public airways for commercial or civic use."
"Which brings me to University of Washington School of Law assistant professor Ryan Calo's recent article. He argues that tech giants Apple and Google, who have implemented reasonably effortless versions of end-to-end encryption into some of their communication products, may be our best hope of resisting government surveillance.
""There's a bit of a disconnect between what people say and what they do," says Ryan Calo, a law professor at the University of Washington who has studied digital market manipulation. He says the paradox is complex and theories that explain it vary. "Maybe they don't really care? Maybe they just don't know?" Calo adds.
Keynote Lecture, Reilly 30th Anniversary Conference
Ryan Calo, UW School of Law
The Past, Present, and Future of Robotic Regulation
Robots have been with us for some time, largely hidden away from daily life. Today robots are leaving the factory and the battlefield and entering our hospitals, hotels, highways, and skies. This talk addresses how the law has addressed robots in the past, how the law is addressing drones, driverless cars, and other robots today, and how law and legal institutions might address this transformative technology going forward.
Roundtable with experts Professor Ronald C. Arkin, Professor Ryan Calo, Dr. Kate Darling, Professor Illah Nourbakhsh, and Professor Noel Sharkey
Moderated by Professor Jennifer Urban
Friday, July 11, 3:30 pm
Boalt Hall Goldberg Room
Robots are quickly moving out of controlled environments into public spaces and homes, and researchers are developing artificial intelligence systems that will allow robots to make decisions autonomously. How should society plan for this transition?
Humans and Machines — Drones, Phones, and Robotic Friends: Where is Emergent Technology Taking Us? On June 27 at 8:30 p.m. with speakers Mary “Missy” Cummings, Ryan Calo, Ken Goldberg and moderator David Kirkpatrick.
As the landscape of high tech is increasingly modernized through applications of robotics from operating theaters to rescue missions, smarter phones that manage our lives, and flying technologies that put cameras (and weapons) in the air (if not everywhere), how will the balance of law, ethics, and relationships between humans and machines change us?
2013 PRIVACY PAPERS FOR POLICY MAKERS
The Future of Privacy Forum
Co-chairs Jules Polonetsky and Christopher Wolf
in conjunction with Congresswoman Sheila Jackson Lee invite you to
“Privacy Papers for Policy Makers”
A discussion of leading privacy research
There are a million ways people might use drones in the future, from deliveries and police work to journalism. But in this episode, we’re going to talk about consumer drones — something that you or I might use for ourselves. What does the world look like when everybody with a smart phone also has a drone?
"“We don’t need to get to this crazy world in which robots are trying to take over in order for there to be really difficult, interesting complex legal questions,” says Ryan Calo, professor of law at the University of Washington, “That’s happening right now.”
Here’s a sample:
“How do we make sure these drones are not recording things that they shouldn’t," Calo says, "and those things aren’t winding up .... on Amazon servers,or somehow getting out to the public or to law enforcement?"
"What will Amazon’s drone highway in the sky look like?
Probably not a drone highway. Amazon unveiled a proposal where low-level air space would be carved out for drones: 200 to 400 feet would be reserved for high-speed transit drones. Below, there would be space for low -speed local drone traffic, and above would be a no-fly buffer zone to keep drones out of manned-vehicle air space, aka flight paths.
Robots have been used in factories around the world for decades, often carrying out dangerous or highly repetitive operations. However the city of Dongguan, China, has become home to the first fully automated factory - where the workforce is made of up entirely of robots. Changying Precision Technology will only employ a small number of human staff who will monitor operations of the machinery, but all processes are completed by robotic equipment.
Is this a sign of things to come? Newsday spoke to Ryan Calo, a professor with the University of Washington Tech Policy Lab.