Ryan Calo is an assistant professor at the University of Washington School of Law and a former research director at CIS. A nationally recognized expert in law and emerging technology, Ryan's work has appeared in the New York Times, the Wall Street Journal, NPR, Wired Magazine, and other news outlets. Ryan serves on several advisory committees, including the Electronic Frontier Foundation, the Electronic Privacy Information Center, and the Future of Privacy Forum. He co-chairs the American Bar Association Committee on Robotics and Artificial Intelligence and serves on the program committee of National Robotics Week.
Ann Bartow once criticized Daniel Solove for not providing enough “dead bodies” in his discussion of privacy. I tend to disagree that such proof is necessary. But privacy has seen a dead body recently—that of Rutgers University student Tyler Clementi.
The narrative around Clementi’s tragic suicide continues to shift. The press originally reported that Clementi killed himself after his roommate invited the entire campus to view footage of Clementi having sex with another man. The Associated Press is now reporting that, according to the roommate’s defense attorney, no one but he and his friend ever saw the video.
The question of whether the defendants recorded or broadcast the web cam is highly relevant to whether there has been a privacy violation. Yet it is hardly relevant at all to the question of whether there has been a privacy harm.
I don’t know that generativity is a theory, strictly speaking. It’s more of a quality. (Specifically, five qualities.) The attendant theory, as I read it, is that technology exhibits these particular, highly desirable qualities as a function of specific incentives. These incentives are themselves susceptible to various forces—including, it turns out, consumer demand and citizen fear.
The law is in a position to influence this dynamic. Thus, for instance, Comcast might have a business incentive to slow down peer-to-peer traffic and only refrain due to FCC policy. Or, as Barbara van Schewick demonstrates inter alia in Internet Architecture and Innovation, a potential investor may lack the incentive to fund a start up if there is a risk that the product will be blocked.
Similarly, online platforms like Facebook or Yahoo! might not facilitate communication to the same degree in the absence of Section 230 immunity for fear that they will be held responsible for the thousand flowers they let bloom. I agree with Eric Goldman’s recent essay in this regard: it is no coincidence that the big Internet players generally hail from these United States.
Prohibition wasn’t working. President Hoover assembled the Wickersham Commission to investigate why. The Commission concluded that despite an historic enforcement effort—including the police abuses that made the Wickersham Commission famous—the government could not stop everyone from drinking. Many people, especially in certain city neighborhoods, simply would not comply. The Commission did not recommend repeal at this time, but by 1931 it was just around the corner.
Five years later an American doctor working in a chemical plant made a startling discovery. Several workers began complaining that alcohol was making them sick, causing most to stop drinking it entirely—“involuntary abstainers,” as the doctor, E.E. Williams, later put it. It turns out they were in contact with a chemical called disulfiram used in the production of rubber. Disulfiram is well-tolerated and water-soluble. Today, it is marketed as the popular anti-alcoholism drug Antabuse.
Were disulfiram discovered just a few years earlier, would federal law enforcement have dumped it into key parts of the Chicago or Los Angeles water supply to stamp out drinking for good? Probably not. It simply would not have occurred to them. No one was regulating by architecture then. To dramatize this point: when New York City decided twenty years later to end a string of garbage can thefts by bolting the cans to the sidewalk, the decision made the front page of the New York Times. The headline read: “City Bolts Trash Baskets To Walks To End Long Wave Of Thefts.”
In an important but less discussed chapter in The Future of the Internet, Jonathan Zittrain explores our growing taste and capacity for “perfect enforcement."
Readers are likely familiar with the cyberlaw mantra that “code is law.” What’s striking is that since Lawrence Lessig published Code in 1999, relatively little has been written about the dangers of regulation by architecture, particularly outside of the context of intellectual property. Many legal scholars—Neil Katyal, Elizabeth Joh, Edward Cheng—have instead argued for more regulation by architecture on the basis that it is less discriminatory or more effective.
My new paper explores what is unique about privacy harm. How does privacy harm differ from other injury? And what do we gain by defining its boundaries and core properties? You can download the paper here; abstract after the jump. Your thoughts warmly welcome.
ACM Computers Freedom Privacy is in its 20th year. This year was exciting to me in that robots entered the mix. My panel on the topic featured forecaster and essayist Paul Saffo, EFF's Brad Templeton, philosopher Patrick Lin, and was moderated by Wired's Gary Wolf. You can find a video recording of our panel here. I also spoke to the Dr. Katherine Albrecht Radio Show, which was broadcasting live from the conference. Click here to listen.
Over the last year, the FBI has had harsh words for Apple, accusing the tech giant of endangering human lives and aiding criminals by turning on encryption by default on the iPhone. When Google announced it would add the feature to Android, meaning that smartphone users would need to unlock their phones for police to be able to go through them, government officials and law enforcement representatives similarly freaked out.
Privacy law scholars tend to be skeptical of markets. Markets “unravel” privacy by penalizing consumers who prefer it, degrade privacy by treating it as just another commodity to be traded, and otherwise interfere with the values or processes that privacy exists to preserve.
In a fresh and recent whitepaper, Brookings Institution senior fellow Benjamin Wittes and law student Jodie Liu turn the standard privacy argument on its head: as they see it, many supposed threats to our privacy actually benefit it.
"Recent headlines declaring “Robot Kills Man in Germany” are examples of growing news coverage about the impact of robots on society. This is the subject of a new law review article by a University of Washington faculty member.
"Headlines rang out across the internet yesterday that a robot killed someone in Germany. Beneath the sensationalist surface, there was a tragic truth: an industrial robot at a Volkswagen plant in Germany had indeed killed a 22-year-old worker who was setting it up.
"That’s all well and good, but what happens if they still get out of line? We all learn societal standards from a very young age, and we still commit crimes. What we need is a legal framework through which AIs (or their creators) can be held liable, says Ryan Calo, a cyberlaw expert at the University of Washington.
“The law today is not well positioned to deal [with these kinds of scenarios],” said Calo. “They break our standard legal models.”"
""Data about reproductive health is very sensitive, but there are situations where maybe you want someone to know that," said Harlan Yu, principal at Upturn, a technology consulting firm. "You might want your doctor or researchers to know that. But in other situations you might not want drug companies or insurance companies to have that information."
U.S. Sen. John Thune (R-S.D.), chairman of the Senate Committee on Commerce, Science, and Transportation, will convene a hearing on Wednesday, November 16, 2016, at 3:00 p.m. entitled “Exploring Augmented Reality.” The hearing will examine the emergence, benefits, and implications of augmented reality technologies. Unlike virtual reality that creates a wholly simulated reality, augmented reality attempts to superimpose images and visual data on the physical world in an intuitive way.
• Mr. Brian Blau, Research Vice President, Gartner
The University of Washington School of Law is delighted to announce a public workshop on the law and policy of artificial intelligence, co-hosted by the White House and UW’s Tech Policy Lab. The event places leading artificial intelligence experts from academia and industry in conversation with government officials interested in developing a wise and effective policy framework for this increasingly important technology. The event is free and open to the public but requires registration. -
CIS Affilate Scholar Ryan Calo wil be part of a panel titled "Understanding the Implications of Open Data".
How can open data promote trust in government without creating a transparent citizenry?
The University of Washington School of Law is delighted to announce a public workshop on the law and policy of artificial intelligence, co-hosted by the White House and UW’s Tech Policy Lab. The event places leading artificial intelligence experts from academia and industry in conversation with government officials interested in developing a wise and effective policy framework for this increasingly important technology.
Simon Jack reports from Seattle on robots at work. From the Boeing factory where robots make planes to a clothes shop where a robot helps him buy a new pair of jeans. Plus Ryan Calo, professor of law at the University of Washington, grapples with the question of who to blame when robots go wrong, and whether there is such a thing as robot rights.
There are a million ways people might use drones in the future, from deliveries and police work to journalism. But in this episode, we’re going to talk about consumer drones — something that you or I might use for ourselves. What does the world look like when everybody with a smart phone also has a drone?
"“We don’t need to get to this crazy world in which robots are trying to take over in order for there to be really difficult, interesting complex legal questions,” says Ryan Calo, professor of law at the University of Washington, “That’s happening right now.”
Here’s a sample:
“How do we make sure these drones are not recording things that they shouldn’t," Calo says, "and those things aren’t winding up .... on Amazon servers,or somehow getting out to the public or to law enforcement?"