Can You Sue A Robot For Defamation?

Author(s): 
Publication Type: 
Other Writing
Publication Date: 
March 17, 2014

Cross-posted from Forbes.

Life moves pretty fast.  Especially for journalists. When an earthquake aftershock shakes America’s second largest city, news outlets scramble to be the first to cover the story.  Today the news itself made news when various outlets picked up on a curious byline over at the Los Angeles Times: “this post was created by an algorithm written by the author.”

The rise of algorithmically generated content is a great example of a growing reliance on “emergence.”  Steven Johnson in his book by this title sees the essence of emergence as the movement of low-level rules to tasks of apparently high sophistication.  Johnson gives a number of examples, from insects to software programs.  As I see it, the text of the earthquake story likewise “emerged” from a set of simple rules and inputs; the “author” in question at the Los Angeles Times, Ken Schwencke, did not simply write the story in advance and cut and paste it.

I imagine Schwencke had a pretty good sense of what story the algorithm would come up with were there an earthquake.  This is not always the case.  Even simple algorithms can create wildly unforeseeable and unwanted results.  Thus, for instance, a bidding war between two algorithms led to a $23.6 million dollar book listing on Amazon.  And who can forget the sudden “flash crash” of the market caused by high speed trading algorithms in 2010.

I explore the challenges emergence can pose for law in my draft article Robotics and the New Cyberlaw.  I hope you read it and let me know what you think.  I’ll give you one example: Imagine that Schwencke’s algorithm covered arrests instead of earthquakes and his program “created” a story suggesting a politician had been arrested when in fact she had not been.  Can the politician sue Schwencke for defamation?  Recall that, in order to overcome the First Amendment, the politician would have to show “actual malice” on the part of the defendant.  Which is missing.  But, in that case, are we left with a victim with no perpetrator?

If this seems far fetched, recall that Stephen Colbert’s algorithm @RealHumanPraise—which combines the names of Fox Fox News anchors and shows with movie reviews on Rotten Tomatoes—periodically refers to Sarah Palin as “a party girl for the ages” or has her “ wandering the nighttime streets trying to find her lover.”  To the initiated, this is obviously satire.  But one could readily imagine an autonomously generated statement that, were it said by a human, would be libel per se.

Various legal scholars are beginning to look at the consequences of machine speech and finance.  I think the consequences of emergence will prove far broader.  Whether in the context of breaking news or military response, the prospect of robots and software capable of replicating human behavior with inhuman speed will be irresistible.  The law—which so often relies on a perpetrator who intends an injury (or a tortfeasor who foresees one)—will just have to adapt.

Disclosure: this post was written by a human.