The Italian Job: The Real Reason the Google Convictions are Bad Precedent

I was pleased to be interviewed last night on BBC America World News (live!) about the convictions of three senior Google executives by an Italian court for privacy violations. The case involved a video uploaded to Google Videos (before the acquisition of YouTube) that showed the bullying of a person with disabilities. (See "Larger Threat is Seen in Google Case" by the New York Times' Rachel Donadio for the details.)

Internet commentators were up-in-arms about the conviction, which can't possibly be reconciled with European law or common sense. The convictions won't survive appeals, and the government knows that as well as anyone. They neither want to or intend to win this case. If they did, it would mean the end of the Internet in Italy, if nothing else. Still, the case is worth worrying about, for reasons I'll make clear in a moment.

But let's consider the merits of the case first. Prosecutors bring criminal actions because they want to change behavior—behavior of the defendant and, more important given the limited resources of the government, others like him. What behavior did the government want to change here?

The video was posted by a third party. Within a few months, the Italian government reported to Google their belief that it violated the privacy rights of the bullying victim, and Google took it down. They cooperated in helping the government identify who had posted it, which in turn led to the bullies themselves.

The only thing the company did not do was to screen the video before posting it. The Google executives convicted in absentia had no personal involvement in the video. They are being sued for what they did not do, and did not do personally.

So if the prosecution stands, it leads to a new rule for third-party content: to avoid criminal liability, company executives must personally ensure that no hosted content violates the rights of any third party.

In the future, the only thing employees of Internet hosting services of all kinds could do to avoid criminal prosecution would be to pre-screen all user content before putting it on their website. And pre-screen them for what? Any possible violation of any possible rights. So not only would they have to review the contents with an eye toward the laws of every possible jurisdiction, but they would also need to obtain releases from everyone involved, and to ensure those releases were legally binding. For starters.

It's unlikely that such filtering could be done in an automated fashion. It is true that YouTube, for example, filters user postings for copyright violations, but that is only because the copyright holders give them reference files that can be compared. The only instruction this conviction communicates to service providers is "don't violate any rights." You can't filter for that!

The prosecutor’s position in this case is that criminal liability is strict—that is, that it attaches even to third parties who do nothing beyond hosting the content.

If that were the rule, there would of course be no Internet as we know it. No company could possibly afford to take that level of precaution, particularly not for a service that is largely or entirely free to users. The alternative is to risk prison for any and all employees of the company.

(The Google execs got sentences of six months in prison each, but they won't serve them no matter how the case comes out. In Italy, sentences of less than three years are automatically suspended.)

And of course that isn’t the rule. Both the U.S. and the E.U. wisely grant immunity to services that simply host user content, whether it’s videos, photos, blogs, websites, ads, reviews, or comments. That immunity has been settled law in the U.S. since 1996 and the E.U. since 2000. Without that immunity, we simply wouldn't have--for better or worse--YouTube, Flickr, MySpace, Twitter, Facebook, Craigslist, eBay, blogs, user reviews, comments on articles or other postings, feedback, etc.

Once a hosting service becomes aware of a possible infringement of rights, to preserve immunity most jurisdictions require a reasonable investigation and (assuming there is merit to the complaint), removal of the offending content. That, for example, is the "notice and takedown" regime in the U.S. for content that violates copyright.

For more, see "The Italian Job" on my website.


Let's get this straight. I despise Google. They're a monopoly and run their engine like a tyrant. They care nothing for customer service and leave legitimate businesses in the cold.
That said, they shouldn't be responsible for filtering every piece of content they index or throw up on one of their web properties. I completely's just not possible.
And even if it were, it's not their job. The person who posts the content is responsible, not the one hosting it! This case should have been thrown out.

I agree that Google does not have any duty to carry out a preemptive review the content uploaded in Google Video. And I agree that in Italy, as in other places around the world, there is a growing desire for censorship and governative control of information.
But again, the question discussed in this trial is probably a different one: of course we don't know for sure until we can read the motivation, since Google insisted for a closed door trial (Google!!!, not some obscure Italian censorial force!).
The question that may have been discussed is whether Google, after being informed from the outside (users, authorities) of the fact that some content was creating problems, reacted with diligence or not. And in this case 5 minutes or 1 month, I'm sorry, would make a huge difference!
" will be fundamental to read the motivation of the sentence, because if the premise was that Google didn't remove promptly the video, thus adopting an omissive behavior, then it would be a correct application of the law in force, and the sentence would not impose any preemptive censorial intervention..."
Stefano Rodotà, former president of the Italian Privacy Authority and of the European Group on Data Protection (and former Stanford Law School visiting professor)
So, let's wait for the motivation before reaching our definitive conclusions!

The "motivation of the sentence" is irrelevant. Note the following from EFF's Danny O'Brien:
"Europe has, in theory at least, at the EU level, strong protections for Internet intermediaries in its E-Commerce Directive: Article 14 of that directive provides that hosting providers are not responsible for the content they host, as long as they are not informed of its illegal character, and they act promptly when informed of it. Article 15 clarifies that hosts do not need to monitor hosted content for potentially illegal content.
This judgement guts both these principles. The court dismissed the allegation of criminal defamation but upheld a charge of illegally handling personal data on the basis that a video is personal data, and that under EU data protection law, Google needed prior authority before distributing that personal data.
This interpretation of the law means that Google is co-responsible for the legality of content containing the images of persons -- before anyone has complained about the content. That effectively means to comply with the decision, any intermediary working within Italy must now pre-screen every piece of video with anyone who appears within it, or risk prosecution. As the judgement stands, it also presents such a wide definition of personal data that it might effectively require that all hosts pre-screen all content be it video, text, audio or data."

First of all, we still don't know the reasons upon which the judge has grounded his sentence (they will be published shortly). Therefore we know what his conclusion is, but nothing about the arguments and the reasoning process.
Said that, let me remind that the trial is NOT about Google (its executives) having a "ex ante" duty of "ensure that no hosted content violates the rights of any third party."
The trial is about Google negligence in promptly removing the content: Google says it was done immediately, Prosecutors say more than 1 month after the company was informed.
Both Google and the Prosecutors presented evidence in the trial, but we don't know anything since Google asked (and the court agreed) to have an absolutely closed door trial. Again, we have to wait that the Judge discloses the complete motivation of his decision.
Last thing: it is true that EU grants immunity to sites that SIMPLY host user content. But, even if we consider youtube service simply as "hosting" in the meaning of EU directive 2000/31/CE (and I am not 100% sure about this), the same directive does not automatically grant immunity from EU privacy law: article 1.5(b) states that "This Directive shall not apply to [...] questions relating to information society services covered by Directives 95/46/EC and 97/66/EC", that is the EU directive on privacy and data protection.
Giuseppe Contissa

Whether it's a month or five minutes, there's no practical way to reconcile a rule of "negligence" with content hosting. Absent a complaint or other notice (including from other users of the service), service providers have no way of knowing content they are hosting violates any law unless they manually review all of it.
And even if armies of employees (or here, senior executives) look at every piece of drivel that users post, what exactly are they reviewing for? In this case, to know that privacy rights were violated by posting the video (to Google Video, by the way, not YouTube) would require access to outside materials--releases and so on--which would, therefore, be required to be submitted and authenticated with the posted content.
The underlying criminal behavior depicted on the video is not the subject of the prosecution--nor could it be. Assuming crime is obvious, how do the executives ensure that it's "real" and not staged or dramatized video or, to get to the heart of what really seems to bother the government here, a kind of news report?

Add new comment