The French Data Protection Agency, CNIL, is currently before a French court, arguing that Google needs to do more to comply with “Right to Be Forgotten” or “Right to Be Delisted” (RTBD) laws. The EU’s highest court, the CJEU, defined the search engine’s obligations in the 2014 Google Spain v. Costeja case, ruling that Google must comply with requests to remove links from the results it displays when people search for the requester by name. The Court did not say whether this remedy applied to search results outside of the EU. CNIL’s position is that if Google’s search results violate Data Protection rights under French law, then Google must prevent users everywhere in the world from seeing them.
CNIL’s position has been widely criticized, including in a New York Times opinion piece that Bruce Brown and I wrote, for threatening speech and information rights in other countries. (I used to work on these issues at Google, and Bruce defends journalists for a living, so our position surprised no one.) As a matter of black letter jurisdiction law, some experts have also questioned CNIL’s extraterritorial authority under Article 4 of the EU’s Data Protection Directive. Much of the legal debate has centered on these jurisdiction questions, and on CNIL’s argument that French law governs Google’s worldwide index because it constitutes a single “processing” under Data Protection law.
In this blog post I will argue that several other express provisions of the Data Protection Directive support national, rather than global, application of French RTBD law. The French court should consider these in rejecting CNIL’s bid to regulate expression and information rights around the world.
1. Data Protection Directive Articles 12 and 14 require only partial, not complete, erasure.
The current CNIL v. Google case is about the series of decisions that search engines, DPAs, or courts must make when assessing RTBD requests. According to CNIL, only one decision matters: whether the requester’s Data Protection rights trump the public’s interest in finding information though Google searches. Once that decision is made, CNIL maintains, the legal analysis ends. Google must remove that web page from search results for the requester’s name, and do so everywhere in the world, in order to provide "effective and complete" protection to data subjects.
But that is not where the CJEU ended its analysis. The CJEU assessed proportional interests at a second point, in order to tailor a remedy that balanced Mr. Costeja’s rights against those of other Internet users. Mr. Costeja and the Spanish DPA wanted Google to “prevent indexing of the information relating to him personally,” so that it would “not be known to internet users.” (Par. 20) In other words, they wanted the data completely erased from Google’s index. The CJEU responded that Google only needed to prevent some aspects of indexing, by removing data “from the list of results displayed following a search made on the basis of a person’s name[.]” (Rul. Par. 3) The Court let Google continue processing that same personal data – text about Mr. Costeja from a newspaper page – on its servers. It also let Google link users to the data, and show it to them in snippets of text from the page, when they searched for terms other than the plaintiff’s name. The Court applied the same "effective and complete" standard advanced by CNIL, and found that it mandated partial -- not total -- erasure.
The practical and policy reasons for this limited scope of erasure are clear. The Court says that search results create a new and unique Privacy harm by aggregating information from separate web pages to create a “more or less detailed profile” of an individual. (Par. 80) Its remedy, which only requires erasure of these “profile” results, is tailored to address this specific harm. As commentators including CNIL have pointed out, letting Google continue to process the same data and provide it in search results for other queries also strikes a balance between the plaintiff’s rights and those of other Internet users seeking information online.
The doctrinal, black letter law foundation for the Court’s limited erasure remedy is less clear. That foundation matters a lot for the resolution of the current CNIL case against Google. For example, the CJEU could have meant that search results for a person’s name are one instance of “processing,” subject to separate legal obligations from the processing that Google does for other search operations. In that case, CNIL’s “single processing” argument for applying the same legal analysis to every component of web search would fail. Or the Court could have meant that Google is a data controller as to search results for a person’s name, but not as to other indexed data. Both of those interpretations are possible, but the Court doesn’t talk about them.
What the court does say is that the limited-scope erasure is mandated by Data Protection Directive Articles 12 and 14. These provisions spell out what data controllers must do when a data subject asks them to stop processing her data. Article 12 requires “erasure or blocking,” and Article 14 requires controllers to honor “objections” to processing. Google Spain says these two Articles “are to be interpreted as meaning that… the operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages published by third parties and containing information relating to that person[.] (Rul. Par. 3)
A look at these two Articles shows why they support, and even mandate, the kind of tailored, proportionate remedy the CJEU provided – and why they do not support the CNIL’s global, all-or-nothing approach. Here is the language:
· Article 12(b) gives data subjects “the right to obtain from the controller . . . as appropriate the rectification, erasure or blocking of data the processing of which does not comply with the provisions of this Directive[.]” The italicized language makes clear that the scope of erasure is flexible, and depends on a nuanced and proportional determination about what is “appropriate.” The CJEU’s ruling effectively tells us that complete erasure was not “appropriate,” and that Google could continue some aspects of its processing while still “comply[ing] with the provisions of this Directive[.]”
· Article 14(a) gives data subjects “the right… in the cases referred to in Article 7 (e) and (f), to object at any time on compelling legitimate grounds… to the processing of data relating to him…. Where there is a justified objection, the processing instigated by the controller may no longer involve those data[.]” Here, too, the Directive tells controllers and courts to apply balance and proportionality in tailoring the scope of compliance with an objection. Where a controller processes data in multiple ways, it should honor objections only for some of them – the ones that support a “justified” objection on “compelling legitimate grounds.” In Google Spain, that meant Google should cease processing only for searches on Mr. Costeja’s name.
The CJEU balanced interests under Articles 12 and 14 to define a tailored and proportionate technical scope of erasure within Google’s search operations. The same approach should govern the geographic scope of erasure in the current CNIL case. The CJEU was not asked about this potential limitation, and did not undertake to answer it or balance the relevant rights and interests. These include the rights of Internet users around the world to seek and impart information under their own countries’ legal interpretations of universal human rights – interpretations that are as legitimate, and as entitled to respect, as the ones advanced by French or EU lawmakers. These also include sovereignty interests of other governments both within and outside of the EU – the interests traditionally recognized under the legal doctrine of comity. Articles 12 and 14 require, and the Google Spain judgment confirms, that courts must pause to consider the balance of rights and interests, and tailor the scope of erasure obligations accordingly.
2. Data Protection Directive Article 9 recognizes that the same processing may be protected by Free Expression laws in one country but not in another.
The other problem with CNIL’s reasoning is simpler. I assume it has been addressed elsewhere in more detail. The Data Protection Directive, in Article 9, explicitly makes Free Expression protections a matter for Member State law. The practical result of this has been a very wide array of national laws – meaning that the same RTBD request may require compliance in one EU country, but not in another.
This diversity of laws is not a bug – it is a feature. The Data Protection Directive is designed to generate different outcomes in different parts of the EU when Free Expression is at issue. The point is not a race to the courthouse, with the fastest-moving DPAs or courts displacing Free Expression laws in other EU countries. Rather, the point is to accept divergent outcomes within the range of permissible national approaches to balancing Privacy and Free Expression rights. CNIL’s approach overthrows this system, encouraging forum shopping and displacing the authority reserved to other nations. The French court should reject it.