In a recent blog post, I discuss the role of EU Member State laws in defining and enforcing the “Right to Be Forgotten” (RTBF) under the EU’s new General Data Protection Regulation (GDPR). I talk about these GDPR provisions in more detail in my forthcoming article. Because in the future RTBF may be applied to hosting services like Facebook or Dailymotion, I discuss potential consequences for them as well as search engines.
This post lists key GDPR articles that are relevant for this kind of Member State legislation. It doesn’t attempt to analyze the scope of Member State authority, beyond noting where the GDPR states that some authority exists.
Article 85 – general protections for expression and information rights
Article 85 states that Member States “shall by law reconcile the right to the protection of personal data pursuant to this Regulation with the right to freedom of expression and information[.]” One way the laws can do so in the RTBF context is through substantive guidance: identifying factors that weigh for or against erasure and de-listing when platforms assess RTBF requests. Another way is through procedural protections: identifying what legal checks and balances can protect legitimate online expression, in the face of platforms’ strong incentives to simply honor all RTBF requests, even legally groundless ones. As a practical matter, procedural constraints may be just as important as substantive ones in protecting Internet users’ rights under privately operated “notice-and-takedown” systems.
Article 23 – restrictions on specified GDPR articles
Article 23 gives Member States authority to “restrict by way of a legislative measure the scope of the obligations and rights” when “such a restriction respects the essence of the fundamental rights and freedoms and is a necessary and proportionate measure[.]” Authorized purposes for national laws of this sort include “the protection of the data subject or the rights and freedoms of others.” The expression and information rights of Internet users affected by RTBF requests are important “rights and freedoms of others” affected by the GDPR. These rights are very directly affected by GDPR articles 17, 18, and 21, which will be the most important rules governing platforms’ decisions to erase or block public access to information.
Article 17 - right to erasure
Article 17, along with Article 21, provides the basis for RTBF requirements. They are analogues of Articles 12 and 14 of the 1995 Data Protection Directive – the articles the CJEU applied in Google Spain and more recently in Manni. GDPR Article 17.3 specifies that controllers need not honor erasure requests that conflict with expression and information rights. Important issues that may be governed by Article 17 include potential RTBF obligations for Internet platforms like Facebook; the scope of free expression exceptions; and procedural protections for online speakers wrongly targeted by “Right to Be Forgotten” requests. These are discussed in depth in my article.
Article 21 – right to object
The article 21 “objection” right is the other core basis for RTBF claims. Controllers must honor objections unless they have “compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject[.]” As the CJEU recently emphasized regarding the current objection right, national legislation is intended to provide exceptions to balance out this right in particular cases. (Manni par. 47)
Article 18 – right to restriction of processing
Article 18 sets forth a newly detailed right of data subjects: the right to require data controllers to “restrict” or suspend processing of personal data. In the RTBF context this seems to let claimants take information offline before a platform actually assesses the merits of their claim. Data subjects may request this if they allege that data is inaccurate (Art. 18.1(a)) or otherwise not legitimately processed. (Art. 18.1(d)). However, controllers may reject these requests “for the protection of the rights of another natural or legal person.” Lawmakers could avert considerable mischief by telling platforms that this exception covers RTBF requests to remove public online information – in effect, that in order to protect the rights of other Internet users, platforms should not take information or links off the Internet before carefully assessing whether the RTBF claim against them is legitimate.
Some other GDPR provisions that States may modify under their Article 23 powers are relevant to RTBF claims in more subtle ways.
Article 12 – communications between controller and data subject
Article 12 provides that where requests made under certain articles (including the erasure, restriction, and objection provisions in Arts 17, 18, and 21) are “manifestly unfounded or excessive,” a controller may “refuse to act on the request.” In this case, the controller shall “bear the burden of demonstrating the manifestly unfounded or excessive character of the request.” (Art. 12.5) Legal clarification about what requests may be deemed “manifestly unfounded or excessive” could help protect Internet users against over-reaching RTBF demands.
Articles 14 and 15
These articles should not have much bearing on RTBF requests, and it is unlikely that their drafters had RTBF in mind. But they create a disturbing possibility for motivated parties to intimidate online speakers or infringe on those individuals’ data protection rights. These articles require controllers to tell data subjects “from which source the personal data [about them] originate” and “any available information as to their source[.]” (Arts. 14.2(f) and 15.1(g)) Applied to public information platforms, this would seem to mean disclosing personal information about online speakers, including anonymous ones. For example, the data indexed by Google originates with individual webmasters or authors. These people may be Google account-holders, in which case Google’s “available information” about them may be very extensive -- and private. Should Google have to disclose such information under Articles 14 and 15? Similarly, personal data contained in tweets or Facebook posts comes from users of those platforms – if the platforms were deemed controllers, these speakers’ information could seemingly also be subject to disclosure.
Article 83 – Fines assessed by Data Protection Agencies (DPAs)
The GDPR permits high administrative fines, but only where those fines will be “effective, proportionate and dissuasive.” At 83.8, it specifies that DPAs assessing fines “shall be subject to appropriate procedural safeguards in accordance with Union and Member State law, including effective judicial remedy and due process.” This could provide a foundation for Member States to insist on procedural safeguards that protect both data subjects’ privacy rights and other Internet users’ information and expression rights.