The Final Draft of Europe's "Right to Be Forgotten" Law

The probably-really-almost-totally final 2016 General Data Protection Regulation (GDPR) is here!  Lawyers around the world have been hunkered down, analyzing its 200-plus pages. In the “Right to Be Forgotten” (RTBF) provisions, not much has changed from prior drafts. The law still sets out a notice and takedown process that strongly encourages Internet intermediaries to delete challenged content, even if the challenge is legally groundless.  The problems I identified in earlier drafts could have been avoided with simple changes – putting procedural checks on invalid erasure requests, while giving effect to valid ones.  Those changes would not have diminished any gains for online privacy rights under the GDPR, or affected Internet users’ ability to delete data collected by companies and held in back-end logs, accounts, or profiling systems.  The opportunity to make those targeted changes has now passed.

The silver lining is that the final GDPR text is riddled with ambiguous passages on key points.  Happy holidays, data protection lawyers – it’s the gift of lifetime employment!  These ambiguities will move debates that should have been resolved in the lawmaking process into a new phase, centered on advocacy before regulators and courts.  For RTBF issues, there are enough important ambiguities to keep the public discussion going for a long time.  The right interpretations can help lawmakers protect online privacy and data protection rights without doing unnecessary and disproportionate harm to free expression and information access.

In overview, here is how the notice and takedown provisions landed in the final draft.   For the quick-and-dirty version, just read the underlined parts.

  • Data controllers, including at least some Internet intermediaries, must erase content based on “right to be forgotten” (RTBF) requests. This is the thrust of Articles 17 and 19.  The much-debated RTBF moniker is in the title of Article 17: ‘Right to erasure (“right to be forgotten”).’  The RTBF provision directly applies only to data controllers, meaning the entities that “determine the purposes and means” of processing personal data.  For hosts and other Internet intermediaries, the next big question will be whether they count as controllers, with RTBF removal obligations for content uploaded by their users.  (Example: If I post about my cousin on Facebook, can my cousin compel Facebook to take it down?)

    What changed in the final draft: Mostly just the title, which had varied from draft to draft. (I have not run text comparisons on all drafts, this is based on memory and review yesterday.)

     

  • The GDPR doesn’t tell us whether hosting platforms like Facebook or Twitter are controllers with RTBF erasure obligations. We know that search engines are controllers and thus have RTBF obligations -- that was a key holding in the Google Spain/Costeja case.  The GDPR doesn’t tell us what other Internet intermediaries will fall in that category.  Realistically, I find it hard to imagine DPAs excusing major social networks from erasure obligations, in the long run.  But there will be a lot of arguing first. 

    There are some strong arguments against RTBF obligations for hosts – for example, that they cannot be controllers because they only process content at the direction of a user, who is herself the controller.  There are also some widely accepted legal arguments that will, if they prevail, lead to more complicated answers.  Following one of them, RTBF would apply to hosts that are too “active” in managing user content, but not to “passive” hosts.  Following another argument, hosts would have to erase some content, but not nearly as much as the content that search engines must de-index.   (Example: Google may have to remove search results pointing to the Facebook page where I posted about my cousin, but Facebook still won’t have to remove the post from its platform.)

    What changed in the final draft: Not much.  There is some inconclusive new language about social networking in Recital 15.

     

  • Intermediaries that do not honor RTBF requests risk crippling fines.  There are no legal consequences for “over-removing” content targeted by invalid RTBF requests. This part of the GDPR kept changing in drafts, but fines for RTBF violations are now set at the greater of “20 000 000 EUR, or in case of an undertaking, up to 4% of the total worlwide (sic) annual turnover.”  Art. 79.3a.  This boosts intermediaries’ well-documented existing incentives to simply honor all removal requests.  Honoring most or all RTBF requests would be a problem, given that both Bing and Google report that at least 50% of RTBF requests they receive are invalid under EU law.

    What changed in the final draft: These numbers are new. Fines varied a lot in early drafts, and reportedly were hotly debated and lobbied until the 11th hour.

     

  • We still don’t know the answer to the €20 million question: Do intermediary liability laws under eCommerce Directive Articles 12-15 apply to RTBF erasure requests?   Existing rules under the eCommerce Directive tell Internet companies how to handle removal requests for other legal claims, like defamation.  Those rules have real flaws, but they at least build in some protections against legally groundless or abusive attempts to silence online expression.  There is no reason to use a whole new process for RTBF claims, so the answer to the question should be yes: eCommerce procedural rules for notice and takedown apply to RTBF erasures.  That would mean, among other things, that intermediaries don’t have to take down content until they know the removal request states a valid claim.

    The GDPR’s plain language seems to support this answer, but has a loophole that will fuel argument for years.  Both GDPR Recital 17 and Article 2.3 say the GDPR is “without prejudice” to “the liability rules of intermediary service providers in Articles 12 to 15” – the eCommerce rules that govern notice and takedown. The problem is, many data protection experts say that the eCommerce “liability rules” are irrelevant, because the GDPR doesn’t technically hold intermediaries liable for the speech of a third party.   Following this argument, the “without prejudice” language has no practical consequence.  As long as this question is unresolved, intermediaries can’t be certain whether they can use existing eCommerce removal systems, or whether they must develop new tools to implement the troubling new removal process prescribed by the GDPR.  Putting faith in the simpler interpretation of Article 2.3, and assuming it excuses an intermediary from following the specific rules described in the GDPR, is an expensive gamble.

     

  • If eCommerce rules do not apply, then RTBF removals must follow specified steps and processes, which encourage erasure without adequate protections for Internet users’ expression and information rights. A controller that gets a RTBF erasure request must follow specific steps.  Some are enumerated in scattered GDPR sections.  Others are implicit, based on regulators’ interpretation of similar language in the old Directive.  Those interpretations warrant fresh consideration and debate, though, as the GDPR expands their impact on Internet information access.


    The GDPR’s erasure process generally makes sense for traditional, pre-Costeja erasure requests – like when a user wants to delete behavioral tracking or profiling information, or content that she herself uploaded.  But applying that same process to erase a third party’s online expression is a problem, because it has almost no effective checks on over-reaching or malicious removal requests.

o   Intermediaries must immediately take content offline when they receive an erasure request, and keep it offline until they figure out whether the request is legally valid.  (Art. 17a.1.a and c)  For example, if someone claims that online information is inaccurate, the intermediary must “restrict” the information by taking it offline “for a period enabling the controller to verify the accuracy of the data.” (Art. 17a.1.a)  That kind of factual investigation of user-generated content is not something intermediaries are likely to attempt.  Information restricted under this remove-then-verify standard is unlikely to ever be reinstated. There are powerful policy arguments, as well as legal arguments based on fundamental rights, against this presumption of guilt for online speakers.  Those arguments should prevail.  But Article 17a won’t make it easy.

What changed: Prior language on “restriction” was confusing and scattered.  The final text in 17a cleans it up in ways that eliminate arguable exceptions or defenses from older drafts, but those were never strong anyway.

o   Intermediaries must decide what to erase without any meaningful guidance about how to weigh the rights at issue.  Hopefully regulators will consult with stakeholders, including civil society groups and experts on free expression and information, to develop concrete guidelines -- like the very thoughtful ones they previously published for search engines.

o   The person whose online expression is erased is not notified or given an opportunity for defense.  The GDPR does not spell this out, but leaves intact the language that regulators already interpreted to preclude notice in the search engine context.  Under that interpretation, intermediaries can sometimes consult with the publisher before erasing the content, however.

o   The intermediary seemingly must disclose the online speaker’s identity to the person requesting erasure.   I think DPAs or courts will find ways interpret this part out of existence.  It’s seriously inconsistent with the GDPR’s pro-privacy goals.  Applying it to RTBF content removals was probably a drafting mistake.  The GDPR language on this is fuzzy enough that smart data protection lawyers should find a way around it, if they want to.  It says that the “data subject shall have the right to obtain from the controller” certain information, including “where the personal data are not collected from the data subject, any available information as to their source.”  Art. 15.1(g).  There is similar language at Article 14a(2)(g).

o   There is no specified form for requesting erasure or required information that data subjects must include to explain why erasure is justified.  This, too, can and hopefully will be remedied in guidance from regulators.

o   Most removals are to be processed in a one month turnaround time.  (Art 12.2)  That seems reasonable in most cases.  For companies experiencing rapid growth who don’t yet have legal teams dedicated to such things, it may lead to rushed and inaccurate removals decisions.

  • The GDPR says data protection must be balanced with free expression, and that RTBF requests can be denied on free expression grounds.  (Art. 17.3, R. 3a)  But provides no guidance on what those free expression grounds are. Instead, it leaves Member States to enact specific free expression protections sometime in the future.  (At. 80)  There are a number of problems with this approach.  For one, some countries still haven’t put in place the free expressions protections they were supposed to enact twenty years ago under the last Directive. So, don’t expect them to create harmonized and adequate protection for Internet users’ speech anytime soon. 

    What changed in the final draft: The core text in Article 80 has changed a little, not in ways relevant for intermediaries.  It appears to me that protections for research and archival uses have become more robust there and throughout the GDPR, though.  My hat is off to whoever got that done.

     

  • GDPR obligations apply to a huge number of non-EU companies – even more than in some earlier drafts.  The GDPR’s “territorial scope” section applies the GDPR to companies outside the EU if they process data about people within the EU in connection with “offering goods or services” or “monitoring” user behavior.  Art. 3.2.   Recitals to the GDPR put some useful limits on the scope of the “offering” basis for jurisdiction over foreign companies. (R. 20)  But “monitoring” appears to cover a broad swath of tracking, profiling, and customization commonly done by Internet companies around the world. (R. 21, Art. 4.3aa)  So unless they block or differentiate services for EU users, those companies need to think hard about their obligations under the Regulation overall – not just for RTBF.

    What has changed in the final draft: A lot more companies are covered. Unlike some prior drafts of Article 3.2, the final language covers data processors outside the EU, not just controllers.  That extends the GDPR to a lot more companies.  The new ones – non-EU data processors – have fewer direct obligations under the law, but will probably need to change their contracts with controller companies, and assume new duties under those contracts.  This is above and beyond the changes already underway to address data transfer issues because of the Schrems case.  I have not tracked processor duties closely, but expect plenty of good analysis on this from corporate law firms with big data protection practices, like Hunton & Williams or Wilson Sonsini

    Recitals in the final draft also evince a real frustration with claims that EU law does not reach the foreign corporate parents of subsidiaries established in the EU, saying “legal form of such arrangements, whether through a branch or a subsidiary with a legal personality, is not the determining factor” for determining jurisdiction under Article 3.1. (R 19, 28)

Add new comment