Solving Data Protection Problems with eCommerce Directive Tools

Cross-posted to the Internet Policy Review News & Comments and Inforrm blogs.

This is one of a series of posts about the pending EU General Data Protection Regulation (GDPR), and its consequences for intermediaries and user speech online.  In an earlier introduction and FAQ, I discuss the GDPR’s impact on both data protection law and Internet intermediary liability law.  Developments culminating in the GDPR have put these two very different fields on a collision course -- but they lack a common vocabulary and are in many cases animated by different goals.  Laws addressing concerns in either field without consideration for the concerns of the other can do real harm to users’ rights to privacy, freedom of expression, and freedom to access information online.

Disclosure: I previously worked on "Right to Be Forgotten" issues as Associate General Counsel at Google.


In my previous two posts about the GDPR, I identified serious problems with its notice and takedown process, and resulting threats to Internet users’ rights to free expression and access to information. The legal framework of intermediary liability provides a lens for identifying these problems.  It also offers a set of ready-made tools to address them.  Lawmakers could and should take advantage of these tools to improve the GDPR. 

The cleanest and simplest solution to the GDPR’s notice-and-takedown problems comes from existing law under the EU’s eCommerce Directive.  That body of law could govern removal of user content by intermediaries, leaving intact the GDPR’s current provisions for deleting back-end data companies collect and store about user behavior. More ambitiously, GDPR drafters could craft a new and better process. European lawmakers could take either approach without undermining other important data protection goals or provisions of the Regulation.


Existing EU legal and policy resources could vastly improve the GDPR’s notice-and-takedown process.

            1. Applying existing eCommerce Directive law directly to the GDPR

Existing law under the eCommerce Directive provides the most obvious and simple way to sweep away the problems created by the GDPR’s current takedown process.  The GDPR could state clearly that its erasure obligations, for intermediaries processing third-party content, are subject to Articles 12-15 of the eCommerce Directive .[i]  Those articles cover enumerated activities such as hosting or caching user-generated content.  Unlike the current GDPR, they protect online expression by only requiring  intermediaries to remove unlawful content once they know about it, typically following notice and takedown processes. 

Internet content removal laws under Member State legislation or case law implementing the eCommerce Directive were designed for precisely the situation an Internet intermediary encounters when faced with a “Right to Be Forgotten” erasure request.  Their purpose is to balance rights of aggrieved parties seeking removal of online content or links on the one hand, and the rights of other Internet users to share or access information on the other.  Of course, existing laws are far from perfect.  The widespread over-removal documented in academic studies illustrates the need for improvement in those laws as well.  But they are worlds better than the GDPR’s current provisions, and they bring to bear the right set of considerations about rights and responsibilities of multiple parties on the Internet.

Invoking the eCommerce Directive within the GDPR is also a clean solution as a drafting matter. With a few new sentences, the GDPR could eliminate the thicket of ill-suited rules for intermediaries, without changing removal processes that work for back-end content collected by Internet companies about users.[ii] As I will discuss below, this can be done without any change to the substantive privacy protections defined by the GDPR in its “Right to Be Forgotten” provisions and elsewhere.


            2. Crafting new GDPR removal processes consistent with intermediary liability principles

Of course, another option is to craft a new process that incorporates proportional protections for online expression.  This would be challenging in the time remaining, but if GDPR experts wanted expert input about notice and takedown, they would not have to look far.   The European Commission has developed considerable internal expertise in precisely this area in recent years.  As part of the 2012 Notice and Action Initiative, the Commission conducted a lengthy public consultation.  The resulting staff working document provides a thorough and nuanced review of notice and takedown law and practice in Europe, and discusses concerns raised by stakeholders including free expression advocates.  The Commission is delving into the topic again through the Digital Single Market project.  

Europe also has a number of well-established civil society organizations that have thought hard about the nuts-and-bolts procedural aspects of content removal.  Article 19 has a concrete and sophisticated model for notice and takedown – which looks nothing like the GDPR.  La Quadrature du Net has also published extensive commentary and concrete recommendations for notice and takedown, and in its early responses to the Costeja case called for regulatory limitations to protect free expression.  These groups and others could provide thoughtful input.


Improving GDPR notice and takedown to protect online free expression would not harm the privacy rights protected by other parts of the GDPR.

Either of these approaches –invoking the eCommerce Directive, or inventing a better removal process – could be carried out without undermining the GDPR’s other achievements for data protection and privacy.


1. Protecting users’ rights to delete data tracking their online behavior

First, improved notice and takedown rules need not have any effect on rights or processes for deleting the other kinds of personal data held by Internet companies.  Much of the GDPR is designed for this important, separate purpose – giving data subjects legal erasure rights with respect to the stored, back-end data that companies hold about their online behavior.   The GDPR’s removal process seems designed for this pure user-to-business, two-party interaction.  Applying it to the very different situation that arises when one Internet user wants to delete content posted by another is dangerous to online expression, for the reasons I set out in my second post.  But using this single set of rules for both situations is a drafting choice, not a necessity. Drafters could invoke eCommerce law or other improved provisions for content notice and takedown without changing provisions for back-end data erasure at all.


            2. Protecting the “Right to Be Forgotten”

Second, content removal process issues can be separated from the substantive scope of the “Right to Be Forgotten.”  European lawmakers could decide that this right is very broad, and most user erasure requests should be granted; or they could decide the opposite.  That decision should not affect, or be affected by, the procedural rules for implementing an erasure request.  Well-crafted processes remain important to protect whatever content does fall outside of the “Right to Be Forgotten,” and to prevent its being unfairly targeted and removed from the Internet.

Procedural protections are especially important because the rights and remedies created by the GDPR will be around for a long time, and affect a broad and evolving Internet ecosystem – not just the large and well-resourced companies that appear in current headlines. Some of those companies, including Google, allocate considerable resources in an effort to avoid over-removal of content under intermediary liability law.  Processing requests carefully and rejecting the ones they believe are legally unfounded is, in my opinion, an important service to users.  But it is not behavior that should be taken for granted in crafting laws of general application.  The law should not incorporate any assumptions that all intermediaries will put effort into avoiding over-removal, or even that the ones doing it now will do it forever.


            3. Companies’ voluntary removals of lawful content

Finally, processes for content removal under the law can, and in this case should, be considered separately from processes companies use for discretionary content removal under their own community guidelines or policies. The two kinds of content removals pose important and related questions – about rights, procedure, and transparency in particular.  Comparison may be fruitful in other contexts.  But only one kind of removal, the one compelled by law, is being decided in the next six weeks under the GDPR.   The tools to improve protections for lawful online expression are readily available, drawing on existing intermediary liability law and models put forth by civil society groups.  Lawmakers should use them.




Public discussion of the GDPR has understandably been dominated by topics more traditionally associated with data protection, such as the data transfer provisions thrown into the spotlight by the recent Schrems case.  There has been very little public discussion of the Regulation’s notice and takedown provisions.   But principles for notice and takedown have been extensively discussed, debated, and passed into law in the field of intermediary liability.  By invoking the protections European laws create in that context, lawmakers can fix these serious problems with the GDPR while still achieving its data protection goals.


[i] A possible formulation would be, “where a data subject seeks erasure of personal data under Articles 17 and 19 from a controller that is processing data provided by a third party pursuant to its function as an intermediary protected by Articles 12-15 of the eCommerce Directive, procedures for requesting and carrying out the erasure shall be governed by Member State law implementing those Articles of the eCommerce Directive.”

A shortcoming of this formulation is that it leaves intact other nagging problems with treating internet intermediaries as controllers under data protection law. Certain existing data protection obligations, including limitations on the processing of “sensitive” data categories such as health information, would, if truly applied to intermediaries’ processing of user-generated content, effectively make normal operations impossible.  The GDPR maintains these, and adds more requirements that sit poorly with the function of Internet intermediaries.  For example, the requirement that companies notify data subjects at the time of collecting data about them from third parties (Art. 14.3) would be very difficult for intermediaries to comply with, since an intermediary does not know when user-posted content includes personal data about another individual.  Other revisions, invoking eCommerce law broadly for intermediaries with respect to their processing of user-generated content could solve this class of problems.

[ii] As Miquel Peguera discusses masterfully in a forthcoming article, data protection enforcers have themselves wrangled with the peculiarity of treating intermediaries as data controllers or processors under the law.  The Article 29 Working Party in 2008 recommended special treatment for search engines for this very reason.  It distinguished personal data that a search engine collects from users from personal data included in indexed, third-party content, and said that for the latter, the “formal, legal and practical control the search engine has over the personal data involved is usually limited to the possibility of removing data from its servers.”  The CJEU’s Costeja ruling, similarly, identified notice and takedown as the locus of Google’s obligations as an intermediary.



"The GDPR could state clearly that its erasure obligations, for intermediaries processing third-party content, are subject to Articles 12-15 of the eCommerce Directive .[i] Those articles cover enumerated activities such as hosting or caching user-generated content. Unlike the current GDPR, they protect online expression by only requiring intermediaries to remove unlawful content once they know about it, typically following notice and takedown processes." For art. 12 of ECD "Mere conduit" -- do I understand correctly that this would be subject to a court decision, not a notice, right?

Treating search engines as controllers - like the upcoming EU General Data Protection Regulation (GDPR) and the European Court of Justice in the Google ./. Spain case do - leads to absurd results. Searching on Google for "Daphne Keller" for example is processing personal data by Google. If Albert Miller would search for Daphne Keller on Googel, nder Art. 14 GDPR Google will be obliged to inform the data subject (in this case Daphne Keller) that Albert Miller searched for her on Google. But then, Google will also have to inform Albert Miller that they informed Daphne Keller about the fact that Albert Miller was searching for Daphne Keller. If Albert Miller would search for "John Doe broken leg" there would not be a legal ground for this search because "broken leg" is health-related sensitive data and under Art. 9 GDPR processing sensitive data is justified only with data subject's consent. Absurd examples like this one can find numerous in the GDPR...

Add new comment