Intermediary Liability and User Content under Europe’s New Data Protection Law

Cross-posted to the Internet Policy Review News & Comments and Inforrm blogs.

A big new law is coming, and a lot of companies doing business online aren’t going to like it.  Neither will many advocates of civil liberties for Internet users. Europe’s pending General Data Protection Regulation (GDPR) updates and overhauls EU data protection law – the law that produced this week’s Schrems case and last year’s “Right to Be Forgotten” ruling in the EU. Data protection has long been a field considered arcane and impenetrable by many US lawyers.  Most businesses and other entities outside Europe have rightly paid little attention in the past because the law seemingly did not apply to them.  But if draft GDPR provisions circulating at this near-final stage in the lawmaking process are enacted into law, that’s about to change.  Companies that previously fell outside data protection jurisdiction, including those with minimal ties to Europe, are being brought within its scope.  For many, compliance will entail meaningful costs in money and engineering time.  And online companies that deal in content – whether as creators and online publishers or as technical intermediaries – may find themselves receiving unprecedented erasure demands from European citizens or regulators.  Going forward, if users around the world find their Facebook reminiscences about European acquaintances disappearing – or can’t find tweets about individuals who settled fraud allegations with the FTC – this law will likely be the reason.

The GDPR is in many other respects a very good law. Europe already provides more robust legal privacy protections than many countries, including the US; this will make those protections even stronger and advance global norms around the privacy rights of Internet users.[1]  And it should surprise no one that European lawmakers, angered by the Snowden revelations and the US government’s lackadaisical response, want more control over what personal data leaves Europe and how it is protected and safeguarded.  But the GDPR has many other consequences, intended or unintended, for free expression, innovation, and the cost of doing business on the Internet.  Those deserve much more public discussion than they are currently getting.

Over the coming months, I will be unpacking these elements of the GDPR in a series of blog posts.[2]  My focus will mostly be on how the law affects Internet intermediaries – and through them, users’ ability to receive and impart information using the Internet.  Some aspects I discuss, like jurisdiction and the “Right to Be Forgotten,” will be important for other kinds of online entities as well. The series isn’t about privacy under the GDPR, and it won’t focus on data protection law governing collection and use of user data in logs or other back-end storage systems.  Great coverage of privacy aspects is available from public interest groups, law firms, and other sources.

A major goal of this series is to foster better conversation between data protection experts and practitioners focused on other parts of Internet law -- particularly intermediary liability and free expression.  My own background is in Internet law.  I am not a data protection lawyer.  In my previous role as Associate General Counsel for Google, I had an immersive real-world education in data protection, most recently in relation to the CJEU’s “Right to Be Forgotten” ruling in Costeja.  But there are other areas of data protection law where I am a relative novice.  My hope is that data protection practitioners, as well as other Internet law mavens, will leave comments here or otherwise reach out with feedback, including criticism.  These posts will later be aggregated in a single publication, which will be greatly improved by your comments.

A brief background on data protection law, intermediary liability, and the GDPR

The law of data protection is generally very foreign to US lawyers.  But some version of it exists in many countries around the world, not just in Europe,[3] and provides important rights to citizens.  Data protection is enshrined in the EU Charter of Fundamental Rights as a right distinct from privacy: a broad right to limit processing of all information relating to oneself, not just information that invades personal privacy.   Where it conflicts with other fundamental rights, including rights to receive and impart information, the rights at issue must be balanced.  The 1995 Data Protection Directive sets out a detailed framework for the data protection right, including specific legal grounds for entities to process personal data.  It also establishes regulatory bodies for enforcement.  National and sub-national Data Protection Agencies (DPAs) are the primary enforcers, and have ongoing relationships with many regulated entities.  For most Internet companies, the foremost data protection issue has been, and will continue to be, the backend processing of data about users – maintaining account information, for example, or tracking behavior on a site.

The law of intermediary liability limits and defines the legal responsibility of technical intermediaries for content posted online by third parties.  In the US, key intermediary liability laws are the DMCA for copyright and CDA 230 for defamation, invasion of privacy, and most other concerns.  In the EU, intermediary liability is governed by Articles 12-15 of the eCommerce Directive, as implemented in the national laws of Member States.  Protected intermediaries generally have no obligations to police, and no liability for unlawful user content until they know about it.[4]   To comply with these laws, intermediaries operate notice and takedown systems to remove content when notified that it violates the law.  In theory intermediaries should only remove user content if the notice is correct and the content actually is illegal – but intermediaries often delete content based on inaccurate or bad faith accusations, leading to over-removal of Internet users’ lawful speech.[5]

Historically, many lawyers have not drawn a connection between data protection and the law of intermediary liability.  The two fields use very different vocabularies, and are for the most part interpreted, enforced and litigated by different practitioners.  A lawyer who views an issue through the lens of intermediary liability and one who views the same issue through the lens of data protection may have trouble even understanding each other’s concerns. 

But if the two fields were ever really separate, the CJEU’s 2014 “Right to Be Forgotten” ruling in the Costeja case changed that.  The court ruled that Google had to de-list certain search results when users searched for the plaintiff’s name.  It prescribed what is effectively a notice and takedown system to remove search results, but arrived at this remedy through the language and logic of data protection – with no reference to Europe’s intermediary liability rules.[6]  Costeja follow-on cases will likely force lower courts to grapple more directly with questions about how the two areas of law fit together.  Even as those cases progress, however, EU legislators are overhauling the governing law by replacing the Data Protection Directive with the pending GDPR.

Legislative Process for the GDPR

The GDPR has been in the works since January 2012, when the European Commission proposed a comprehensive update and reform of the 1995 Data Protection Directive.  A number of drafts from different EU governance bodies have been released since.[7]  (This discussion does not distinguish between drafts except where differences are relevant.)   The GDPR is now in a final “trilogue” process, in which remaining differences will be resolved.  One announced timeline put finalization as early as December, though such deadlines often slip.  The law will come into force two years after its publication date.  Because it is a Regulation rather than a Directive, it will not have to be implemented as separate legislation in each member state of the EU.  Rather, it will automatically go into effect.  The GDPR covers a lot of ground, with provisions addressing everything from data portability, to coordination between national DPAs, to company codes of conduct and appointment of data protection officers.  A good summary of the process and overall issues as of June is here, and a substantive Q&A from the European Parliament is here.

There is a chance that some of the sound and fury around the GDPR will come to nothing, if provisions of the GDPR are obviated by other sources of law – such as one of the pending trade agreements with the US, or laws arising from the EU’s new Digital Single Market (DSM) initiative.  This possibility of preemption could explain why trade and business groups have been relatively unengaged with the GDPR.  But the DSM process is in its infancy, and trumping the GDPR through a trade agreement seems like a long shot.  European lawmakers do not seem disposed to make major concessions to the US right now on issues of privacy and data protection.  And to the extent that US trade negotiators are seeking such concessions, their priorities may not lie with the issues I identify here.

Final passage of the GDPR will not necessarily answer the questions raised in this series about intermediaries and user access to information.  Practitioners have significant unresolved differences about how certain points in the 1995 Directive should be interpreted; the GDPR probably won’t change that.  Existing drafts are unclear on some key points, and seem likely to remain so – there can be good reasons for negotiators to choose constructive ambiguity, leaving room for DPA or court interpretation after the law is enacted.  The upshot is that we will not necessarily see expert consensus on everything the GDPR means, and what parts of the law it has changed, even once its language is finalized.

Ambiguous drafting, intentional or not, will likely leave room for litigation and policy battles about the GDPR’s impact on Internet intermediaries and user free expression.  But it is clear that overall the Regulation moves the needle in a troubling direction for online innovation and civil liberties.  It extends jurisdiction to a vast new group of Internet companies, imposing burdensome regulatory obligations on companies that have never heard of this law.  It extends “Right to Be Forgotten” content erasure requirements, applying European legal standards to require deletion of content that is legal in other places.  By the same token, it puts decisions balancing European users’ speech and privacy rights into the hands of foreign technology companies, instead of national courts.  And it tilts the playing field for the people whose rights are affected: it expands rules and institutions to vindicate privacy rights, but has no countervailing increase in resources or legal channels to protect speech and information rights.  These issues merit much closer consideration before the GDPR is finalized and brought into effect.

 

Frequently Asked Questions and Rough Answers

 

1. Core Questions About the GDPR and Intermediary Liability

Later blog posts will address these topics in more depth.

Q: What entities outside Europe will fall under GDPR jurisdiction?

A:  A lot.  The GDPR asserts jurisdiction over entities that offer services to or “monitor” EU users.  “Monitoring” seems to be defined broadly enough to include fairly standard web and app customization features, so the law reaches many online companies outside of the EU.  In practice, regulators presumably will not prioritize or dedicate limited resources to policing small and distant companies.  But the GDPR will be an issue for companies with growing EU user bases and presence in Europe; and regulators can choose to enforce the law against many more entities around the world.

Q: What’s this about Controllers and Processors?

A: These are key terms under existing data protection law and in the GDPR.  Regulated entities are generally classified as either Controllers or Processors.  Distinct legal obligations flow from that classification.  Controllers are, roughly speaking, entities that hold personal data and decide what to do with it.  Because they are the decision-makers, they have more obligations under the law – including compliance with erasure or “Right to Be Forgotten” requirements.  Processors hold personal data, but follow instructions from a controller about what to do with it.  Their legal duties are correspondingly fewer.  In a simple example, a firm that holds records about its employees is a controller of their personal information; if it outsources payroll operations under contract with a payroll company, that company is a processor.  The CJEU’s determination that Google acted as a controller in operating web search was a key holding of Costeja.  More on the controller/processor distinction is here.

Q: What about the “Right to Be Forgotten”?

A: It’s not going away.  In the GDPR, it is currently enumerated as a right to “Erasure.”  In recent drafts it has been a right exercisable only against data controllers, not data processors.  That would mean Google web search still has to do these removals.  There is room for debate about the obligations of other Internet intermediaries, such as Twitter.  Content providers can also be required to honor “Right to Be Forgotten” removal requests, but under different substantive standards for determining what to remove.

Q: Does the GDPR clear up whether intermediaries can rely on intermediary liability “safe harbors” or notice and takedown systems under the eCommerce Directive when they receive an erasure request?

A:  I don’t think so.  But there will be disagreement on this.

Q: How does the GDPR directly address free expression?

A: Article 80, which in most drafts is titled “Processing of personal information and freedom of expression,” requires Member States’ laws to include exemptions and derogations protecting speech and information rights.  That’s a lot of pressure to put on national law, which historically has varied widely in its protection of such rights.[8]  More troublingly, some drafts would offer exceptions only for the “processing of personal data carried out solely for journalistic purposes or the purpose of artistic or literary expression.”  (EDPS Art. 80)  In other words, if the work is for some other purpose, or if it has a mixed purpose, the exceptions would not apply.  

For intermediaries processing third-party data, free expression is also relevant, though in ways that can be hard to pin down in practice.  The legal basis for intermediaries’ processing in the first place is often that the processing serves “legitimate purposes.” (Art 5.1(b)) When an intermediary declines to honor a removal request on free expression grounds, the GDPR provision invoked is one that references only “legitimate interests.” (Art 6.1(f)) While undefined, such legitimate purposes and interests clearly include expression and information rights.  But the GDPR and existing law provide scant detail on how to assess these interests – this was one common critique of the Costeja ruling.  And important questions about whose interests may be considered – which come up in litigation about content removal – are not always addressed well in GDPR drafts.  For example, one draft provision allows controllers to decline to remove content based on “legitimate interests pursued by the controller, or by the third party or parties to whom the data are disclosed[.]” (6.1(f) EDPS)  Under this formulation, the interests of the speaker – the user whose content is indexed, transmitted, or hosted – fall out of the analysis.  Data protection law’s lack of detailed provisions for free expression made more sense in an era when regulated entities were assumed to be banks, employers, medical offices, and the like.  Today, inattention to the unique role of Internet intermediaries in GDPR drafting will likely lead to more removals of lawful expression – and more litigation.

Q: If parts of this law are unclear, who decides what it means?

A: It will take a while.  Initial layers of review will typically come from data protection regulators, rather than courts.  In the first instance, DPAs – largely staffed by career civil servants specialized in data protection law – will answer most questions.  Issues that affect more than one country will be resolved via important and hotly contested new “One Stop Shop” and Cooperation Procedure provisions.  Difficult questions or disagreements among national DPAs will be addressed by a new European Data Protection Board created by the GDPR, which effectively replaces the existing EU-wide Article 29 Working Party.  Entities which disagree with regulators’ interpretation of the law can eventually go to court (or the complainant can go directly to court instead of the DPA), so in the long term we will see court opinions on the hard issues.  But they may vary from country to country and even from case to case within a country – particularly in civil law countries.  The really hard and consequential questions should eventually bubble up to the Court of Justice of the EU (CJEU) or possibly the European Court of Human Rights (ECHR).

 

2. Important and Complex Questions from Experts

These are all hard questions I have heard from experts in Brussels.  They will not get extensive treatment in this series, but they matter a lot in the long term. Feedback regarding these questions is especially welcome; there is more to be said about all of them.

Q: Aren’t the eCommerce Directive and GDPR already aligned, because any intermediary that is passive enough to qualify for eCommerce protection will also qualify as a Processor for data protection purposes?

A: This question comes up because of some approximate parallels between the eCommerce and Data Protection directives.  Intermediaries lose protection under the eCommerce Directive if they are too “active” in handling user-generated content, as opposed to being “passive” and “neutral.”  Similarly, under data protection law, an entity that determines for itself how to process personal data is deemed a “controller” with significant legal obligations including data erasure; while an entity merely following a controller’s instructions about how to process data is a “processor” with fewer obligations.   There are parallels between the two classification systems: the more discretion you exercise in managing third party data or content, the more responsibility you have.  So it would be theoretically possible that the only entities that have content erasure obligations as controllers under the GDPR are ones that fall outside eCommerce Directive protections anyway – in other words, that all data protection processors are passive intermediaries protected by the eCommerce Directive, and all controllers aren’t. 

But the law doesn’t say that now, and there’s good reason it shouldn’t in the future.  There are already instances where an intermediary has been deemed a controller in data protection cases, but found to be protected by the eCommerce Directive in content removal cases. More broadly, law about “passive” and “active” eCommerce intermediaries is a moving target.  Court rulings provide widely diverging interpretations in different cases and in different countries.  More fundamentally, the animating policy goals of data protection law and intermediary liability law are sufficiently different – and the scope of unrelated issues each much address sufficiently broad – that it seems unlikely the two would ultimately arrive at identical classification regimes.

Q: What does the GDPR have to do with freedom of expression?

A: Some thoughtful data protection experts honestly see no free expression concerns with the law, despite its strong new language requiring erasure of information.  Erasure is only required after consideration of relevant legitimate interests, including interests in free expression and access to information, so – one could reason – protection of free expression is built in. 

One problem with this analysis is the documented tendency of intermediaries to avoid risk and transaction costs by simply removing any challenged content.[9]  Putting removal decisions in the hands of technology companies – as opposed to, say, content creators or national courts – is a recipe for over-removal of lawful expression.  Another is that procedural details in the GDPR’s removal and review process tilt the playing field in favor of privacy rights, and make users’ free expression rights harder to vindicate.  A final problem is that different countries have very different laws balancing free expression against other rights, including privacy or data protection.  Content that self-evidently should be removed in Europe may be protected and lawful speech in the US and other countries.  Applying EU removal standards to content in those countries creates a free expression issue for Internet speakers and readers there.

Q:  The erasure provisions of the GDPR aren’t about liability, so how can they affect intermediary liability?

A: I’ve heard this question a couple of times from smart data protection lawyers, and I’m not sure I quite understand it.  But here’s a shot.  I think the point may be that the erasure requirements function like injunctive relief, they don’t create liability in the sense of exposure to monetary damages. Assuming that is the argument, there are several possible responses.  One is about terminology.  The term “intermediary liability” is used by practitioners as shorthand for an array of obligations intermediaries have toward third party content, including notice and takedown.  So any law creating a removal obligation for intermediaries falls in the category of “intermediary liability” law.  (We could really use better terminology.)  Another answer is that given the new, high financial penalties for GDPR noncompliance, an intermediary risks serious financial consequences for not taking content down – even if the intermediary believes the law does not require removal.  The same may be true under current law, according to the Vidal-Hall ruling.  A third answer is that parts of the GDPR seemingly create liability for intermediaries even when they are unaware that they are processing content unlawfully. [10]  Such a departure from the eCommerce Directive’s knowledge standard would be a sea change for intermediary liability, and make the operation of open platforms for users to receive and impart information a much riskier business.  

 

3. One Question I Hope Other People Are Asking About Free Expression and The GDPR

I have not seen much evidence of librarians and archivists following the GDPR, and provisions affecting them are outside the scope of this series.  I would be interested in seeing any analysis others have on this issue.

Q: Does the GDPR have other consequences for free expression and information access, aside from the Internet issues discussed here?

A: Yes.  The stand-out issue to me is the GDPR’s treatment of archives and research.  I am not expert in this field, but I hope those who are have been tracking the GDPR and communicating with decision-makers in Brussels.  The GDPR appears to whittle away at archival uses in a number of ways.  For example, Article 83 in most drafts permits use of personal data for “historical, statistical or scientific research” only if it is impossible to conduct the research using non-identifying information.  Given the expansive definition of “personal data,” and the cost for libraries or researchers to strip out anything that meets the definition, this would appear to impose significant costs on normal and valuable research.  Many exceptions and derogations permit archival “public interest” uses only if they are specifically listed in national law.  For example, one draft says Member State law must ensure that archival data “cannot be used in support of measures or decisions affecting specific individuals, except for those measures or decisions that are specifically foreseen in Member State law.”  (EDPS draft Art. 83a) It is easy to imagine scenarios where a use unanticipated by the State has real societal value: aggregation of third party personal data to decide if a specific individual should be charged with professional negligence, for instance, or receive extra educational support, be protected from certain allergens in the workplace, etc.  Unless all these scenarios are foreseen and enumerated by every national legislature in the EU, the outcome appears troubling for researchers – and the people whose lives are made better by their work. 

Another possible threat to archives comes from provisions about “further processing” and the purpose limitation for data processing.  (Art. 5 and elsewhere.) Different drafts take notably distinct approaches to the situation in which personal data that was collected for one purpose is later used for a new purpose.  The issue is rightly important and contentious because of concerns about companies getting user permission for one purpose, and then using the data for other things.  Without careful drafting, though, this could also affect archives and research institutions, where information collected for one purpose can often be a treasure trove for researchers pursuing new questions.  It is not clear if this concern is fully represented in GDPR discussion.

------

[1] One important new provision of the GDPR establishes user rights to data portability, for example. (Art. 18)

[2] Many thanks to Neal Cohen, who reviewed this work with a data protection practitioner’s keen eye.  Remaining mistakes are my own.

[3] The title of Professor Graham Greenleaf’s article on point is telling:  Global Data Privacy Laws: 89 Countries, and Accelerating.

[4] Protection varies with the nature of the service.  Providers in the “mere conduit” category do not have knowledge-based removal obligations; and all intermediaries can lose legal protection if they are too actively involved in managing content.

[5] See, e.g., Jennifer Urban and Laura Quilter’s 2006 review of DMCA removals and Daniel Seng’s more recent work in the same area; Bits of Freedom’s study of Dutch intermediaries and Oxford PCMLP’s study on UK and US intermediaries; and Rishabh Dara’s detailed study of over-removals by Indian intermediaries.

[6] Some argue that Costeja de-indexing should not be called “removal,” because it leaves the same results available for different queries. In intermediary liability parlance, this kind of partial suppression of content would still be called a “removal.”  Search engines are not expressly covered by the eCommerce Directive intermediary liability provisions, but many national courts or laws have protected them as intermediaries.  The protection of any intermediaries with respect to data protection-based content removal requests is further complicated by eCommerce Directive Article 1.5(b), which some argue excludes that content from the immunities created by Articles 12-15.  Miquel Peguera addresses the connection between the two Directives in more depth here

[7] Winston and Strawn has a good, long list of links to drafts and commentary, mostly from government and business sources.  A pdf and app comparing major drafts are available from the European Data Protection Supervisor here.

[8]  Cambridge Professor David Erdos’s detailed survey of current national law concludes that “many Member State laws have clearly failed to provide for an effective balance between data protection and freedom of expression in the media sphere.” (P. 3)

[9] See note 5.

[10] For example, if a user uploads information about another person’s criminal record.  (Art.  9, 9a.)

 

Comments

Excellent article. Thanks a lot for that. I have question. What do you mean with that:
"For example, one draft provision allows controllers to decline to remove content based on " legitimate interests pursued by the controller, or by the third party or parties to whom the data are disclosed[.]" (6.1(f) EDPS)"
I did not understand this argument (maybe only because of language problems)... Thanks in advance.

I am curious if this will require marketing agencies to purge their European data related to names and email address (used in sending e-newsletters and promotional materials) after a specific time frame as well?

Marketing agencies are sure to be affected, and depending on current activities/equipment/etc. in Europe may already be covered by current data protection law as well. Law firms like Hunton & Williams that have large data protection practices tend to have good information about questions like this on their websites.

Energy, resources and innovation similar to that invested in developing ever-improving tracking / profiling technologies in order to monetize the Internet should be dedicated to developing technologies to address global data privacy challenges facing data subjects, companies and governments. There's no reason why all parties can't ‘win' in this situation – protecting individual privacy rights while enabling trusted use of data in compliance with EU and other international and domestic laws. See data scientist report submitted to the FTC on October 9, 2015 on this subject at http://anonos.com/images/Oct_9_2015_Sean_Clouston_Submission_To_FTC

I agree -- there should be great possibilities for technological innovation to protect privacy.

will hosting a companies site in europe solve most problems? german providers anyway are cheaper than their U.S. competition.

From one perspective, yes, but that entails signing up for compliance with the regulation (conceivably there are some odd exceptions). So you have to want that.

Thank you for raising some really interesting issues in this piece Daphne - they are as you say definitely deserving of much wider debate. Just a few immediate thoughts. As regards jurisdiction, it is a moot point whether the GDPR will expand this. That would depend on the meaning of the "use of equipment" which results in jurisdiction under the current Directive. The Spanish DPA famously argued that in having a robot simply going to a EU-based server and taking information off it, Google search engine was indeed using equipment within the EU. Whilst that wasn't determined one way or another by the Court of Justice, it remains a possible interpretation of the current law. If so, many other entities with no real foothold in Europe and technically caught by the current laws. Enforcement, however, is weak. This might change to an extent under the GDPR. Secondly, there needs to be more discussion on the interface between the e-Commerce shields and data protection. The current Directive essentially states that it applies irrespective of e-Commerce shields. This would imply that, similarly to any other controller, Google search engine (even if it is an intermediary in e-Commerce terms which remains debatable) should be proactively ensuring compliance with data protection without waiting for requests from individuals. That would clearly impose enormous - and at some point clearly disproportionate - burdens on Google. Drafts of the new GDPR would maintain this language but also state the converse, namely, that the GDPR should be read without prejudice to the e-Commerce Directive. The relationship between these two legal regimes, therefore, remains rather murky. The EU should certainly take the opportunity to clarify what it means by all this before signing off on the new law.

Thank you, David. I'm happy to say I thought about both those things (whew!). On jurisdiction, maybe what this really does is move us from a lighter to darker shade of gray? It seems to me that "equipment" jurisdiction had a lower chance receiving such a broad interpretation by the courts in the long run, and "monitoring" jurisdiction has a much higher chance. The robots argument always seemed silly to me, but that doesn't stop arguments from being adopted. And I suppose for non-search index entities, there is the cookies argument for equipment jurisdiction. But I still think it is meaningfully weaker than the "monitoring" argument DPAs will have under the GDPR. I'll try to work this in for the rewrite.

The eCommerce thing I thought was in a footnote, maybe I took it out. I will definitely expand on it in a later section. You won't be surprised that l argue strongly that Controllers still get Art 12-15 protection, but I know many smart lawyers disagree.

Thanks again, really appreciate the feedback.

Add new comment