What Does the DSA Say?

People keep asking me what the EU’s new Digital Services Act (DSA) says. So far, I have not found overview materials that seem like the right match for people unfamiliar with the EU legal and policy landscape. So here is my own very quick and dirty rundown.

This is not legal advice and it probably has some inaccuracy in the details. That’s both because some final points are not publicly known yet, and because I wrote this quickly and could have made mistakes. I will update it if corrections come in. I also did not try to summarize the DSA articles that are primarily about allocation of EU institutional power and enforcement capabilities.

 

What is the DSA?

 

The DSA is a once in a generation overhaul of EU law governing intermediaries’ handling of user content. It builds on the pre-existing eCommerce Directive from 2000, and preserves key ideas and legal structures from that law. The closest U.S. law analog is the DMCA. All three laws (DMCA, DSA, and eCommerce Directive) specify categories of immunized intermediaries, with immunities that may be lost if intermediaries learn about illegal content and do not act to remove it. (The devils in the details are legion, of course.) Unlike the earlier EU law, the DSA unifies many rules at an EU-wide level, with an EU-level regulator. Individual EU countries will continue to have their own speech laws (like what constitutes defamation), and national courts will surely reach different interpretations of the DSA. Still, platform obligations should generally become more consistent.

Europe has had knowledge-based platform liability for illegal content for decades. Don’t let anyone tell you that this aspect of the DSA is new. As a result, platforms operating in the EU typically operate notice and takedown systems roughly like the DMCA’s for illegal content of all varieties. The DSA takes this pre-existing model and supercharges it. It adds a LOT of new process and regulatory rules, and also adds substantial new rules for platforms’ voluntary enforcement of their own Community Guidelines or Terms of Service (TOS) for user content.

For companies that handle user content, the DSA is something like the GDPR. It adds new compliance and process rules that will need new staffing, new internal tools, new external user interfaces, and new formal legal interactions in Europe. For Internet users, researchers, and platform critics, the DSA creates a range of new legal protections and tools for understanding or shaping platform behavior.

 

Where does the DSA stand procedurally?

 

A final draft was announced this week, but we don’t yet have a public copy. This final version is the product of a “trilogue” reconciliation process, ironing out differences between earlier Commission, Council, and Parliament drafts of the law. Those earlier versions were largely similar in the big picture and in most of the smaller points, so for those wanting more detail this earlier draft is a decent source. (It’s also formatted for easy navigation using Google Docs’ left nav bar.) For those who want more recent language, the best sources are the leaked “four column drafts” from the trilogues. Those are harder to obtain and can be painful for even dedicated wonks to follow, though.

 

What the DSA Says

 

The DSA applies to numerous Internet intermediary services. It provides both immunities and obligations. Many of its specific rules apply only to services in specific categories (access, caching, hosting, and marketplace providers, for example). A last minute compromise brought search engines into scope, but largely left it to future courts to ascertain when search engines fit under one of the DSA’s enumerated categories, and thus what rules apply.

Much like the GDPR, the DSA asserts significant jurisdiction over companies based outside the EU. It reaches services “directed” to EU Member States. (Art 2) It allows enforcers to assess extremely steep fines, in principle reaching up to 6% of annual revenue. (In practice, I wouldn’t expect fines of that magnitude absent serious platform intransigence.) It also sets up major new regulatory powers within the European Commission, many details of which will be hashed out later.

The DSA will come into force January 1, 2024 for most companies, but apparently could start sooner for the very largest ones – the “Very Large Online Platforms” or VLOPs, which have at least 45 million monthly active users in the EU. The specific obligations vary based on the company size and the kinds of service it offers. An RA and I tried to capture this in a chart here. I have not seen an estimate of the number of total entities regulated under the DSA, but my guess is it may run into the hundreds of thousands. That’s a very rough estimate, extrapolating from the number of entities that historically registered for DMCA protection in the US, and from the UK government’s estimate that its own DSA-esque law will cover some 24,000 entities.

The DSA’s provisions fall into two general buckets: (1) prescriptive compliance obligations for most intermediaries, and (2) major new regulatory mechanisms for the VLOPs.

  

Prescriptive Compliance Obligations for Most Intermediaries

 

A lot of the DSA spells out specific operational obligations, mostly related to content moderation. These rules make it relatively clear what platforms are supposed to do, but they will require significant time and effort for smaller entities to hire, train, devise new UIs, and so forth to come into compliance. The obligations vary somewhat by size and function, and in some cases platforms are exempted if they have fewer than 50 employees and under EUR 10 million annual turnover. Medium-sized enterprises, defined as having up to 250 employees and EUR 50 million turnover, will have all of these obligations, but apparently will have extra time to come into compliance.

Obligations that relate to the DSA’s core concern with content moderation include:

 

  • Responding to government-issued orders to remove content or disclose user data (Art. 8-9)
  • Putting a lot of information about content moderation in Terms of Service, and notifying users of changes (that last part could get burdensome for platforms, and annoying for users, if it is not interpreted flexibly). (Art. 12)
  • Publishing transparency reports (Art. 13, 23, etc.)
  • Building mechanisms for users to notify platforms about prohibited content (Art. 14). The DSA does not prescribe turnaround times for response to notices. However, despite significant civil society opposition, the final draft apparently suggests a period of 24 hours for illegal hate speech.
  • Notifying users when their content has been taken down or otherwise acted against, providing an internal appeal mechanism, and engaging with and paying for alternative dispute resolution (!!) if users disagree with outcomes. (Art. 14, 15, 17, 18) Platforms can, and perhaps must, terminate users who repeatedly violate the rules or file abusive takedown requests. (Art. 20)
  • Engaging with and creating special channels for government-approved “trusted flaggers” in each EU country to notify platforms of prohibited content. (Art. 19)
  • Notifying law enforcement of suspicion of serious crimes (Art. 21 or 15a)
  • For marketplaces, extensive new duties vetting vendors and providing information to users. (Art. 22, 24a-c)
  • Controversial “crisis protocols,” allowing EU Commission officials to require content removal under to-be-determined rules in crisis situations. These are primarily for VLOPs, but seemingly may also apply to smaller platforms. (Art. 37)
  • Having a point of contact and a legal representative in Europe (Art. 10-11)

 

The DSA also speaks to some ongoing tensions in intermediary liability law by reiterating the EU’s longstanding (but evolving) prohibition on “general” monitoring obligations, and specifying that platforms’ voluntary efforts to find and remove illegal content should not cause them to lose immunity. (Art 6 and 7)

A few of the DSA’s new obligations, including some added later in the legislative process, are less directly tied to content moderation. Some of these are less clearly prescriptive, and will likely require more legal judgment calls in interpretation.

 

  • Providing users with information, accessible directly from the UI, about ads and recommendations (Art. 24, 24a)
  • For porn sites/apps, additional content moderation obligations (24b)
  • Designing to avoid “dark pattern” nudges to shape user behavior or choices (Art. 13a)
  • No targeted ads to known minors, or to anyone based on sensitive personal information such as health or sexual orientation unless the user affirmatively provided that information. (This was negotiated to the last minute, is a big deal, in many ways has more in common with GDPR than the DSA, and is 100% worthy of real legal consultation for any affected business.)  (Art. 24)

 

Major new regulatory mechanisms for the biggest platforms

 

The biggest “VLOP” platforms have additional obligations. These are in many cases less prescriptive, and more about creating institutional governance systems to handle evolving risks. VLOPs include any platform with over 45 million monthly active users in the EU. I think that’s maybe a couple dozen platforms, the top few of which are much, much bigger than the rest. As I recall, the EU’s initial Impact Assessment document had a list of expected VLOPs (if people want to check). The list will likely have lengthened since then owing to expansive user-count methodology in later DSA drafts.

 

VLOPs are responsible for:

  • Annual formal risk assessments, risk mitigation plans, and third party audits (Art. 26-28), and resulting reporting to regulators
  • Ongoing engagement with regulators, who may effectively shape platform practices through feedback on risk mitigation plans; facilitating the development of “voluntary” industry standards and codes of conduct (Art. 34-36); and various enforcement powers including investigations, requests for information, on-site inspections, the imposition of “interim measures” and ordering platforms to “take the necessary measures to ensure compliance” with Commission requirements under the DSA. (Art. 50-58).  
  • Appointing compliance officers with special obligations and resources. (Art. 32)
  • Paying a yearly fee of 0.05% of their global revenues to fund the new regulator (reported, I think not yet in public drafts)
  • Providing vetted researchers and regulators with access to internal data (Art. 31)
  • Labeling deep fakes (Art. 30a)
  • Removing content in emergencies in compliance with crisis protocols (Art. 37)
  • Some other important obligations that are analogous to, but more expansive than, related obligations in earlier, non-VLOP-specific parts of the DSA
  •      Explaining recommender systems and allowing users to choose non-personalized versions (Art. 29)
  •      Maintaining public repositories of information about ads (Art. 30)
  •      Publishing transparency reports more frequently and with some additional information about risk assessment, mitigation, and audits (Art. 33)

 

Some of these obligations were extended to “Very Large Search Engines” late in the DSA process, but I am not yet sure which ones.

 

Conclusion

The DSA is a massively important new law, on par with the GDPR and DMA. Usually, I would expect to see multiple summaries of this sort freely available by now, mostly from law firms in search of clients. Presumably now that I’ve spent the afternoon writing this up, those will finally appear. I hope this is useful in the meantime!

Comments

Add new comment