Blog Posts: Filtered

Broad Consequences of a Systemic Duty of Care for Platforms

In a previous post, I described the growing calls for what I called a “systemic duty of care” (SDOC) in platform regulation. I suggested that SDOC requirements would create difficult questions in ordinary intermediary liability litigation. By encouraging or requiring platforms to review user content or exercise more control over it, SDOC laws would change courts’ reasoning in cases about YouTube’s liability for a defamatory video, for example. Read more about Broad Consequences of a Systemic Duty of Care for Platforms

Systemic Duties of Care and Intermediary Liability

Policymakers in Europe and around the world are currently pursuing two reasonable-sounding goals for platform regulation. First, they want platforms to abide by a “duty of care,” going beyond today’s notice-and-takedown based legal models to more proactively weed out illegal content posted by users. Second, they want to preserve existing immunities, with platforms generally not facing liability for content they aren’t aware of. Read more about Systemic Duties of Care and Intermediary Liability

Intermediary Liability 101: An Update for 2020

I've had a lot of positive feedback for the Intermediary Liability 101 slides I shared back in 2018, so I thought I'd post these updated ones now. They are based on a deck I presented to a European policymaking audience last month. Their focus tilts toward European examples -- but many of the issues captured here are universal. This version also has a longer section toward the end listing emerging issues and ideas (again, with a European lens).  Read more about Intermediary Liability 101: An Update for 2020

The CJEU’s new filtering case, the Terrorist Content Regulation, and the future of filtering mandates in the EU

This blog post will briefly discuss the ruling’s relevance for future EU legislation, and in particular for the Terrorist Content Regulation. TL;DR: Glawischnig-Piesczek does not discuss when a filtering order might be considered proportionate or consistent with fundamental rights under the EU Charter. It only addresses the eCommerce Directive, holding that a monitoring injunction is not “general” — and thus is not prohibited under the Directive — when it “does not require the host provider to carry out an independent assessment” of filtered content. This interpretation of the eCommerce Directive opens the door for lawmakers to require “specific” machine-based filtering. But it seemingly leaves courts unable to require platforms to bring human judgment to bear by having employees review and correct filters’ decisions. That puts the eCommerce Directive in tension with both fundamental rights and EU lawmakers’ stated goals in the Terrorist Content Regulation. Read more about The CJEU’s new filtering case, the Terrorist Content Regulation, and the future of filtering mandates in the EU

Pages