New EU rules on video-sharing platforms: will they really work?

The new EU Audiovisual Media Services Directive (AVMSD) has been officially adopted and published. it is now time for member States to start the process of incorporating its provisions into their respective legal and institutional frameworks. One of the main challenges regarding the transposition and further application of the AVMSD when it comes to video-sharing platforms (VSPs) will be to articulate and implement a proper distinction between decisions taken by these operators in the exclusive application of their own and internal Terms of Service, and measures adopted in their new role as enforcers of national laws in very important areas including protection of minors, hate speech or terrorism. The AVMSD establishes important differences in the two cases, in particular for users seeking regulatory or judicial review of a VSP’s decisions. This can trigger relevant interpretative conflicts and may negatively impact users’ rights, particularly the fundamental right to freedom of expression.  

 

In October 2018, I published a short piece focusing on national media regulators’ new legal responsibilities for overseeing content on privately owned Internet platforms. These duties, included for the first time in the Directive, particularly concern hate speech, child pornography, children’s physical and mental development, and terrorism. National authorities (mainly independent media regulatory bodies) are also given the responsibility to verify that VSPs have adopted “appropriate measures”, which could include revising and enforcing Terms of Service; having appropriate flagging, reporting, and declaring functionalities; implementing age verification or rating and control systems; establishing and operating transparent, easy-to-use and effective procedures to resolve users' complaints; and providing media literacy tools.

 

I noted that VSPs will play a fundamental role in determining the boundaries of legitimate political speech or the right to adopt and express unconventional social and cultural points of view. Although they already had responsibilities to take down known unlawful content as hosts under the EU’s eCommerce Directive, the AVMSD adds a series of additional responsibilities which will be subjected to additional administrative oversight.

 

In a recent and balanced response by my colleague Lubos Kuklis (who, apart from being a recognized media law expert, is the current chair of the European Group of Regulatory Authorities, known as ERGA), a more positive approach towards the legal text is presented. While acknowledging the fact that delegating regulatory powers to private companies could certainly run into trouble with the fundamental rights of the VSPs’ users (and with basic principles of constitutional democracies), Lubos also argues that the text of the Directive requires VSPs to protect users’ rights, and provides users with a wide array of safeguards, including rights to appeal, enforced by public authorities. Such safeguards will, he argues, ensure sufficient monitoring of VSP content regulation activities, protect users’ rights, and eventually redress any potential excess from private actors.   

 

As mentioned in my previous contribution, it is undeniable that the “appropriate measures” that the Directive requires VSPs to take should be seen as new safeguards for users regarding the private moderation of content uploaded using VSPs’ services, as well as useful tools to increase the transparency and accountability of these actors. 

 

This being said, I also believe that the noble objectives of the Directive, that is, to tackle illegal content online while protecting the right to freedom of expression online, are not properly served by these provisions. Moreover, they raise important concerns and introduce serious issues in terms of legal certainty and proper protection of fundamental rights.

 

One of the reasons mentioned by Lubos to justify the approach of the Directive (which can also be found in the Recitals) is the fact that inasmuch as VSPs and Internet platforms in general have become central players and facilitators of online speech, it is necessary to engage them in content moderation. That task could not, realistically, be exclusively conferred to public bodies as it is for broadcast and other traditional, regulated media. In any case, these private actors have already put in place and normally enforce their own Terms of Service, which typically incorporate content rules that basically cover the areas mentioned by the Directive. These rules are of private, self-regulatory, nature and they already present, in the absence of any statutory provision, an obvious impact on the way freedom of expression is exercised in the online world. 

 

The Directive enshrines as a legal obligation for VSPs the adoption of appropriate measures to protect several categories of users from content that is either illegal or harmful (in the case of provisions aiming a minors’ protection). In fact, VSPs Terms of Service often prohibit more or less the same content prohibited under legal provisions, including the categories now included in the AVMSD. The Directive now raises the issue of how these private, contractual rules relate (particularly in practical terms) to the rules created under public, democratically enacted laws. 

 

As previously mentioned, the Directive guarantees users a right to seek review, by courts or other public bodies, of particular decisions taken by platforms under the Directive affecting their right to freedom of expression. These mechanisms will most probably be operated by or under the supervision of current audiovisual media regulatory bodies. The Directive grants in any case access to judicial review to properly protect such rights. 

 

These newly created rights of review appear to be one of the Directive’s most important protections for free expression rights. However, there are serious questions about how effective they will really be – and about how often users will truly be able to seek review of platform decisions in practice: if it is assumed by everyone that traditional rule of law mechanisms cannot tackle the complexity and volume of content moderation decisions on online platforms, how can we expect that regulators and courts will be able to properly monitor and review users’ challenges to the huge amount of expeditious, almost automatic decisions that VSPs will now be adopting on the (additional) basis of their new legal responsibilities? Another issue relates to the transmission of expression and information across national borders. Do we expect individual content creators from outside the EU to be able to identify the competent authorities and the competent member States in order to challenge platform decisions affecting the visibility of their posts in Europe (at least)? Is it rather assumed that European consumers will be entitled and motivated to defend before their respective national authorities the legality of pieces of content they have been denied access to? 

 

Another tricky element has to do with the separation between legally mandated removals and those based on the VSP’s own Terms of Service. Depending on the case, a platform decision affecting a piece of audiovisual content may be based on the application of the Terms of Service only, or on the Terms of Service as a resultof the need to guarantee the application of laws through the former. The consequences, and users’ rights to seek review, are completely different in these two scenarios. According to the text of the Directive, only in the second scenario will review mechanisms be available. 

 

The most relevant challenge in this area would be, however, to properly identify which rules in the Terms of Service actually refer to legal obligations and which of them are exclusively of internalnature. According the Directive, this responsibility belongs prima facie to VSPs.  Considering the fact that affected users will be able to invoke the AVMSD’s external review mechanisms only when a VSP acts pursuant to legal obligations, VSPs may have reason to justify their content decisions on their internal rules only. Can public review mechanisms address disputes regarding the basis of a specific takedown? The AVMSD does not seem to exclude these “competence of the competence” issues from regulators’ purview, although in most cases, and due to the fact that legal and contractual content norms will be closely intertwined in the Terms of Service, it shall be particularly difficult to say whether a VSP truly acted based on the law or based on its own discretionary rules.   

 

Even in cases of application of legal content restrictions, and considering once again the amount of complaints that platforms handle and will need to manage within the new legal framework, it is fairly likely that external review mechanisms (including the judiciary) will also mostly accept the legalassessment of VSPs, without prejudice of the fact that such review bodies will be legally entitled to declare violations of user’s rights to freedom of expression, at least in unusual and notoriously unfounded cases. In this area, it needs to be particularly underscored that most of the content areas for which the Directive requires private content moderation, i.e. hate speech, terrorism, child protection and so on, do not have common or uniform legal definitions across the European Union. Under certain circumstances, the denial of the Holocaust is a hate speech crime in several countries of the Union, but it is also considered to be the lawful exercise of the right to freedom of expression in member States like Spain. Definition of what constitutes a terrorist offence also varies among European countries, as the current provisions set by the specific Directive in this matter have legitimately been incorporated in different manners into the different national legal systems. This means that VSPs may either need to consider the often-intricate differences between the legal concepts and systems of 28 different member States, or else apply the most restrictive standard – or adopt such a standard under their Terms of Service -- to avoid any possible responsibility. 

 

And here precisely lies the most important paradox that the application of the AVMSD provisions may trigger. When confronted with accusations that the application and enforcement of legal content restrictions may be “excessive” or go “beyond the law”, VSPs will be able to argue that, indeed, what they applied are their internal rules only, and therefore no external review mechanisms can be applied to the dispute. To mention a possible example, a platform decides to take down a video because it contains expressions that were labeled as hate speech. The author of the video challenges the decision, arguing that this was legitimate political speech, as it falls outside the narrow definition of hate speech in his or her own member State (which, let’s say, has also been endorsed by the European Court of Human Rights). The platform may easily argue that the content was taken down not on the basis of the hate speech limitations contained in the Directive and imposed on VSPs, but on their own and broader definition of the notion, which would in this case need to be respected by the users on the basis of their contractual relationship with the service provider. In this context, the decision of the platform would be fully legitimate and no possible external review mechanisms could in principle be applied (unless member States adopt a new jurisprudence recognizing the horizontal effect of human rights).

 

As has been shown, and despite the fact that the new AVMS Directive contains some relevant improvements, many questions remain to be answered and several concerns still need to be properly addressed. Tackling all these issues is now an important responsibility for member State legislators and regulatory bodies, as well as for broader organizations like ERGA or EPRA.

Add new comment