Global Content Regulation and Jurisdiction: Who Decides?

Policymakers around the world are showing renewed interest in the rules that govern Internet information flow across national borders. New regulation may not sound like good news to those concerned about online information access and free expression, but it’s worth a hard look at what will happen without it: national courts will decide how to regulate content outside their borders. Courts aren’t well-equipped to shape national policy that touches not only on free expression rights but on foreign relations and national IT infrastructure. But that is effectively what they are being forced to do in cases about online content that violates national law. If we believe in democratic process, we should welcome a change of forum for these issues – toward non-judicial branches of government, and ideally toward serious discussion within transnational institutions to set well-considered, rights-based guidelines for these difficult cases.

Courts around the world, hearing claims ranging from privacy to trade secret infringement, are being asked to make meaningful changes to the way Internet users access information. In the most extreme cases, plaintiffs want content removed globally. A Canadian court affirmed a global removal order of this sort last week, saying that BC courts have jurisdiction to require Google to remove search results from its services for every country in the world. The French Data Protection Authority is now asserting similar power. Global removal orders raise a lot of hard questions – some familiar to courts from other jurisdiction cases, but many quite novel. When should the forum country’s law regulate information access for the rest of the world? Is the forum country’s foreign policy a factor when a court ruling undermines the sovereignty of lawmakers and courts in other countries? If the forum country asserts this power over other countries, does it concede that other countries’ courts can regulate content access in the forum country? Should the court stay its hand out of concern about creating a “lowest common denominator” Internet, subject to the sum of all countries’ content laws?*

Other cases ask courts to compel foreign Internet defendants to use geotargeting or IP address blocking, so users in the forum country can’t see particular content. That’s what some people thought the Article 29 group was asking Google to do for Right To Be Forgotten removals: just make sure that Europeans could not access the unexpurgated (or less expurgated) versions of search results. IP blocking orders avoid the global censorship questions, but raise thorny new ones. Does the harm averted by the forum country’s IP blocking order in one case offset long-term harm from further balkanizing the Internet? Is blocking Internet access to foreign businesses consistent with the forum country’s trade policies? What are the costs to online speech and innovation as the web becomes gradually less world-wide in nature? If the forum country can re-architect the Internet to isolate its citizens from banned content, is it so wrong for China or Iran to do the same? Would architectural barriers to online “travel” take us back to a world where only the wealthy can experience other legal content regimes, by physically traveling to them – and do we want that?

These are hard questions. Without a clearer shared policy framework, a courtroom is not the right place to ask them. Judges are understandably reluctant to be guided by “pure policy” considerations, which can leave them looking to black-letter jurisprudence developed largely in a pre-Internet age. Courts also often have before them a sympathetic local plaintiff with a legitimate grievance and a difficult path to relief; and a defendant who is either a remote wrongdoer or a technical Internet intermediary, lacking relevant information and perhaps even incentive to defend content created by a user. It should not come as a surprise when many courts in this situation conclude that they have jurisdiction to grant relief by ordering global removal or IP blocking. And every assertion of jurisdiction makes it easier for the next court to do the same.

It is time to move these conversations to the right forums. The recent statement on cross-border data flows from the Council of Europe is an important start. The European Commission’s prominent inclusion of this issue in the Digital Single Market strategy is another. These questions should not be answered in the context of a single case before a judge, without considered input from other branches of government. Nor should they be answered by national institutions – such as DPAs -- specialized in just some of the many issues in play. Governments with strong national interests in content restriction – China, Iran, Saudi Arabia – have been crafting policy in this area for a long time, with Russia and Turkey not far behind. Countries and institutions with strong commitments to speech and information rights for citizens need to catch up, and engage policymakers with real expertise to identify national priorities and guide courts. Leaving this question to case-by-case analysis without such a framework is the wrong approach.



* Bonus questions for all you jurisdiction nerds: Does it matter if the law at issue is relatively consistent across national borders (counterfeiting, say) or relatively diverse (privacy, election campaigning)? Does it matter if the disputed content is illegal in the “site of harm” country but legal in the “site of publication” country? What if defendant did not intentionally target the forum country? Does it matter if the forum country’s law creates only a “soft conflict” by removing more content than a foreign country’s law would require, versus a “hard conflict,” by actually violating rights protected by that country’s law? Does it matter if a defendant Internet company complies with forum country’s law on a nationally targeted version of the service, but follows other countries’ laws on other versions? Where defendant is an intermediary, does it matter if relief might be available against the actual content creator? Does the answer depend at all on that intermediary’s willingness to hand over, to plaintiff or prosecutors, private user data about the content creator?



Add new comment