Thennicke CC BY-SA 4.0, via Wikimedia Commons
In March, a gunman slaughtered worshippers in two Christchurch mosques, live-streamed his crime, and became an Internet sensation. People around the world watched in horror as the news broke – and as platforms like YouTube and Facebook struggled to control new uploads of the murderer’s sickening video. For many, this crystalized an already-growing sense that lawmakers must do something to restrain violent extremist content on Internet platforms. Australia has now done just that, with a new law requiring platforms to remove footage made by perpetrators of violence. But the law – which passed after less than a day of public review – doesn’t really set new rules of the road for platforms. Instead, it tells them to set their own rules. It gives them every incentive to take down not only perpetrators’ violent content, but also important public speech and information – including news coverage of events like the atrocity in Christchurch. In a law billed as a regulation of platforms, Australia has greatly expanded regulation by platforms.
The Australian law illustrates the risk of legislating for one part of the Internet ecosystem – platforms – without regard to the consequences for legitimate information providers and consumers that depend on them. News organizations in particular are deeply vulnerable to shifts in platforms’ rules for online content. Many depend on readers who arrive via Facebook or Twitter, and some have abandoned print or other offline distribution entirely. The Christchurch video complicated the press-platform relationship still further, as journalists struggled to balance their own duties to inform the public about the video’s existence -- and its continued availability on platforms -- with concerns about amplifying the killer’s message of hate. Editors’ responses to this dilemma varied widely. Some initially posted the footage in its entirety, much as their predecessors aired grim scenes of the September 11th or Paris attacks. Others opted to air only excerpts, or to describe the footage without sharing it. Ethical rules for journalists covering abhorrent but newsworthy material remain very much open to debate.
The Australian legislation ensures that news organizations’ own answers to such fraught questions won’t matter, though – at least not to the considerable portion of the public that get their news via Internet platforms. It charges platforms with deciding which news sources are using violent footage “in the public interest,” and taking down the rest. A platform that leaves up the wrong news report risks penalties of up to 10% of annual turnover, and its executives can face jail time. As a former platform lawyer, I can tell you how those incentives will play out in practice. All but the bravest or most risk-tolerant companies will err on the side of taking down news reports, substituting their own judgment for that of reporters and editors.
The law’s provisions for scientific and historical research are no better. Australia has effectively created a new quarantine system, in which only qualified researchers can gain online access to horrific but important primary source material. This kind of limitation has precedent. German libraries, for example, historically kept prohibited literature in a closely guarded Giftschrank or “poison cabinet.” Some legal systems might see a quarantine system as the right balance, containing the spread of toxic messages without purging the archives completely. But the Australian bill doesn’t attempt to define the new system. It doesn’t, for example, say which researchers should qualify for access. Instead, it outlines vague standards and outsources the decisions to platforms, who risk massive penalties for granting access to the wrong researcher.
This legislative decision seems willfully blind to private platforms’ role in modern archiving and research. Organizations documenting everything from police brutality in the U.S. to human rights violations in Syria have lamented their dependence on platforms like YouTube, and endeavored – with limited success – to find alternative means to preserve and share information. They remain highly vulnerable to platforms’ removal policies – as the non-profit Syrian Archive discovered when YouTube took down over 100,000 of the videos it had gathered as evidence of crimes in Syria.
Australia’s law applies the same quarantine approach to ordinary people’s political speech and participation. Platforms can let people advocating for change to “any matter established by law, policy or practice” share or view violent material if doing so is “reasonable in the circumstances.” That uncertain standard would appear to affect potential participants in public debates on matters ranging from gun-control to counter-radicalization to the Australian legislation itself. But we should not expect platforms adjudicating these people’s claims to strike the same balance that courts would. We should expect them to strictly limit access, if they permit it at all.
As lawmakers around the world consider their own responses to harmful content online, Australia provides a lesson in what not to do. But that doesn’t mean that legislators’ hands are tied. They have many ways to impose new obligations on platforms without making them substitutes for democratic government. A public agency or court could decide questions about research or journalistic content, for example, and then tell platforms what to take down. Or if platforms made their own legal judgments, the law could provide clear guidance and opportunities for after-the-fact public review. Human rights and legal experts have outlined numerous models for more responsible platform regulation, including mandating improvements to platforms’ own takedown processes. Lawmakers have no need to resort to blunt legal instruments when more tailored ones are at hand.
Australia is not the first democratic country to overhaul legal protections for speech and information rights in a moment of national trauma. The USA Patriot Act, passed in the U.S. after the September 11th attacks, is one of many precedents. But we should hope Australia will be the last to pass such unnecessarily clumsy Internet content legislation. In their efforts to govern platforms, lawmakers should not end up outsourcing the job of governing.