Law in the Internet Society

The New "Supreme Court" for Content Moderation

-- By CeliaDiederichs - 07 Dec 2019

Facebook's vast freedom of self-regulation over its content gave rise to multiple attempts to restrain the networks power. State Courts, specifically within the European Union, intensified their review of Facebook's decisions on content moderation and scrutinized Facebook's governance over content deletion. As an attempt to soothe concerns, Facebook responds with a plan to establish something "like a Supreme Court." Zuckerberg's description of the company's new "Oversight Board" invites oneself to question whether or not Facebook is merely mirroring the judicial name system or instead, eyeing at the prospect of establishing its own and exclusive judiciary review for claims for or against content deletion on Facebook.

Escaping Scrutiny for an Inability of Meeting Constitutional Obligations

Facebook is currently experiencing a strict practice of public inspection over its practices and a restricting the freedom of governance the public forum of the 21st century has enjoyed. For example, a decision by Germany's Federal Constitutional Court binds private actors in extraordinary circumstances to constitutional rights almost to the same extent the state is bound to guarantee its citizens defensive mechanisms to superior governmental power. This decision concerns the issuing of exclusion orders from football stadiums. If transferred onto social networks, the limitation of the domiciliary right of a football club translates to an obligation placed upon Facebook to govern the network's content in a way that guarantees each user their constitutional right to free speech.

But what is the basis for the "if transferred"? Show how the logic of the court justifies the transference. Otherwise, everything that follows rests on an unsupported implication.

Zuckerberg claims not to believe that companies like Facebook "should be making so many important decisions about speech on our own." Facebook's responsibility as assigned by the Federal Constitutional Court in Germany is immense. Essentially, the most damaging aspect of this obligation is the immense risk of error and public scrutiny. With governments now looking at Facebook to make these decisions, Facebook must now search for a way to avoid scrutiny in the way they are made.

Or, the parties are engaged in a dialogue designed to reach some other form of consensus. It would be sensible at least to consider this form of analysis. It turns out to help with some issues that are otherwise difficult for you.

One route of escape from judicial scrutiny may be found in mechanisms of alternative dispute resolution, specifically in the form of arbitration. This requires the arbitrability of the subject matter, a valid arbitration agreement and the constitution of an arbitral tribunal. Binding its users to arbitrate claims of false content deletion would allow Facebook to evade judicial control and to adjudicate issues of content moderation behind closed doors.

More likely, it would permit them to offer what most of the governments think they want, or are at least demanding, which is faster action. Outside its own dispute resolution fora, that might give rise to claims of negligence in operation that the internal fora can be trusted to avoid.

The most prominent example of successfully claiming arbitration as an alternative to judicial control is the Court of Arbitration for Sport. The Court of Arbitration for Sport decides disputes amongst athletes and sports organizations in such a way that judicial control is limited to violations of public policy. Where the legitimacy of the Court of Arbitration for Sport was questioned, Courts deemed the institution and the arbitration agreements legitimate. Whether or not Facebook could fulfill these requirements on an international level remains questionable, but not impossible. The effect would be immensely valuable to the company. Instead of comprehensive jurisdictional review, national courts would be limited to reviewing cases for violations against public policy.

Why Must We Care

The prospect of arbitration itself is not necessarily a scary one. Arbitration is well embedded into the international legal order and acknowledged as legitimate and unbiased forum for dispute resolution by most jurisdictions. Some even see Facebook's move as one creating "accountability and transparency in content moderation." In this light, why should efforts be put in place to prevent Facebook from establishing its Court of Arbitration for Content Moderation?

Content moderation features one decisive particularity that makes Facebook's sole control over this legal issue a dark prospect. Content moderation touches upon the bedrock of every democratic society, namely the freedom of speech. As a constitutional matter, freedom of speech is inherently related to the unique constitutional identity of each jurisdiction. On what standard of free speech would an arbitral tribunal or Facebook's Oversight Board determine the limits of free speech?

Why should it have to? Have we also transferred to it that governmental responsibility? Not under US law, surely.

There is no global standard of free speech for Facebook to interpret or enforce over its user community. Countless judicial decisions on free speech prove the intricacy of the distinctive historic, societal and political limitations to free speech in each jurisdiction. Drawing the line between illegal defamation and permissible critique in such a manner that democratic values of a pluralistic society suffer no loss is a matter that must remain subject to full jurisdictional control.

Zuckerberg presents Facebook's Board of Overview as a genuine attempt of fulfilling the need of protecting victims of hate speech or sexual exploitation in the pluralistic Internet society. Undeniably, these are pressing needs. Nonetheless, a solution to them must not entail an obliteration of the freedom of speech as guaranteed within the individual constitutional order of each jurisdiction. Social networks are not apt to conclusively assess the limitations of free speech in a global Internet society as the company will not be able to represent global policy perspectives on a spectrum wider than any law may reach. Recent European legislation fining Facebook for a reluctance to delete harmful posts in time offers a guideline for cases of doubt on content moderation. Where excessive deletion avoids a financial sanction: when in doubt, Facebook will delete. This aligns with the societal phenomenon of cancel culture. Allowing the company to evade strict judicial review of its decisions would ultimately lead to the concession of one of the most prestigious human rights and the abundance of what is a necessity to a free democratic society.

What Can the Board of Overview accomplish?

For now the Board of Overview is a long way from having the characteristics of an arbitral tribunal, and Facebook has decided that the board will not consider cases of potentially illegal content. But even as currently planned, the Board allows Facebook to hide from real democratic oversight by outsourcing content moderation to a seemingly independent adjudicative body.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r2 - 11 Jan 2020 - 15:47:15 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM