Computers, Privacy & the Constitution

View   r4  >  r3  ...
AbdullahiAbdiFirstPaper 4 - 28 Apr 2022 - Main.AbdullahiAbdi
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"

Big Tech Poses a Threat to Privacy and Free Expression

Changed:
<
<
-- By AbdullahiAbdi - 12 Mar 2022
>
>
-- By AbdullahiAbdi - 12 Mar 2022 [Updated 28 April 2022]
 
Added:
>
>

Introduction

 Technology companies have somehow managed to create a perceived sense of control over how masses communicate over the internet. While technically anyone on earth can stand up a webserver and create communication platforms, the reality is that giant private technology companies such as Facebook are becoming governors of the marketplace of ideas. They control how those who use their platforms may speak. This greatly threatens freedom of expression and speech.
Line: 14 to 16
 Since I won’t be able to delve into all the aspects and the threats of the pervasive surveillance and how big tech restrict free expression, in this paper, I will explore how Facebook’s enforcement of its content moderation rules curtail free expression.
Changed:
<
<
Facebook’s moderation policy
>
>

Facebook’s moderation policy

 Freedom of expression is a core human right protected under international law and in domestic law systems around the world. It is a core right, encompassing the right to receive, hold, and share opinions, that is essential for democratic governance. But increasingly big social media technology companies are restricting this right through content moderation policies. Moderation policies are a set of rules and practices used by tech companies to regulate content shared on their platforms.
Line: 22 to 28
 There are three major problems associated with Facebook’s moderation policies and practices. First; there is overreliance on automated algorithms leading to incorrect removals. Second; there is often no clear appeal process for negatively impacted individuals and thirdly; there is undue government manipulation of Facebook leading to censorship.
Changed:
<
<
Before I joined Columbia Law School, I worked on a project that documented how Facebook deleted several accounts of Somali journalists for allegedly violating Facebook’s Community Standards. All the journalists whose Facebook accounts were disabled said they did not understand why their accounts were shutdown. They were not also given prior warnings and were not afforded an opportunity to appeal against the removal of their accounts.
>
>
Before I joined Columbia Law School, I worked on a project that documented how Facebook deleted several accounts of Somali journalists for allegedly violating Facebook’s Community Standards. All the journalists whose Facebook accounts were disabled said they did not understand why their accounts were shutdown. They were not also given prior warnings and were not afforded an opportunity to appeal against the removal of their accounts.
 My investigation revealed two main reasons as to why these accounts were deleted by Facebook.
Deleted:
<
<
Use of automated algorithms
 
Changed:
<
<
First, some of the journalists whose accounts were deleted were flagged through automated algorithms. AutoŽmated removals have long been critiŽcized as more susceptŽible to error than human reviewŽers. These mistakes particŽuŽlarly affect FaceŽbook users outside westŽern counŽtries since Face-book’s algorithms only work in certain languages, and autoŽmated tools often fail to adequately account for context or politŽical, cultural, linguistic, and social differŽences.
>
>

Use of automated algorithms

First, some of the journalists whose accounts were deleted were flagged through automated algorithms. Automated removals have long been criticized as more susceptible to error than human reviewers. These mistakes particularly affect Facebook users outside western countries since Face-book’s algorithms only work in certain languages, and autoŽmated tools often fail to adequately account for context or political, cultural, linguistic, and social differences.

 
Changed:
<
<
One way Facebook is trying to address this problem is by employing content moderators to conduct the review, but recent media reports revealed that even where Facebook employ people to do this work, the staff are overworked and unable to make individualized decisions on these contentious and complex issues in the time afforded. For example, in one of its facilities in Africa, Facebook required its staff to moderate hours and minutes long videos and other content and make decisions within one minute.
>
>
One way Facebook is trying to address this problem is by employing content moderators to conduct the review, but recent media reports revealed that even where Facebook employ people to do this work, the staff are overworked and unable to make individualized decisions on these contentious and complex issues in the time afforded. For example, in one of its facilities in Africa, Facebook required its staff to moderate hours and minutes long videos and other content and make decisions within one minute.

Government interference

 
Deleted:
<
<
Government interference
 The other reason is government officials reporting critical journalists and individuals to Facebook.
Line: 40 to 50
 In removing these accounts—which often provided the only source of independent information to people in Somalia where the press is severely restricted by the government —Facebook was essentially facilitating state censorship.
Deleted:
<
<
International human rights standards
 
Changed:
<
<
International human rights standards require private companies to ensure that human rights and due process considerations are integrated at all stages of their operations. The UN Guiding Principles on Business and Human Rights for example, requires all business companies including those in the technology sector like Facebook to respect human rights.
>
>

International human rights standards

International human rights standards require private companies to ensure that human rights and due process considerations are integrated at all stages of their operations. The UN Guiding Principles on Business and Human Rights for example, requires all business companies including those in the technology sector like Facebook to respect human rights.

Civil society groups have been thinking of ways to address this phenomenon and have come up with possible solutions. The 2018 Santa Clara Principles, for example, are a civil society charter that outline minimum standards for companies engaged in content moderation and sets out five foundational principles including (1) human rights and due process considerations; (2) application of understandable rules and polices; (3) integration of cultural competence; (4) recognition of risks of State involvement in content moderation; and (5) integrity and explainability of the policies. The charter also sets out three operational principles that should guide moderation policies. These include, more transparency in content moderation—requires companies to publicly report the actions taken by companies to restrict or remove content—; give notice to individuals whose content is moderated; and to establish and appeal process for anyone who feels aggrieved as a result of a moderation policy.

 
Changed:
<
<
Civil society groups have been thinking of ways to address this phenomenon and have come up with possible solutions. The 2018 Santa Clara PrinŽciples, for example, are a civil sociŽety charter that outline minimum standŽards for companŽies engaged in content moderŽaŽtion and sets out five foundational principles including (1) human rights and due process considerations; (2) application of understandable rules and polices; (3) integration of cultural competence; (4) recognition of risks of State involvement in content moderation; and (5) integrity and explainability of the policies. The charter also sets out three operational principles that should guide moderation policies. These include, more transparency in content moderation—requires companies to publicly report the actions taken by companies to restrict or remove content—; give notice to individuals whose content is moderated; and to establish and appeal process for anyone who feels aggrieved as a result of a moderation policy.
>
>

Recommendations to Facebook

 
Deleted:
<
<
Recommendations to Facebook
 Since Facebook is an important forum of mass communication and its policies are impacting many people across the word, there should be some transparency mandate about their policy formation. Such mandate should require Facebook to share more information with the public, researchers and government on how they make content moderation policies.

Revision 4r4 - 28 Apr 2022 - 19:04:13 - AbdullahiAbdi
Revision 3r3 - 27 Apr 2022 - 22:18:59 - AbdullahiAbdi
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM