Computers, Privacy & the Constitution

View   r4  >  r3  >  r2  >  r1
ArgiriosNickasFirstPaper 4 - 11 May 2018 - Main.ArgiriosNickas
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Line: 18 to 18
 

The Structural Problem

Changed:
<
<
Corporations are legally driven by shareholder profit. The aforementioned millenials (and each similar, subsequent generation of adopters) are the demographic who most use social media and are thus the target demographic of social media companies. If a large, growing portion of the user base is okay with (or prefers) censorship then it may be in a company’s interest to censor. But, that can’t be the whole answer. If Facebook censors a liberal-leaning viewpoint, Facebook risks significant blowback from conservatives,
>
>
Corporations are legally driven by shareholder profit. The aforementioned millenials (and each similar, subsequent generation of adopters) are the demographic who most use social media and are thus the target demographic of social media companies. If a large, growing portion of the user base is okay with (or prefers) censorship then it may be in a company’s interest to censor. But, that can’t be the whole answer. If Facebook censors a conservative-leaning viewpoint, Facebook risks significant blowback from conservatives,
 
Changed:
<
<
Is It not the other way around?
>
>
fixed
 


ArgiriosNickasFirstPaper 3 - 10 May 2018 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Line: 18 to 18
 

The Structural Problem

Changed:
<
<
Corporations are legally driven by shareholder profit. The aforementioned millenials (and each similar, subsequent generation of adopters) are the demographic who most use social media and are thus the target demographic of social media companies. If a large, growing portion of the user base is okay with (or prefers) censorship then it may be in a company’s interest to censor. But, that can’t be the whole answer. If Facebook censors a liberal-leaning viewpoint, Facebook risks significant blowback from conservatives, and visa versa. If both sides are equal in their representation on the platform—which is the case—then, “the only winning move is not to play” because any censorship which benefits one party will be met with an equal and opposite negative reaction from the other. Unless, of course, no one cares (or one party cares less) that content is being censored at all. More likely, I think the problem is a lack of transparency regarding censorship. Other than the Facebook Files leak, which revealed general, internal Facebook censorship guidelines, we know very little about how Facebook and other social media sites actually censor (especially in response to user reports). Moreover, whatever information the leaked guidelines provide is limited: (1) the leak was last summer and guidelines can be changed without notice and (2) work-load volume: moderators receive millions of reports and must respond immediately—the job itself prevents consultation with the manual.
>
>
Corporations are legally driven by shareholder profit. The aforementioned millenials (and each similar, subsequent generation of adopters) are the demographic who most use social media and are thus the target demographic of social media companies. If a large, growing portion of the user base is okay with (or prefers) censorship then it may be in a company’s interest to censor. But, that can’t be the whole answer. If Facebook censors a liberal-leaning viewpoint, Facebook risks significant blowback from conservatives,

Is It not the other way around?

and visa versa. If both sides are equal in their representation on the platform—which is the case—then, “the only winning move is not to play” because any censorship which benefits one party will be met with an equal and opposite negative reaction from the other. Unless, of course, no one cares (or one party cares less) that content is being censored at all. More likely, I think the problem is a lack of transparency regarding censorship. Other than the Facebook Files leak, which revealed general, internal Facebook censorship guidelines, we know very little about how Facebook and other social media sites actually censor (especially in response to user reports). Moreover, whatever information the leaked guidelines provide is limited: (1) the leak was last summer and guidelines can be changed without notice and (2) work-load volume: moderators receive millions of reports and must respond immediately—the job itself prevents consultation with the manual.

  Because moderators are forced to react quickly, and their censorship largely does not undergo public scrutiny, mostly due to a lack of mandated (or even voluntary) reporting on user-initiated censorship, it’s inevitable that implicit biases (here, political) effect split-second decision-making. I think that’s how domestic censorship has played out in practice: that conservative positions are more often censored because social media company ownership and operation is homogenous and strongly left-leaning. But, regardless of whether I’m correct in my assessment of how censorship has partisanly played out, the core problem is worse: that companies have (and do exercise) the near-unilateral ability to curate content.
Line: 36 to 43
 Word Count: ~990
Added:
>
>
That depends. If the German regulatory approach actually results in very large fines being regularly imposed, the results will be different from the "terms of service" enforcement forms of private censorship on the platforms.

But the conflation between speech on the platforms and speech on the Net, or on the Web, is a serious limitation in the analysis. I speak on the Web quite a bit, and I have no accounts on any of the platforms. My own servers are in the Web just as much as they are, are indexed by search engines, and provide just as much "ability to speak" as the platforms, with none of their control. People who want to speak on the Net can do so, regardless of platform company behavior. The public force, on the other hand, can impose itself where the body of the speaker is. That difference is no narrower than the space between freedom and tyranny, it seems to me.

 

ArgiriosNickasFirstPaper 2 - 24 Apr 2018 - Main.ArgiriosNickas
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Line: 46 to 46
 
Deleted:
<
<
Addendum
 

Deleted:
<
<
This essay deals with two parts of the private censorship problem: (1) the symptoms and (2) the pathology/cause. I’d like to give some initial thoughts on the third part (3), treatment: perhaps (1) broadening the state action doctrine—private viewpoint discrimination (Ninth Amendment?) or; (2) bringing back some version of the fairness doctrine for social media platforms—limited to those who do not ostensibly claim to support a certain viewpoint over another and instead pretend to be neutral, or; (3) implementing other forms of transparency reporting requirements—how many user-initiated content flags are received? How many are actually removed? Ex: ‘Provide a random sample of 1000 or so removals and their justifications’ for the public to digest and assess any bias. The implications of these reforms and others would be interesting to look at. Where 67% of Americans and growing get at least some of their news from social media the lack of transparency regarding agenda/bias is astounding. These platforms aggregate an enormous amount of power in the hands of a few individuals (see Cambridge Analytica and your recent article in Times of India) and their use of that power deserves more scrutiny.
 -- ArgiriosNickas - 13 Apr 2018

ArgiriosNickasFirstPaper 1 - 15 Apr 2018 - Main.ArgiriosNickas
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstPaper"

Social Media Censorship: the Symptoms and Diagnosis

Censorship in Europe

In October 2017, the Netzwerkdurchsetzungsgesetz law came into effect in Germany. The law mandates removal of online hate speech on social media platforms by social media companies. Failure to comply is punishable by fines of up to 50 million euros. The European Commission considered following suit, but instead opted to continue censorship through a ‘voluntary’ agreement brokered in 2015. What exactly constitutes ‘hate speech’ under these and similar laws is anyone's guess, but recent examples may be cause for concern: the prosecution of French politician Marine le Pen for “distribution of violent images” for a tweet depicting graphic ISIS activities or the ban of a German satirical magazine after it parodied anti-Muslim sentiments.

Censorship in the U.S.

Unfortunately, it seems that social media companies in the United States, have started to adopt similar, but not governmentally-directed, policies: YouTube? temporarily banning far-right professor Jordan Peterson and demonetizing conservative content-creators (one recent, non-conservative demonetization controversy ended in gunfire); Google slashing the visibility of far-left websites like World Socialist Website; Facebook banning an account for posting, “[a]ll white people are racist. Start from this reference point or you’ve already failed” and; allegations of Twitter shadowbanning alt-right activists. The private censorship problem has both social and structural roots.

The Social Problem

Socially, U.S. politics is highly polarized. In a 2014 Pew Study, 27% of Democrats said they saw the Republican Party “as a threat to the nation’s well-being” and 36% of Republicans thought the same of the Democratic Party, more than doubling the rates in 1994. Since 2014, polarization has worsened: the antipathy has become more personal, increasing negative perceptions of opposing party members (as opposed to the counterparty entity). This coupled with social sorting—where one’s social circle, including on social media, is homogenous—creates a robust echo chamber. I think this hostile social climate has increased willingness to censor speech found disagreeable: as an offending speaker becomes more of a separable ‘other’ the perception of the value of that ‘other’s’ rights, here a ‘right’ to speak and be heard, decreases; conversely, the harm that a politically-aligned, similar listener experiences from such speech is emphasized. A recent Pew Study revealed that 40% of millennials support the proposition that the government should be able to prevent people from making statements that are “offensive to minority groups.”

The Structural Problem

Corporations are legally driven by shareholder profit. The aforementioned millenials (and each similar, subsequent generation of adopters) are the demographic who most use social media and are thus the target demographic of social media companies. If a large, growing portion of the user base is okay with (or prefers) censorship then it may be in a company’s interest to censor. But, that can’t be the whole answer. If Facebook censors a liberal-leaning viewpoint, Facebook risks significant blowback from conservatives, and visa versa. If both sides are equal in their representation on the platform—which is the case—then, “the only winning move is not to play” because any censorship which benefits one party will be met with an equal and opposite negative reaction from the other. Unless, of course, no one cares (or one party cares less) that content is being censored at all. More likely, I think the problem is a lack of transparency regarding censorship. Other than the Facebook Files leak, which revealed general, internal Facebook censorship guidelines, we know very little about how Facebook and other social media sites actually censor (especially in response to user reports). Moreover, whatever information the leaked guidelines provide is limited: (1) the leak was last summer and guidelines can be changed without notice and (2) work-load volume: moderators receive millions of reports and must respond immediately—the job itself prevents consultation with the manual.

Because moderators are forced to react quickly, and their censorship largely does not undergo public scrutiny, mostly due to a lack of mandated (or even voluntary) reporting on user-initiated censorship, it’s inevitable that implicit biases (here, political) effect split-second decision-making. I think that’s how domestic censorship has played out in practice: that conservative positions are more often censored because social media company ownership and operation is homogenous and strongly left-leaning. But, regardless of whether I’m correct in my assessment of how censorship has partisanly played out, the core problem is worse: that companies have (and do exercise) the near-unilateral ability to curate content.

Is Censorship 'good'? The marketplace of ideas counterargument

There’s an initially attractive argument that this censorship is net desirable because it’s the embodiment of the ‘marketplace of ideas’ and that society has made a judgment that this speech is not acceptable. But this argument presumes a free marketplace of ideas—where opposing ideas can fight it out in pursuit of some consensus—which social media is not: because of (1) politically-homogenous ownership and operation, with values leaking into censorship unintentionally or intentionally; (2) a lack of transparency regarding human and non-human (i.e. algorithmic) censorship decisions, largely barring public scrutiny, and; (3) monopolization, limiting competition (who might offer 'more free' platforms) through large barriers to entry, including critical user base mass.

The speed at which corporations can act to address ‘offensive language’ is also dangerous: the U.S. government is designed to be slow and tedious; corporations don’t face the same limitations on hasty (and often misguided) censorship.

Conclusion

So increased politicization, leading to increased perceptions of otherness and increased willingness to devalue dissimilar opinions and structural problems have started us down the European path. But there’s some reason to be optimistic: America is still the country most tolerant of offensive speech and free expression. But there’s also plenty of reasons not be (see above). Unfortunately, the only functional difference between our current private-censorship system and the EU’s is the entity defining the content to be censored, a distinction I’m not willing to rest comfortably on. In the United States, it seems that the nearest threat to free speech is not from government, but from private corporations.

Word Count: ~990

Addendum

This essay deals with two parts of the private censorship problem: (1) the symptoms and (2) the pathology/cause. I’d like to give some initial thoughts on the third part (3), treatment: perhaps (1) broadening the state action doctrine—private viewpoint discrimination (Ninth Amendment?) or; (2) bringing back some version of the fairness doctrine for social media platforms—limited to those who do not ostensibly claim to support a certain viewpoint over another and instead pretend to be neutral, or; (3) implementing other forms of transparency reporting requirements—how many user-initiated content flags are received? How many are actually removed? Ex: ‘Provide a random sample of 1000 or so removals and their justifications’ for the public to digest and assess any bias. The implications of these reforms and others would be interesting to look at. Where 67% of Americans and growing get at least some of their news from social media the lack of transparency regarding agenda/bias is astounding. These platforms aggregate an enormous amount of power in the hands of a few individuals (see Cambridge Analytica and your recent article in Times of India) and their use of that power deserves more scrutiny.

-- ArgiriosNickas - 13 Apr 2018

 
<--/commentPlugin-->


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 4r4 - 11 May 2018 - 21:11:47 - ArgiriosNickas
Revision 3r3 - 10 May 2018 - 21:06:55 - EbenMoglen
Revision 2r2 - 24 Apr 2018 - 03:19:45 - ArgiriosNickas
Revision 1r1 - 15 Apr 2018 - 23:11:58 - ArgiriosNickas
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM