Computers, Privacy & the Constitution

View   r3  >  r2  ...
AndrewBrickfieldFirstPaper 3 - 29 Apr 2018 - Main.AndrewBrickfield
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Line: 36 to 36
 (999 words, excluding subtitles and this parenthetical)


Deleted:
<
<
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:
 
Changed:
<
<
>
>
It's interesting that a few of us wrote about the same problem: Facebook as the man in the middle for news. But I think we all (perhaps due to space constraint) ran into some trouble with the solution portion. If we think of the problem as a disease, we addressed (1) the symptoms and (2) the pathology/cause, but failed to prescribe a (3) treatment. You touched on it briefly in your last paragraph, but I thought we all could use this space to develop and expand on some of those treatment options. My initial thoughts: perhaps (1) broadening the state action doctrine—some sort of viewpoint discrimination-based solution (this would require FB be neutral in application of its internal removal guidelines, etc.) or; (2) bringing back some version of the fairness doctrine for social media platforms—limited to those who do not ostensibly claim to support a certain viewpoint over another and instead pretend to be neutral, or; (3) (and to me, the most likely solution) statutorily implementing forms of transparency reporting requirements. For example, 'FB must: report how many user-initiated content flags they receive, how many posts/comments they actually remove in response, the criteria used in the removal process, and provide a random sample of 1000 or so removals and their specific justifications’ for the public to examine and assess any bias.

-- ArgiriosNickas - 27 Apr 2018


Hi Argirios - thank you for the comment. You're correct that I, too, ran into space problems when it came to addressing the "solution" to the Facebook as man-in-the-middle for news/opinions problem -- so I'm glad to receive the invitation to propose some solutions here!

As I see it, there are two somewhat distinct issues at play here. One, which you outline, is the problem of Facebook as censor, in which Facebook decides to censor certain content in response to either government mandate (like in some European countries and China) or to the profit motive (in order to keep advertisers and users happy). The other problem, which I attempt to outline above, involves Facebook not as censor of news/opinions but as distributor, in which its distributive decisions are informed by behavior collection and implemented in pursuit either of some government mandate or mere profitability. I think the Facebook as censor problem and Facebook as distributor problem are like flip sides of the same coin - on one hand Facebook determines the baselines requirements to enter its walled garden and on the other hand Facebook designs the rules that determine how that content moves within the walled garden. Although related, the two problems probably require different solutions.

I think you're correct that the Facebook as censor problem is best addressed through statutory disclosure requirements that provide users and regulators with an accurate picture of censorship activity on Facebook. But, for regulation by disclosure to be effective, users need Facebook substitutes or the willingness/ability to leave Facebook when it crosses the "line". I'm skeptical that those conditions exist because, as you explain in your FirstPaper, there is not a ton of consensus on what should and should not be censored, so it would take a variable series of disclosed censorship screw-ups for Facebook to alienate enough users to give the disclosure regulation any teeth. On the other hand, perhaps disclosure requirements would at least make clear to users that censorship is occurring, that Facebook has substantial power as censor, and that there is other information out there, which you need to get off Facebook to consume -- that might be a good outcome.

 
Changed:
<
<
Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.
>
>
Unfortunately, short of government-mandated standards (which of course are problematic in their own right from freedom of thought and 1A perspectives), I'm not sure there is a better regulatory solution to the Facebook as censor problem than disclosure. Transitioning Facebook to something closer to the reddit model, a semi-decentralized model in which users are engaged as community moderators who conduct most every-day censorship and moderation duties, is probably the best approach, but unlikely to occur because Facbeook is not divided into separate communities. YouTube? is currently debating precisely this question and hoping to produce an algorithm that solves their censorship/content problems. Here is the article describing that process if you are interested: https://www.bloomberg.com/news/features/2018-04-26/youtube-may-be-a-horror-show-but-no-one-can-stop-watching

Overall, the censorship problem strikes me as a perfect example of the "zoning the net" framing that Larry Lessig identifies in "Reading the Constitution in Cyberspace." Just like zoning in real life, in which neighbors/developers are in conflict about what kind of use to allow in a given plot of land, on the net we have users in conflict about what kind of content to allow in cyberspace. Unfortunately, zoning IRL doesn't exactly offer a shining example for cyberspace to follow (for most of its existence, zoning has been driven by xenophobia after all). That said, some scholars see the purpose of zoning as creating spaces for deliberation, in which process is more important than outcome. Perhaps that approach makes sense for the Facebook as censor problem. That is, maybe the best approach is a combination of disclosure requirements along side government mandated opportunities for citizen participation and input (via roles as moderators, access to disclosure, to voting on censorship standards, voting on approving variances from those standards, etc.). I'm curious whether that sounds reasonable?

By contrast, I think the Facebook as distributor problem has more straightforward solutions. I see two options, with the choice among them depending on how pernicious you think Facebook is acting or will act in its role as distributor of news/opinion. One solution would be to simply outlaw use of behavior collection among news distributors. This approach would recognize a constitutional right to autonomy that prevents any public/private use of behavior collection technology to direct news and unduly interfere in truthmaking. The challenge to this approach is defining what applications of behavior collection tech are "undue interference" and what are business as usual in a world where we all compete to create the truth. This approach would also include the challenge of deciding what uses of behavior collection tech are sufficiently important to the public interest that the constitutional right should be set aside.

The alternative to that approach is an open source/disclosure model, which would permit most uses of behavior collection technology so long as the actual operating code is available to all to constantly check for abusive practices and to understand how the tech is being used to affect truthmaking. This model would also require disclosure by news distributors of their use of that technology, such as a notice on targeted news articles stating to the user that "you were targeted to receive this content from [client]" or "click here to see why you were sent this content" which would lead to a basic explanation of the algorithm's criteria for sending content to you and a link to view the operating code.

I'm not sure which approach I'm partial too, or whether either approach is realistic, but I'm fairly confident (as you also appear to be) that something must be done to address Facebook's unprecedented power as both news distributor and private censor.

-- AndrewBrickfield - 29 Apr 2018

 
 
<--/commentPlugin-->
Deleted:
<
<
It's interesting that a few of us wrote about the same problem: Facebook as the man in the middle for news. But I think we all (perhaps due to space constraint) ran into some trouble with the solution portion. If we think of the problem as a disease, we addressed (1) the symptoms and (2) the pathology/cause, but failed to prescribe a (3) treatment. You touched on it briefly in your last paragraph, but I thought we all could use this space to develop and expand on some of those treatment options. My initial thoughts: perhaps (1) broadening the state action doctrine—some sort of viewpoint discrimination-based solution (this would require FB be neutral in application of its internal removal guidelines, etc.) or; (2) bringing back some version of the fairness doctrine for social media platforms—limited to those who do not ostensibly claim to support a certain viewpoint over another and instead pretend to be neutral, or; (3) (and to me, the most likely solution) statutorily implementing forms of transparency reporting requirements. For example, 'FB must: report how many user-initiated content flags they receive, how many posts/comments they actually remove in response, the criteria used in the removal process, and provide a random sample of 1000 or so removals and their specific justifications’ for the public to examine and assess any bias.
 \ No newline at end of file

Revision 3r3 - 29 Apr 2018 - 16:36:29 - AndrewBrickfield
Revision 2r2 - 28 Apr 2018 - 16:32:32 - ArgiriosNickas
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM