Computers, Privacy & the Constitution

View   r3  >  r2  ...
MadihaZahrahChoksiFirstPaper 3 - 17 May 2018 - Main.JoeBruner
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"

It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Line: 37 to 37
 The aftermath of the 2016 PC and election urge us reflect on the controlling features that have become an unalterable reality of the world-wide web’s net infrastructure, and perhaps also start considering our role as agent-less bystanders as there is, and always was, more that we could do.
Changed:
<
<

>
>

Response - Joe Bruner

Madiha, this was very thoughtful and covered a lot of ground in a concise way. Using social media's role in the 2016 election to set up a starting point sends me in some interesting directions.

The first interesting direction to me is thinking more specifically about manipulation and algorithms. To me, bots are a less interesting form of manipulation for now - when we get more realistic-seeming bots, that will be more dangerous, but for now they are primarily interesting because they interact with algorithms to make things trend and to make pages appear popular and legitimate. In the same way you spend more money to buy your ad a better place in the local newspaper, now you buy bots to make your post appear more popular and significant. What is ironic to me and what this reveals is that they may be a better way to pay your way to notability than the "promotion" function Facebook and Twitter have tried to monetize. It has several advantages: If you already have a spare botnet and dummy accounts, it's free. The post does not have that delegitimizing little "promoted" label. It lets you build a greater screen of plausible deniability between who is posting and who is spending to promote the post. On the one hand we might expect Twitter and Facebook to be angry about this behavior and create algorithms to ban bots, but on the other hand, assuming Eben's ideas about behavior collect are correct, the idea of annoying fake comments motivating all the real people in the machine to behave back at things may be more desirable than the revenue. At least it may be enough of a counter-benefit that they do not find it cost-effective to try and ban all bots.

Everyone is talking a lot about algorithms but I think the discussion rarely gets specific enough about how the algorithms actually affect our behavior and thinking. In the winter I met and was really impressed by Miranda Fricker. I'll give you the Wikipedia blurbs on her biggest idea, the two forms of epistemic injustice:

"Testimonial injustice consists in prejudices that cause one to "give a deflated level of credibility to a speaker's word":[5] Fricker gives the example of a woman who due to her gender is not believed in a business meeting. She may make a good case, but prejudice causes the listeners to believe her arguments to be less competent or sincere and thus less believable. In this kind of case, Fricker argues that as well as there being an injustice caused by possible outcomes (such as the speaker missing a promotion at work), there is a testimonial injustice: "a kind of injustice in which someone is wronged specifically in her capacity as a knower."

Here, this seems relevant to me because likes, shares, and views on the platform become a de facto currency of credibility. I recently saw a message calling for bringing together left-wing "social media influencers." The importance and credibility of a speaker is in large part their following on the platform, which is hypermediated (you love that word) and demarcated by the terms of interaction which the platform companies determine for us. If all our political organizing and discussion is moving into the platforms, the ability to be believed or have a say is diminished, almost certainly unjustly, if you do not use the platform and command a sufficiently large following. And obviously if your messages are too long, nuanced, or sophitistcated for the terms of interaction a platform provides, you will have no following. Even more interestingly, not having enough pictures or information about your personal life linked to the platform also diminishes your credibility in the new form of social interaction. Who is this strange person with a flag as their profile picture? Facebook and Twitter have successfully inverted the dynamics of the old forms of internet political discussion on 4chan, Something Awful, and Slashdot where too much information about yourself was seen as foolish vanity and a sign someone was a self-promoter, not to be trusted, rather than someone authentically expressing their thoughts and interacting for the interaction's own sake. But this is actually the smaller of the two ideas for our purposes, in my opinion.

"Hermeneutical injustice, then, describes the kind of injustice experienced by groups who lack the shared social resources to make sense of their experience. One consequence of such injustice is that such individuals might be less inclined to believe their own testimony. For example, Fricker describes a woman attending a meeting in the late 1960s at which post-partum depression was discussed; in this case, the shared social resource - a linguistic label and sharing of experiences - enabled an understanding of a condition she had experienced and was previously blamed for."

Your own Zeynep Tufekci is undoubtedly correct when she points out Youtube is now a huge radicalization engine, but I think the sketch of how this happens is usually incomplete. Sophisticated ideology to interpret one's experiences is hard to come by, and it requires a lot of dedicated focus, ability to read and study, and so on, especially when you do not have a teacher teaching you firsthand. When I was a teenager I was reading Hayek and Engels and so on, but if my ability to pay attention had been compromised, I might have instead defaulted to sources that were easier to pay attention to. It's a lot easier for Youtube videos to hold a kid's attention and they give clearer answers. Antisemitism is so easy, you just tell people that because the Jews are behind everything that sucks and they only care about themselves the solution is either "another shoah" or "itbah al-yahud" depending on what language the video is in. The communism and socialism that goes around is a horrible cut-rate version because instead of answering for the horrors of Stalin, it's much easier to say that was all bullshit. We're making it really hard for kids to use good tools to interpret their world by diminishing their attention span and providing flashy hypermediated alternatives, and we're giving them easy-to-use CONVENIENT shitty tools. But it's not only kids - the whole Trump/Hilary interplay was so shallow it left most Americans feeling totally dissatisfied because everyone was just hitting back stupid little talking points on their social media pages, and the majority of Americans who did not vote called it out. If not voting were a candidate, they would have won by a landslide.

I suppose the next big lingering question is how to structure a networked public sphere that facilitates real, meaningful interactions and study of the questions, rather than facilitating vaguely poliical behavior which is reactionary in every sense of the word. Social media platforms don't engage the rational mind and aren't conducive to healthy, reasoned debate and discussion. This isn't even really controversial now, but the hard question to me is what structural arrangement of humans interactiving over a computing network does a good job, rather than a bad one, of approaching the Habermasian ideal. If you like Jurgen Habermas you understand his shame at the fact that more Germans were attending the Reichsparteitag in the Luitpoldarena der Nuremberg than were ever discussing bourgeoisie ideas in his precious Öffentlichkeit salons. Everyone having a networked computer should give everyone a chance to participate in that kind of idealized Öffentlichkeit culture and allow us to take back power from the old broadcasters who Eben called the Eyeball Merchants. And I actually think the tech for that is mostly fine already, but we have to facilitate a transition towards people using it.

140 characters is not enough, but this Wiki is pretty nice when people use it, don't you think?

 You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Revision 3r3 - 17 May 2018 - 22:50:43 - JoeBruner
Revision 2r2 - 28 Apr 2018 - 05:21:02 - MadihaZahrahChoksi
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM