Computers, Privacy & the Constitution
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

The Political Objective of Disinformation

-- By MadihaZahrahChoksi - 26 Apr 2018

In all of my reading and thinking about the debacle that was the 2016 US presidential campaign and election, I thought I would use this opportunity to sum up my thoughts, fitting them into this space in 1000 words or less, and moving on from it.

A Great Deception:

Jürgen Habermas defines the “public sphere” as a place where private citizens come together to form the “public.” In the 21st century however, following a few hundred years of continued economic, industrial and political development, there exists an entirely new public sphere that exists on the World Wide Web. The “virtual sphere” or the “networked public sphere” as defined by Zeynep Tufekci, constitute an open, shared social space existing in the digital realm. Similar to the Habermasian public sphere, the networked public sphere encourages the free exchange of ideas—however, embedded within the seamless and intuitive digital interfaces are deceptive mechanisms controlling the free exchange of ideas. The Habermasian theory fails to account for the ways in which differentiated public spheres emerge, or develop over time; a valid criticism in a society where human to information interaction is increasingly curated. This can be applied to the case study of the 2016 U.S. Presidential Campaign (USPC), where dialogue was artificially shifted and manipulated on the Internet. Social media platforms, search engines, and web pages acting as gatekeepers for expression within the net, heavily mediate user experience in this newly-networked public space. They prescribe an information diet that confirms and conforms to the patterns of beliefs and tastes gleaned from users’ behavior on the platform.

The deceptive nature of information disseminated during the 2016 US presidential campaign (USPC) is epitomized by the relationship between the dubious nature of facts, and algorithmic biases. The first, more tangible characteristic of illusive information is the unreliability of facts themselves, and their immediate ability to guide partisan politics thanks to the networked public sphere. For example, in the context of the USPC, the false perception amongst the majority of Trump supporters is that he is a strong anti-urbanist. Trump’s focus on the working class in rural areas enabled him to paint his political identity as one who is fundamentally concerned with issues faced by the communities in these geographically remote areas. Although Trump has never lived outside of a city during his adult life, his campaign turned to social media platforms whose seamless integration into everyday lives shared these alternative facts as truths, and ultimately reached those targeted marginalized rural supporters.

In Pax Technica, Philip Howard describes how in “emerging democracies” political groups rely on the networked public sphere “…to raise funds, rally supporters, and outmaneuver opponents in policy debates.” This way, political objectives such as those of Mr. Trump that may otherwise be hard to entice support, can find their momentum through the manipulation of the web-based spaces where there are little to no financial consequences. Howard claims that “[i]n more and more elections, political victory goes to the most tech-savvy campaigner. Ideological packaging seems secondary...an impressive party machine is one that uses social media to create a bounded news ecology for supporters. It mines data on shared affinity networks, and otherwise mobilizes voters on election day.” In this sense, Mr. Trump’s exasperating opinions packed in one-hundred-and-forty-characters-or-less on Twitter not only attracted media attention across all platforms, but transformed into a critical tool through which he further connected with his supporters.

Secondly, the illusive characteristics of digital interactions further constrained the networked public sphere as a free exchange of ideas by introducing artificial forms of manipulation such as algorithms and bots. Social media platforms in the mold of Twitter and Facebook who depend on algorithmic control for “optimal user experience,” alongside Russia’s algorithmic manipulation, penetrated specious information surrounding political ideologies into the networked public sphere. While platform companies argue that their algorithms are not trained to influence political dialogue one way or the other, the experiential reality of these features demonstrates otherwise.

Immeasurable Outcomes

Algorithms and Society

The trending algorithm does much more than spread information based on popularity, it becomes a symbolic representation with cultural significance. Regardless of the topic, the fact that a certain story or idea is “trending” or “viral” is influential in and of itself. A group of trending topics taken together can represent a pseudo collective consciousness, a moment in time in which a certain set of ideas defined the outlook of the public. While the process of content reaching this status is often spontaneous, the ability to dependably promote content has spawned an industry. Using technology such as “bots,” or zombie accounts that automatically like/retweet content, these trends can be controlled by experienced actors—usually marketing firms with relatively innocuous motives. However, this space is increasingly occupied by actors affiliated with national governments, whose objectives include manipulating public opinion.

Algorithms and the Controlled Urban Experience

Algorithms, and algorithmic infrastructure have become unassailable truth mechanisms. Coupled with the dissemination of untruthful and misleading information online, they have such an extraordinary amount of power that “[t]he internet has become not just a weapon in the world’s great political battles. It has become the weapon for ideological influence, and careful use can mean the difference between winning and losing.” Whereas the boundaries of physical spaces are concrete, the ones of online spaces are fluid and are not subject to the types of outspoken and critical behavior (such as public shaming or protesting) that one would find in physical public spaces.

Social media usage is frequently referred to as taking place outside of reality, but people are increasingly faced with the truth that the nature of users’ experience on these platforms affects not only sociopolitical experiences, but national security as well. Networked “urban” spaces are designed to connect like-minded people, and in the case of neo-fascists and geographically isolated Americans, algorithms reached across societal norms and physical distance to encourage interactions between users who shared similar beliefs. Rather than fulfilling their promise to foster a diverse dialogue, these platforms artificially concentrate and segment users without regard for the basis for their union—be it environmentalism or eugenics. The effective control thrust upon communities who blindly trust virtual sources such as Facebook and Twitter as disseminators of truths is immeasurable. Ceding control to these digital spaces is a decision made for the public without their consent, but whose consequences are directly felt by the same public, not the digital mechanisms who are in charge of facilitating the flow of information taking place on their networks.

Thinking Beyond the Blame Game

Whereas in the 90s, the world wide web was championed for its democracy producing effects, less than three decades later, in an era of fake news, stolen elections, and the impending death of privacy, the human interests sales pitch of the world wide web has been betrayed by the privatization of the net. In the context of USPC, user agency was sidelined and wholly corrupted by those very firms who professed to enhance user agency on the network through tools that encourage communication and dialogue between citizens.

The aftermath of the 2016 PC and election urge us reflect on the controlling features that have become an unalterable reality of the world-wide web’s net infrastructure, and perhaps also start considering our role as agent-less bystanders as there is, and always was, more that we could do.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r2 - 28 Apr 2018 - 05:21:02 - MadihaZahrahChoksi
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM