Law in the Internet Society

View   r1
AlessiaSaracinoFendiFirstEssay 1 - 27 Oct 2021 - Main.AlessiaSaracinoFendi
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstEssay"
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Data collection and Loss of Privacy: Widespread Apathy and Dystopian advances

-- By AlessiaSaracinoFendi - 27 Oct 2021

When considering the internet society, we live in today, I cannot help but reflect firstly on my upbringing and my first interactions with computers and the internet as a young child. The ability for children with access to the internet to explore a world of endless information has created a new paradigm shift where parents were not always cognizant of the dangers. How are we to protect our privacy in a world full of cameras, surveillance, and constant data collection? Although in class, we are given the tools to better protect ourselves and our right of privacy on the internet, I am constantly shocked by fellow classmates that see no alarming issue with the trade and selling of our data. Data that may be time-stamped or geotagged. Why are we not more suspicious of this data collection? Why are we not fighting in unison? How come Columbia students will not protest the use of course works? Is the answer simply apathy? I would like to believe that it’s a lack of awareness and research; however, with the internet at everyone’s fingertips constantly and in an academic institution in which research is a core pillar this seems impossible.

Many of the data collecting products, such as Facebook, are “purposefully addictive, and thereby eroding individuals’ ability to make free choices” (1). These profit driven data collecting companies in many instances “exploit behavioral biases and imperfect willpower to addict users.” In turn the data collected can be used to manipulate, to infiltrate our privacy and inhibit our freedom of speech. As Professor Moglen emblematically made us contemplate in class, if what we say in class and what we read for class is monitored, are we truly as free and are we engaging in an intellectual exploration to the fullest degree had we not been observed?

Take back our Privacy

As I read Taking Back our Privacy, written by Anna Wiener, I felt relieved and interested to learn more. Signal is a nonprofit run solely on donations and an “outlier in the tech industry”(2). The code base for Signal is “open-source—publicly available for anyone to download and comment on” (2). The founding principal of Signal was that mass surveillance should not be achievable, especially by governments and corporations. Even Signal itself has gone to extreme lengths to maintain this, Signal “cannot read the messages that its users send and does not collect user metadata. It keeps no call logs or data backups” (2).

Marlinspike, the co-founder of Signal and the central figure of the article, had “long harbored concerns that the products and business models of private technology corporations—telecom firms, e-mail providers, search engines, social networks—would be built atop rapacious data-collection networks” (2). As the pandemic has made ever more obvious is that our world and our economy is become ever-more internet based and ever-more susceptible to blatant data collection and privacy invasions. Columbia students for the first time ever took classes from home via Zoom. More than ever, white collar jobs in which it was ingrained in political economy that workers most come into the office, are working from home or all-together quitting their jobs who do not allow them this flexibility in a mass exodus. Signal is giving the public the tools to protect itself and its data, but will society, take advantage of this technology and push other companies to put our interests above their profit margins?

Digital System of Social Control

When thinking about what governments could and can use this continuous influx of data, China is a demonstration of the possibilities this data collection creates and how it can be used to exert social control on a population. As described in Ross Andersen’s article, The Panopticon Is Already Here, China’s President, Xi Jingping, hopes to use “artificial intelligence to build a digital system of social control, patrolled by precog algorithms that identify dissenters in real time” (3).

Embodying the antithesis of the principles Signal was founded on, CETC, a state-owned company that built much of Xinjiang’s surveillance system,” is rolling out pilot project surveillance in multiple Chinese cities in the long-term hopes of a nationwide scheme in China (3). CETC is one piece of many parts of “China’s coalescing mega-network of human-monitoring technology” (3). Although China has not yet integrated the data it has collected or created a singular repository that we know of, there are no political or legal protection that would stop the “security state” from doing so (3). Since China’s population is extremely online, with millions of phones that keep track every action of the mobile user, China is an ideal setting for this technological and data-based surveillance.

Prior to reading Anderson’s piece, if I had been described this situation without context, I would have most-likely believed the article was describing a fictional dystopian episode of Black Mirror, a “sci-fi anthology series of into the collective unease about the modern world, particularly regarding both intended and unintended consequences of new technologies and the effect they have on society and individuals.” But this is a reality, China is actively pursuing a “social credit scheme” that would in essence categorize and reputation score every citizen. “China’s digital infrastructure” would without a doubt “shift the balance of power between the individual and the state” and set an alarming precedent globally (3).

Final Thought

Will it take governments moving into a digital system of social control using the data many have so willingly given for years and years now for the average American to wake up and care more about how much we share and what data we are giving away?

Articles cited (1) https://hbr.org/2018/03/here-are-all-the-reasons-its-a-bad-idea-to-let-a-few-tech-companies-monopolize-our-data

(2) https://www.newyorker.com/magazine/2020/10/26/taking-back-our-privacy

(3) https://www.theatlantic.com/magazine/archive/2020/09/china-ai-surveillance/614197/


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 1r1 - 27 Oct 2021 - 17:45:53 - AlessiaSaracinoFendi
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM