Computers, Privacy & the Constitution

View   r3  >  r2  >  r1
BargainingAnonymity 3 - 21 Mar 2018 - Main.JoeBruner
Line: 1 to 1
Changed:
<
<
META TOPICPARENT name="WebPreferences"
>
>
META TOPICPARENT name="CompPrivConst"
 

Bargaining Anonymity

Line: 26 to 26
 On the one end, users don't understand the technology, so in-depth discussion of it strikes them as irrelevant to them. But, on the other end, and I think far more profoundly, users don't understand the importance of anonymity. I am sure you and I and everyone in the course know plenty of people with the required technical knowledge to understand everything who don't really give a shit about their anonymity. And I think this is also a great failing of most discussions of privacy in places like the NYT, the Guardian, Law School and so on. People assume privacy may have some low-level intrinsic value but do not understand its relationship to either the political economy of force or the power of small groups to uproot democracy. The first issue we covered in class well enough, and I think a fair conclusion is that blasphemy law is suddenly enforceable again without the use of vigilantes in Pakistan.(1) But that is over there and even with Trump the thought that it could happen here, I think, is still far too difficult for most Americans to really grapple with.

Notes

1 : https://www.aljazeera.com/news/2017/09/pakistan-sentences-christian-man-death-blasphemy-170916091856674.html


Changed:
<
<
The latter issue is even more difficult for people to understand, and even more dangerous for democracy. During the BBC's undercover investigation of Cambridge Analytica(2), a part of the sales pitch sounded more similar to Eben's take on things, and was a more effective summary of it, than I have ever heard. "People are motivated primarily based on two things, their hopes and their fears. Most of these are unconscious. You have to tease them out. What we are doing is dropping the bucket further down the well than anyone ever has before." Last semester political theorist Joseph Raz told me that Democracy would inevitably morph into something unrecognizable and someone would do a very good Ph.d dissertation explaining how we now have "post-Democracy." Noam Chomsky and Theodor Adorno and Antonio Gramsci all made arguments about the manufacturing of consent given the propaganda systems they observed, but the risk of bypassing the rational and deliberative faculties to an even greater degree risks upsetting the whole system to a new degree. Jean-Jacques Rousseau famously argued that, in a democracy, there should not be public debate on issues because that member of the community who is too persuasive in advocacy will unduly influence the vote. That argument was regarded as absurd and silly from the time it was published through my political theory study in undergrad. But even if it is untenable with regards to a single human speaker, when the capacities of that human to influence and manipulate are cybernetically augmented and unrestrained, it becomes relevant once more.

Notes

2 : https://www.youtube.com/watch?v=mpbeOCKZFfQ, middle portion of the video


>
>
The latter issue is even more difficult for people to understand, and even more dangerous for democracy. During the BBC's undercover investigation of Cambridge Analytica(3), a part of the sales pitch sounded more similar to Eben's take on things, and was a more effective summary of it, than I have ever heard. "People are motivated primarily based on two things, their hopes and their fears. Most of these are unconscious. You have to tease them out. What we are doing is dropping the bucket further down the well than anyone ever has before." Last semester political theorist Joseph Raz told me that Democracy would inevitably morph into something unrecognizable and someone would do a very good Ph.d dissertation explaining how we now have "post-Democracy." Noam Chomsky and Theodor Adorno and Antonio Gramsci all made arguments about the manufacturing of consent given the propaganda systems they observed, but the risk of bypassing the rational and deliberative faculties to a far greater degree risks upsetting the whole system. Jean-Jacques Rousseau famously argued that, in a democracy, there should not be public debate on issues because that member of the community who is too persuasive in advocacy will unduly influence the vote. That argument was regarded as absurd and silly from the time it was published through my political theory study in undergrad. But even if it is untenable with regards to a single human speaker, when the capacities of that human to influence and manipulate are cybernetically augmented and unrestrained, it becomes relevant.
 Not understanding the dangers of a lack of anonymity, or dissociating from them and pretending not to understand, makes failing to understand or implement technological countermeasures much easier. And not understanding the technology makes it easier to dismiss concerns about the loss of anonymity as conspiracy-theoretic ramblings. Additionally, ignorance about psychology allows people to convince themselves that only stupid people(4) could be influenced in such a way and enough general education and social status solves the problem. And a lack of ignorance of psychology will be critical for addressing the other issue you mention, negligence - figuring out how to tell it to people in a way that causes them to exercise a duty of reasonable care and encouraging people around them to do so rather than dissociating from the problem and becoming a fatalist, or worse, an accelerationist(5)

Notes

4 : Rednecks, bhakts, teenagers, reactionaries, whoever the preferred target is

5 : Who isn't tempted by the prospect of using all this to make left-liberal change? There are just as many potentials for becoming Christopher Marlowe's Dr. Faustus on 'our' side. This is part of why Ted Nelson failed. https://www.gutenberg.org/files/779/779-h/779-h.htm



BargainingAnonymity 2 - 21 Mar 2018 - Main.JoeBruner
Line: 1 to 1
 
META TOPICPARENT name="WebPreferences"

Bargaining Anonymity

Line: 18 to 18
 -- MadihaZahrahChoksi - 20 Mar 2018
Added:
>
>

Thank you, Madiha. Your writing is exceptionally clear, concise, and poignant, and I agree with what I take to be your two central arguments: The issue of user ignorance is being overlooked, and learning enough of the fundamentals is not insurmountable. On the latter argument, a year ago I wrote an essay(6) which argued people once took pride in learning and mastering the vehicles and appliances they used, and it was even an important way for people to start careers. I think it is much more an issue of people thinking technology is weird, scary, and not something they want to understand than an issue of insurmountability.(7)

I agree that Tufekci is missing a discussion of user ignorance. My fear is that user ignorance exists at two points in the cognitive landscape, both of which reinforce the other.

On the one end, users don't understand the technology, so in-depth discussion of it strikes them as irrelevant to them. But, on the other end, and I think far more profoundly, users don't understand the importance of anonymity. I am sure you and I and everyone in the course know plenty of people with the required technical knowledge to understand everything who don't really give a shit about their anonymity. And I think this is also a great failing of most discussions of privacy in places like the NYT, the Guardian, Law School and so on. People assume privacy may have some low-level intrinsic value but do not understand its relationship to either the political economy of force or the power of small groups to uproot democracy. The first issue we covered in class well enough, and I think a fair conclusion is that blasphemy law is suddenly enforceable again without the use of vigilantes in Pakistan.(8) But that is over there and even with Trump the thought that it could happen here, I think, is still far too difficult for most Americans to really grapple with.

The latter issue is even more difficult for people to understand, and even more dangerous for democracy. During the BBC's undercover investigation of Cambridge Analytica(9), a part of the sales pitch sounded more similar to Eben's take on things, and was a more effective summary of it, than I have ever heard. "People are motivated primarily based on two things, their hopes and their fears. Most of these are unconscious. You have to tease them out. What we are doing is dropping the bucket further down the well than anyone ever has before." Last semester political theorist Joseph Raz told me that Democracy would inevitably morph into something unrecognizable and someone would do a very good Ph.d dissertation explaining how we now have "post-Democracy." Noam Chomsky and Theodor Adorno and Antonio Gramsci all made arguments about the manufacturing of consent given the propaganda systems they observed, but the risk of bypassing the rational and deliberative faculties to an even greater degree risks upsetting the whole system to a new degree. Jean-Jacques Rousseau famously argued that, in a democracy, there should not be public debate on issues because that member of the community who is too persuasive in advocacy will unduly influence the vote. That argument was regarded as absurd and silly from the time it was published through my political theory study in undergrad. But even if it is untenable with regards to a single human speaker, when the capacities of that human to influence and manipulate are cybernetically augmented and unrestrained, it becomes relevant once more.

Not understanding the dangers of a lack of anonymity, or dissociating from them and pretending not to understand, makes failing to understand or implement technological countermeasures much easier. And not understanding the technology makes it easier to dismiss concerns about the loss of anonymity as conspiracy-theoretic ramblings. Additionally, ignorance about psychology allows people to convince themselves that only stupid people(10) could be influenced in such a way and enough general education and social status solves the problem. And a lack of ignorance of psychology will be critical for addressing the other issue you mention, negligence - figuring out how to tell it to people in a way that causes them to exercise a duty of reasonable care and encouraging people around them to do so rather than dissociating from the problem and becoming a fatalist, or worse, an accelerationist(11)

I agree with you that user ignorance and user negligence are critical. They may be the only really hard problems involved.

-- JoeBruner - 21 Mar 2018

Notes

6 : It was actually my first essay for Moglen and is still on the wiki. You must be logged in to get to it, though. http://moglen.law.columbia.edu/twiki/bin/view/LawContempSoc/JoeBrunerFirstEssay

7 : Teenagers are more honest than adults. I had a high school student tell me that switching web browsers, and looking up programs independently to replace substandard programs she had, was too scary. I had another one tell me that Macs are not targeted at people like me because normal people want a smart person to set everything up for them so it is as easy as possible.


 
 
<--/commentPlugin-->

BargainingAnonymity 1 - 20 Mar 2018 - Main.MadihaZahrahChoksi
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="WebPreferences"

Bargaining Anonymity

That’s basically what’s at stake when 270,000 Facebook users decide to allow the thisisyourdigitallife app to scrape not only their data, but that of their friends, for $1 to $2 USD in return. In today’s New York Times Opinion piece by Zeynep Tufekci, a short and precise Facebook-slam piece, she discusses Facebook’s denial of their participation in a data breach affecting 50 million American users. She outlines the facts: that Facebook collected data of those who subscribed to the app, as well as their friends, they then sent the data back to the app administrators who then handed it over to Cambridge Analytica. She also alludes to the reality we also know all too well: that Facebook users are, in the most basic terms, simple income units. The units behave on Facebook, Facebook monitors/spies on their behavior, and then, if that weren’t evil enough, Facebook capitalizes on their attention: their ears, and eyeballs, and, by extension, their mind [by selling slots to advertisers to be seen by increasingly precise slices/demographics].

I think what Tufekci’s piece is missing is a discussion of user ignorance, and user negligence to familiarize themselves with the technology that they have so hospitably welcomed and created a space for in their daily lives. While she does touch on how difficult it is to give “informed” consent, and then in turn how hard it is to retract that consent, what could be discussed at a greater length is the impenetrable knowledge gap surrounding mechanisms controlling platforms like Facebook (algorithms, bots, commodification of user data), and how we could work to fill it.

We had some interesting discussions in class before the break that remind me of this missed angle in today’s article. The debate was about how tech literacy isn’t for everyone; that if you don’t come from a STEM background, it’s basically impossible to comprehend the mechanisms of a desktop computer’s processing unit. Familiarizing oneself with tech was compared to a fridge, or a car; the argument being that those of us who are not mechanics can’t be asked to fix cars, or explain how a catalytic converter works. The point, however, isn’t to be “the whiz”. It’s not about perfecting one’s knowledge of a thing so far as to never need help from a specialist. The point, in my opinion, is to understand the basics, the fundamentals, the essence of what is happening on the inside so you can’t be fooled by the mechanic who wants to overcharge you on a wheel alignment. It requires time, effort too, but most importantly an appetite for dealing with information that one doesn’t necessarily care much about. But again, the point isn’t to care, it’s to have a relationship with a thing you are giving a part of your precious life to [and allowing to access precious parts of your life]. With a car, it’s your physical body, whether you are a driver or passenger. With tech, it’s your time, your privacy, your emotions, and those of your friends too – is that really only worth $2 or less?

Negligible user interest pertaining to fundamentals of tech functionality is deeply rooted in the political economy of both the web, and ICT infrastructure. The closed-web, and the platforms that dominate it, rule absolutely. They do not invite open dialogue, knowledge sharing, or even curiosity: they nurture reticence.

The same goes for those devices we cherish as our most prized possessions. Their construction is seamless: perfect shape, size, colours, so alluring that we forget it’s a device replacing some of our most basic human abilities such as social interaction and geographical awareness. One can argue that thisisyourdigitallife hid its motives to scrape and share user data, and it would have impossible to predict. While that argument holds value, the truth helps us understand the conditions in which it operates: Facebook is a surveillance platform, it has always been a surveillance platform, and its data collection and dissemination routine refine themselves over time. Cambridge Analytica is certainly not the first and only company to harvest user data of tens of millions of people, it’s one of many, and the practice is not stopping just because we have revealed a bit more about it.

-- MadihaZahrahChoksi - 20 Mar 2018

 
<--/commentPlugin-->

Revision 3r3 - 21 Mar 2018 - 17:37:43 - JoeBruner
Revision 2r2 - 21 Mar 2018 - 15:57:48 - JoeBruner
Revision 1r1 - 20 Mar 2018 - 04:02:24 - MadihaZahrahChoksi
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM