BenMingovFirstEssay 2 - 17 Nov 2024 - Main.EbenMoglen
|
|
META TOPICPARENT | name="FirstEssay" |
| |
< < | It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted. | | | |
> > |
Perhaps a real title?
| | -- By BenMingov - 25 Oct 2024
The Problem | |
< < | Contemporary life has eroded many of the barriers between the technological sphere of our lives and the rest of our lives. Technology has become a ubiquitous and indispensable part of the human experience – at least in the developed world. So many of the products that we have become accustomed to, such as “smart” watches and “smart” phones, give us the power of instantaneous communication throughout the world and access to more information than anybody in generations prior could fathom, among other powerful features. But beneath the veneer of their sensory- and information-enhancing power lies a user that sees us as a tool. As Shoshana Zuboff argues, modern technology exploits our relationship with it, particularly our dependence on it. Every possible data point that we provide our devices, both knowingly and unknowingly, is harnessed to produce profit for its owners and control over people’s autonomy and privacy. | > > | Contemporary life has eroded many of the barriers between the technological sphere of our lives and the rest of our lives.
This is not the sort of sentence you want to start with.
Technology has become a ubiquitous and indispensable part of the human experience – at least in the developed world. So many of the products that we have become accustomed to, such as “smart” watches and “smart” phones, give us the power of instantaneous communication throughout the world and access to more information than anybody in generations prior could fathom, among other powerful features. But beneath the veneer of their sensory- and information-enhancing power lies a user that sees us as a tool. As Shoshana Zuboff argues, modern technology exploits our relationship with it, particularly our dependence on it. Every possible data point that we provide our devices, both knowingly and unknowingly, is harnessed to produce profit for its owners and control over people’s autonomy and privacy. | | It's Been Said Before
However, all of this has been said in countless forms by countless people before. What interests me about technology exploiting its users is the fact that the conversations surrounding this dynamic are still in their infancy, with mostly surface-level observations and truths that are plainly apparent. Critiques for their own sake are hardly compelling. | |
> > |
They are not compelling because they are in their infancy? You want to say something less jejune? Great. But you are 20% through the space available and you haven't even hinted at the idea yet,
| | Ignorance Is No Excuse | |
< < | In many contemporary writings, we are told that capitalism based on the retrieval and sale of our data is one that we have not consented to and that most of us don’t know is occurring. But decades after this economy has been built, with ever increasing prevalence of headline-making news about big-tech companies and their “shady” practices, surely the onus must fall on the users to either reject it or consent. It doesn’t seem to be a sound logical or moral argument to continue using a product whose risks you understand and then claim that you were not apprised. Like with all other aspects of our lives, we are presumed to know the rules that dictate our conduct. Ignorance is not a valid excuse. We cannot claim ignorance for breaking the law or not paying our bills on time. No matter how unpleasant the parallel might be in this situation, it begs the question, why should ignorance excuse us from the implied contract of our data for “free” use of technologies. | > > | In many contemporary writings, we are told that capitalism based on the retrieval and sale of our data is one that we have not consented to and that most of us don’t know is occurring. But decades after this economy has been built, with ever increasing prevalence of headline-making news about big-tech companies and their “shady” practices, surely the onus must fall on the users to either reject it or consent.
They consent to terms of service all the time. But it wouldn't be saying something new to challenge the model of consent, as I have said it quite a few times myself. Perhaps it would be worth discussing?
It doesn’t seem to be a sound logical or moral argument to continue using a product whose risks you understand and then claim that you were not apprised. Like with all other aspects of our lives, we are presumed to know the rules that dictate our conduct. Ignorance is not a valid excuse. We cannot claim ignorance for breaking the law or not paying our bills on time. No matter how unpleasant the parallel might be in this situation, it begs the question, why should ignorance excuse us from the implied contract of our data for “free” use of technologies. | | Privacy Capitalism (Or not even?) | |
< < | Without a federal law expressly prohibiting the mining and selling of our data, solutions to this problem lie in the hands of private action. It is likely that in any effective campaign against this current form of “surveillance capitalism”, education would be the cornerstone of the strategy. A much greater swathe of our society needs to be made aware of the prescient risks associated with our data being for sale. They must also be convinced that this information needs to be acted on, and not simply noted and forgotten. Perhaps the solution to our dilemma isn’t for us to break the model of surveillance capitalism, but rather to create a viable alternative. Free market solutions, either for-profit or nonprofit, that offer the same types of services we are accustomed to, such as streaming music, navigation, instant messaging, etc., but which do not sell our data. If the market accepts these products, then those using it will have solved the surveillance issue to some extent, albeit not entirely, at least with respect to themselves. It would appear that the most effective antidote to surveillance capitalism is privacy capitalism. | > > | Without a federal law expressly prohibiting the mining and selling of our data, solutions to this problem lie in the hands of private action. It is likely that in any effective campaign against this current form of “surveillance capitalism”, education would be the cornerstone of the strategy. A much greater swathe of our society needs to be made aware of the prescient risks associated with our data being for sale.
Weren't you just arguing a moment ago that if they haven't figured all this out by now the "onus" should be on them?
They must also be convinced that this information needs to be acted on, and not simply noted and forgotten.
When people must be convinced they often aren't. The recent presidntial election seems to be a case in point.
Perhaps the solution to our dilemma isn’t for us to break the model of surveillance capitalism, but rather to create a viable alternative. Free market solutions, either for-profit or nonprofit, that offer the same types of services we are accustomed to, such as streaming music, navigation, instant messaging, etc., but which do not sell our data. If the market accepts these products, then those using it will have solved the surveillance issue to some extent, albeit not entirely, at least with respect to themselves. It would appear that the most effective antidote to surveillance capitalism is privacy capitalism. | | Real World Examples
And in fact, there are currently many products whose stated missions are to provide services without compromising user privacy. An obvious question that comes from this, of course, is how can we ensure that this privacy is truly being maintained and that there is no duplicitous activity occurring. But at the very least, by placing the spotlight on privacy-centric goods and services in an effort to combat constant surveillance, people are provided with a choice that allows them to be plugged into modern society, knowledge, and culture while maintaining some degree of anonymity. Two examples of nonprofit-produced services that position themselves in the market as privacy-centric alternatives are Signal, which is meant to compete with companies such as Telegram and Whatsapp, and the Tor Browser, which unlike most other browsers such as Chrome, does not track its users’ activities and sell this information. On the for-profit end of the spectrum, we find companies such as the start-up phone producer, Purism, that uses an open-source operating system. | |
> > |
Your "real-world" examples, taking up one sentence each, are all examples of free software being used to improve privacy. Perhaps it would be useful to analyze the root rather than giving a sentence to three among thousands of leaves? The forest might then become visible.
| | Conclusion
While overarching government regulation would be a more absolute and pervasive response, a market-based solution still provides an option to nearly all those concerned about their digital privacy. This is, of course, caveated by access issues and issues of being under the age of majority, which are two issues that could be resolved by legal regulation and more nonprofit work. There is widespread interest in keeping our data secure. To this end, the subreddit r/privacy has approximately 1.5 million members.
American society places great value on freedom and agency. And if we believe in the ideals of freedom and agency, shouldn’t we also believe that these ideals can lead people to results they might not want? People will always have the option to use things that might not be good for them. It’s hard to be a champion of freedom but also claim that we must push businesses or people to conduct their affairs in a particular way. Rather, we should encourage both private sector and nonprofit organizations to produce privacy-focused goods and services and simply say to people: user beware. | |
> > |
This is conclusory, but is it a conclusion?
I think the primary route to improvement is better focus. The first 200 words contribute little; the argument concerning "ignorance as excuse" seems internally contradictory, and the "real-world" references don't seem to be quite as informative about the real world as they might be. Rewriting rather than patching seems indicated, but the first draft did its job well, by getting your ideas out on clear ground.
| |
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. |
|
BenMingovFirstEssay 1 - 25 Oct 2024 - Main.BenMingov
|
|
> > |
META TOPICPARENT | name="FirstEssay" |
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
-- By BenMingov - 25 Oct 2024
The Problem
Contemporary life has eroded many of the barriers between the technological sphere of our lives and the rest of our lives. Technology has become a ubiquitous and indispensable part of the human experience – at least in the developed world. So many of the products that we have become accustomed to, such as “smart” watches and “smart” phones, give us the power of instantaneous communication throughout the world and access to more information than anybody in generations prior could fathom, among other powerful features. But beneath the veneer of their sensory- and information-enhancing power lies a user that sees us as a tool. As Shoshana Zuboff argues, modern technology exploits our relationship with it, particularly our dependence on it. Every possible data point that we provide our devices, both knowingly and unknowingly, is harnessed to produce profit for its owners and control over people’s autonomy and privacy.
It's Been Said Before
However, all of this has been said in countless forms by countless people before. What interests me about technology exploiting its users is the fact that the conversations surrounding this dynamic are still in their infancy, with mostly surface-level observations and truths that are plainly apparent. Critiques for their own sake are hardly compelling.
Ignorance Is No Excuse
In many contemporary writings, we are told that capitalism based on the retrieval and sale of our data is one that we have not consented to and that most of us don’t know is occurring. But decades after this economy has been built, with ever increasing prevalence of headline-making news about big-tech companies and their “shady” practices, surely the onus must fall on the users to either reject it or consent. It doesn’t seem to be a sound logical or moral argument to continue using a product whose risks you understand and then claim that you were not apprised. Like with all other aspects of our lives, we are presumed to know the rules that dictate our conduct. Ignorance is not a valid excuse. We cannot claim ignorance for breaking the law or not paying our bills on time. No matter how unpleasant the parallel might be in this situation, it begs the question, why should ignorance excuse us from the implied contract of our data for “free” use of technologies.
Privacy Capitalism (Or not even?)
Without a federal law expressly prohibiting the mining and selling of our data, solutions to this problem lie in the hands of private action. It is likely that in any effective campaign against this current form of “surveillance capitalism”, education would be the cornerstone of the strategy. A much greater swathe of our society needs to be made aware of the prescient risks associated with our data being for sale. They must also be convinced that this information needs to be acted on, and not simply noted and forgotten. Perhaps the solution to our dilemma isn’t for us to break the model of surveillance capitalism, but rather to create a viable alternative. Free market solutions, either for-profit or nonprofit, that offer the same types of services we are accustomed to, such as streaming music, navigation, instant messaging, etc., but which do not sell our data. If the market accepts these products, then those using it will have solved the surveillance issue to some extent, albeit not entirely, at least with respect to themselves. It would appear that the most effective antidote to surveillance capitalism is privacy capitalism.
Real World Examples
And in fact, there are currently many products whose stated missions are to provide services without compromising user privacy. An obvious question that comes from this, of course, is how can we ensure that this privacy is truly being maintained and that there is no duplicitous activity occurring. But at the very least, by placing the spotlight on privacy-centric goods and services in an effort to combat constant surveillance, people are provided with a choice that allows them to be plugged into modern society, knowledge, and culture while maintaining some degree of anonymity. Two examples of nonprofit-produced services that position themselves in the market as privacy-centric alternatives are Signal, which is meant to compete with companies such as Telegram and Whatsapp, and the Tor Browser, which unlike most other browsers such as Chrome, does not track its users’ activities and sell this information. On the for-profit end of the spectrum, we find companies such as the start-up phone producer, Purism, that uses an open-source operating system.
Conclusion
While overarching government regulation would be a more absolute and pervasive response, a market-based solution still provides an option to nearly all those concerned about their digital privacy. This is, of course, caveated by access issues and issues of being under the age of majority, which are two issues that could be resolved by legal regulation and more nonprofit work. There is widespread interest in keeping our data secure. To this end, the subreddit r/privacy has approximately 1.5 million members.
American society places great value on freedom and agency. And if we believe in the ideals of freedom and agency, shouldn’t we also believe that these ideals can lead people to results they might not want? People will always have the option to use things that might not be good for them. It’s hard to be a champion of freedom but also claim that we must push businesses or people to conduct their affairs in a particular way. Rather, we should encourage both private sector and nonprofit organizations to produce privacy-focused goods and services and simply say to people: user beware.
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines:
Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list. |
|
|