Law in the Internet Society

View   r5  >  r4  >  r3  >  r2  >  r1
JaredHopperFirstEssay 5 - 19 Dec 2023 - Main.JaredHopper
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Changed:
<
<

Fantastic Machines and How to Use Them

>
>

Your Netflix Subscription's on Big Tech

 -- By JaredHopper - 13 Oct 2023

Introduction

Changed:
<
<
In thinking about the pitfalls and perils of the "internet society," it is tempting to erase the leaps and bounds that developing technology signifies in human evolution. The "outsourcing" of high-analytic capabilities, which I will refer to in this essay as the trend towards pushing what we see as exclusively human traits into the domain of technology, to AI algorithms is terrifying to most of us. We don't trust cars to drive themselves because there is an inherently human type of caution, we think, that cannot be captured by a precarious mixture of cameras, motion sensors, and an algorithm. Even the development of delivery robots scares us: we might be ok with technology augmenting our reality, but it should not "walk" among us.
>
>
In thinking about the pitfalls and perils of the "internet society," it is tempting to erase the leaps and bounds that developing technology signifies in human evolution. The "outsourcing" of high-analytic capabilities to AI algorithms is terrifying to most of us. We don't trust cars to drive themselves because there is an inherently human type of caution, we think, that cannot be captured by a precarious mixture of cameras, motion sensors, and an algorithm. Even the development of delivery robots scares us: we might be ok with technology augmenting our reality, but it should not "walk" among us.
 

Marching Forwards

Changed:
<
<
Where does this fear come from? Obviously there are multiple root causes, but it may be helpful to think about just one within this essay's confines: outsourcing as dangerous to human development. It may certainly be the case that a society dependent on maximum technological integration, as I argue we are, has resulted in an evolutionary halt. For example, the capacity to solve simple arithmetic, while still useful in calculating a 20% tip at (some) restaurants, is essentially vestigial at this point, calculators having become a household device by the mid-1970s. But I think that fear in totally abandoning arithmetic in our early education comes more from a laziness in revising hundreds of years of pedagogy rather than from a fear that not learning arithmetic will stunt mental growth. The transformation of function from essential to vestigial is, and has always been, a sign of evolution, not sliding backwards (looking at you, appendix). When viewing moments of outsourcing in the twentieth century ex-post, we are fine looking back on completed transfers of previously human tasks to automated systems as not only a good thing, but one essential to exponential progress. We fear changing, but we do not regret being changed.
>
>
Where does this fear come from? Obviously, there are multiple root causes, but it may be helpful to think about just one within this essay's confines: outsourcing as dangerous to human development. It may certainly be the case that a society dependent on maximum technological integration, as I argue we are, has resulted in an evolutionary halt. For example, the capacity to solve simple arithmetic, while still useful in calculating a 20% tip at (some) restaurants, is essentially vestigial at this point. However, I think that fear in totally abandoning arithmetic in our early education comes more from a laziness in revising hundreds of years of pedagogy rather than from a fear that not learning arithmetic will stunt mental growth. The transformation of function from essential to vestigial is, and has always been, a sign of evolution, not sliding backwards. Looking at outsourcing in the twentieth century ex-post, we accept completed transfers of previously human tasks to automated systems as not only a good thing, but one essential to exponential progress. We fear changing, but we do not regret being changed.
 

Surrender

Changed:
<
<
It is no secret that all of our data, our private information as well as most things we have said in the last decade, is no longer just ours. Corporations like Alphabet have broken into our homes and raided our medicine cabinets, our safes, and our lingerie drawers. The data is sold to other private corporations for a variety of purposes to fill the coffers of these data-mining titans like Alphabet. Free services do not exist—we pay with our privacy. Perhaps surrender is the answer. This proposition arises not from laziness, of not wanting to do anything about it, but from a feeling of hopelessness mixed with a glimmer of hope.
>
>
It is no secret that all our data is no longer just ours. Corporations like Alphabet have broken into our homes and raided our medicine cabinets, our safes, and our lingerie drawers. The data is sold to other private corporations for a variety of purposes to fill the coffers of these data-mining titans like Alphabet. Free services do not exist: we pay with our privacy. Perhaps surrender is the answer. Is there a way to surrender without losing hope of retaining, or taking back, control?
 

Give the voyeurs what they want; get a buck!

Changed:
<
<
The view, in short, is that we are already being tracked and invaded, so we might as well get paid for it. Tech companies sell our data, and we don’t ask to be paid for it? Bonkers! However, a few companies are actually starting to pay their customers for their location information. For example, Tapestri, a young startup, itself is a company that has your data as its primary trade. Its business is our business, the front page of the company announcing, “it’s your date, you should get paid for it,” and the app is a gamified access point for them to get extremely accurate location data on you 24/7. Tapestri pays anywhere from $8 to $25 a month for the data. What drives this noble business model? Some privacy experts report that, although it may benefit the consumer, its purpose is not as altruistic as it seems. These advocates of online anonymity explain that “traditional sources of data dry up,” and as more and more jurisdictions pass laws regulating the background tracking that the big companies are now known for (you can’t use the service until you accept the terms, and there is no negotiating with Apple here), the industry is having to get creative.
>
>
The view, in short, is that we are already being tracked and invaded, so we might as well get paid for it. Tech companies sell our data, and we don’t ask to be paid for it? Bonkers! However, a few companies are starting to pay their customers for their location information. For example, Tapestri, a young startup, itself is a company that has your data as its primary trade. Its business is our business, the front page of the company announcing, “it’s your date, you should get paid for it,” and the app is a gamified access point for extremely accurate location data \ 24/7. Tapestri pays anywhere from $8 to $25 a month for the data. What drives this noble business model? Some privacy experts report that, although it may benefit the consumer, its purpose is not as altruistic as it seems. These advocates of online anonymity explain that traditional sources of data dry up, and as more and more jurisdictions pass laws regulating the background tracking that the big companies are now known for (you can’t use the service until you accept the terms, and there is no negotiating with Apple here), the industry is having to get creative.
 This new type of business, centered on what is called zero-party data, is nothing to celebrate yet, at least it seems. The end result is the same, but at least the violation is more out in the open. The approach to data mining on which companies like Tapestri focus is indeed “better” in that it is a consensual transaction, but this is meaningless so long as the traditional “background” tracking is completely eradicated. It doesn’t help that the people making laws, for the most part, don’t know how to open their email.
Changed:
<
<
A new opportunity has presented itself: getting paid to sign over your data. The choice is yours. Is getting your Netflix subscription paid for worth welcoming Big Brother with open arms?
>
>

Conclusion

 
Changed:
<
<
One hard edit to tighten the writing is indicated. You use more words than necessary, mostly through overdecorating. I think 15% to 20% of the space can be gained back.
>
>
The elephant in the room is the result of this surrender. Although it may be nice to get paid for your personal information, is compensation really enough to make up for the invasion of privacy? Perhaps greater clarity on where the information is going and for what it is being used is necessary for us to feel comfortable fully surrendering to a world of decreasing delineated personhood. It’s not clear whether companies would be fully transparent about to whom the data is being sold, but regulation might not be totally useless.
 
Changed:
<
<
How can a price be established for the sale of personal information if the risks of harm from future disclosure must be borne by the subject. Without knowing what can be inferred or otherwise synthesized from what is disclosed, in relation to what else may be discovered or disclosed elsewhere, how can the seller determine the appropriate price? And how could such information about the future effects of information disclosure be acquired or estimated?
>
>
A new opportunity has presented itself: getting paid to sign over your data. The choice is yours. Is getting your Netflix subscription paid for worth welcoming Big Brother with open arms?
 



JaredHopperFirstEssay 4 - 17 Dec 2023 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Deleted:
<
<
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
 

Fantastic Machines and How to Use Them

Line: 27 to 26
 A new opportunity has presented itself: getting paid to sign over your data. The choice is yours. Is getting your Netflix subscription paid for worth welcoming Big Brother with open arms?
Added:
>
>
One hard edit to tighten the writing is indicated. You use more words than necessary, mostly through overdecorating. I think 15% to 20% of the space can be gained back.

How can a price be established for the sale of personal information if the risks of harm from future disclosure must be borne by the subject. Without knowing what can be inferred or otherwise synthesized from what is disclosed, in relation to what else may be discovered or disclosed elsewhere, how can the seller determine the appropriate price? And how could such information about the future effects of information disclosure be acquired or estimated?

 
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

JaredHopperFirstEssay 3 - 17 Dec 2023 - Main.JaredHopper
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.


JaredHopperFirstEssay 2 - 13 Dec 2023 - Main.JaredHopper
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Line: 9 to 9
 

Introduction

Changed:
<
<
In thinking about the pitfalls and perils of the "internet society," it is tempting to erase the leaps and bounds that developing technology signifies in human evolution. The "outsourcing" of high-analytic capabilities, which I will refer to in this essay as the trend towards pushing what we see as exclusively human traits into the domain of technology, to AI algorithms is terrifying to most of us. We don't trust cars to drive themselves because there is an inherently human type of caution, we think, that cannot be captured by a precarious mixture of cameras, motion sensors, and an algorithm. Even the development of delivery robots scares us: we might be ok with technology augmenting our reality, but it should not "walk" among us. Where does this fear come from? Obviously there are multiple root causes, but it may be helpful to think about just one within this essay's confines: outsourcing as dangerous to human development. It may certainly be the case that a society dependent on maximum technological integration, as I argue we are, has resulted in an evolutionary halt. For example, the capacity to solve simple arithmetic, while still useful in calculating a 20% tip at (some) restaurants, is essentially vestigial at this point, calculators having become a household device by the mid-1970s. But I think that fear in totally abandoning arithmetic in our early education comes more from a laziness in revising hundreds of years of pedagogy rather than from from a fear that not learning arithmetic will stunt mental growth. The transformation of function from essential to vestigial is, and has always been, a sign of evolution, not sliding backwards (looking at you, appendix). When viewing moments of outsourcing in the twentieth century ex-post, we are fine looking back on completed transfers of previously human tasks to automated systems as not only a good thing, but one essential to exponential progress. We fear changing, but we do not regret being changed.
>
>
In thinking about the pitfalls and perils of the "internet society," it is tempting to erase the leaps and bounds that developing technology signifies in human evolution. The "outsourcing" of high-analytic capabilities, which I will refer to in this essay as the trend towards pushing what we see as exclusively human traits into the domain of technology, to AI algorithms is terrifying to most of us. We don't trust cars to drive themselves because there is an inherently human type of caution, we think, that cannot be captured by a precarious mixture of cameras, motion sensors, and an algorithm. Even the development of delivery robots scares us: we might be ok with technology augmenting our reality, but it should not "walk" among us.
 
Changed:
<
<

Surrender

>
>

Marching Forwards

 
Changed:
<
<
It is no secret that all of our data, our private information as well as most things we have said in the last decade, is no longer just ours. Corporations like Alphabet have broken into our homes and raided our medicine cabinets, our safes, and our lingerie drawers. The data is sold to other private corporations for a variety of purposes to fill the coffers of these data-mining titans like Alphabet. Free services do not exist.
>
>
Where does this fear come from? Obviously there are multiple root causes, but it may be helpful to think about just one within this essay's confines: outsourcing as dangerous to human development. It may certainly be the case that a society dependent on maximum technological integration, as I argue we are, has resulted in an evolutionary halt. For example, the capacity to solve simple arithmetic, while still useful in calculating a 20% tip at (some) restaurants, is essentially vestigial at this point, calculators having become a household device by the mid-1970s. But I think that fear in totally abandoning arithmetic in our early education comes more from a laziness in revising hundreds of years of pedagogy rather than from a fear that not learning arithmetic will stunt mental growth. The transformation of function from essential to vestigial is, and has always been, a sign of evolution, not sliding backwards (looking at you, appendix). When viewing moments of outsourcing in the twentieth century ex-post, we are fine looking back on completed transfers of previously human tasks to automated systems as not only a good thing, but one essential to exponential progress. We fear changing, but we do not regret being changed.
 
Added:
>
>

Surrender

 
Added:
>
>
It is no secret that all of our data, our private information as well as most things we have said in the last decade, is no longer just ours. Corporations like Alphabet have broken into our homes and raided our medicine cabinets, our safes, and our lingerie drawers. The data is sold to other private corporations for a variety of purposes to fill the coffers of these data-mining titans like Alphabet. Free services do not exist—we pay with our privacy. Perhaps surrender is the answer. This proposition arises not from laziness, of not wanting to do anything about it, but from a feeling of hopelessness mixed with a glimmer of hope.
 
Added:
>
>

Give the voyeurs what they want; get a buck!

 
Changed:
<
<

Subsub 1

>
>
The view, in short, is that we are already being tracked and invaded, so we might as well get paid for it. Tech companies sell our data, and we don’t ask to be paid for it? Bonkers! However, a few companies are actually starting to pay their customers for their location information. For example, Tapestri, a young startup, itself is a company that has your data as its primary trade. Its business is our business, the front page of the company announcing, “it’s your date, you should get paid for it,” and the app is a gamified access point for them to get extremely accurate location data on you 24/7. Tapestri pays anywhere from $8 to $25 a month for the data. What drives this noble business model? Some privacy experts report that, although it may benefit the consumer, its purpose is not as altruistic as it seems. These advocates of online anonymity explain that “traditional sources of data dry up,” and as more and more jurisdictions pass laws regulating the background tracking that the big companies are now known for (you can’t use the service until you accept the terms, and there is no negotiating with Apple here), the industry is having to get creative.
 
Changed:
<
<

Subsection B

Subsub 1

Subsub 2

Section II

Subsection A

Subsection B

>
>
This new type of business, centered on what is called zero-party data, is nothing to celebrate yet, at least it seems. The end result is the same, but at least the violation is more out in the open. The approach to data mining on which companies like Tapestri focus is indeed “better” in that it is a consensual transaction, but this is meaningless so long as the traditional “background” tracking is completely eradicated. It doesn’t help that the people making laws, for the most part, don’t know how to open their email.
 
Added:
>
>
A new opportunity has presented itself: getting paid to sign over your data. The choice is yours. Is getting your Netflix subscription paid for worth welcoming Big Brother with open arms?
 
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.

JaredHopperFirstEssay 1 - 13 Oct 2023 - Main.JaredHopper
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstEssay"
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Fantastic Machines and How to Use Them

-- By JaredHopper - 13 Oct 2023

Introduction

In thinking about the pitfalls and perils of the "internet society," it is tempting to erase the leaps and bounds that developing technology signifies in human evolution. The "outsourcing" of high-analytic capabilities, which I will refer to in this essay as the trend towards pushing what we see as exclusively human traits into the domain of technology, to AI algorithms is terrifying to most of us. We don't trust cars to drive themselves because there is an inherently human type of caution, we think, that cannot be captured by a precarious mixture of cameras, motion sensors, and an algorithm. Even the development of delivery robots scares us: we might be ok with technology augmenting our reality, but it should not "walk" among us. Where does this fear come from? Obviously there are multiple root causes, but it may be helpful to think about just one within this essay's confines: outsourcing as dangerous to human development. It may certainly be the case that a society dependent on maximum technological integration, as I argue we are, has resulted in an evolutionary halt. For example, the capacity to solve simple arithmetic, while still useful in calculating a 20% tip at (some) restaurants, is essentially vestigial at this point, calculators having become a household device by the mid-1970s. But I think that fear in totally abandoning arithmetic in our early education comes more from a laziness in revising hundreds of years of pedagogy rather than from from a fear that not learning arithmetic will stunt mental growth. The transformation of function from essential to vestigial is, and has always been, a sign of evolution, not sliding backwards (looking at you, appendix). When viewing moments of outsourcing in the twentieth century ex-post, we are fine looking back on completed transfers of previously human tasks to automated systems as not only a good thing, but one essential to exponential progress. We fear changing, but we do not regret being changed.

Surrender

It is no secret that all of our data, our private information as well as most things we have said in the last decade, is no longer just ours. Corporations like Alphabet have broken into our homes and raided our medicine cabinets, our safes, and our lingerie drawers. The data is sold to other private corporations for a variety of purposes to fill the coffers of these data-mining titans like Alphabet. Free services do not exist.

Subsub 1

Subsection B

Subsub 1

Subsub 2

Section II

Subsection A

Subsection B


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 5r5 - 19 Dec 2023 - 23:54:25 - JaredHopper
Revision 4r4 - 17 Dec 2023 - 19:19:54 - EbenMoglen
Revision 3r3 - 17 Dec 2023 - 18:38:48 - JaredHopper
Revision 2r2 - 13 Dec 2023 - 16:45:50 - JaredHopper
Revision 1r1 - 13 Oct 2023 - 19:17:55 - JaredHopper
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM