Law in the Internet Society

Your Netflix Subscription's on Big Tech

-- By JaredHopper - 13 Oct 2023


In thinking about the pitfalls and perils of the "internet society," it is tempting to erase the leaps and bounds that developing technology signifies in human evolution. The "outsourcing" of high-analytic capabilities to AI algorithms is terrifying to most of us. We don't trust cars to drive themselves because there is an inherently human type of caution, we think, that cannot be captured by a precarious mixture of cameras, motion sensors, and an algorithm. Even the development of delivery robots scares us: we might be ok with technology augmenting our reality, but it should not "walk" among us.

Marching Forwards

Where does this fear come from? Obviously, there are multiple root causes, but it may be helpful to think about just one within this essay's confines: outsourcing as dangerous to human development. It may certainly be the case that a society dependent on maximum technological integration, as I argue we are, has resulted in an evolutionary halt. For example, the capacity to solve simple arithmetic, while still useful in calculating a 20% tip at (some) restaurants, is essentially vestigial at this point. However, I think that fear in totally abandoning arithmetic in our early education comes more from a laziness in revising hundreds of years of pedagogy rather than from a fear that not learning arithmetic will stunt mental growth. The transformation of function from essential to vestigial is, and has always been, a sign of evolution, not sliding backwards. Looking at outsourcing in the twentieth century ex-post, we accept completed transfers of previously human tasks to automated systems as not only a good thing, but one essential to exponential progress. We fear changing, but we do not regret being changed.


It is no secret that all our data is no longer just ours. Corporations like Alphabet have broken into our homes and raided our medicine cabinets, our safes, and our lingerie drawers. The data is sold to other private corporations for a variety of purposes to fill the coffers of these data-mining titans like Alphabet. Free services do not exist: we pay with our privacy. Perhaps surrender is the answer. Is there a way to surrender without losing hope of retaining, or taking back, control?

Give the voyeurs what they want; get a buck!

The view, in short, is that we are already being tracked and invaded, so we might as well get paid for it. Tech companies sell our data, and we don’t ask to be paid for it? Bonkers! However, a few companies are starting to pay their customers for their location information. For example, Tapestri, a young startup, itself is a company that has your data as its primary trade. Its business is our business, the front page of the company announcing, “it’s your date, you should get paid for it,” and the app is a gamified access point for extremely accurate location data \ 24/7. Tapestri pays anywhere from $8 to $25 a month for the data. What drives this noble business model? Some privacy experts report that, although it may benefit the consumer, its purpose is not as altruistic as it seems. These advocates of online anonymity explain that traditional sources of data dry up, and as more and more jurisdictions pass laws regulating the background tracking that the big companies are now known for (you can’t use the service until you accept the terms, and there is no negotiating with Apple here), the industry is having to get creative.

This new type of business, centered on what is called zero-party data, is nothing to celebrate yet, at least it seems. The end result is the same, but at least the violation is more out in the open. The approach to data mining on which companies like Tapestri focus is indeed “better” in that it is a consensual transaction, but this is meaningless so long as the traditional “background” tracking is completely eradicated. It doesn’t help that the people making laws, for the most part, don’t know how to open their email.


The elephant in the room is the result of this surrender. Although it may be nice to get paid for your personal information, is compensation really enough to make up for the invasion of privacy? Perhaps greater clarity on where the information is going and for what it is being used is necessary for us to feel comfortable fully surrendering to a world of decreasing delineated personhood. It’s not clear whether companies would be fully transparent about to whom the data is being sold, but regulation might not be totally useless.

A new opportunity has presented itself: getting paid to sign over your data. The choice is yours. Is getting your Netflix subscription paid for worth welcoming Big Brother with open arms?

You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Webs Webs

r5 - 19 Dec 2023 - 23:54:25 - JaredHopper
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM