Law in the Internet Society

View   r7  >  r6  >  r5  >  r4  >  r3  >  r2  ...
LouisAmoryFirstEssay 7 - 18 Apr 2022 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

Your profile defines your future.

Line: 49 to 49
 The “profiled human” does not freely make these choices. The “profiled human” is only a reflection of the predictable. The profiled human will be suggested to behave in the same way as he did in the past and in the same way as people with similar profiles have behaved previously.

If you are a white male studying law at Columbia, it means you are also likely to vote Democrats, travel to Europe, be interested in wine, play tennis, and listen to rock music. If you were born in Harlem and you are a black unemployed male, you will also be likely to vote Democrats, but you will rather be likely not to travel, be interested in drinking beers or sodas rather than wine, play soccer or video games rather than tennis, listen to hip-hop instead of rock music. And this is what will be suggested to them as well as to their alike friends. The data locks people into their profiles. It makes people become what you were likely to become and, in this sense, prevents individuals to freely realize themselves and reproduces existing social patterns. The possible was reduced to the probable.

Added:
>
>
But this presentation isn't accurate. There are more kinds of music than two, and "likely" is not an available judgment. It's true that describing a model based on four orders of magnitude more points requires two sentences, but they could be spared. "Locking-in" would better be described, perhaps, in terms of the goal of the Chinese Social Credit System. That would make the point clearly and allow your preceding analysis to be terser and more direct.

 \ No newline at end of file

LouisAmoryFirstEssay 6 - 01 Feb 2022 - Main.LouisAmory
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

Your profile defines your future.

Line: 8 to 8
 The "smart" objects were supposed to make our lives easier, but they were rather designed to collect our behavior patterns. Free digital services are like free dinners, if you’re not on the table, you’re on the menu. Our travels, purchases, internet searches, our heart pace are examples of the data collected by our new tech-toys/behavioral trackers.
Changed:
<
<
At the data collection stage, our behavioral data is a huge, incoherent, decontextualized mass of data. After collection, data must be processed. Algorithms will compute all the collected raw data and recognize behavioral patterns. Behavioral patterns are built on the basis of how you and other people having similar profiles behaved. Your behavioral patterns, all put together constituting your profile.
>
>
At the data collection stage, our behavioral data is a huge, incoherent, decontextualized mass of data. After collection, data must be processed. Algorithms will compute all the collected raw data and recognize behavioral patterns. Behavioral patterns are built on the basis of how you and other people having similar profiles behaved. Your behavioral patterns, all put together constitute your profile.
 There are multiple applications of profiling: marketing, surveillance, city management, security, etc. They all learn about, predict and influence our behaviors. On the basis of an individual’s profile, continuously evolving over time as data comes in, his/her behaviors can be predicted, triggered, guided through incentives, personalized recommendations, suggestions, warnings or other stimuli. Such guiding has at least three undesirable consequences: (i) a new non-democratic normativity regime; (ii) a new conception of the human being; (iii) locking people into their profile.

Line: 20 to 20
 

New non-democratic normativity regime

Changed:
<
<
As profiling is capable to influence people’s behaviors, it has normative power. Contrary to classic forms of norms, produced by law, politics and social control, profiling normativity does not ban or constraint, but rather makes the disobedience the norm unlikely. As pointed by researcher A. Rouvroy, it is a “relatively subliminal mode of government that consists of directing people’s attention towards certain things, in modifying the informational or physical environment so that behaviors are no longer obligatory but necessary.” (Link here)
>
>
As profiling is capable to influence people’s behaviors, it has normative power. Contrary to classic forms of norms, produced by law, politics and social control, profiling normativity does not ban or constraint, but rather makes the disobedience the norm unlikely. As pointed by researcher A. Rouvroy, it is a “relatively subliminal mode of government that consists of directing people’s attention towards certain things, in modifying the informational or physical environment so that behaviors are no longer obligatory but necessary.” (see here)
 This new form of normativity is not conducted in the name of certain shared values, a philosophy or an ideology. It is the death of idealistic/value-based politics. Politics is all about transforming the state of affairs by means of ideas, projections, and imagination. The world of profiling is not aiming at transforming things by means of communist, capitalist, or any ideologically supported regime. At most, behavioral predictability could claim to govern society objectively and efficiently with the sole aim of optimizing social life as much as possible, without bothering to know whether the norm is fair, equitable or legitimate. In fact, it is a non-democratic regime of optimization of the existing state of affairs for the benefit certain actors, be it private actors serving their private monetary interests (as in Western liberal democracies) public actors serving social order (such as in China).


LouisAmoryFirstEssay 5 - 30 Jan 2022 - Main.LouisAmory
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

Your profile defines your future.

Line: 2 to 2
 
META TOPICPARENT name="FirstEssay"

Your profile defines your future.

Deleted:
<
<
 
Deleted:
<
<
The emergence of individual connected devices such as personal computers, smartphones, smartwatches, tablets, etc. introduced spies in our lives. The "smart" objects that are supposed to make our lives easier are designed to collect our behaviors. We generate more information in two days than humanity did in two million years. Our travels, our purchases, our internet searches, our interests, our friendships, the length of our sleep, our political opinions, the pace of our heart, the features of our face are a few examples of the type of data collected by our new tech-toys/behavioral trackers. In 2010, there was one connected device per human on earth. Today, there are more than 6 connected devices. The "Smart" objects brought the "Internet of Things revolution” track our behaviors in return for services supposed to make our life more convenient.

At the data collection stage, our behaviors data is a huge, incoherent, decontextualized mass of data. After collecting, data must be processed. Data about an individual will be compiled in order to define his profile. For instance: “white male going to law school at Columbia, of catholic religion, homosexual, etc.”. Once you are categorized into a profile, algorithms will predict and influence your behaviors on the basis of how you and other people having similar profiles behaved.

 
Changed:
<
<
There are multiple (potential) applications of profiling: marketing, surveillance, city management, security, etc. They all have one thing in common: learning about and influencing our behaviors. Once an individual’s profile is compiled, algorithms are able to guide his behaviors through incentives, personalized recommendations, suggestions or alerts. Such influence on our behaviors have at least three undesirable consequences: (1) a new non-democratic normativity regime; (2) a new conception of the human; (3) locking people into their profile.
>
>
Spying used to take a great deal of effort. Wiretapping someone’s home normally required the coordinated work of a team of seasoned professionals and intercepting communications was a task that used to take months. However, the emergence of individual connected devices placed spies in our desks, pockets and wrists, 24/7. Our entire lives are now visible and on display.
 
Added:
>
>
The "smart" objects were supposed to make our lives easier, but they were rather designed to collect our behavior patterns. Free digital services are like free dinners, if you’re not on the table, you’re on the menu. Our travels, purchases, internet searches, our heart pace are examples of the data collected by our new tech-toys/behavioral trackers.
 
Added:
>
>
At the data collection stage, our behavioral data is a huge, incoherent, decontextualized mass of data. After collection, data must be processed. Algorithms will compute all the collected raw data and recognize behavioral patterns. Behavioral patterns are built on the basis of how you and other people having similar profiles behaved. Your behavioral patterns, all put together constituting your profile.
 
Changed:
<
<

New non-democratic normativity regime

>
>
There are multiple applications of profiling: marketing, surveillance, city management, security, etc. They all learn about, predict and influence our behaviors. On the basis of an individual’s profile, continuously evolving over time as data comes in, his/her behaviors can be predicted, triggered, guided through incentives, personalized recommendations, suggestions, warnings or other stimuli. Such guiding has at least three undesirable consequences: (i) a new non-democratic normativity regime; (ii) a new conception of the human being; (iii) locking people into their profile.
 
Deleted:
<
<
As algorithms are capable to influence people’s behaviors, they have normative power. Contrary to classic laws and other forms of governmental regulations, algorithmic normativity does not act by way of constraint, but rather makes the disobedience the norm unlikely. For instance, algorithm can predict which driver are likely to drink alcohol, and by way of alerts, prevent such drive to drive. A “smart” car could also potentially refuse to start if it detects that the driver has alcohol in his/her blood.
 
Deleted:
<
<
This new form of normativity is not conducted in the name of certain shared values, a philosophy or an ideology. Algorithms claim to govern society objectively and efficiently with the sole aim of optimizing social life as much as possible, without bothering to know whether the norm is fair, equitable or legitimate. Neither has the algorithmic any democratic legitimacy, as they are currently mostly used by private actors in order to serve their private monetary interests (at least in Western liberal democracies).
 

Changed:
<
<

The algorithmic human

>
>

New non-democratic normativity regime

 
Changed:
<
<
The second issue caused by data processing and profiling is of philosophical nature. The philosophy of the Enlightenment envisages the modern human as a free individual, responsible, endowed with reason and free will.

The “algorithmic human”, whose behaviors are collected, processed, compiled in a profile and influenced, shares a common feature with the free individual conceived by the philosophy of the Enlightenment. They share a logic of individualization. Insofar it allows the environment to adapt to each profile in all its singularity (for example: individualization of advertising), the “algorithmic human” is as individualistic as the “free human”. However, the “algorithmic human” is far from the notion of the modern man as conceived by the philosophy of the Enlightenment. He is surveilled. His behaviors are tracked and influenced. Eric Schmidt, CEO of Google from 2001 to 2011 said: “I actually think most people don't want Google to answer their questions. They want Google to tell them what they should be doing next. (...) The technology will be so good it will be very hard for people to watch or consume something that has not in some sense been tailored for them”.

>
>
As profiling is capable to influence people’s behaviors, it has normative power. Contrary to classic forms of norms, produced by law, politics and social control, profiling normativity does not ban or constraint, but rather makes the disobedience the norm unlikely. As pointed by researcher A. Rouvroy, it is a “relatively subliminal mode of government that consists of directing people’s attention towards certain things, in modifying the informational or physical environment so that behaviors are no longer obligatory but necessary.” (Link here)
 
Added:
>
>
This new form of normativity is not conducted in the name of certain shared values, a philosophy or an ideology. It is the death of idealistic/value-based politics. Politics is all about transforming the state of affairs by means of ideas, projections, and imagination. The world of profiling is not aiming at transforming things by means of communist, capitalist, or any ideologically supported regime. At most, behavioral predictability could claim to govern society objectively and efficiently with the sole aim of optimizing social life as much as possible, without bothering to know whether the norm is fair, equitable or legitimate. In fact, it is a non-democratic regime of optimization of the existing state of affairs for the benefit certain actors, be it private actors serving their private monetary interests (as in Western liberal democracies) public actors serving social order (such as in China).
 
Deleted:
<
<

The profile prison

The third regrettable consequence of data profiling is the reproduction of class systems. The “free human” should be able to become the person he/she freely chooses to become. Liberal democracies traditionally strive at giving each individual the tools necessary to achieve its potential and to emancipate. Individuals should be free to practice the sport they like, listen and play music they like, be interested in the languages and culture they admire, etc. Such life choices should ideally be made freely.

The “algorithmic human” does not freely make these choices. The “algorithmic human” is predictable. The algorithmic human will typically behave in the same way as he did in the past and in the same way as people with similar profiles have behaved previously.

If you are a white male studying law at Columbia, it means you are also likely to vote Democrats, travel to Europe, eat salads for lunch, be interested in wine, play tennis, and listen to rock music. If you were born in Harlem and you are a black unemployed male, you will also be likely to vote Democrats, but you will rather be likely not to travel, to eat junk food for lunch, be interested in drinking beers or sodas rather than wine, play soccer or video games rather than tennis, listen to hip-hop instead of rock music. And this is what will be suggested to them as well as to their alike friends. The algorithm locks people into their profiles. It makes people become what you were likely to become and, in this sense, prevents individuals to freely realize themselves and reproduces existing social patterns.

 
Deleted:
<
<
Begin by telling people what they will get from reading. Starting out with two paragraphs of exposition before we even begin to find out what your contribution is will lose readers you could have kept.

Because you have seen deeply so far, advice on the improvement of the essay involves specifics.

You attribute to "algorithms" the various effects in "guiding" behavior you describe. This error is becoming so cheap and easy that reality will never disturb it. Not only does it distract people from actually thinking about technology, it introduces into policy the idea of "algorithmic transparency." which is a favorite recommendation of tech-adjacent rather than technically expert people.

An algorithm is, strictly speaking, a procedure for making a computation, a computer program strictu sensu. An efficient mode for sorting a list or computing the transcendental arccosine of floating-point data is an algorithm.

Generally speaking, the algorithms involved in the ad-tech targeting and the public-order "nudging" are pattern recognition algorithms. They are simple and general: if you look at them and you're a proficient reader of program code you can see everything about them very easily and in a short time. Those programs are exposed to "training data." This data trains the program to recognize patterns. Any given set of training data will cause the simple pattern recognition program being trained on it to behave in the recognition of its relevant patterns in slightly different (in the concrete technical sense, in differently biased ways).

 
Deleted:
<
<
The pattern-recognition program as it has been trained then is given a flow of "real" input, on which it is trained in turn, so that it "improves" its ability to find the pre-establish patterns as they are modified by the training process. The whole state of the model at any moment is the product of all its previous states. Knowing "the algorithm" is more or less useless in knowing what is actually occurring in the model.
 
Changed:
<
<
Think of a spam filter, trying to do an intelligent job filtering your email for you. It begins by training on lots of spam, presumably including the things you too consider spam, so that when you begin receiving email for it to filter, it catches a bunch of spam, not much of which you considered not-spam (called "ham" in hacker jargon). Over time, as you mark spam you didn't want to receive and ham you regret was sidelined in transit, the filter gets better at doing the job for you. (Because "spam" is actually just mail you didn't want and "ham" is actually just stuff you did, this simple Bayesian probability calculator that could be whipped up in an afternoon does a pretty good imitation of an impossible job, namely mirroring your semi-conscious subjective decisions about what email you want to see.)
>
>

The profiled human

 
Changed:
<
<
So, to be brief about it, it's not the algorithms, it's the data over time and the evolving state of the model. To be even briefer, if less comprehensible, it's the Gestalt of neural networks. Algorithmic transparency is mostly useless. People who are essentially applying our free software philosophy to this problem are unaware of the differences that matter.
>
>
The second issue caused by data processing and profiling is of philosophical nature. The philosophy of the Enlightenment envisages the modern human as a free individual, responsible, endowed with reason and free will.
 
Changed:
<
<
Behavior guiding, then, is based not on some "algorithm for guiding people" that we should be studying, but on another simple principle, the basic life-rule of the Parasite With the Mind of God: reinforce patterns that benefit you, and discourage patterns that might benefit the human, but don't benefit you. This is the basic life rule of all parasites. This also leads to reinforcing patterns that benefit the human but also benefit the parasite. That's why parasitism can be evolutionarily advantageous. We have eukaryotes and photosynthesis for this reason, after all.
>
>
The “profiled human” shares a common feature with the free individual conceived by the philosophy of the Enlightenment. They share a logic of individualization. Insofar it allows the environment to adapt to each profile in all its singularity (for example: individualization of advertising), the “profiled human” is as individualistic as the “free human”. However, the “profiled human” is far from the notion of the modern man as conceived by the philosophy of the Enlightenment. He is surveilled. His behaviors are tracked and influenced.
 
Changed:
<
<
In this instance, the parasite guides first of all by reinforcing patterns of engagement. Where those patterns of engagement reduce anxiety responses in the human, the patterns reinforced are experienced as "convenient" by the human.
>
>
Eric Schmidt, CEO of Google from 2001 to 2011 said: “I actually think most people don't want Google to answer their questions. They want Google to tell them what they should be doing next. (...) The technology will be so good it will be very hard for people to watch or consume something that has not in some sense been tailored for them”. This is not a new debate. Replace Google by God, and this sentence could have been written in the 4th century, when Saint-Augustine questioned individuals’ freedom, predestination and God’s foreknowledge. Later, Walter Benjamin studied the impact of propaganda and persuasion methods such as television. The novelty today is the automaticity, magnitude, and surveillance dimension of the phenomenon.
 
Deleted:
<
<
The parasite guides secondarily by reducing patterns that reduce engagement. The negative reinforcement structure is accomplished by transferring anxiety back to the human: this is experienced by the human as FOMO, or fear of social isolation, encouraging negative internal feedback experienced as depression that can be alleviated by re-engaging.
 
Deleted:
<
<
It's only at the tertiary level that the platforms to which humans allow themselves to be connected then guide behavior by presenting particular stimuli known to elicit specific consumptive responses—a process variously described as "advertising," or "campaigning," or "activating." Whether this is democratic depends on the defiinition of democracy that you do not give. But one might not inquire whether the inquiry results from a category error or a tautology: both predation and parasitism are processes to which the concept of democracy is not applicable.
 
Changed:
<
<
Blaming the existence of pattern-matching software for the patterns is also a category error, known colloquially among humans as "shooting the messager." If human behavior is calculable and can be nudged in these ways, then our Enlightenment account of human-ness is incomplete, or perhaps our account of the Enlightenment and its relation to our existence after the phenomena we call Freud, Lenin, Bernays, Hitler, Skinner, Mao Zedong, Pablo Picasso and the King of the Undead Now Dead is not quite perfect. I knew that there were problems in our conception of free will before the Apple ][ existed, let alone Facebook.
>
>

The profile prison

 
Changed:
<
<
But there is a prison being built. You just don't say anything about how we can walk out from it while it is still unfinished. Now that would be one hell of an essay.
>
>
The third regrettable consequence of profiling is the reproduction of class systems. The “free human” should be able to become the person he/she freely chooses to become. Liberal democracies traditionally strive at giving each individual the tools necessary to achieve his/her potential and to emancipate. Individuals should be free to practice the sport they like, listen and play music they like, be interested in the languages and culture they admire, etc. Such life choices should ideally be made freely.
 
Changed:
<
<
>
>
The “profiled human” does not freely make these choices. The “profiled human” is only a reflection of the predictable. The profiled human will be suggested to behave in the same way as he did in the past and in the same way as people with similar profiles have behaved previously.
  \ No newline at end of file
Added:
>
>
If you are a white male studying law at Columbia, it means you are also likely to vote Democrats, travel to Europe, be interested in wine, play tennis, and listen to rock music. If you were born in Harlem and you are a black unemployed male, you will also be likely to vote Democrats, but you will rather be likely not to travel, be interested in drinking beers or sodas rather than wine, play soccer or video games rather than tennis, listen to hip-hop instead of rock music. And this is what will be suggested to them as well as to their alike friends. The data locks people into their profiles. It makes people become what you were likely to become and, in this sense, prevents individuals to freely realize themselves and reproduces existing social patterns. The possible was reduced to the probable.

LouisAmoryFirstEssay 4 - 28 Nov 2021 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

Your profile defines your future.

Line: 43 to 43
  If you are a white male studying law at Columbia, it means you are also likely to vote Democrats, travel to Europe, eat salads for lunch, be interested in wine, play tennis, and listen to rock music. If you were born in Harlem and you are a black unemployed male, you will also be likely to vote Democrats, but you will rather be likely not to travel, to eat junk food for lunch, be interested in drinking beers or sodas rather than wine, play soccer or video games rather than tennis, listen to hip-hop instead of rock music. And this is what will be suggested to them as well as to their alike friends. The algorithm locks people into their profiles. It makes people become what you were likely to become and, in this sense, prevents individuals to freely realize themselves and reproduces existing social patterns.

Added:
>
>
Begin by telling people what they will get from reading. Starting out with two paragraphs of exposition before we even begin to find out what your contribution is will lose readers you could have kept.

Because you have seen deeply so far, advice on the improvement of the essay involves specifics.

You attribute to "algorithms" the various effects in "guiding" behavior you describe. This error is becoming so cheap and easy that reality will never disturb it. Not only does it distract people from actually thinking about technology, it introduces into policy the idea of "algorithmic transparency." which is a favorite recommendation of tech-adjacent rather than technically expert people.

An algorithm is, strictly speaking, a procedure for making a computation, a computer program strictu sensu. An efficient mode for sorting a list or computing the transcendental arccosine of floating-point data is an algorithm.

Generally speaking, the algorithms involved in the ad-tech targeting and the public-order "nudging" are pattern recognition algorithms. They are simple and general: if you look at them and you're a proficient reader of program code you can see everything about them very easily and in a short time. Those programs are exposed to "training data." This data trains the program to recognize patterns. Any given set of training data will cause the simple pattern recognition program being trained on it to behave in the recognition of its relevant patterns in slightly different (in the concrete technical sense, in differently biased ways).

The pattern-recognition program as it has been trained then is given a flow of "real" input, on which it is trained in turn, so that it "improves" its ability to find the pre-establish patterns as they are modified by the training process. The whole state of the model at any moment is the product of all its previous states. Knowing "the algorithm" is more or less useless in knowing what is actually occurring in the model.

Think of a spam filter, trying to do an intelligent job filtering your email for you. It begins by training on lots of spam, presumably including the things you too consider spam, so that when you begin receiving email for it to filter, it catches a bunch of spam, not much of which you considered not-spam (called "ham" in hacker jargon). Over time, as you mark spam you didn't want to receive and ham you regret was sidelined in transit, the filter gets better at doing the job for you. (Because "spam" is actually just mail you didn't want and "ham" is actually just stuff you did, this simple Bayesian probability calculator that could be whipped up in an afternoon does a pretty good imitation of an impossible job, namely mirroring your semi-conscious subjective decisions about what email you want to see.)

So, to be brief about it, it's not the algorithms, it's the data over time and the evolving state of the model. To be even briefer, if less comprehensible, it's the Gestalt of neural networks. Algorithmic transparency is mostly useless. People who are essentially applying our free software philosophy to this problem are unaware of the differences that matter.

Behavior guiding, then, is based not on some "algorithm for guiding people" that we should be studying, but on another simple principle, the basic life-rule of the Parasite With the Mind of God: reinforce patterns that benefit you, and discourage patterns that might benefit the human, but don't benefit you. This is the basic life rule of all parasites. This also leads to reinforcing patterns that benefit the human but also benefit the parasite. That's why parasitism can be evolutionarily advantageous. We have eukaryotes and photosynthesis for this reason, after all.

In this instance, the parasite guides first of all by reinforcing patterns of engagement. Where those patterns of engagement reduce anxiety responses in the human, the patterns reinforced are experienced as "convenient" by the human.

The parasite guides secondarily by reducing patterns that reduce engagement. The negative reinforcement structure is accomplished by transferring anxiety back to the human: this is experienced by the human as FOMO, or fear of social isolation, encouraging negative internal feedback experienced as depression that can be alleviated by re-engaging.

It's only at the tertiary level that the platforms to which humans allow themselves to be connected then guide behavior by presenting particular stimuli known to elicit specific consumptive responses—a process variously described as "advertising," or "campaigning," or "activating." Whether this is democratic depends on the defiinition of democracy that you do not give. But one might not inquire whether the inquiry results from a category error or a tautology: both predation and parasitism are processes to which the concept of democracy is not applicable.

Blaming the existence of pattern-matching software for the patterns is also a category error, known colloquially among humans as "shooting the messager." If human behavior is calculable and can be nudged in these ways, then our Enlightenment account of human-ness is incomplete, or perhaps our account of the Enlightenment and its relation to our existence after the phenomena we call Freud, Lenin, Bernays, Hitler, Skinner, Mao Zedong, Pablo Picasso and the King of the Undead Now Dead is not quite perfect. I knew that there were problems in our conception of free will before the Apple ][ existed, let alone Facebook.

But there is a prison being built. You just don't say anything about how we can walk out from it while it is still unfinished. Now that would be one hell of an essay.

 \ No newline at end of file

LouisAmoryFirstEssay 3 - 22 Oct 2021 - Main.LouisAmory
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

Your profile defines your future.

Changed:
<
<

Collection of data

>
>
The emergence of individual connected devices such as personal computers, smartphones, smartwatches, tablets, etc. introduced spies in our lives. The "smart" objects that are supposed to make our lives easier are designed to collect our behaviors. We generate more information in two days than humanity did in two million years. Our travels, our purchases, our internet searches, our interests, our friendships, the length of our sleep, our political opinions, the pace of our heart, the features of our face are a few examples of the type of data collected by our new tech-toys/behavioral trackers. In 2010, there was one connected device per human on earth. Today, there are more than 6 connected devices. The "Smart" objects brought the "Internet of Things revolution” track our behaviors in return for services supposed to make our life more convenient.
 
Added:
>
>
At the data collection stage, our behaviors data is a huge, incoherent, decontextualized mass of data. After collecting, data must be processed. Data about an individual will be compiled in order to define his profile. For instance: “white male going to law school at Columbia, of catholic religion, homosexual, etc.”. Once you are categorized into a profile, algorithms will predict and influence your behaviors on the basis of how you and other people having similar profiles behaved.
 
Changed:
<
<

Processing data

>
>
There are multiple (potential) applications of profiling: marketing, surveillance, city management, security, etc. They all have one thing in common: learning about and influencing our behaviors. Once an individual’s profile is compiled, algorithms are able to guide his behaviors through incentives, personalized recommendations, suggestions or alerts. Such influence on our behaviors have at least three undesirable consequences: (1) a new non-democratic normativity regime; (2) a new conception of the human; (3) locking people into their profile.
 
Changed:
<
<

>
>

New non-democratic normativity regime

 
Added:
>
>
As algorithms are capable to influence people’s behaviors, they have normative power. Contrary to classic laws and other forms of governmental regulations, algorithmic normativity does not act by way of constraint, but rather makes the disobedience the norm unlikely. For instance, algorithm can predict which driver are likely to drink alcohol, and by way of alerts, prevent such drive to drive. A “smart” car could also potentially refuse to start if it detects that the driver has alcohol in his/her blood.

This new form of normativity is not conducted in the name of certain shared values, a philosophy or an ideology. Algorithms claim to govern society objectively and efficiently with the sole aim of optimizing social life as much as possible, without bothering to know whether the norm is fair, equitable or legitimate. Neither has the algorithmic any democratic legitimacy, as they are currently mostly used by private actors in order to serve their private monetary interests (at least in Western liberal democracies).

The algorithmic human

The second issue caused by data processing and profiling is of philosophical nature. The philosophy of the Enlightenment envisages the modern human as a free individual, responsible, endowed with reason and free will.

The “algorithmic human”, whose behaviors are collected, processed, compiled in a profile and influenced, shares a common feature with the free individual conceived by the philosophy of the Enlightenment. They share a logic of individualization. Insofar it allows the environment to adapt to each profile in all its singularity (for example: individualization of advertising), the “algorithmic human” is as individualistic as the “free human”. However, the “algorithmic human” is far from the notion of the modern man as conceived by the philosophy of the Enlightenment. He is surveilled. His behaviors are tracked and influenced. Eric Schmidt, CEO of Google from 2001 to 2011 said: “I actually think most people don't want Google to answer their questions. They want Google to tell them what they should be doing next. (...) The technology will be so good it will be very hard for people to watch or consume something that has not in some sense been tailored for them”.

The profile prison

The third regrettable consequence of data profiling is the reproduction of class systems. The “free human” should be able to become the person he/she freely chooses to become. Liberal democracies traditionally strive at giving each individual the tools necessary to achieve its potential and to emancipate. Individuals should be free to practice the sport they like, listen and play music they like, be interested in the languages and culture they admire, etc. Such life choices should ideally be made freely.

The “algorithmic human” does not freely make these choices. The “algorithmic human” is predictable. The algorithmic human will typically behave in the same way as he did in the past and in the same way as people with similar profiles have behaved previously.

If you are a white male studying law at Columbia, it means you are also likely to vote Democrats, travel to Europe, eat salads for lunch, be interested in wine, play tennis, and listen to rock music. If you were born in Harlem and you are a black unemployed male, you will also be likely to vote Democrats, but you will rather be likely not to travel, to eat junk food for lunch, be interested in drinking beers or sodas rather than wine, play soccer or video games rather than tennis, listen to hip-hop instead of rock music. And this is what will be suggested to them as well as to their alike friends. The algorithm locks people into their profiles. It makes people become what you were likely to become and, in this sense, prevents individuals to freely realize themselves and reproduces existing social patterns.

 
Deleted:
<
<
After collecting and processing data, comes the stage of using it. The applications of Big Data are multiple. However, they all have one thing in common: influencing our behavior. The first section will critically expose the new normativity brought by big data. The second section will criticize the conception of the human envisaged by algorithmic governance.
 \ No newline at end of file

LouisAmoryFirstEssay 2 - 21 Oct 2021 - Main.LouisAmory
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

Your profile defines your future.

Deleted:
<
<
This paper argues that data is collected around an individual's behaviours, algorithms will

Collection of data

 
Changed:
<
<
The emergence of individual technologies such as personal computers, smartphones, smartwatches, tablets, etc. introduced spies in our lives. The "smart" objects that are supposed to make our lives easier are rather meant at better collecting our behaviours. Our travels, our purchases, our internet searches, our interests, our friendships, the length of our sleep, our political opinions, the pace of our heart, the features of our face are a few examples of data collected by the spies that we introduced in our lives. The "Smart-ish" object, the "Internet of Things", are busy collecting such data in return for services supposed to make our life more convenient. We generate more information in two days than humanity did in two million years. In 2010, there was one connected device per human on earth. Today, there are more than 6 connected devices
>
>

Collection of data

 
Deleted:
<
<
At the data collection stage, it is a huge, incoherent, decontextualized mass of data. After collecting and processing data, comes the stage of using it. As explained above, the applications of Big Data are multiple. However, they all have one thing in common: influencing our behavior. The first section will critically expose the new normativity brought by big data. The second section will criticize the conception of the human envisaged by algorithmic governance.
 

Processing data


LouisAmoryFirstEssay 1 - 21 Oct 2021 - Main.LouisAmory
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstEssay"

Your profile defines your future.

This paper argues that data is collected around an individual's behaviours, algorithms will

Collection of data

The emergence of individual technologies such as personal computers, smartphones, smartwatches, tablets, etc. introduced spies in our lives. The "smart" objects that are supposed to make our lives easier are rather meant at better collecting our behaviours. Our travels, our purchases, our internet searches, our interests, our friendships, the length of our sleep, our political opinions, the pace of our heart, the features of our face are a few examples of data collected by the spies that we introduced in our lives. The "Smart-ish" object, the "Internet of Things", are busy collecting such data in return for services supposed to make our life more convenient. We generate more information in two days than humanity did in two million years. In 2010, there was one connected device per human on earth. Today, there are more than 6 connected devices

At the data collection stage, it is a huge, incoherent, decontextualized mass of data. After collecting and processing data, comes the stage of using it. As explained above, the applications of Big Data are multiple. However, they all have one thing in common: influencing our behavior. The first section will critically expose the new normativity brought by big data. The second section will criticize the conception of the human envisaged by algorithmic governance.

Processing data

After collecting and processing data, comes the stage of using it. The applications of Big Data are multiple. However, they all have one thing in common: influencing our behavior. The first section will critically expose the new normativity brought by big data. The second section will criticize the conception of the human envisaged by algorithmic governance.


Revision 7r7 - 18 Apr 2022 - 16:18:40 - EbenMoglen
Revision 6r6 - 01 Feb 2022 - 03:56:42 - LouisAmory
Revision 5r5 - 30 Jan 2022 - 17:44:06 - LouisAmory
Revision 4r4 - 28 Nov 2021 - 18:22:32 - EbenMoglen
Revision 3r3 - 22 Oct 2021 - 19:18:51 - LouisAmory
Revision 2r2 - 21 Oct 2021 - 21:09:08 - LouisAmory
Revision 1r1 - 21 Oct 2021 - 19:57:54 - LouisAmory
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM