Law in the Internet Society

Your profile defines your future.

The emergence of individual connected devices such as personal computers, smartphones, smartwatches, tablets, etc. introduced spies in our lives. The "smart" objects that are supposed to make our lives easier are designed to collect our behaviors. We generate more information in two days than humanity did in two million years. Our travels, our purchases, our internet searches, our interests, our friendships, the length of our sleep, our political opinions, the pace of our heart, the features of our face are a few examples of the type of data collected by our new tech-toys/behavioral trackers. In 2010, there was one connected device per human on earth. Today, there are more than 6 connected devices. The "Smart" objects brought the "Internet of Things revolution” track our behaviors in return for services supposed to make our life more convenient.

At the data collection stage, our behaviors data is a huge, incoherent, decontextualized mass of data. After collecting, data must be processed. Data about an individual will be compiled in order to define his profile. For instance: “white male going to law school at Columbia, of catholic religion, homosexual, etc.”. Once you are categorized into a profile, algorithms will predict and influence your behaviors on the basis of how you and other people having similar profiles behaved.

There are multiple (potential) applications of profiling: marketing, surveillance, city management, security, etc. They all have one thing in common: learning about and influencing our behaviors. Once an individual’s profile is compiled, algorithms are able to guide his behaviors through incentives, personalized recommendations, suggestions or alerts. Such influence on our behaviors have at least three undesirable consequences: (1) a new non-democratic normativity regime; (2) a new conception of the human; (3) locking people into their profile.

New non-democratic normativity regime

As algorithms are capable to influence people’s behaviors, they have normative power. Contrary to classic laws and other forms of governmental regulations, algorithmic normativity does not act by way of constraint, but rather makes the disobedience the norm unlikely. For instance, algorithm can predict which driver are likely to drink alcohol, and by way of alerts, prevent such drive to drive. A “smart” car could also potentially refuse to start if it detects that the driver has alcohol in his/her blood.

This new form of normativity is not conducted in the name of certain shared values, a philosophy or an ideology. Algorithms claim to govern society objectively and efficiently with the sole aim of optimizing social life as much as possible, without bothering to know whether the norm is fair, equitable or legitimate. Neither has the algorithmic any democratic legitimacy, as they are currently mostly used by private actors in order to serve their private monetary interests (at least in Western liberal democracies).

The algorithmic human

The second issue caused by data processing and profiling is of philosophical nature. The philosophy of the Enlightenment envisages the modern human as a free individual, responsible, endowed with reason and free will.

The “algorithmic human”, whose behaviors are collected, processed, compiled in a profile and influenced, shares a common feature with the free individual conceived by the philosophy of the Enlightenment. They share a logic of individualization. Insofar it allows the environment to adapt to each profile in all its singularity (for example: individualization of advertising), the “algorithmic human” is as individualistic as the “free human”. However, the “algorithmic human” is far from the notion of the modern man as conceived by the philosophy of the Enlightenment. He is surveilled. His behaviors are tracked and influenced. Eric Schmidt, CEO of Google from 2001 to 2011 said: “I actually think most people don't want Google to answer their questions. They want Google to tell them what they should be doing next. (...) The technology will be so good it will be very hard for people to watch or consume something that has not in some sense been tailored for them”.

The profile prison

The third regrettable consequence of data profiling is the reproduction of class systems. The “free human” should be able to become the person he/she freely chooses to become. Liberal democracies traditionally strive at giving each individual the tools necessary to achieve its potential and to emancipate. Individuals should be free to practice the sport they like, listen and play music they like, be interested in the languages and culture they admire, etc. Such life choices should ideally be made freely.

The “algorithmic human” does not freely make these choices. The “algorithmic human” is predictable. The algorithmic human will typically behave in the same way as he did in the past and in the same way as people with similar profiles have behaved previously.

If you are a white male studying law at Columbia, it means you are also likely to vote Democrats, travel to Europe, eat salads for lunch, be interested in wine, play tennis, and listen to rock music. If you were born in Harlem and you are a black unemployed male, you will also be likely to vote Democrats, but you will rather be likely not to travel, to eat junk food for lunch, be interested in drinking beers or sodas rather than wine, play soccer or video games rather than tennis, listen to hip-hop instead of rock music. And this is what will be suggested to them as well as to their alike friends. The algorithm locks people into their profiles. It makes people become what you were likely to become and, in this sense, prevents individuals to freely realize themselves and reproduces existing social patterns.

Navigation

Webs Webs

r3 - 22 Oct 2021 - 19:18:51 - LouisAmory
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM