Law in the Internet Society

View   r3  >  r2  >  r1
RohanGeorge1FirstEssay 3 - 22 Jan 2018 - Main.RohanGeorge1
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Changed:
<
<

Regulating behavioural collection

>
>

Regulating behavioral collection

 -- By RohanGeorge1 - 9 Nov 2017
Added:
>
>
The scale of behavioral collection should not surprise. However current privacy laws fail to protect us from it. This model, which has been called Privacy Self-Management, focuses on principles of notice, consent and purpose limitation. It does not account for new technology and the political economy that emerged from it.
 
Changed:
<
<

Setting the scene

>
>
Additionally, as exemplified by Article 8(1) of the EU’s Charter on Fundamental Rights, “Everyone has the right to the protection of personal data concerning him or her”, somehow society has decided that what ought to be protected is information about humans, and not humans themselves.
 
Changed:
<
<
Recently, I went to Joe’s Pizza – the finest slice in the city. Since then, my Android smartphone has prompted me via push notification to view Joe’s menu thrice. This was the latest and most tangible form of behavioral collection I’ve consciously experienced. The scale of behavioral advertising, based on ubiquitous collection and tracking of consumer behavior through recursive cycles of stimulus and response tracking, should not surprise anyone.
>
>
Instead, ‘data protection’ legislation seems most focused on fostering a digital market, rather than protecting individual freedom and privacy. For example, EU data protection legislation is framed as part of the EU’s Digital Single Market policy, which aims to capitalize on the economic opportunities presented by technology.
 
Changed:
<
<
For the longest time, I thought little of digital privacy because I had nothing to hide. A simple thought experiment exposes the fallacious logic underlying such thoughts: Simply imagine you have something to hide – a colorful past, an addiction or a secret hobby – and then you realize the scale of the problem Daniel Solove lucidly highlights. Equating privacy with secrecy is too narrow a conception of privacy. It fails to capture the harms caused by extensive behavioral collection networks due to collection, use, disclosure or storage of personal data.
>
>
Thus, we could say that the regulatory landscape sanctions behavioral collection instead of attempting to regulate it.
 
Deleted:
<
<
Against this backdrop, it’s worth considering a recent development in human society – privacy and data protection legislation. Arguably, such laws consider some of these harms and put in place restrictions about how individuals’ personal data is used.
 
Added:
>
>

How Should Privacy Regulation Change?

One starting point in trying to reduce behavioral collection is adjusting the power balance between individuals and platform companies. But how did such a fundamental imbalance of power emerge? Why do they collect so much information (behavior) about us? Where did they get the mandate to use our information for profit?
 
Changed:
<
<

The 'main' problem

But I think these laws are woefully ill-suited to the times. For one, the crux of most regulatory regimes focuses on principles of notice, consent and purpose limitation. Those who collect, use disclose or store your data should notify you about it, garner your consent and do with your data nothing more than what they notified you about. This model, which has been called Privacy Self-Management, does not account for new technology and the political economy that emerged around it.
>
>

Societal Consensus on Regulating Platform Companies?

 
Changed:
<
<
I have previously written on the limits of consent – two points are salient: first, that consent places unrealistic and unfair expectations on individuals. These expectations manifest in individuals being expected to fend for themselves against the legalese-filled, lengthy terms-of-service documents that condition use of services on accepting lopsided terms. The cost of reading such documents, according to one estimate, was $781 billion.
>
>
Surely, there must exist some societal consensus on the proper role of platform companies in society, which lawmakers can base regulations on?
 
Changed:
<
<
Second, the potential for mass-collection, aggregation and downstream uses of individual’s data is simply not considered by a legislative regime that focuses at the point of collection of data. Platform companies hold troves of our personal data, subject it to endless forms of analytics and often use it in ways impossible to conceive at the point of collection (even by the companies themselves).
>
>
In actual fact, society has not considered the issue in the first place. Instead, we have been seduced by the convenience of social networking and an appeal to the innate human desire for connection with other humans. This seduction has precluded society from seriously considering how technology has affected our lives.
 
Changed:
<
<
The above reasons highlight the significant limitations on a privacy self-management style of regulation. Moreover, I think even these arguments fail to address a more fundamental problem.
>
>
What should consensus look like? For example, should everyone have a fundamental right to not have his/her behavior collected? Should individuals and not platform companies own data they generated by using certain online services? What kind of mandate should these companies have with respect to using individuals information for profit, (and potentially at our expense)?
 
Changed:
<
<

The real 'main' problem

>
>
It is also important to translate any societal consensus from concept to law. This will involve dealing with the issue that individuals trade privacy in exchange for convenience and good quality services.
 
Deleted:
<
<
The problem is that our data protection legislation focuses on protecting the wrong subject. As exemplified by Article 8(1) of the EU’s Charter on Fundamental Rights, “Everyone has the right to the protection of personal data concerning him or her”. Somehow society has decided that what ought to be protected is information about humans, and not humans themselves. This goes back to the earlier experiences I identified: experiences of behavioral collection and how its collectors use our behavior to influence our actions and thoughts (and profit from it too).
 
Changed:
<
<
It is worth articulating some questions about the relationship between platform companies, personal data and humans: how did such a fundamental imbalance of power between the companies who own our data and us? Why do they collect so much information (behavior) about us? How did it come to pass that a handful of companies are authorized to use our information to profit with a mandate that is extremely wide?
>
>

Dealing with the convenience / e-commerce vs privacy ‘trade-off’

 
Changed:
<
<
It is not that these observations are on their face, wrong. In fact, the real problem is that any answer to how society should model its relationship with platform companies is necessarily ambiguous. For society has not considered the issue in the first place. Partially due to the staggering speed at which technology has developed, it seems as if this situation has snuck up on society. Granted, the visionaries of the early internet were aware about the effects such technologies can have on society.
>
>
First, the idea that individuals must surrender their privacy and freedom to companies in exchange for use of their services is misguided. Modern technology allows each person to, for example, store all their personal information on a hard drive connected to their own personal server, and use a ‘raspberry pi’ to automate negotiation of data disclosure from your own private storage to the relevant company for each transaction: it can specify the purpose, duration, permission to store or permission to distribute downstream.
 
Changed:
<
<
But for the majority of today’s users, my view is that the realization of powerlessness has not dawned on us. Instead we have been seduced by the convenience of social networking and an appeal to the innate human desire for connection with other humans. This seduction has, in my view, precluded society from seriously considering the status quo. Convenience is too convenient an excuse to consider the rather inconvenient nature of society’s predicament.
>
>
However, the above idea rests on some debatable assumptions. For one, assuming self-management of privacy via a ‘privacy-management-bot’ assumes a basic level of computer literacy that is far beyond the layperson’s current capabilities.
 
Changed:
<
<
For example, it is worth asking whether each person has a fundamental right to not have his/her behavior collected. It is worth asking whether the individual and not the platform company should own the data currently collected and held by these private companies. It is also worth asking what kind of mandate these companies should have with respect to using this information for their profit, (and potentially at our expense).
>
>
While the technology for such a privacy-management-bot already exists in its constituent pieces, and that every technological revolution is impossible until it becomes inevitable, it remains true that expecting individuals to configure their own privacy bot is untenable in the short term. Technology like FreedomBox needs to be made ‘idiotproof’ in that it should not negatively impact user experience.
 
Changed:
<
<
It may well be that the status quo is preferred by the majority of society, but perhaps instead the answers to these questions should be considered “behind the veil of ignorance”. Perhaps instead this discussion of how society should order its relationship with platform companies should begin by focusing on what rights and protections humans should have.
>
>
I feel the above idea is a long-term solution. Privacy-friendly substitutes for essential services that commercially exploit mass data collection, like social networking or search engine provision, will take time to develop.
 
Changed:
<
<
My fear is that such discussions may yield a societal consensus that is merely a pipe dream for us all, given the current political and economic realities.
>
>
Moreover, these substitutes will have to compete against multi-billion dollar market incumbents. The political economy of the current e-commerce industry, backed by supportive data protection legislation, militates against a new paradigm where companies are no longer able to commercially exploit customer data.
 
Changed:
<
<
>
>
Some positive trends include device manufacturers trying to privacy-wash their new devices. If anything, this business-decision aimed to poach revenues from the platform companies will also improve individual privacy.
 
Changed:
<
<
But the place for the draft to go next is made obvious by this disempowering conclusion. What are the realities? If it is technically necessary for people to give away so much data in order to have particular services they consider "convenient" or "important," even "vital," so be it. But if it is not necessary to design and implement our technology with these drawbacks in order to achieve those objectives, then the issues concerning regulation shift straightforwardly: there is no inherent difficulty in prohibiting environmentally dangerous activities for which safe substitutes are readily available at lower cost. If we had an "emission control" rather than a "data protection" approach to consumers' digital hardware, software and services bundle, what would happen? Tradeoffs between energy efficiency, cost, and safety exist with respect to all forms of mass and automotive transport: societies have many, not necessarily consistent, modes of dealing with those tradeoffs, in which concealment and misdirection are generally not thought to have any place.
>
>
Moreover, the experience of Internet.org in India exemplifies a wave of anti-digital colonialism, challenging the entry and dominion of big-US-tech companies. These sentiments could be harnessed by device manufacturers as a source of new growth.
 
Changed:
<
<
I think you should reduce the space in this draft spent setting up a problem we can agree we basically understand. Let's try to use that space to explain, succinctly and forcefully, which parts of the existing system could be re-engineered relatively easily to offer individuals the services and facilities they want, as much as possible, in an environmentally-conscious context in which design objectives and implementation methods take privacy destruction as an ecological cost to be minimized rather than ignored.
>
>
However, I think there is no political will in the ‘West’ to challenge the dominion of the platform companies in the near future. There needs to be a more immediate regulatory solution to the problem of excessive behavioral collection.
 
Changed:
<
<
>
>

Short-term Solutions

Perhaps the answer lies in use restrictions, as Dan Geer suggested. For example, there could be restrictions on what queries organizations can run on their datasets, to prevent data analytics premised on discriminatory correlations. Or we could regulate the kind of training data used to develop machine learning algorithms, to prevent inbuilt algorithmic discrimination.

The main objection to use restrictions is the resulting inhibition of freedom of expression. In a business context, this manifests itself in restraining software engineers’ creativity.

However, I see a potential middle ground, where businesses are required to disclose certain aspects about their data handling practices (such as types of queries run on their datasets), while not giving away trade secrets. These mandated disclosures about types of use could add transparency and accountability to the quasi-public services that currently function without public oversight.

Another short-term alternative is privacy by design. This essentially mandates privacy nudges and privacy default settings. They work by countering individuals cognitive and behavioral biases and are aspects of a system or user interface that could be re-engineered to offer services in a more privacy-conscious fashion.

A combination of privacy defaults and use restrictions are the best short-term legislative solution to minimize privacy harms that occasion mass behavioral collection. Especially if the laws are accompanied by credible threats, they will hopefully have the teeth to deter organizations from infringing privacy.

Importantly, these short-term solutions should not prevent the development of privacy-conscious substitutes. Such technical solutions have the advantage of shifting power away from the big corporates back to individuals. The short-term solutions should be seen as part of an ongoing process toward reducing privacy harms that moves toward the technical solutions, with use restrictions and privacy defaults marking not the zenith of privacy protection, but a milestone along the way.

 



RohanGeorge1FirstEssay 2 - 04 Dec 2017 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Deleted:
<
<
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
 

Regulating behavioural collection

Line: 42 to 41
 My fear is that such discussions may yield a societal consensus that is merely a pipe dream for us all, given the current political and economic realities.
Added:
>
>

But the place for the draft to go next is made obvious by this disempowering conclusion. What are the realities? If it is technically necessary for people to give away so much data in order to have particular services they consider "convenient" or "important," even "vital," so be it. But if it is not necessary to design and implement our technology with these drawbacks in order to achieve those objectives, then the issues concerning regulation shift straightforwardly: there is no inherent difficulty in prohibiting environmentally dangerous activities for which safe substitutes are readily available at lower cost. If we had an "emission control" rather than a "data protection" approach to consumers' digital hardware, software and services bundle, what would happen? Tradeoffs between energy efficiency, cost, and safety exist with respect to all forms of mass and automotive transport: societies have many, not necessarily consistent, modes of dealing with those tradeoffs, in which concealment and misdirection are generally not thought to have any place.

I think you should reduce the space in this draft spent setting up a problem we can agree we basically understand. Let's try to use that space to explain, succinctly and forcefully, which parts of the existing system could be re-engineered relatively easily to offer individuals the services and facilities they want, as much as possible, in an environmentally-conscious context in which design objectives and implementation methods take privacy destruction as an ecological cost to be minimized rather than ignored.

 
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.

RohanGeorge1FirstEssay 1 - 10 Nov 2017 - Main.RohanGeorge1
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstEssay"
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Regulating behavioural collection

-- By RohanGeorge1 - 9 Nov 2017

Setting the scene

Recently, I went to Joe’s Pizza – the finest slice in the city. Since then, my Android smartphone has prompted me via push notification to view Joe’s menu thrice. This was the latest and most tangible form of behavioral collection I’ve consciously experienced. The scale of behavioral advertising, based on ubiquitous collection and tracking of consumer behavior through recursive cycles of stimulus and response tracking, should not surprise anyone.

For the longest time, I thought little of digital privacy because I had nothing to hide. A simple thought experiment exposes the fallacious logic underlying such thoughts: Simply imagine you have something to hide – a colorful past, an addiction or a secret hobby – and then you realize the scale of the problem Daniel Solove lucidly highlights. Equating privacy with secrecy is too narrow a conception of privacy. It fails to capture the harms caused by extensive behavioral collection networks due to collection, use, disclosure or storage of personal data.

Against this backdrop, it’s worth considering a recent development in human society – privacy and data protection legislation. Arguably, such laws consider some of these harms and put in place restrictions about how individuals’ personal data is used.

The 'main' problem

But I think these laws are woefully ill-suited to the times. For one, the crux of most regulatory regimes focuses on principles of notice, consent and purpose limitation. Those who collect, use disclose or store your data should notify you about it, garner your consent and do with your data nothing more than what they notified you about. This model, which has been called Privacy Self-Management, does not account for new technology and the political economy that emerged around it.

I have previously written on the limits of consent – two points are salient: first, that consent places unrealistic and unfair expectations on individuals. These expectations manifest in individuals being expected to fend for themselves against the legalese-filled, lengthy terms-of-service documents that condition use of services on accepting lopsided terms. The cost of reading such documents, according to one estimate, was $781 billion.

Second, the potential for mass-collection, aggregation and downstream uses of individual’s data is simply not considered by a legislative regime that focuses at the point of collection of data. Platform companies hold troves of our personal data, subject it to endless forms of analytics and often use it in ways impossible to conceive at the point of collection (even by the companies themselves).

The above reasons highlight the significant limitations on a privacy self-management style of regulation. Moreover, I think even these arguments fail to address a more fundamental problem.

The real 'main' problem

The problem is that our data protection legislation focuses on protecting the wrong subject. As exemplified by Article 8(1) of the EU’s Charter on Fundamental Rights, “Everyone has the right to the protection of personal data concerning him or her”. Somehow society has decided that what ought to be protected is information about humans, and not humans themselves. This goes back to the earlier experiences I identified: experiences of behavioral collection and how its collectors use our behavior to influence our actions and thoughts (and profit from it too).

It is worth articulating some questions about the relationship between platform companies, personal data and humans: how did such a fundamental imbalance of power between the companies who own our data and us? Why do they collect so much information (behavior) about us? How did it come to pass that a handful of companies are authorized to use our information to profit with a mandate that is extremely wide?

It is not that these observations are on their face, wrong. In fact, the real problem is that any answer to how society should model its relationship with platform companies is necessarily ambiguous. For society has not considered the issue in the first place. Partially due to the staggering speed at which technology has developed, it seems as if this situation has snuck up on society. Granted, the visionaries of the early internet were aware about the effects such technologies can have on society.

But for the majority of today’s users, my view is that the realization of powerlessness has not dawned on us. Instead we have been seduced by the convenience of social networking and an appeal to the innate human desire for connection with other humans. This seduction has, in my view, precluded society from seriously considering the status quo. Convenience is too convenient an excuse to consider the rather inconvenient nature of society’s predicament.

For example, it is worth asking whether each person has a fundamental right to not have his/her behavior collected. It is worth asking whether the individual and not the platform company should own the data currently collected and held by these private companies. It is also worth asking what kind of mandate these companies should have with respect to using this information for their profit, (and potentially at our expense).

It may well be that the status quo is preferred by the majority of society, but perhaps instead the answers to these questions should be considered “behind the veil of ignorance”. Perhaps instead this discussion of how society should order its relationship with platform companies should begin by focusing on what rights and protections humans should have.

My fear is that such discussions may yield a societal consensus that is merely a pipe dream for us all, given the current political and economic realities.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 3r3 - 22 Jan 2018 - 00:29:05 - RohanGeorge1
Revision 2r2 - 04 Dec 2017 - 16:39:10 - EbenMoglen
Revision 1r1 - 10 Nov 2017 - 08:18:05 - RohanGeorge1
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM