Computers, Privacy & the Constitution

Rgulating large-scale Behavioral Collection, ver 2.0 [amended]

-- By RohanGeorge1 - 16 June 2018

Introduction

Last term, I wrote an essay suggesting that short-term solutions like use restrictions and privacy by design should be considered important stop-gap measures to regulate widespread behavioral collection. The Cambridge Analytica revelations made me realize that they were just stop-gap measures, covering the underlying gap between current privacy laws and the problem of behavior collection. It is now an opportune moment to directly tackle this problem.

I will first outline why existing constitutional provisions cannot adequately constrain behavioral collection or protect freedom of thought. Then, I will propose a more structural regulatory alternative – considering what rights users should possess in the digital services economy.

Current Constitutional Limitations to regulating the behavioral collection

Neither the Fourth nor First Amendments currently protect privacy sufficiently.

For the IV Amendment, its main protections are against unreasonable searches and seizures of places; in the 21st century, it is identities that are searched. One’s digital footprint is not a place, yet it is often searched by police. Unfortunately, the IV Amendment itself, interpreted using any particular interpretative theory, does not seem to cover identities. Moreover, one’s reasonable expectation of privacy is surrendered by sharing information with a third party happens whenever we consent to data collection by using mobile apps or services, because our identities are now stored with third parties instead of on our person or in our houses, papers or effects. Consent opens the gateway to extensive behavioral data gathering. Therefore, it has been demonstrated how the IV amendment currently affords us little privacy protection.

For the I amendment, the issue is slightly different. It is important to locate the relationship between privacy and freedom of expression in the idea of freedom to read, learn and consume content without monitoring. While there are strong protections for anonymous political speech and freedom of association, these protections have not been explicitly extended to the freedom to read and to receive knowledge. The I Amendment has not protected the space between the book and the reader at all.

An Alternative: Users Rights

One solution involves relying on the Ninth Amendment’s concept of retained rights, which the people hold on to despite not being articulated in the Constitution. In this case, the specific category of retained rights is users’ rights to digital services.

Also, the starting point of any fundamental reconstruction of technology and privacy law must begin with considering what rights users of technology and digital services should have. This seems much more suitable than foisting lackluster privacy protections on individuals after ensuring that corporations can reap large profits through the creation of a large digital economy.

In this regard, seeking to extend current data subject access rights under the GDPR to cover a more general concept of “users’ rights” will not be worthwhile, because the starting point for the GDPR was fostering a digital economy.

The transactional, individual-centric model of privacy regulation ought to be replaced by an alternative model recognizing that privacy is social or ecological, in the sense that there are externalities not accounted for under the current transactional model.

Under a privacy-as-ecology framework, the most important user right is the right not to have one’s behavior extensively monitored and monetized. Large internet platforms learn about our lives by collecting information about our online activity. This includes user browsing on websites other than those owned by the platform – “Facebook can still monitor what they are doing with software like its ubiquitous ‘Like’ and ‘Share’ buttons, and something called Facebook Pixel — invisible code that’s dropped onto the other websites that allows that site and Facebook to track users’ activity”. Platforms monetize our information by selling highly targeted access to users. But why does someone else have to know about every single action we take online, and even profit from that knowledge? Considering that instant-messaging and photo-sharing are, in actual fact, not ground-breaking technological discoveries and relatively simple to provide, users ought to have the right to have easy access to high-quality digital services without selling themselves in order to gain access to the service.

Second, users of digital services should have the right to customize their internet services or use them as they wish. In this context, the Free Software Movement’s freedom zero is a helpful precedent - the freedom to run programs as the users wish in this context should refer to the user’s right to control how their data is used. This does not mean merely consenting, but rather requiring the design of services and software so that users actually can decide whether they want to see any advertising, decide which kinds of third parties they would like to share their information with and a right to change these settings as the user wishes. In this way, the community of internet users can work to correct or modify existing software, removing malware and improving inferior features. Individuals should be able to create a dislike button or customize their own social media newsfeed, for example.

Role of Regulators

Equally important for the effectiveness of users’ rights is the ability for users’ rights to be enforced. In my view, any digital service provider must have the obligation to ensure users rights are complied with. But outside of steep pecuniary fines for instances where service providers breach these obligations, digital service providers must have the obligation to build these users rights into their digital services. In a sense, this is privacy by design 2.0, where the privacy protections being designed into a system are designing respect for users’ rights.

Additionally, digital service providers should be required to undertake a Privacy Impact Assessment (PIA), taking into account the nature of service as well as how the service provision might impact on the rights to secrecy, anonymity and autonomy of individuals. Similar to an Environmental Impact Statement as required by NEPA and to the Data Protection Impact Assessments as required by Art. 35 of the GDPR, a PIA where users’ rights are the focus, along with strict penalties for false disclosures, could be an effective way of protecting and enforcing users’ rights with respect to digital services.

In conclusion, this represents an alternative to the current mechanism of regulating digital services and platform companies.

Improvement here is a matter of focus. We can remove some extraneous material and shorten the presentation of the problem: "Cambridge Analytica and the resulting furore over the data-sharing behavior of Facebook will now allow more daring thinking than before," is the point of departure, and you can put it succinctly. Free software is not directly relevant, as your text makes clear: in the economy of services, a concept of "users' rights" is not a modification of copyright law, but a form of economic regulation at a deeper structural level. So the definition of those rights isn't an "it strikes me" matter, but really the heart of the enterprise. What rights should users possess in the digital services economy? Defining the rights and explaining the mechanism of their enforcement is the primary subject. You cannot get everything explained in detail in 1,000 words, but you can present the core of your thinking as a more or less finished product.

Amended in light of your comments, Prof. Moglen. - Rohan George (16 June 2018)


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r3 - 17 Jun 2018 - 03:18:35 - RohanGeorge1
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM