Law in the Internet Society

A Way Forward

-- By SamanthaBeatriceKing - 09 Dec 2021

Introduction

A recurring theme throughout the semester has been the idea that it isn't data that needs protecting; but rather, “the people.” The challenge is to articulate what, exactly, this means. What does it mean to protect the human being in the broader context of data protection? The general framework of data protection is to protect how your personal information is processed. Understood in this sense, data protection seems more concerned with the methodology of data processing rather than the substantive rights of those whose data is gathered.

As an example, the General Data Protection Regulation (GDPR) clearly outlines the rights of individuals in relation to his or her data. These rights include the right to be informed, right of access, right to rectification, right to erasure (i.e. to be “forgotten”), right to restrict processing, right to data portability, right to object, and rights in relation to automated decision making and profiling.

Given the arguably comprehensive listing of rights in the GDPR, one can make the argument that since people's rights in relation to their data are recognized (at least in this limited EU context), then people are sufficiently protected by such regulations.

Except they are not.

Proceeding from the GDPR example, the bundle of recognized rights is premised on the assumption that, in the first place, you consent to your data being processed. If we are to protect “the people” instead of their data, should that mean cutting data collection off at the knees? Right now, it is utopic to imagine the internet without data collection. But if freedom begins in knowing another world is possible, then a pushback against data collection, however small the steps or difficult the path, is a good place to start.

Smaller and smaller circles

To see for myself how Google perceives my identity, I checked myactivity.google.com (Google claims: “The activity you keep helps Google make services more useful for you, like helping you rediscover the things you've searched for, read, and watched.”) and visited Ad Settings. It’s one thing to read about it, but actually seeing how Google keeps track of your browsing history and ascribes an identity to you is mortifying. According to Google and its partnered advertisers, I am, among others, an unmarried, childless, 25 to 35 year-old female homeowner with a high household income rating, an ideal candidate to take out a mortgage, and whose job industry is healthcare.

Hundreds of clicks and facts about my browsing history were diluted into categories. My interests and purchasing power are for advertisers to determine how much I am worth.

As such, in the eyes of Facebook, Google, Apple, online data aggregators and their ilk, I am not a consumer. I am a product. Objectified, alienated, and labelled as more or less valuable according to the things I do online.

It is dangerous because my integrity, or my wholeness as a human being, is constantly being chipped at. If my clicks determine the information that is fed to me, my browsing experience becomes a deterministic, endless loop where the things I searched in the past control what I see next. These customized ads are defining us, keeping us in ever-shrinking boxes. Now that I’ve checked under the bed, the result (which I understand as only the tip of the iceberg) is alarming, and drives home the point that consent-based data protection will not truly safeguard privacy.

Collective reckoning

The meat of the inquiry is what we should do about it. Should we eradicate data collection? Is this even possible or practical? The scale of this particular question, with its impact on government, health, and economic systems is too daunting for someone with my limited knowledge and exposure to entertain.

What is clear is that it will take the concerted effort of various actors involved. Limiting the scope of what we can do to counter data stored online and/or collected by Big Tech and advertisers, on an individual level, we can simply unplug and stop engaging with the product. This is simplistic and difficult, but it essentially stoppers the hole siphoning our data. The problem is that not everyone else can be expected to disengage, leaving us feeling isolated from our peers. Perhaps it is a matter of framing. Would you rather be left out, or spied on?

On the corporate level, one idealistic solution would be to limit advertising reach by still allowing space for ads, but in a general manner, free from microtargeting. This assumes big tech cares enough about its users to limit the reach of advertisers, which is highly unlikely. A smaller step these companies could take is to reverse the paradigm and offer “opt in” schemes, which would place some control in the hands of its users. The existing “opt out” framework means that data collection is the status quo, and your consent to such collection is presumed (Google advises: “You can install a browser plugin to maintain your preference to opt out of personalized ads from Google, even if you've cleared your cookies.”). By opting in, a user must grant the company access to the data, meaning users can withhold consent and prevent data from being processed in the first place. Of course, the parameters of what the user can “opt in” to will be defined by the company. But this is another step forward, should companies actually make the move.

Finally, on the macro level, the protection of persons in relation to their data cannot be achieved without the hand of government. Legislative bodies are arguably the main actors since they can enact meaningful laws protecting consumer privacy (formalizing opt in consent, for instance). But the political will of the head of state should not be overlooked; our generation may benefit from electing leaders who understand and are alarmed by the problem.

If you did not have a Google ID you use to log in to Google services, Google would not have the ability to track your searching and reading behavior. It is therefore far easier to say what you can do than what we can do, as it is easier for me to control my own digital privacy than "ours."

But it's not hard, as you show, to understand what protecting people rather than data means, by concentrating—as any legal realist trained in the US over the last 75 years would tend instinctively to do—on remedies first. Declaring rights that cannot readily be enforced does not protect the supposed rights holders. A draft that asked how appropriate remedies should work would make great progress in answering your question.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r2 - 05 Jan 2022 - 17:31:32 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM