Law in the Internet Society

The Burden of Protection

-- By AnjaKong - 07 Dec 2019

This semester all my classes have clicked together. In addition to this class, I am in Copyright, a seminar on counseling digital innovators and tech-focused clients taught by practicing attorneys in the field, and an externship where I have been externing at the legal department of a large retailer and working exclusively on their compliance with privacy laws. It has been a fascinating experience to see one issue in four different contexts with four different perspectives. Given the nature of these four courses, it’s a given that the different approaches to each individual issue will not be reconcilable (in fact, this is true for most of the issues I’ve encountered this semester). The topic that I find myself struggling with most is the question of how the burden of privacy should be distributed.

In this course, we have talked about how it is frankly unbelievable that the current expectation is that consumers should be responsible for safeguarding their own privacy. How can users even do so within a framework that has been created to profit off of our personal information? Additionally, even if a user is ignorant to the infringement of their privacy rights and its true cost or apathetic to those consequences, does that justify current practices? Because companies benefit from consumer ignorance and apathy, they cannot be trusted or expected to truly care about their users’ privacy rights. For those who care about their privacy and seek to restrain the proliferation, sale, and use or abuse of their personal information, the burden of doing so becomes gargantuan.

Regulations like the GPDR and CCPA that shift the burden onto companies seem to be a step in the right direction. Through my work at my externship, however, I have witnessed how the flaws in these regulations have put a disproportionate burden on small businesses and medium enterprises. My externship placement was at a large retailer that collected customer data for their business purposes, which includes marketing and analytics. During the course of my externship, I learned a lot about what type of data they collected, how they used it, and the different vendors that they contracted with to accomplish their business purposes. Most were very innocuous, used for things like email marketing and customer recordkeeping. But there were a few that were a little scarier from a privacy viewpoint, such as the vendor that allowed the company to record sessions of a customer using the website, linked to an IP address—and thus, very likely, an identity.

Of course, the purpose for which the company used this vendor and this technology was an innocent one. The company uses the recordings to figure out what issues users are having with the company website so that the IT team could fix and improve the website. User-submitted complaints and reviews were not enough for the team to understand or replicate problems; recordings made the problems clear, and thus more easily able to be fixed. But without context (and frankly, even with context), this technology is creepy. And this technology is certainly being used by others with not-so-innocent purposes. Given this, it becomes apparent that current privacy regulations are focused on the wrong actors.

This is something that I’ve come to realize over the semester. The instructors and guest speakers I’ve learned from in my other classes come from private sector backgrounds. They acknowledge that privacy issues are extremely important and are similarly disturbed by the headline scandals, like Cambridge Analytica. But they are more comfortable with leaving the burden on the individual consumer to take control over their personal information and privacy. For those I’ve encountered outside of this class whose day jobs are focused on the nitty-gritty of complying with the GDPR and CCPA, these privacy regulations are onerous, costly, and misguided. Now at the end of the semester, though I am much more educated and experienced in these issues than I was at the beginning, I find myself more on the fence than before. If asked at the beginning of the semester, I would have wholeheartedly agreed that this burden of protecting individual privacy rights should be placed on those who stand to profit from the collection, use, and sale of personal information. Unglamorously, this opinion arose partly out of a belief that this was the morally right way that things should be carried out and partly out of my own complacency with how things are and thus reluctance to actively protect myself or change my internet usage behavior. Today, while I still believe that regulation and enforcement is needed, I find myself sympathizing with the average business (not data behemoths like Facebook or Google). I am also more active in reigning in the presence of my own personal information, for example by using a VPN as a default rather than only in specific instances. As much as I believe the burden for this should be on the system and not on the individual, I find myself taking increasing steps to protect my privacy because the system is so flawed. It’s ironic that those who believe in protecting their privacy and take steps to do so are in some way relieving the system of that burden and thus enabling it to stay the way it is.

This a stronger draft than either of the drafts of the previous essay, reflecting the education process that you are also describing.

The central metaphor of this analysis is "on the fence." Implicitly, that assumes there are two possible outcomes and a line between. Not surprisingly, this turns out to be the land of laissez-faire and the land of too much regulation, with a fence in between. Also not surprisingly, this is neither intellectually satisfying nor reflective of your personal behavior, as you show.

This is why I have tried to emphasize that there are always three parts to our understanding of any problem in this area: technology, law and politics are always present and must be thought through consiliently. Nor have I been a big supporter of "data protection" legislation, after all. But the consumer's burden is a technological and political as well as legal question. We need a politics, therefore a system of education, that empowers people to think about their needs and concerns collectively, not just individually. Even people who sort their trash need public trash collection and processing to work on their and the environment's behalf. We need public investment in helping consumers act as well as learn, which doesn't mean public dollars building FreedomBoxes, but does mean helping pro-privacy technologies built by people for people to compete with privacy-destroying centralized versions of the same services.

Once there are multiple domains in which to work, the fence between pro- and anti-regulatory regimes becomes an imaginary structure. A draft which uses all that has clicked together for you to declick yourself from the imaginary fence between the imaginary lands would be a terrific culmination of your term.

You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Webs Webs

r2 - 12 Jan 2020 - 10:33:48 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM