Law in the Internet Society

View   r4  >  r3  ...
JakeBlecherSecondEssay 4 - 02 Feb 2020 - Main.JakeBlecher
Line: 1 to 1
 
META TOPICPARENT name="SecondEssay"

The Regimes of Privacy

Changed:
<
<
-- By JakeBlecher - 06 Dec 2019

The announcement of the CCPA has marked a shift in American society; all of a sudden, everyone cares about data privacy. 18 other states have entered the legislative process to some extent, whether through discussions about a potential law or even fully passing their own. Some states, like New Jersey, which were perfectly content couching data privacy into existing consumer protection laws in the past, have realized the need to explore more comprehensive solutions. But as the movement towards establishing protections takes off, are we actually heading in the wrong direction?

>
>
-- By JakeBlecher - 02 Feb 2020
 
Added:
>
>
The announcement of the CCPA has marked a shift in American society; all of a sudden, everyone cares about data privacy. 18 other states have entered the legislative process to some extent, whether through discussions about a potential law or even fully passing their own. Some states, like New Jersey, which were perfectly content couching data privacy into existing consumer protection laws in the past, have realized the need to explore more comprehensive solutions. But as the movement towards establishing protections takes off, are we actually heading in the wrong direction?
 

The Notice and Consent Regime

Changed:
<
<
The current regime of data privacy law is summed up by one idea - notice and consent. For years, this meant that a corporation had limitless rights as far as data collection and use was concerned, so long as they recorded that usage in a list somewhere and published the list for consumers to see. At first blush, this seems like a fairly optimal system; just like in contract law, the personal autonomy of interested parties is preserved, while ensuring that everyone is on level playing grounds as far as knowledge of the transaction goes. Time went on to show that this was not the case, however. Companies would use a “wide net” policy for collecting data, requesting every single thing they could think to from consumers just for the sake of having it, and would feel justified in doing so because their Privacy Policy stated that was the case. Conversely, consumers would be faced with enormous and incomprehensible documents “informing” them of what they were agreeing to and would simply not read them. Feeling the need to use the service regardless, consumers would essentially be forced into accepting terms they weren’t even aware of.

In order for notice and consent in this form to “work”, there would need to be a means through which it could be ensured that people were actually being made aware of their agreements. It isn’t difficult, necessarily, to achieve this. Imagine a system in which a user, upon visiting a website for the first time, is presented with a full copy of the Privacy Policy. Then, in order to use the service, they need to pass a “quiz” of sorts - this could be exceedingly simple, such as requiring users to “fill in the blanks” on a number of select phrases from the Privacy Policy. This way, the website can feel confident that the user didn’t simply check the “I Agree” box without paying any attention to the agreement they were making. Now think about the fact that there are few, if any, services anymore that don’t collect data to some extent (and this hypothetical is even ignoring the fact that in-person services collect data too). Every single web page would require a quiz. Consumers would riot. Not to mention the economic inefficiency of such a change - productivity would be slowed so much by this new imposition that we shouldn’t ask for it, even if it would “work.”

>
>
The current regime of data privacy law is summed up by one idea - notice and consent. For years, this meant that a corporation had limitless rights as far as data collection and use was concerned, so long as they published a list of categories for consumers to see. At first blush, this seems like an optimal system; just like in contract law, the personal autonomy of interested parties is preserved, while ensuring that everyone is on level playing grounds as far as knowledge of the transaction goes. Time went on to show that this was not the case, however. Companies would use a “wide net” policy for collecting data, requesting permission to store, use, and sell every possible data point they could think to from consumers just for the sake of having it, and would feel justified in doing so because their Privacy Policy stated that was the case. Conversely, consumers would be faced with enormous and incomprehensible documents “informing” them of what they were agreeing to and would simply not read them. Feeling the need to use the service regardless, consumers would essentially be forced into accepting terms they weren’t even aware of.
 
Added:
>
>
In order for notice and consent in this form to “work”, there would need to be a means through which it could be ensured that people were actually being made aware of their agreements. Imagine a system in which a user, upon visiting a website for the first time, is presented with a full copy of the Privacy Policy. Then, in order to use the service, they need to pass a “quiz” of sorts - this could be exceedingly simple, such as requiring users to “fill in the blanks” on a number of select phrases from the Privacy Policy. This way, the website can feel confident that the user didn’t simply check the “I Agree” box without paying any attention to the agreement they were making. Now think about the fact that there are few, if any, services anymore that don’t collect data to some extent (and this hypothetical is even ignoring the fact that in-person services collect data too). Every single web page would require a quiz. Consumers would riot. Not to mention the economic inefficiency of such a change - productivity would be slowed so much by this new imposition that we shouldn’t ask for it, even if it would “work.”
 

The Updated Regime

The Cambridge Analytica scandal made one thing overwhelmingly clear; people had no idea what they were agreeing to for the ability to use certain services, and a lot of them cared about the privacy they had unknowingly forsaken. To react to this, the existing regime has been widened, albeit only slightly. Notice and consent still reigns supreme, but legislatures have finally started crafting statutory requirements that can’t be contracted away. As there is no federal law currently, these requirements differ based on where you are, but some common ones include “the right to be forgotten”, which requires data collectors to have some sort of data deletion protocol available upon request, the right to view your own personal data, and the right to opt-out of data collection without discrimination, meaning without losing access to the product.

Changed:
<
<
The last of these is the most problematic. The right to opt-out of data collection without discrimination requires businesses to operate under two different functionalities; one where their business is run from the profits of data collection, and one where they need to be able to operate independently of those funds. Theoretically, everyone could opt out of data collection, and under this legal requirement businesses would need to still provide their service. What motivation does Facebook have to continue providing their service to users it can’t collect data from when the majority of its profits are generated from that data collection? If enough people opt out, the economic decision would be for Facebook to shut down. Whatever your opinion might be on the benefits of such an occurrence, it seems paradoxical to continue to legislate under a notice and consent scheme that strips away such an essential right of said scheme. Doing so is a half-measure meant to make users feel like they have control without really addressing the problem in any meaningful way.

I don't understand this description of the situation. Facebook isn't requesting information from users: they are providing it when they transmit and receive information intermediated by the platform. So there's no "overrequesting" issue. Nor have you explained why it is Facebook's particular responsibility to disclose how the Web and smartphones work in general, for example. So far, the argument proceeds under the assumption that consent is relevant so long as it is actually informed, but this appears to be coupled with an implicit conclusion that no one who was properly informed would give consent.

>
>
The last of these is the most problematic. The right to opt-out of data collection without discrimination requires businesses to operate under two different functionalities; one where their business is run from the profits of data collection, and one where they need to be able to operate independently of those funds. Theoretically, everyone could opt out of data collection, and under this legal requirement businesses would need to still provide their service. What motivation does Facebook have to continue providing their service to users it can’t collect data from when the majority of its profits are generated from that data collection? Any legislation containing such a right opens the door to a host of new issues, and any legislation missing it acts as a half-measure meant to make users feel like they have control without really addressing the problem in any meaningful way.
 

An Ecological Regime

Changed:
<
<
The ecological approach to data privacy laws, setting impactful requirements that can’t be contracted around, makes a lot of sense. Like global warming, this is an issue that would be best handled in a definitive manner.

This seems to be about adjectives, like "definitive" and "impactful," if that is indeed a word at all. Why isn't the reader learning the difference between transactional consent and socially-established standards of care, rather than the difference between "impactful" rules that "make a lot of sense" and apparently impact-free consent that "seems like a fairly optimal system"? The reader comes to you for clarity of analysis, not comparative evaluation of metaphors.

And yet, taking a fully legislatively-driven approach to better data privacy policies has its own hindrances; namely, the speed, or rather lack thereof, with which lawmaking bodies operate. Technology evolves at a rate that the law has demonstrated no ability to match; a recent amendment to New Jersey’s data breach notification statute was literally outdated before it went into effect due to its lack of language addressing alternative identification methods such as biometric data. As it stands, lawmakers are not capable of either understanding the nuances of technology nor keeping pace as would be necessary to provide truly comprehensive laws.

This is a straw man. Contemporary administrative law isn't built around statutes that regulate in detail. Agencies are empowered to make and update rules, and to act against those who break the rules. Expertise and responsiveness to change in conditions, along with broad contacts with interested and affected parties through public process, have been sought in this way since the Administrative Procedure Act. Why would privacy law be made in a fashion that hasn't been used in the better part of a century?

The only logical way to do so is to match laws to industry standards and address the problems from the industry side. Unfortunately, industry is resistant to change - and like global warming, it seems unlikely the big players will act to their own detriment, even if such detriment is only perceived.

So there is legislative action to require industry to face the democratic demand. That's where you began. Where did you end up?

The way to improve the draft is to give clear structure to its argument. What is your idea that you are contributing to the conversation? Give a clear statement of the point at the top. Show in successive paragraphs how you come by the idea, that is, how it develops out of the conversation you are joining, which includes not merely the fact of, but the actual contents of, legislation now being considered. Then you can show how your view of the matter differs from the other positions you think it valuable to consider. This enables a conclusion that the reader can both see as a reasoned outcome of your participation in the discussion, while also having directions she can travel on her own, enabled by but not determined by the teaching of your essay.

>
>
The ecological approach to data privacy laws, setting meaningful requirements that can’t be contracted around, is another possible approach. Like global warming, the data privacy crisis is one where no one can make their own choices without affecting others. And yet, taking a fully legislatively-driven approach to better data privacy policies has its own hindrances; namely, the speed, or rather lack thereof, with which lawmaking bodies operate. Technology evolves at a rate that the law has demonstrated no ability to match; a recent amendment to New Jersey’s data breach notification statute was literally outdated before it went into effect due to its lack of language addressing alternative identification methods such as biometric data. As it stands, lawmakers are not capable of either understanding the nuances of technology nor keeping pace as would be necessary to provide truly comprehensive laws. Administrative rulemaking, while possibly carried out by more informed parties, moves at an even slower pace than legislative systems - regulations would be more accurate, but current relevance would be sacrificed further. The only logical way to do so is to match laws to industry standards and address the problems from the industry side. Unfortunately, industry is resistant to change - and like global warming, it seems unlikely the big players will act to their own detriment, even if such detriment is only perceived.
 
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Revision 4r4 - 02 Feb 2020 - 22:30:34 - JakeBlecher
Revision 3r3 - 11 Jan 2020 - 11:44:27 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM