Law in the Internet Society

View   r3  >  r2  >  r1
NataliaNegretSecondEssay 3 - 10 Jan 2022 - Main.NataliaNegret
Line: 1 to 1
 
META TOPICPARENT name="SecondEssay"
Changed:
<
<

How the net neutrality idea is wrecking our rights

>
>

How social media platforms are wrecking our rights and what we must do

 -- By NataliaNegret - 08 Dec 2021

Changed:
<
<
Net neutrality has been used as the underlying reason for treating Facebook, Twitter, and Instagram as "mere intermediaries". Social media is presented as the space for others' virtual interactions with this category. I believe public policy must shift towards users' protection: social media platforms need to be accountable for creating the space, profiting from users' data, and filtering and curating the content shown on their feeds. There is a colossal lie behind the idea of social media "empowering" users. The apparent voice users receive by creating an account is subject to the platform's terms and conditions, policies, and algorithms. Therefore, it is a voice subject to the directives social media pursue as private companies. Rules and limits that apply to those who collect and disseminate information and data should apply to platforms.
>
>
Facebook, Twitter, and Instagram are not "mere intermediaries" on the relationship between users and the internet. Social media is presented as the space for others' virtual interactions. This vision must change to make them accountable for the services they provide. I believe public policy must shift towards users' protection: social media platforms need to be accountable for creating the space, profiting from users' data, and filtering and curating the content shown on their feeds. There is a colossal lie behind the idea of social media "empowering" users. The apparent voice users receive by creating an account is subject to the platform's terms and conditions, policies, and algorithms. Therefore, it is a voice subject to the directives social media pursue as private companies. Rules and limits that apply to those who collect and disseminate information and data should apply to platforms.
 

Line: 16 to 15
 
The policy rooted in privacy recently implemented in the EU is a wake-up call to act but fails to address the root issue. This UE "Policy" resembles the idea that EU must protect the internet, and in that direction, promulgates "limited liability" for online platforms. How are we supposed to protect users' rights if the platforms are not accountable? Within this tangle of efforts to do something, the EU Courts have created pompous rights as the right to be forgotten, which, as the "limited liability," doesn't protect anyone.
Deleted:
<
<
  The need for liability relies in the fact that platforms enable mass participation, engage in mass data collection and surveillance, and deploy their means for manipulating their user's behavior. Once the algorithm connects with the nervous system of the account owner, the platforms acquirees control of their behavior, it subjects the individual to their limits –named by the corporation "Community Standards" –, and deploy the human from the freedom to decide, think, or choose; deploy them from the human nature, and subject them to the category of a "mere" user.
Added:
>
>
The EU policy avoids the internet's policing and boasts of protecting net neutrality. The internet can be free and individuals who use it as well. These two freedoms don't oppose each other. Nonetheless, privacy and limited liability are not the pillars to grant them. Public policy should make liable those who aim to take control and profit.
 
Deleted:
<
<
The EU policy avoids the internet's policing and boasts of protecting net neutrality. The internet can be free and individuals who use it as well. These two freedoms don't oppose each other. Nonetheless, privacy and limited liability are not the pillars to grant them. Public policy should make liable those who aim to control and profit, hiding in the concepts of "net neutrality" and "mere intermediary".
 
Changed:
<
<

An initial solution: Fiduciary Duties

>
>

What kind of responsibility we must push for?

 
Changed:
<
<
People who use social media become so vulnerable that the platforms that create and nurture digital dependence must be liable somehow. I adhere to Balkin's thesis: platforms acquire special duties concerning the information they obtain in the course of their relationship; as they collect and process information from their users, they should be treated as information fiduciaries[1]. By treating social media as a kind of fiduciary, they are subject to good faith, confidentiality, and trustworthiness in the "collection, collaboration, use, and distribution of personal information.This category of "information fiduciaries" is a concept introduced by Jack Balkin.

With a liability regime where free internet and user rights are promoted, platforms' digital curation and collection can be changed. Fiduciaries rely on relationships that are (1) asymmetric, (2) based on trust, and therefore, require (3) responsibility towards the principal that trusted the fiduciary. In this sense, a fiduciary owes his principal a high duty, analyzed with strict accountability, of good faith, fair dealing, and honest performance. These duties involve other duties to disclose, access information, and to transparency. Judges see the set of duties as the minimum requirements the Fiduciary should conduct to respect (1) the asymmetry (of information, knowledge and control over the content) and (2) the trust relationship between the principal and the fiduciary. With the fiduciary approach we recognize the vulnerability of being "users" and not beings. And by that acceptance, we try to revert the vulnerable condition we face as users.

Free internet is, without a doubt, something we must protect. Nonetheless, the idea of "freedom" that's penetrating through the conjunction of net neutrality and mere intermediaries doesn't rely on free access, nor the possibility of choosing, and not in the protection on the individual's will. Once the platforms are held accountable, the "free" services in exchange for our data will be over. Those who run the platforms don't act in good faith and loyalty towards the people they made vulnerable, with a simple liability regime. The people responsible for dictating and implementing our public policy on free internet must act. Beyond responsibility, they must grant education on creating, operating software, storage, and services. With accountability, the centralized knowledge of the platfroms can, and would, be spread. They are not doing rocket science, they collect, manipulate and profit from that, its on our hands to pursue a change.

>
>
People who use social media become so vulnerable that the platforms that create and nurture digital dependence must be liable somehow. With a liability regime where free internet and user rights are promoted, platforms' digital curation and collection can be changed. Free internet is, without a doubt, something we must protect. The idea of "freedom" must rely on free access, and the possibility of choosing. Once the platforms are held accountable, the "free" services in exchange for our data will be over. Those who run the platforms don't act in good faith.
 
Added:
>
>
As users we must fight for transparency, while acknowledging we lose our ability to choose when we engage in those platforms. To do this, the public policy that must be enacted to hold platforms liable and to grant the users’ capacity to take decisions must be structured in four pillars. First, educate and disclose how our data produces revenues to the platforms. Second, disclose the ways in which the users are targeted through their algorithm, with the advertisement, etc. Third, elaborate how targeting diminishes (actually, extinguish) our freedom of thought and freedom of decision. The latter, as the engagement, likes, swipes and the clicks feed the revenues and the empire of control those platforms exercise over our brains. The “hidden” influence must be shown as what it is: direct imposition and control of behavior and thoughts. Lastly, educate how the manipulation of thought and behavior is achieved.
 
Changed:
<
<
[1]Jack M. Balkin, Information Fiduciaries and the First Amendment, 49 U.C. Davis L. Rev. 1183, 1209 (2016) and Jack M. Balkin, The Three Laws of Robotics in the Age of Big Data, 78 Ohio St. L.J. 1217, 1228 (2018)
>
>
If users understand that the manipulation cycle begins with the engagement with the platform. That given engagement is data for the platform. That the platforms use and manipulates the data to produce the behavior it desires. And that induction of actions (which varies from more engagement to fixating ideas and to imposing shopping patterns) deprives the persons from their ability to choose. Many more would reconsider their presence and their engagement with those platforms. In this regard, to make them accountable we must, first, educate on their impact, their harm and their oppression on the independence and freedom of human nature.
 
Changed:
<
<
I don't understand the central claim of the draft. "Net neutrality" is not a clear concept, but it is absolutely unclear why you think it has anything to do with platform liability for content. Making telecommunications intermediaries common carriers, is one operational definition of "net neutrality"; prohibiting discriminatory routing practices is another. Neither has any bearing on the "safe harboring" or legal immunity for user-generated content that 230 of the US Communications Decency Act and related legal rules around the world.

So we are left with an argument, almost entirely by assertion, that these legal immunities should be withdrawn. I entirely agree, but I can't recognize your particular contribution to the point amidst the confusion introduced by Jack Balkin's noise. Fiduciaries are legal actors with essentially unlimited responsibilities to particular people, to whom they owe duties of personal trust. Twisting that relationship into ones involving general duties to no one in particular makes little sense. (His 2018 talk that you cite recycles the content of a talk I gave six years earlier, as it happens, but we needn't bother about that now....)

I think the best route to improvement is pretty clear: You should either explain clearly what the relationship is between "net neutrality" and your subject, or leave it behind. If you are really up for the work of rehabilitating Balkin, go to it, but please keep in mind that you've got an uphill task to make actual legal material from that froth. But what's most important aren't other peoples' concepts, but your own ideas.

>
>
The people responsible for dictating and implementing our public policy on free internet must act. Beyond responsibility, they must grant education on creating, operating software, storage, and services. With accountability, the centralized knowledge of the platfroms can, and would, be spread. They are not doing rocket science, they collect, manipulate and profit from that, its on our hands to pursue a change. The vulnerability in which they have drown the people who uses and engages with their services must be reversed, or at least, virilized.
  \ No newline at end of file

NataliaNegretSecondEssay 2 - 02 Jan 2022 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="SecondEssay"
Deleted:
<
<
 

How the net neutrality idea is wrecking our rights

-- By NataliaNegret - 08 Dec 2021

Line: 36 to 35
 

[1]Jack M. Balkin, Information Fiduciaries and the First Amendment, 49 U.C. Davis L. Rev. 1183, 1209 (2016) and Jack M. Balkin, The Three Laws of Robotics in the Age of Big Data, 78 Ohio St. L.J. 1217, 1228 (2018) \ No newline at end of file

Added:
>
>

I don't understand the central claim of the draft. "Net neutrality" is not a clear concept, but it is absolutely unclear why you think it has anything to do with platform liability for content. Making telecommunications intermediaries common carriers, is one operational definition of "net neutrality"; prohibiting discriminatory routing practices is another. Neither has any bearing on the "safe harboring" or legal immunity for user-generated content that 230 of the US Communications Decency Act and related legal rules around the world.

So we are left with an argument, almost entirely by assertion, that these legal immunities should be withdrawn. I entirely agree, but I can't recognize your particular contribution to the point amidst the confusion introduced by Jack Balkin's noise. Fiduciaries are legal actors with essentially unlimited responsibilities to particular people, to whom they owe duties of personal trust. Twisting that relationship into ones involving general duties to no one in particular makes little sense. (His 2018 talk that you cite recycles the content of a talk I gave six years earlier, as it happens, but we needn't bother about that now....)

I think the best route to improvement is pretty clear: You should either explain clearly what the relationship is between "net neutrality" and your subject, or leave it behind. If you are really up for the work of rehabilitating Balkin, go to it, but please keep in mind that you've got an uphill task to make actual legal material from that froth. But what's most important aren't other peoples' concepts, but your own ideas.

 \ No newline at end of file

NataliaNegretSecondEssay 1 - 08 Dec 2021 - Main.NataliaNegret
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="SecondEssay"

How the net neutrality idea is wrecking our rights

-- By NataliaNegret - 08 Dec 2021

Net neutrality has been used as the underlying reason for treating Facebook, Twitter, and Instagram as "mere intermediaries". Social media is presented as the space for others' virtual interactions with this category. I believe public policy must shift towards users' protection: social media platforms need to be accountable for creating the space, profiting from users' data, and filtering and curating the content shown on their feeds. There is a colossal lie behind the idea of social media "empowering" users. The apparent voice users receive by creating an account is subject to the platform's terms and conditions, policies, and algorithms. Therefore, it is a voice subject to the directives social media pursue as private companies. Rules and limits that apply to those who collect and disseminate information and data should apply to platforms.

The UE provision on Platforms Liability fails to address the problem?

The policy rooted in privacy recently implemented in the EU is a wake-up call to act but fails to address the root issue. This UE "Policy" resembles the idea that EU must protect the internet, and in that direction, promulgates "limited liability" for online platforms. How are we supposed to protect users' rights if the platforms are not accountable? Within this tangle of efforts to do something, the EU Courts have created pompous rights as the right to be forgotten, which, as the "limited liability," doesn't protect anyone.

The need for liability relies in the fact that platforms enable mass participation, engage in mass data collection and surveillance, and deploy their means for manipulating their user's behavior. Once the algorithm connects with the nervous system of the account owner, the platforms acquirees control of their behavior, it subjects the individual to their limits –named by the corporation "Community Standards" –, and deploy the human from the freedom to decide, think, or choose; deploy them from the human nature, and subject them to the category of a "mere" user.

The EU policy avoids the internet's policing and boasts of protecting net neutrality. The internet can be free and individuals who use it as well. These two freedoms don't oppose each other. Nonetheless, privacy and limited liability are not the pillars to grant them. Public policy should make liable those who aim to control and profit, hiding in the concepts of "net neutrality" and "mere intermediary".

An initial solution: Fiduciary Duties

People who use social media become so vulnerable that the platforms that create and nurture digital dependence must be liable somehow. I adhere to Balkin's thesis: platforms acquire special duties concerning the information they obtain in the course of their relationship; as they collect and process information from their users, they should be treated as information fiduciaries[1]. By treating social media as a kind of fiduciary, they are subject to good faith, confidentiality, and trustworthiness in the "collection, collaboration, use, and distribution of personal information.This category of "information fiduciaries" is a concept introduced by Jack Balkin.

With a liability regime where free internet and user rights are promoted, platforms' digital curation and collection can be changed. Fiduciaries rely on relationships that are (1) asymmetric, (2) based on trust, and therefore, require (3) responsibility towards the principal that trusted the fiduciary. In this sense, a fiduciary owes his principal a high duty, analyzed with strict accountability, of good faith, fair dealing, and honest performance. These duties involve other duties to disclose, access information, and to transparency. Judges see the set of duties as the minimum requirements the Fiduciary should conduct to respect (1) the asymmetry (of information, knowledge and control over the content) and (2) the trust relationship between the principal and the fiduciary. With the fiduciary approach we recognize the vulnerability of being "users" and not beings. And by that acceptance, we try to revert the vulnerable condition we face as users.

Free internet is, without a doubt, something we must protect. Nonetheless, the idea of "freedom" that's penetrating through the conjunction of net neutrality and mere intermediaries doesn't rely on free access, nor the possibility of choosing, and not in the protection on the individual's will. Once the platforms are held accountable, the "free" services in exchange for our data will be over. Those who run the platforms don't act in good faith and loyalty towards the people they made vulnerable, with a simple liability regime. The people responsible for dictating and implementing our public policy on free internet must act. Beyond responsibility, they must grant education on creating, operating software, storage, and services. With accountability, the centralized knowledge of the platfroms can, and would, be spread. They are not doing rocket science, they collect, manipulate and profit from that, its on our hands to pursue a change.

[1]Jack M. Balkin, Information Fiduciaries and the First Amendment, 49 U.C. Davis L. Rev. 1183, 1209 (2016) and Jack M. Balkin, The Three Laws of Robotics in the Age of Big Data, 78 Ohio St. L.J. 1217, 1228 (2018)


Revision 3r3 - 10 Jan 2022 - 05:18:28 - NataliaNegret
Revision 2r2 - 02 Jan 2022 - 16:37:48 - EbenMoglen
Revision 1r1 - 08 Dec 2021 - 21:10:12 - NataliaNegret
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM