Law in the Internet Society

View   r4  >  r3  ...
KifayaAbdulkadirFirstEssay 4 - 14 Jan 2022 - Main.KifayaAbdulkadir
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 7 to 7
 -- By KifayaAbdulkadir - 22 Oct 2021
Changed:
<
<
Arguably, one of the greatest evils prompted by global digitalization is surveillance capitalism. Personal data is the currency of the digital world and like puppets, we have just been handing it over on a silver platter. Social media has infiltrated our lives to the extent that is impossible to exist in this current day and age without having a digital imprint. The amount of online data available has increased and with it, the monetary value of the information to companies. In just 2020, targeted online advertising generated over 84.2 billion U.S. dollars for Facebook amounting to 97.9% of Facebooks total global revenue; 182.5 billion U.S dollars for Alphabet, Google’s parent company; 3.2 billion U.S dollars for Twitter amounting to 82% of its total global revenue and 25.1 billion U.S. Dollars for Amazon.
>
>
Arguably, one of the greatest evils prompted by global digitalization is surveillance capitalism. Personal data is the currency of the digital world and like puppets, we have just been handing it over on a silver platter. Social media has infiltrated our lives to the extent that is impossible to exist in this current day and age without having a digital imprint. The amount of online data available has increased and so has the monetary value of the information to companies. In just 2020, targeted online advertising generated over 84.2 billion U.S. dollars for Facebook amounting to 97.9% of Facebooks total global revenue; 182.5 billion U.S dollars for Alphabet, Google’s parent company; 3.2 billion U.S dollars for Twitter amounting to 82% of its total global revenue and 25.1 billion U.S. Dollars for Amazon.
 Targeted advertising is a form of digital advertising that allows companies to direct their sponsored posts towards a specific demographic. The basis of targeted advertising is affinity profiling- where users are categorized not only according to their ‘actual personal attributes’ but their ‘assumed interests’ as well, based on likes and interactions on the platform. Social media platforms track your activity on these sites, collate all this information and share it with advertising companies. Advertising companies can then send ads to users with the specific attributes they are looking for and reach their target audience. While not only being a violation of privacy, targeted advertising has been used by companies to unlawfully discriminate against traditionally marginalized groups.
Changed:
<
<
In 2016, Facebook came under fire for its multi-cultural affinity feature, which allowed advertisers to exclude users from a particular ad campaign based on ‘ethnic affinities’. In carrying out their investigation, Pro-Publica purchased an ad from Facebook’s housing category and in it they had the option to exclude African-Americans, Asian-Americans and Hispanics. Facebook claimed that ethnic affinity was not synonymous with race but simply an assumption of one’s closeness to a particular ethnicity based on their likes and interactions. Their assertion was that this did not amount to discrimination as it was an inference of one’s ethnicity and not always an accurate assessment. While Facebook removed the ‘ethnic affinity’ feature from the platform, ad companies still target their ads to a specific zip code, thereby excluding locations with certain demographics such as a predominantly Hispanic or Black communities. Facebook has also used its advertising system to perpetuate gender and age discrimination in job advertisements, which is illegal under U.S. Law. This is not an indictment on Facebook alone, last year, Google was exposed for allowing advertisers on its platform to exclude non-binary and transgender people from job and housing ads.
>
>
In 2016, Facebook came under fire for its multi-cultural affinity feature, which allowed advertisers to exclude users from a particular ad campaign based on ‘ethnic affinities’. In carrying out their investigation, Pro-Publica purchased an ad from Facebook’s housing category and in it they had the option to exclude African-Americans, Asian-Americans and Hispanics. Facebook claimed that ethnic affinity was not synonymous with race but simply an assumption of one’s closeness to a particular ethnicity based on their likes and interactions. Their assertion was that this did not amount to discrimination as it was an inference of one’s ethnicity and not always an accurate assessment. While Facebook removed the ‘ethnic affinity’ feature from the platform, ad companies still target their ads to a specific zip code, thereby excluding locations with certain demographics such as a predominantly Hispanic or Black communities. Facebook has also used its advertising system to perpetuate gender and age discrimination in job advertisements. This however is not an indictment on Facebook alone, last year, Google was exposed for allowing advertisers on its platform to exclude non-binary and transgender people from job and housing ads.
 
Changed:
<
<
2021 however marked a trend by tech companies in attempting to ‘reduce targeted advertisements. Apple in its new update required users to choose whether an app could track them or not thus eliminating their automatic collection of private information. Google announced that it would stop tracking users across websites by removing third-party cookies but would start sending information to groups of users with similar interests and not individuals. Facebook also indicated that it would be removing sensitive ad-targeting categories such as age, race, ethnicity from its platform, starting from January 19 2022. This leaves other categories open, through which discrimination can still be carried out.
>
>
2021 however marked a trend by tech companies in attempting to ‘'protect' data privacy. Apple in its new update required users to choose whether an app could track them or not in order to eliminate the automatic collection of private information. Google announced that it would stop tracking users across websites by removing third-party cookies but would start sending information to groups of users with similar interests and not individuals. Facebook also indicated that it would be removing sensitive ad-targeting categories such as age, race, ethnicity from its platform, starting from January 19 2022. This would however leave other categories open, through which discrimination could still be carried out.
 
Changed:
<
<
These symbolic changes however are not a solution as they will still be reliant on the algorithms used by these platforms. A study found that the bias in Facebook ads was unfortunately built into the algorithm. Even where the companies do not segment their adverts according to ages for instance, the algorithm will still send it out to a specific age or gender group. Technology is as biased and flawed as the human behind it. These companies have designed their algorithm in a manner that permits and reflects biases in the real world and thus that is the way it will present itself in the digital world.
>
>
Big Tech companies primary concern has been and always will be profitability. Any changes they effect to ‘remedy’ violations of privacy are merely a ruse. These companies will not develop a moral conscience; It would be idealistic to expect them to provide solutions for the problems they created. These changes being proposed neglect a fundamental problem which is algorithmic bias. A study found that the bias in Facebook ads was unfortunately built into the algorithm. Even where the companies do not segment their adverts according to ages for instance, the algorithm will still send it out to a specific age or gender group.Google's search engine has also been accused of being sexist and racist. Technology is as biased and flawed as the human behind it. These companies have designed their algorithm in a manner that permits and reflects biases in the real world and that is the way it will present itself in the digital world.
 
Changed:
<
<
Big Tech companies primary concern has been and always will be profitability. Any changes they effect to ‘remedy’ violations of privacy are merely a ruse. These companies will not develop a moral conscience; It would be idealistic to expect them to provide solutions for the problems they created. Just to be clear, my argument here is not for the ‘equality of infringement of rights’. It would be counter-intuitive to argue that the invasive practice should be ‘fair and unbiased’ amongst all users when at the same time arguing for its elimination. The bottom line is that the mining of personal data is an inherent violation of privacy and a practice that needs to be done away with. The purpose here is to simply highlight the varied layers of inequality presented by surveillance advertising. If the ultimate goal is not for the total elimination of technology but for the creation of a people-centered internet, one that is not propelled by the trading of private data then it is important to draw out the essential components contributing to it. Targeted advertisement has been the profit driver for these companies and a main contributory factor in its discriminatory practices is the algorithm they have created and continue to use. Any technological developments not rooted in the of the rectification of the algorithm create will continue to perpetuate discrimination.
>
>
Just to be clear, my argument here is not for the ‘equality of infringement of rights. It would be counter-intuitive to argue that this invasive practice should be ‘fair and unbiased’ amongst all users while at the same time arguing for its elimination. The bottom line is that the mining of personal data is an inherent violation of privacy and a practice that needs to be done away with. The purpose here is to simply highlight the varied layers of inequality presented by surveillance advertising. If the ultimate goal is not for the total elimination of technology but for the creation of a people-centered internet, one that is not propelled by the trading of private data then it is important to draw out the essential components contributing to it. Targeted advertisement has been the profit driver for these companies and a main contributor in its discriminatory practices is the algorithm they have created and continue to use. If we are to achieve a digital future centred on the protection of human values then the first step needs to be ensuring algorithmic accountability and transparency.
 


Revision 4r4 - 14 Jan 2022 - 03:17:57 - KifayaAbdulkadir
Revision 3r3 - 14 Jan 2022 - 01:02:20 - KifayaAbdulkadir
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM