Law in the Internet Society

What We Want To See and What We Need To See (Second Draft)

-- By RochishaTogare - 09 Jan 2021

From the moment you download and open TikTok? , the algorithm is watching what you watch in order to develop your ForYouPage? . And this isn't unique to TikTok? .

YouTube? does the same in order to develop your Recommended Videos page. Turned off browsing and search history on the platform? It’s not going to change much; your IP address, your watch-time, your interactions, and the interactions of those connected with you, are all still tracked and analyzed. Tech companies trace your metaphorical internet footprints to ensure you have the “best viewing experience.”

When questioned as to why and how it’s okay for social media platforms to be tracking user data as closely as they do, there are two common responses.

“It’s OK, it’s anonymous.”

The first response is that it’s anonymized data. Anonymized data is data the can’t be linked to the user it’s from. Much of the modern world runs on this anonymized data, and while it is true that individual data isn’t usually used to track back to a user, the various pieces of data collected from a user can eventually be used to build a profile that is akin to human DNA— just without the name attached. In fact, there was a whole scandal surrounding photos paparazzi took of celebrities entering taxi cabs in New York. An individual was able to track the cab, the start and end locations of the trip, total trip cost, and the tip the celebrity paid, using the obviously printed medallion numbers that were pictured in the paparazzi shots and information from the photos.

Taxi cabs and paparazzi photos are considerably low tech compared to the amount of data that BigTech? companies collect. We all know that the data that these platforms collect, anonymized or otherwise, are sold to advertisers. The tradeoff is that we get access to content for free in exchange for this information. That said, even platforms without ads, a service we often pay for like Spotify Premium or YouTube? Premium, still track our data. In fact, many companies require, or at least say that they require, data in order to operate their services.

A right to privacy, a right to honesty

If we argue that privacy is a right, just like some argue that access to the world wide web is a right, would things change? We could subsequently say that if privacy is a right, then by jeopardizing this privacy BigTech? creates harm. We can start simple and pose that the least tech companies should do is protect their users from false information targeted to their data profile (think the Cambridge Analytica mess), or that for all the benefits BigTech? gains from user data that they should spend more on protecting their user bases. Yet, just the big four tech companies (Amazon, Apple, Facebook, and Google) spent $55 million in 2018 lobbying, and have instead seemed to redirect their efforts to shirk responsibility for the million's of users' rights they hold in their hands.

Even if we don't want to consider the right to privacy, we should at least consider the right to the truth, and the harm individuals face when platforms don't consider the repercussions of misinformation on their users. Even without attempting to delve into the divisive world of politics, we see concern when Facebook for years refused to allow individuals to request takedowns of their own photos. In fact, even copyright law doesn't give many rights to our own likeness; a photographer taking a photo of you owns the photo, despite it being your image. Similarly, for years there was no way to take down a photo from Facebook that someone uploaded of you without your consent. Only more recently have states adopted some measures to protect victims in cases of revenge porn, yet that is an exception.

Curated but not curated

Platforms get out of being liable for user-posted content by arguing that they don't have a responsibility to the user. Platforms are not editing or curating, but are simply publishers of content. This is their second defense. Different from the past when, as in the context of newspapers, curators, editors, and publishers were largely one and the same, modern laws provide platforms safe harbor from their users' actions. That is, Facebook is not liable for the content posted on its platform, and neither are TikTok? , Instagram, Snapchat, Youtube, or any other social media platform. As of now, there isn't a clear liability for the way companies use advertising to contact users, and similarly, beyond standard advertising laws, these platforms terms and conditions allow anonymized but otherwise individual data to be used to target users in this manner.

Is there a solution?

We cannot deny that there are some conveniences to having our data collected. We open YouTube? and get posts that will likely be of our interest, or we'll scroll through Spotify and find songs that are definitely our jam. Are these conveniences worth it? One possible solution is to place liability on the intent of the algorithm being used (to connect people with friends vs. connect people with advertisers), but such intent is hard to regulate, and it's definitely not how individuals and non-tech companies are regulated. If ever used, intent is but a piece of the large picture. Regardless, such determinations will undeniably require case-by-case analyses.

Ultimately, I believe that in order for any solution to be recognized, we need to value privacy as much as we value convenience. And this value has to come from the userbase that is so beholden to these platforms. We need to demand change with our actions, maybe first and foremost by giving up the convenience of platforms, because regulation is far from creating the change needed to recognize let alone protect our rights.

You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Webs Webs

r3 - 10 Jan 2022 - 05:00:39 - RochishaTogare
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM