Law in the Internet Society

What We Want To See and What We Need To See

-- By RochishaTogare - 06 Dec 2021

From the moment you download and open TikTok? , the algorithm is watching what you watch. How long you spend on a video before scrolling, how you interact with the video, what videos you skip and what videos you share—everything is used to create your “ForYouPage.” And this isn’t unique to TikTok? .

YouTube? does the same in order to develop your Recommended Videos page. Turned off browsing and search history on the platform? It’s not going to change much; your IP address, your watch-time, your interactions, and the interactions of those connected with you, are all still tracked and analyzed. Furthermore, Alphabet, YouTube? ’s parent company that also owns Google and the broader suite of G-products like Gmail and GDrive, scans your emails, tracks your search history, and traces your metaphorical internet footprints to ensure you have the “best viewing experience.”

“It’s OK, it’s anonymous.”

When questioned as to why and how it’s okay for social media platforms to be tracking user data as closely as they do, there are two common responses.

The first response is that it’s anonymized data. Anonymized data is data the can’t be linked to the user it’s from. Much of the modern world runs on this anonymized data, and while it is true that individual data isn’t usually used to track back to a user, the various pieces of data collected from a user can eventually be used to build a profile that is akin to human DNA—just without the name attached.

Curated but not curated

Many platforms get out of being liable by arguing that they don't have a responsibility to a user's content, as they are not editing or curating, but are simply publishers of content. Different from the past when, as in the context of newspapers, curators, editors, and publishers were largely one and the same, modern laws provide platforms safe harbor from their users' actions. That is, Facebook is not liable for the content posted on its platform, and neither are TikTok? , Instagram, Snapchat, Youtube, or any other social media platform. As of now, there isn't a clear liability for the way companies use advertising to contact users, and similarly, beyond standard advertising laws, these platforms terms and conditions allow anonymized but otherwise individual data to be used to target users in this manner.


One possible solution to this problem is to place liability on the intent of the algorithm being used. Algorithms meant to help with ease of use in connecting friends with friends, or building local relationships, versus algorithms in place to produce targeted advertising and political influence. Such determinations will undeniably require case-by-case analyses. In the end, I believe we must come up with some sort of solution to, legally or otherwise, to categorize and regulate the influence social media, AI, and internet data undeniably have over our lives.

This has served the purpose of a first draft: to clear away the brush and expose the issue. But the next draft needs to do what this one does not. Too much space here is spent describing a problem with which we are all familiar, having spent a semester talking about it. So we can put all that in to sentences and come to the idea. What we have now is that computer programs should be regulated according to their "intent." That's not viable: we don't try to regulate human or corporate conduct that way either. We need an act, and harm, and duty to avoid the harm, and then—in some situations—we say we con cern ourselves with "intent." But, as Oliver Wendell Holmes Jr. showed more than 125 years ago, we don't really mean it. Is federal legislation predicated on the interstate commerce power to regulate software on the basis of "the algorithm's intent," where maximizing engagement or delivering advertisements are to be treated as malign intentions? Surely we can agree that's not happening. We need, then, a different form of idea than a magic "solution."

You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Webs Webs

r2 - 07 Jan 2022 - 19:02:15 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM