Law in the Internet Society


-- By KifayaAbdulkadir - 09 Dec 2021

It has been said that the key to being the best professional boxer is agility and the ability to think quickly on your feet. When one is in the ring facing his opponent, all planned tactic should go out the window. In order to win, you must have the ability to observe your opponent, learn from their moves, analyze and adapt. From time immemorial, we have been in this hyperbolic boxing ring with big Tech getting obliterated yet throwing the same punches. Is it perhaps time to switch stances?

The prolonged metaphor uses many words, but does it add enough to be worth the baggage? Almost 10% of the space gone, and I have only a very general sense of your point. Why not start by telling the reader what she most needs to hear?

We are at a point in society where our lives are so intertwined with technology that envisioning a future without one is quixotic. Much of what is our current reality was some time ago considered a dystopian future. Tech monopolies have amassed so much power that they have been able to operate largely without legal consequences. While we can agree that technology has brought about numerous challenges, is the best way to completely detach from it? For all of its evils, technology has drastically changed our lives and in some ways, for the better. We cannot realistically plan a future centered on the elimination of technology unless we opt to go completely off-grid. So what is the solution to dealing with this enigma?

Really a false opposition, isn't it? The implicit premise is that the only possible technology is of a kind that concentrates power in this undesirable fashion. But this premise is not established. I have been laboring to show why I believe it is indeed false. If we can use technology in ways that support our interconnection without delivering behavior data on us all into the hands of a few entities, "private" or "public" as they may be, wouldn't this apparent dichotomy dissolve?

Numerous recommendations have been proposed — users have been advised to transition to alternative applications in an attempt to minimize dependency on Big Tech. Suggestions such as using ‘duckduckgo’ instead of google have been proposed, as these alternative sites do not mine as much data as the big tech. On the legal side, the government has been doubling down on anti-trust laws in an attempt to reduce the monopoly of these large companies. Increasing competition in the technology field would supposedly incentivize these platforms to give their best. Once consumers have a wide variety of options to pick from then they can settle on those that prioritize their privacy. However, anti-trust laws alone cannot suffice in a world of ever evolving technology. And though there is speculation that Big Tech might be moving away from trading private data, I must confess I am slightly unconvinced about the plausibility of this scenario. Let’s assume hypothetically Big Tech companies do actually decide to step away from trading private data, who is to say other companies won’t step in to fill the vacuum? In this capitalistic world that we reside in, there will always be buyers and sellers, regardless of a change in the ‘service providers’.

So what are out options?

Minus the typo, this is the real question this draft is valuably unearthing. What do you want to learn about the options?

Cutting of technology as we’ve seen is not a solution. We cannot place the task entirely on users to be vigilant in protecting their privacy. And while we can encourage adults to do so, we cannot reasonably expect young children to take these protective measures or to even understand the exigency of the matter.

Oh, on the contrary. The curiosity of children is the most powerful of social forces. But it's not a matter of teaching them to deal with an exigent threat. Giving children an opportunity to explore how things work and to explain them to one another, they can invent and accomplish anything they set out to do. They could master how servers work just as easily as they can master iPads, and in the process they can make Big Tech less useful to everyone around them, thus less powerful.

Relying on these platforms to develop a conscience is obviously out of question.

I do not profess to have all the answers on the best way forward but as a lawyer I am hard-wired to look towards the law to solve crisis.

Now is the time to soften up the hardwiring; that's what we're here for. American legal realism begins from predicting what courts will do in fact and ends caring about what social outcomes are in fact, regardless of what it says in the books. That's why in my 1L course we don't define "law," only "lawyering," which is defined as making things happen in society using words. Lawyers are therefore people using words to make things happen. Teaching, making software, giving counsel, appearing in court, talking to policy-makers, negotiating with and among businesses—the point is to learn effectiveness in lawyering, not law.

`"Hardwiring" means actually "implicit unquestioned assumptions." The purpose of recognizing them is not to dispose of them, but rather to put them in dialogue with the rest of what we know and can learn.

The legal solutions provided thus far have been woefully inadequate and myopic. I believe that the biggest problem contributing to the inefficacy of the laws is that regulators have been struggling to catch up with Big Tech. Technology is evolving exponentially and because the law is very slow in keeping up, regulatory gaps are being exposed. This is why these companies have continuously been able to skirt the operation of the law. By the time a regulation has been implemented, these companies have introduced new innovations that raise new legal and ethical implications, not fitting into the already available categories.

In order to conquer this behemoth, we need to address this problem not only from the status quo but from the prism of the future. We need to think ahead and use the law to obviate problems and not merely to solve them once they have permeated society. Granted, the insurmountable challenges brought about by disruptive technology is unprecedented. But technological innovations do not occur spontaneously- if policy makers analyze data surrounding technology they would be able to tell what innovations are likely to hit the market and which ones require extra regulation.

It would be simpler to talk to the people who make it, which they do all the time.

Elon Musk, for example, recently announced that his company Neuralink would begin implanting brain chips in humans next year, to assist those with spinal injuries. While Facebook had also confirmed that it was working on a similar Brain Interface Technology project, that would read brain signals and convert it to text, they quickly abandoned it and opted instead for one that would read muscle signals in the arms. It would not be far-fetched to assume that neural-interface technology is somewhere in the near future.

My dear friend Hal Abelson at MIT has been telling me for almost thirty years that the direct neural interface to computers is twenty years away. And he's right: at the end of some twenty-year period it will happen, but those twenty years haven't begun yet.

Policy-makers have to sort out hype from actionable intelligence all the time, taking into account just how much bullshit they are subjected to all the time and the real scope of possible action given the political context within which they make policy. You are underestimating the sophistication of their efforts, whatever their level of resources. Smart people do this work, and they try not to get blown around by smoke. Musk, as you are aware, is always partly smoke.

Presumably the reason why we are not fully there yet is because there are technological issues that need to first be overcome. Can we imagine the data privacy concerns that will arise once the line between machine and brain is completely blurred? It is therefore not unfathomable for regulators to predict where innovation is headed so long as they keep abreast of the industry and analyze trends.

They don't have to do this by themselves. There are people who have been thinking about these questions for two generations already. Some of the policy-makers could be my grandchildren, if I'd ever had children, so the issues have been thought about literally since their grandparents' time. You are a really important and indeed decisive participant in that conversation, because yours is the last generation that gets a choice. But you will be more effective is you begin knowing that you are joining a conversation and don't have to invent it.

The more technology advances, the more intrusive it will get and the more problems regulators will have on their hands. We have identified why things have not been working, and like a professional boxer, we need to learn, analyze and adapt. The first step should be to change the way we are using the law in this fight against Big Tech. The best approach to dealing with this regulatory dilemma is for policymakers to be dynamic and proactive. Let us use the law to curb problems before they grow and not only attempt to fix them once they have become intractable. Flexibility is the most important trait in the art of boxing- when things are not going according to plan, one needs to change tactics. Perhaps once we shift tactics and identify a new path to follow, the light will become clearer as we hobble along.

Returning to your metaphor gets in the way.

This is a fine first draft: it clears away the brush and shows the central question the next draft shows benefits from learning about: What are the options? By learning about the options you examine two implicit assumptions: that technology we can use as we use present services inevitably leads to centralized private or public behavior collection; and that legal determinism is, like technological determinism, a sound analytic bet. If, as I believe, neither law nor technology determines social outcomes—that another future is always possible—then by learning about Nelson's Xanadu or Tim Berners-Lee's Re-Decentralization of the Web, or FreedomBox might help in imagining practicable options, in which technology, law, and politics all play necessary and interrelated parts.

You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Webs Webs

r2 - 30 Dec 2021 - 15:49:01 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM