Law in the Internet Society

View   r3  >  r2  ...
KifayaAbdulkadirSecondEssay 3 - 11 Jan 2022 - Main.KifayaAbdulkadir
Line: 1 to 1
 
META TOPICPARENT name="SecondEssay"
Line: 6 to 6
 -- By KifayaAbdulkadir - 09 Dec 2021
Changed:
<
<
It has been said that the key to being the best professional boxer is agility and the ability to think quickly on your feet. When one is in the ring facing his opponent, all planned tactic should go out the window. In order to win, you must have the ability to observe your opponent, learn from their moves, analyze and adapt. From time immemorial, we have been in this hyperbolic boxing ring with big Tech getting obliterated yet throwing the same punches. Is it perhaps time to switch stances?
>
>
From time immemorial, we have been in a hyperbolic boxing ring with Big Tech, getting obliterated yet throwing the same punches. Is it perhaps time to switch stances? We are at a point in society where our lives are intertwined with technology that envisioning a future without one is idealistic. Much of what is our current reality was some time ago considered a dystopian future. Tech monopolies have amassed so much power that they have been able to usurp our rights without legal consequences. While we can agree that technology has brought about numerous challenges, is the best way to completely detach from it? For all of its evils, technology has drastically changed our lives and in some ways, for the better. We cannot realistically plan a future centered on the elimination of technology unless we opt to go completely off-grid. But at the same time, coexisting with it should not mean sacrificing our fundamental human rights.
 
Changed:
<
<
The prolonged metaphor uses many words, but does it add enough to be worth the baggage? Almost 10% of the space gone, and I have only a very general sense of your point. Why not start by telling the reader what she most needs to hear?
>
>
So what is the solution to dealing with this enigma?
 
Changed:
<
<
We are at a point in society where our lives are so intertwined with technology that envisioning a future without one is quixotic. Much of what is our current reality was some time ago considered a dystopian future. Tech monopolies have amassed so much power that they have been able to operate largely without legal consequences. While we can agree that technology has brought about numerous challenges, is the best way to completely detach from it? For all of its evils, technology has drastically changed our lives and in some ways, for the better. We cannot realistically plan a future centered on the elimination of technology unless we opt to go completely off-grid. So what is the solution to dealing with this enigma?
>
>
Numerous recommendations have been proposed — users have been advised to transition to alternative applications in an attempt to minimize dependency on Big Tech. Suggestions such as using ‘duckduckgo’ instead of google have been proposed, as these alternative sites do not mine as much data as the big tech. On the legal side, the government has been doubling down on anti-trust laws in an attempt to reduce the monopoly of these large companies. Increasing competition in the technology field would supposedly incentivize these platforms to give their best. Once consumers have a wide variety of options to pick from then they can settle on those that prioritize their privacy. However, anti-trust laws alone cannot suffice in a world of ever evolving technology. And though there is speculation that Big Tech might be moving away from trading private data, I must confess I am slightly unconvinced about the plausibility of this scenario. Let’s assume hypothetically Big Tech companies do actually decide to step away from trading private data, who is to say other companies won’t step in to fill the vacuum? In the capitalistic world that we reside in, there will always be buyers and sellers, regardless of a change in the ‘service providers’.
 
Changed:
<
<
Really a false opposition, isn't it? The implicit premise is that the only possible technology is of a kind that concentrates power in this undesirable fashion. But this premise is not established. I have been laboring to show why I believe it is indeed false. If we can use technology in ways that support our interconnection without delivering behavior data on us all into the hands of a few entities, "private" or "public" as they may be, wouldn't this apparent dichotomy dissolve?
>
>
So what are our options? Cutting of technology as we’ve seen is not a solution. We also cannot place the task entirely on users to be vigilant in protecting their privacy. Relying on these platforms to develop a conscience and prioritize the consumers rights is obviously out of question. Additionally, given the multi-faceted nature of technology, we cannot rely on the law alone to fix this concern.
 
Changed:
<
<
Numerous recommendations have been proposed — users have been advised to transition to alternative applications in an attempt to minimize dependency on Big Tech. Suggestions such as using ‘duckduckgo’ instead of google have been proposed, as these alternative sites do not mine as much data as the big tech. On the legal side, the government has been doubling down on anti-trust laws in an attempt to reduce the monopoly of these large companies. Increasing competition in the technology field would supposedly incentivize these platforms to give their best. Once consumers have a wide variety of options to pick from then they can settle on those that prioritize their privacy. However, anti-trust laws alone cannot suffice in a world of ever evolving technology. And though there is speculation that Big Tech might be moving away from trading private data, I must confess I am slightly unconvinced about the plausibility of this scenario. Let’s assume hypothetically Big Tech companies do actually decide to step away from trading private data, who is to say other companies won’t step in to fill the vacuum? In this capitalistic world that we reside in, there will always be buyers and sellers, regardless of a change in the ‘service providers’.
>
>
The challenges of technology being confronted today have been long debated for years yet it has continued to maintain its stronghold. The main reason for this is that we have left the future of technology entirely in the hands of technologists. We have let them decide for us what is important and when faced with competing values, they will always prioritize profit over human values. In order to conquer this behemoth, we need to change the manner in which technology products are created and subsequently used. There is a power imbalance between consumers and tech companies- one which they have exploited to our detriment. The only way to reclaim our rights is to become a well-informed society. A surveillance economy is not the only option available for our use of technology, it is the option big tech prefers because it is more profitable for them. If we are aware of the better options available to technologists in innovation then perhaps we can sway the path of technological development.
 
Changed:
<
<
So what are out options?
>
>
If we change our position from being merely consumers to prosumers, then we can achieve the desirable outcome we seek- a harmonious coexistence with technology- one that is rooted in the protection of our human values. In the words of Antoine de Saint-Exupéry: ‘If you want to build a ship, don't drum up people to collect wood and don't assign them tasks and work, but rather teach them to long for the endless immensity of the sea.’ If we deeply learn about digital technology, about the algorithms and the functioning of the system then we can avoid being used by it. This approach to addressing the problem is not one that can be achieved by individuals or by one generation alone. Shielding children from the dangers of technology will not help curb the problem. Education is the engine through which we can achieve change. By teaching young children about design of technological products and the ethical concerns behind them, then we can raise a generation of knowledgeable, civic minded citizens that will prioritize social concern in technological development. Only then can we trust that the next phase of technology will better protect our human rights.
 
Changed:
<
<
Minus the typo, this is the real question this draft is valuably unearthing. What do you want to learn about the options?
>
>
We are the key to a better digital future. The reason why the legal solutions provided thus far have been woefully inadequate and myopic is because regulators have been playing catch up with Big Tech. Technology is evolving exponentially and because the law is very slow in keeping up, regulatory gaps are being exposed. This is why companies have continuously been able to skirt the operation of the law. By the time a regulation has been implemented, these companies have introduced new innovations that raise new legal and ethical implications, not fitting into the already available categories. By being prosumers, we blur the lines between technologists and consumers. Legal interventions can then be holistic as they arise from a fully informed perspective, with lawyers, technologists, policy makers and consumers at the decision making table.
 
Added:
>
>
The more technology advances, the more intrusive it will get and the more problems we will face. If we can intervene at the inception stage then this will help curb the problem before it becomes intractable. Big Tech has proven to us they cannot be trusted to be proper custodians of technology. This is a systemic problem that cannot be addressed by individuals alone. It requires an overhaul of the entire system. Every citizen, not only technologists, should have a voice in determination of things that will affect us. By weakening technologists epistemic monopoly, then we empower consumers to govern technology instead of Big Tech governing us.
 
Deleted:
<
<
Cutting of technology as we’ve seen is not a solution. We cannot place the task entirely on users to be vigilant in protecting their privacy. And while we can encourage adults to do so, we cannot reasonably expect young children to take these protective measures or to even understand the exigency of the matter.

Oh, on the contrary. The curiosity of children is the most powerful of social forces. But it's not a matter of teaching them to deal with an exigent threat. Giving children an opportunity to explore how things work and to explain them to one another, they can invent and accomplish anything they set out to do. They could master how servers work just as easily as they can master iPads, and in the process they can make Big Tech less useful to everyone around them, thus less powerful.

Relying on these platforms to develop a conscience is obviously out of question.

I do not profess to have all the answers on the best way forward but as a lawyer I am hard-wired to look towards the law to solve crisis.

Now is the time to soften up the hardwiring; that's what we're here for. American legal realism begins from predicting what courts will do in fact and ends caring about what social outcomes are in fact, regardless of what it says in the books. That's why in my 1L course we don't define "law," only "lawyering," which is defined as making things happen in society using words. Lawyers are therefore people using words to make things happen. Teaching, making software, giving counsel, appearing in court, talking to policy-makers, negotiating with and among businesses—the point is to learn effectiveness in lawyering, not law.

`"Hardwiring" means actually "implicit unquestioned assumptions." The purpose of recognizing them is not to dispose of them, but rather to put them in dialogue with the rest of what we know and can learn.

The legal solutions provided thus far have been woefully inadequate and myopic. I believe that the biggest problem contributing to the inefficacy of the laws is that regulators have been struggling to catch up with Big Tech. Technology is evolving exponentially and because the law is very slow in keeping up, regulatory gaps are being exposed. This is why these companies have continuously been able to skirt the operation of the law. By the time a regulation has been implemented, these companies have introduced new innovations that raise new legal and ethical implications, not fitting into the already available categories.

In order to conquer this behemoth, we need to address this problem not only from the status quo but from the prism of the future. We need to think ahead and use the law to obviate problems and not merely to solve them once they have permeated society. Granted, the insurmountable challenges brought about by disruptive technology is unprecedented. But technological innovations do not occur spontaneously- if policy makers analyze data surrounding technology they would be able to tell what innovations are likely to hit the market and which ones require extra regulation.

It would be simpler to talk to the people who make it, which they do all the time.

Elon Musk, for example, recently announced that his company Neuralink would begin implanting brain chips in humans next year, to assist those with spinal injuries. While Facebook had also confirmed that it was working on a similar Brain Interface Technology project, that would read brain signals and convert it to text, they quickly abandoned it and opted instead for one that would read muscle signals in the arms. It would not be far-fetched to assume that neural-interface technology is somewhere in the near future.

My dear friend Hal Abelson at MIT has been telling me for almost thirty years that the direct neural interface to computers is twenty years away. And he's right: at the end of some twenty-year period it will happen, but those twenty years haven't begun yet.

Policy-makers have to sort out hype from actionable intelligence all the time, taking into account just how much bullshit they are subjected to all the time and the real scope of possible action given the political context within which they make policy. You are underestimating the sophistication of their efforts, whatever their level of resources. Smart people do this work, and they try not to get blown around by smoke. Musk, as you are aware, is always partly smoke.

Presumably the reason why we are not fully there yet is because there are technological issues that need to first be overcome. Can we imagine the data privacy concerns that will arise once the line between machine and brain is completely blurred? It is therefore not unfathomable for regulators to predict where innovation is headed so long as they keep abreast of the industry and analyze trends.

They don't have to do this by themselves. There are people who have been thinking about these questions for two generations already. Some of the policy-makers could be my grandchildren, if I'd ever had children, so the issues have been thought about literally since their grandparents' time. You are a really important and indeed decisive participant in that conversation, because yours is the last generation that gets a choice. But you will be more effective is you begin knowing that you are joining a conversation and don't have to invent it.

The more technology advances, the more intrusive it will get and the more problems regulators will have on their hands. We have identified why things have not been working, and like a professional boxer, we need to learn, analyze and adapt. The first step should be to change the way we are using the law in this fight against Big Tech. The best approach to dealing with this regulatory dilemma is for policymakers to be dynamic and proactive. Let us use the law to curb problems before they grow and not only attempt to fix them once they have become intractable. Flexibility is the most important trait in the art of boxing- when things are not going according to plan, one needs to change tactics. Perhaps once we shift tactics and identify a new path to follow, the light will become clearer as we hobble along.

Returning to your metaphor gets in the way.

This is a fine first draft: it clears away the brush and shows the central question the next draft shows benefits from learning about: What are the options? By learning about the options you examine two implicit assumptions: that technology we can use as we use present services inevitably leads to centralized private or public behavior collection; and that legal determinism is, like technological determinism, a sound analytic bet. If, as I believe, neither law nor technology determines social outcomes—that another future is always possible—then by learning about Nelson's Xanadu or Tim Berners-Lee's Re-Decentralization of the Web, or FreedomBox might help in imagining practicable options, in which technology, law, and politics all play necessary and interrelated parts.

 



Revision 3r3 - 11 Jan 2022 - 22:46:35 - KifayaAbdulkadir
Revision 2r2 - 30 Dec 2021 - 15:49:01 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM