Law in the Internet Society

View   r5  >  r4  >  r3  >  r2  >  r1
YanisAlioucheFirstEssay 5 - 18 Jan 2024 - Main.YanisAliouche
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

The Failure of Digital Contact Tracing during COVID-19

-- By YanisAliouche - 13 Oct 2023

Added:
>
>
The COVID-19 epidemic represented a massive shift in governments and populations reliance towards technology in order to ensure the continuation of work, education, and communications amidst the health crisis. The question thus arose as to whether the development of new digital platforms could help combat the ever-developing epidemic. Governments put this to the test through the implementation of digital contact-tracing apps. This essay aims to address how the faith of governments in digital contract tracing was misplaced combatting the spread of the virus, and instead provided a venue for empowering surveillance capitalism and fragilizing democratic principles.
 
Deleted:
<
<

Introduction

The COVID-19 epidemic represented a massive shift in governments and populations reliance towards technology in order to ensure the continuation of work, education, and general communications amidst the health crisis. Because of this extensive reliance arose the question as to whether the development of new digital platforms could help combat the ever-developing epidemic. Governments decided to put this to the test through the implementation of COVID contact-tracing apps. This essay aims to address how the faith of governments in digital contract tracing and covid tracing apps was misplaced and failed when combatting the spread of the virus.
 
Added:
>
>

The Failure of DCT in combatting the epidemic

Contact tracing apps had too many flaws and inefficiencies to be useful in combatting the virus. Firstly, app developers witnessed the difficulty of coordination within governments, as they were sent from office to office in different administrative departments to obtain adequate permissions, thus causing delays in the release of the apps. Rushed releases led to numerous issues: Norway’s Smittestopp app was swiftly shut down as "the risks of intensified surveillance outweighed the app’s as of yet unproven public health benefits” . In India, the app was found that it was able to leak users’ precise locations. This approach of releasing apps and refining them later eroded peoples’ trust in their government, a crucial component in the pandemic fight.
 
Changed:
<
<

Off to a bad start: the lack of expertise of governments in app development

To start out, the lack of expertise of state governments in app development did not help these apps get off to a good start. Constant delays and errors in the development of the apps already put in question the efficiency of these apps. Teddy Gold, director of a nonprofit making pandemic response software witnessed the difficulty of coordination in governments, as app developers were sent from various administrative departments and offices to department in order to obtain adequate permissions – “States don’t develop apps”. This caused delays in the release of these apps overall, and created the sentiment that they were arriving too late: “It’s like the plane has crash-landed and everyone has died and the captain scrambles out of the rubble and he’s like, ‘Okay guys, in an emergency landing there are lifejackets under your seats”.
>
>
The apps functioned through BlueTooth? and GPS, which posed significant challenges: systems didn’t know about surrounding circumstances, such as whether individuals were wearing masks, or if they were separated by a wall… Additionally, a study on Bluetooth contact tracing revealed highlight inaccurate distance measurements on a tram, resulting in 50% false positive and false negative rate.
 
Changed:
<
<
Governments didn’t have the time and resources to get it right, and in consequence, many rushed development. In Norway for instance, the app Smittestopp was shut down shortly after its release as the ‘the risks of intensified surveillance outweighed the app’s as of yet unproven public health benefits” . In India, after the app that was downloaded more than 77 million times, it was found that it was able to leak users’ precise locations. The government then fixed the issue and began paying security researchers who found vulnerabilities in their app. This policy adopted by some of releasing these apps then refining and correcting their issues after affected the peoples’ trust in the government, a component vital in the struggle against the pandemic. It seems that these efforts were more about showcasing government engagement in fighting the virus rather than implementing effective means.
>
>
This technology was meant to accompany traditional tracing methods and public health practices, not replace them. Yet Governments chose to advertise them as “digital vaccines” and a “road to recovery”. In Norway, the Prime Minister assured that “if many people download the Smittestopp app, we can open up society more and get our freedom back”. We knew this was wrong at the time, and it remains untrue today.
 
Added:
>
>

Why invest in this technology?

There are winners in the failure of digital contact tracing: tech companies, and in some ways, governments themselves.
 
Changed:
<
<

Ineffectiveness of the Apps

Ultimately, digital contact tracing failed to be efficient. Firstly, not enough people used the apps for them to be functional. An Oxford study stated that for these to be effective, 60% of the country’s population would have to download and use the apps. Many states failed to achieve these numbers: in Germany 21.7% of the population had the app. In Ireland, 26.3%. In New York and New Jersey, 4% had the app downloaded a month after release. Even those that did download the apps may not have isolated following receipts of exposure notifications.
>
>

Big Tech, hero of crises?

Former Google CEO Eric Smidt announced the pandemic would make people “grateful” for Big Tech. Global crises such as the pandemic present pivotal moments for tech companies to collect new information from us, translate it to data and capitalize on it. Google had always wanted to take ahold of health data – the pandemic presented itself as the perfect opportunity. Shoshana Zuboff notes that “while it is a crisis for all of us, it is something like business as usual for surveillance capitalists, in the sense that it is an opportunity to, possibly, significantly enhance their behavioral data supply chains” .
 
Changed:
<
<
For the people to engage in this digital contact tracing, governments needed to gain people’s trust: trust in the government and trust in their response to the epidemic.
>
>
In the aftermath of 9/11, tech companies were new – they portrayed their surveillance and privacy-exception tactics as an exception and dressed themselves up as heroes during a time of fear. Populations did not know what exactly the companies were up to with their information, and in their fear trusted these entities. 20 years later these same companies have grown into empires due to the data they’ve accumulated, claiming as their own even if it was never supposed to be theirs to begin with, and selling it to interested ears.
 
Changed:
<
<

Privacy and efficiency

The more data was collected by the apps, the more people may worry about the divulgation of their personal information – from location to sensitive information. For instance, digital contact tracing pseudonymizes data – it doesn’t anonymize it. In the case of centralized systems such as France or Australia where one this may pose issues when combining other data collected by the platform, which may lead to identifying individuals.
>
>

Empty promises

Circumstances have thus changed, and so the image of heroism had to transform for BigTech? to appease public concerns and continue to assert and expand their data-driven dominance during the pandemic. To do so, Apple, Google and other companies promised that their technology safeguarded privacy and would be temporary measures, the information solely used for combating the pandemic. Can we trust them? Of course not. The great lack of regulatory framework makes it so that we can only rely on their self-regulation, and hope the companies stay true to their word. But we know all too well how that goes. When Facebook had bought WhatsApp? , it promised it would stay its own separate company. Oops.
 
Changed:
<
<
However, privacy doesn’t paint the whole picture: it is true that people were ready to sacrifice their privacy if they were sure the app would ensure a road towards ‘normalcy’ – if they were efficient. If we look at Singapore for instance, many people opted into downloading the tracing app after the government promised to relax restrictions when adoption increased from 50% to 70%. Additionally, too much privacy could make difficult for public health officials and individuals to have information that may be important to better understand the spread of the virus.
>
>
During the pandemic, companies’ actions hint at empty promises. Lobbyists had legislators in California agree to delay implementing new privacy laws under pretext of the pandemic. Research from digital security firm Surfshark reveals that 60% of contact-tracing apps are vague about their tracking methods, lack transparent terms and conditions and use intrusive methods such as surveillance camera footage, to keep tabs on users.
 
Changed:
<
<

Poor advertisement

The apps were often advertised in the wrong way – either not advertised enough, or advertised through the wrong lens: the technology was not meant to replace traditional means of tracing and public health methods generally, it was meant to accompany these. Governments however used the wrong approach – the Australian app COVIDSafe was advertised as a “digital vaccine” and a “road to recovery”. In Norway, the Prime Minister assured “if many people download the Smittestopp app, we can open up society more and get our freedom back”. At the time, we knew it was wrong. Today, we know this is untrue.
>
>

Government partnership

Private companies are not bound by the same constitutional provisions as governments – they don’t have to act in public interest, and there is little regulatory framework. Governments collaborating with tech companies helps solidify their surveillance systems and begin to bypass their democratic systems. In the UK, the NHS had announced a deal with private technology companies to combine and cross-reference data and data partners hold – both the NHS and private companies thus have a large array of new data they have access to. Matt Hancock signed legal backing for the NHS to set aside its duty of confidentiality in data sharing agreements.
 
Deleted:
<
<

App mechanisms

The apps themselves proved to be quite ineffective. Many were easily hackable and could divulge lots of personal information in unwanted hands, as we explored with the issues present at app rollouts. In addition, the apps functioned through BlueTooth? or GPS, which posed significant challenges. Bluetooth signals moves through walls, meaning you may receive a notification for COVID exposure even though you were separated from the infected. The systems don’t know many circumstances, such as whether the individual wearing a mask… A study on contact tracing through Bluetooth revealed that Bluetooth’s distance measurements were highly inaccurate on a tram: the app sent false positives and false negatives for contact at a 50% rate – “the performance of such detection rules is similar to that of triggering notifications by randomly selecting from the participants in our experiments, regardless of proximity”.
 
Added:
>
>

What worked? The future we need

 
Deleted:
<
<

What worked?

The focus on the development of these apps should have been redirected on more efficient means of combatting the epidemic, notably sticking to what we know best: traditional public health methods: manual contact tracing helped identify the spread of the virus more concisely by directly identifying individuals and their recent contacts and settings. Investment in infrastructures where personnel and equipment were lacking – there was a need for more hospital beds, COVID tests, masks…

So now we are ready to learn something, it seems to me. Epidemiologists and public health sociologists can ask what about the pathogen and the natural history of infection caused proximity-tracking technologies to be of limited use, or how the cultural styles of various societies contributed to the differences in uptake and outcome. But we can ask why the reliance of software technology was so over-grown, and why the platform companies were the primary exponents of privacy while simultaneously building dangerous social surveillance systems that proved to be of so little actual utility.

 \ No newline at end of file
Added:
>
>
The focus on the development of these apps should have been redirected on more efficient means of combatting the epidemic, notably sticking to what governments know best: traditional public health methods: manual contact tracing helped identify the spread of the virus more concisely by directly identifying individuals and their recent contacts and settings. Investment in infrastructures where personnel and equipment were lacking – there was a need for more hospital beds, COVID tests, masks… In combatting COVID-19, the public traditionally places their trust in their government to do the right thing. Professionals are bound to scientific and professional norms. Public sector operations to protect health are to be done solely for public interest. These spy-apps were allowed to exist due to the lack of regulatory framework for the benefit of surveillance capitalism all while hindering democratic principles.

Revision 5r5 - 18 Jan 2024 - 17:26:55 - YanisAliouche
Revision 4r4 - 06 Jan 2024 - 12:39:01 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM