Computers, Privacy & the Constitution

View   r3  >  r2  >  r1
MoneshDevireddyFirstPaper 3 - 15 May 2024 - Main.MoneshDevireddy
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Line: 6 to 6
 -- By MoneshDevireddy - 04 Mar 2024
Added:
>
>
Facial recognition technology (FRT) is an innovation of modern times that allows the comparison of two or more pictures of faces and determination of whether they represent the same person. FRT’s contemporary usage by government agencies poses disturbing implications for society—namely threatening our individual expectation to privacy and other important constitutional rights.
 

I. Facial Recognition Technology and Government Usage

Changed:
<
<
Facial recognition technology (FRT) is an innovation of modern times that allows the comparison of two or more pictures of faces and determination of whether they represent the same person. The biometric technology uses various techniques in making this determination, from comparing the distance between the eyes, shape of the cheekbones, and even analyzing a patch of skin for characteristic pores, lines, and texture. While the technology behind facial recognition is nothing short of impressive, FRT’s contemporary usage by government agencies poses disturbing implications for society—namely threatening our individual expectation to privacy and other important constitutional rights.

FRT is widely used at the federal and state levels of government, from the FBI to local law enforcement agencies such as the NYPD and LAPD. In addition to being used by law enforcement to identify suspects, FRT has recently been implemented at airports and other transit hubs to facilitate travel and protect national security. The FBI’s facial recognition system allows it access to criminal and civil photos from various databases, such as visa applicant photos, mugshots, and driver’s license photos; access to such databases means that law enforcement face recognition affects more than 117 million American adults, with one in two adults in a face recognition network. Despite these staggering statistics about the prevalence of its usage, FRT is largely unregulated, and government agencies widely use the technology without democratic oversight or transparency as to specific usage and technical limitations—a fact that must change if we are to protect our civil liberties and constitutional rights.

>
>
FRT is widely used at the federal and state levels of government, from the FBI to local law enforcement agencies, and even at airports and other transit hubs to facilitate travel and protect national security. The FBI’s facial recognition system allows it access to criminal and civil photos from various databases, such as visa applicant photos, mugshots, and driver’s license photos; access to such databases means that law enforcement face recognition affects more than 117 million American adults, with one in two adults in a face recognition network. Despite these staggering statistics about the prevalence of its usage, FRT is largely unregulated, and government agencies widely use the technology without democratic oversight or transparency as to specific usage and technical limitations—a fact that must change if we are to protect our civil liberties and constitutional rights.
 

II. Privacy

Changed:
<
<
The Fourth Amendment ensures our reasonable expectations of privacy by protecting against unreasonable searches and seizures by the government. Academics evaluating FRT’s implications on our Fourth Amendment rights have come out on either side—concluding that FRT does threaten our right to privacy, or that it doesn’t. Those that argue the latter position rely heavily on the Supreme Court’s holding in United States v. Katz, which held that there exists no right to privacy in public places. It follows, then, that we subject ourselves to identification/examination of our faces when we step into public places, so facial recognition technology doesn’t violate our privacy rights in this context any more than would a random passerby checking us out.
>
>
The Fourth Amendment ensures our reasonable expectations of privacy by protecting against unreasonable searches and seizures by the government. Academics evaluating FRT’s implications on our Fourth Amendment rights have come out on either side—concluding that FRT does threaten our right to privacy, or that it doesn’t. Those that argue the latter position rely heavily on the Supreme Court’s holding in United States v. Katz, which held that there exists no right to privacy in public places. It follows, then, that we subject ourselves to identification/examination of our faces when we step into public places, so facial recognition technology doesn’t violate our privacy rights in this context any more than would a random passerby checking us out. Additionally, should an individual be vehemently opposed to even a brief scan of his/her face, (s)he can don a mask as an extra preventative measure, an action that has become a social norm since the COVID-19 pandemic.
  But this cursory analysis does not paint the whole picture. With the pervasiveness of CCTVs, recording much of our public movements, police body cameras, and other mediums of potential surveillance, it is very possible to use FRT in a more invasive form. Such an application has been termed as “dragnet, real-time” usage, in which law enforcement can use FRT in a “suspicionless and surreptitious” way, allowing them to scan people’s faces as they go about their day and compare the faces with identifying information such as name, address, criminal history, immigration status, etc. Such a usage extends our Katz analysis, introducing the argument that, while we may consent to superficial inspections of our faces by others in public settings, we do not invite strangers to determine intimate details about our work and personal lives based on such inspections.

Other judicial holdings relevant to this issue include Carpenter v. United States, which held that the use of advanced technologies to engage in the prolonged and sustained surveillance of a person’s public activities may prompt Fourth Amendment concerns, and Kyllo v. United States, which indicated that a search occurs when an instrument that is not available to the public is used for an investigation that, without the technology, would normally be constitutionally protected. As such, more information regarding how exactly government agencies are using FRT is required, in order to determine whether such usage fits within the boundaries of our jurisprudence concerning privacy and surveillance.

Deleted:
<
<
What would it mean to say that we have an expectation of privacy in our faces? Presumably we would wear masks or veils, the usual resort over the last several thousand years. Do we have a right to mask? There is no shortage of material to draw upon.

 

III. Equal Protection

Changed:
<
<
Aside from Fourth Amendment concerns, facial recognition technology has serious potential to undermine our equal protection rights. This potential stems from the inherent technical limitations of FRT, as well as the possibility for misuse by law enforcement that could result in further marginalization of particular demographics. Though many FRT algorithms boast high accuracy rates, a deeper dive reveals many biases against minority demographics. In fact, an NIST study that evaluated over 180 FRT algorithms from around one hundred developers revealed higher false positive match rates in Asians, African Americans, and native groups. Such inherent flaws in FRT software can lead to disastrous consequences; we have already seen manifestations of this phenomenon, in multiple cases of African American men being wrongfully arrested based on false facial recognition matches. Unfortunately, despite FRT’s potential of leading to a racially disproportionate impact, our jurisprudence likely forecloses any recourse on equal protection grounds without a showing of discriminatory purpose, which will be difficult to do. See, e.g., Washington v. Davis; McCleskey v. Kemp.
>
>
Aside from Fourth Amendment concerns, facial recognition technology has serious potential to undermine our equal protection rights. The United States is no stranger to discriminatory applications of its policing power. And FRT has potential to be the latest tool for law enforcement to perpetuate racist and anti-activist surveillance, widening pre-existing inequalities. Professor Amanda Levendowski terms this inequitable application of FRT that would be disadvantageous to certain minority communities as “deployment bias”—allowing the government to "weaponiz[e] surveillance technologies, such as face surveillance, against marginalized communities…render[ing] their movements hypervisible to law enforcement." Providing this ability to the government without proper safeguards not only might affect racial minorties, but other marginalized populations, such as undocumented immigrants (by ICE) or Muslim citizens (by, e.g., the NYPD). Whether we attack FRT on equal protection grounds or on another theory, efforts must be made to curb the technology’s ability to further inequities in our society.
 
Deleted:
<
<
Perhaps more pertinent than technical limitations of FRT potentially leading to societal inequities is the ability it affords law enforcement to perpetuate racist and anti-activist surveillance, widening pre-existing inequalities. Professor Amanda Levendowski terms this inequitable application of FRT that would be disadvantageous to certain minority communities as “deployment bias”—allowing the government to "weaponiz[e] surveillance technologies, such as face surveillance, against marginalized communities…render[ing] their movements hypervisible to law enforcement." Providing this ability to the government without proper safeguards not only might affect racial minorties, but other marginalized populations, such as undocumented immigrants (by ICE) or Muslim citizens (by, e.g., the NYPD). Whether we attack FRT on equal protection grounds or on another theory, efforts must be made to curb the technology’s ability to further inequities in our society.
 
Added:
>
>

IV. The Road Ahead

 
Changed:
<
<
Speculative possibilities of potential future disparate impact are not the basis for tenable claims. Depending on non-improvement of software to preserve liberties wouldn't be a sustainable approach.
>
>
It is now time to think about prophylactic measures that can be taken to prevent a quasi-Owellian state of affairs in which the government can access intimate details about our personal, vocation, and criminal backgrounds as easily as they can run our license plates.
 
Changed:
<
<
A fine start. You don't need all the background, and you do need to come more to grips with some basic questions. I'm sure the rewrite will be fruitful.
>
>
First, proper legislative oversight over FRT must be implemented before allowing its rampant use. Regulatory agencies should collaborate with experts from diverse fields, including law, ethics, and technology, to develop guidelines that prioritize privacy protection and mitigate discriminatory outcomes. These regulations should encompass transparent data collection practices, stringent security measures, and accountability mechanisms to hold entities responsible for FRT misuse or abuse.
 
Changed:
<
<

IV. The Road Ahead

>
>
Furthermore, enhancing transparency surrounding FRT is essential to build public trust and accountability. Entities deploying FRT should provide clear information about its usage, including the purposes, methodologies, and potential risks involved. Transparency reports detailing data practices, algorithmic biases, and performance metrics should be regularly published to enable independent auditing and scrutiny. Moreover, accountability mechanisms such as oversight boards or independent regulators should oversee FRT deployment to ensure compliance with legal and ethical standards.
 
Changed:
<
<
It is now time to think about prophylactic measures that can be taken to prevent a quasi-Orwellian state of affairs in which the government can access intimate details about our personal, vocational, and criminal backgrounds as easily as it can run our license plates. So far, legal action against agencies such as the FBI, DEA, ICE, and CBP are being explored by organizations such as the ACLU. It is also imperative that we implement proper legislative oversight over FRT before allowing its rampant use.
>
>
Finally, respecting individuals' rights to autonomy and consent is fundamental to ethical FRT deployment. Entities collecting facial data should obtain informed consent from individuals, clearly outlining the purposes and scope of data usage. And individuals should have the right to access, correct, or delete their facial data held by FRT systems, empowering them to exercise control over their personal information.
 



MoneshDevireddyFirstPaper 2 - 25 Apr 2024 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Line: 23 to 23
  Other judicial holdings relevant to this issue include Carpenter v. United States, which held that the use of advanced technologies to engage in the prolonged and sustained surveillance of a person’s public activities may prompt Fourth Amendment concerns, and Kyllo v. United States, which indicated that a search occurs when an instrument that is not available to the public is used for an investigation that, without the technology, would normally be constitutionally protected. As such, more information regarding how exactly government agencies are using FRT is required, in order to determine whether such usage fits within the boundaries of our jurisprudence concerning privacy and surveillance.
Added:
>
>
What would it mean to say that we have an expectation of privacy in our faces? Presumably we would wear masks or veils, the usual resort over the last several thousand years. Do we have a right to mask? There is no shortage of material to draw upon.

 

III. Equal Protection

Aside from Fourth Amendment concerns, facial recognition technology has serious potential to undermine our equal protection rights. This potential stems from the inherent technical limitations of FRT, as well as the possibility for misuse by law enforcement that could result in further marginalization of particular demographics. Though many FRT algorithms boast high accuracy rates, a deeper dive reveals many biases against minority demographics. In fact, an NIST study that evaluated over 180 FRT algorithms from around one hundred developers revealed higher false positive match rates in Asians, African Americans, and native groups. Such inherent flaws in FRT software can lead to disastrous consequences; we have already seen manifestations of this phenomenon, in multiple cases of African American men being wrongfully arrested based on false facial recognition matches. Unfortunately, despite FRT’s potential of leading to a racially disproportionate impact, our jurisprudence likely forecloses any recourse on equal protection grounds without a showing of discriminatory purpose, which will be difficult to do. See, e.g., Washington v. Davis; McCleskey v. Kemp.

Line: 30 to 35
  Perhaps more pertinent than technical limitations of FRT potentially leading to societal inequities is the ability it affords law enforcement to perpetuate racist and anti-activist surveillance, widening pre-existing inequalities. Professor Amanda Levendowski terms this inequitable application of FRT that would be disadvantageous to certain minority communities as “deployment bias”—allowing the government to "weaponiz[e] surveillance technologies, such as face surveillance, against marginalized communities…render[ing] their movements hypervisible to law enforcement." Providing this ability to the government without proper safeguards not only might affect racial minorties, but other marginalized populations, such as undocumented immigrants (by ICE) or Muslim citizens (by, e.g., the NYPD). Whether we attack FRT on equal protection grounds or on another theory, efforts must be made to curb the technology’s ability to further inequities in our society.
Added:
>
>
Speculative possibilities of potential future disparate impact are not the basis for tenable claims. Depending on non-improvement of software to preserve liberties wouldn't be a sustainable approach.

A fine start. You don't need all the background, and you do need to come more to grips with some basic questions. I'm sure the rewrite will be fruitful.

 

IV. The Road Ahead

It is now time to think about prophylactic measures that can be taken to prevent a quasi-Orwellian state of affairs in which the government can access intimate details about our personal, vocational, and criminal backgrounds as easily as it can run our license plates. So far, legal action against agencies such as the FBI, DEA, ICE, and CBP are being explored by organizations such as the ACLU. It is also imperative that we implement proper legislative oversight over FRT before allowing its rampant use.


MoneshDevireddyFirstPaper 1 - 05 Mar 2024 - Main.MoneshDevireddy
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstPaper"

Government Usage of Facial Recognition Technology: a Threat to Privacy and Constitutional Rights

-- By MoneshDevireddy - 04 Mar 2024

I. Facial Recognition Technology and Government Usage

Facial recognition technology (FRT) is an innovation of modern times that allows the comparison of two or more pictures of faces and determination of whether they represent the same person. The biometric technology uses various techniques in making this determination, from comparing the distance between the eyes, shape of the cheekbones, and even analyzing a patch of skin for characteristic pores, lines, and texture. While the technology behind facial recognition is nothing short of impressive, FRT’s contemporary usage by government agencies poses disturbing implications for society—namely threatening our individual expectation to privacy and other important constitutional rights.

FRT is widely used at the federal and state levels of government, from the FBI to local law enforcement agencies such as the NYPD and LAPD. In addition to being used by law enforcement to identify suspects, FRT has recently been implemented at airports and other transit hubs to facilitate travel and protect national security. The FBI’s facial recognition system allows it access to criminal and civil photos from various databases, such as visa applicant photos, mugshots, and driver’s license photos; access to such databases means that law enforcement face recognition affects more than 117 million American adults, with one in two adults in a face recognition network. Despite these staggering statistics about the prevalence of its usage, FRT is largely unregulated, and government agencies widely use the technology without democratic oversight or transparency as to specific usage and technical limitations—a fact that must change if we are to protect our civil liberties and constitutional rights.

II. Privacy

The Fourth Amendment ensures our reasonable expectations of privacy by protecting against unreasonable searches and seizures by the government. Academics evaluating FRT’s implications on our Fourth Amendment rights have come out on either side—concluding that FRT does threaten our right to privacy, or that it doesn’t. Those that argue the latter position rely heavily on the Supreme Court’s holding in United States v. Katz, which held that there exists no right to privacy in public places. It follows, then, that we subject ourselves to identification/examination of our faces when we step into public places, so facial recognition technology doesn’t violate our privacy rights in this context any more than would a random passerby checking us out.

But this cursory analysis does not paint the whole picture. With the pervasiveness of CCTVs, recording much of our public movements, police body cameras, and other mediums of potential surveillance, it is very possible to use FRT in a more invasive form. Such an application has been termed as “dragnet, real-time” usage, in which law enforcement can use FRT in a “suspicionless and surreptitious” way, allowing them to scan people’s faces as they go about their day and compare the faces with identifying information such as name, address, criminal history, immigration status, etc. Such a usage extends our Katz analysis, introducing the argument that, while we may consent to superficial inspections of our faces by others in public settings, we do not invite strangers to determine intimate details about our work and personal lives based on such inspections.

Other judicial holdings relevant to this issue include Carpenter v. United States, which held that the use of advanced technologies to engage in the prolonged and sustained surveillance of a person’s public activities may prompt Fourth Amendment concerns, and Kyllo v. United States, which indicated that a search occurs when an instrument that is not available to the public is used for an investigation that, without the technology, would normally be constitutionally protected. As such, more information regarding how exactly government agencies are using FRT is required, in order to determine whether such usage fits within the boundaries of our jurisprudence concerning privacy and surveillance.

III. Equal Protection

Aside from Fourth Amendment concerns, facial recognition technology has serious potential to undermine our equal protection rights. This potential stems from the inherent technical limitations of FRT, as well as the possibility for misuse by law enforcement that could result in further marginalization of particular demographics. Though many FRT algorithms boast high accuracy rates, a deeper dive reveals many biases against minority demographics. In fact, an NIST study that evaluated over 180 FRT algorithms from around one hundred developers revealed higher false positive match rates in Asians, African Americans, and native groups. Such inherent flaws in FRT software can lead to disastrous consequences; we have already seen manifestations of this phenomenon, in multiple cases of African American men being wrongfully arrested based on false facial recognition matches. Unfortunately, despite FRT’s potential of leading to a racially disproportionate impact, our jurisprudence likely forecloses any recourse on equal protection grounds without a showing of discriminatory purpose, which will be difficult to do. See, e.g., Washington v. Davis; McCleskey v. Kemp.

Perhaps more pertinent than technical limitations of FRT potentially leading to societal inequities is the ability it affords law enforcement to perpetuate racist and anti-activist surveillance, widening pre-existing inequalities. Professor Amanda Levendowski terms this inequitable application of FRT that would be disadvantageous to certain minority communities as “deployment bias”—allowing the government to "weaponiz[e] surveillance technologies, such as face surveillance, against marginalized communities…render[ing] their movements hypervisible to law enforcement." Providing this ability to the government without proper safeguards not only might affect racial minorties, but other marginalized populations, such as undocumented immigrants (by ICE) or Muslim citizens (by, e.g., the NYPD). Whether we attack FRT on equal protection grounds or on another theory, efforts must be made to curb the technology’s ability to further inequities in our society.

IV. The Road Ahead

It is now time to think about prophylactic measures that can be taken to prevent a quasi-Orwellian state of affairs in which the government can access intimate details about our personal, vocational, and criminal backgrounds as easily as it can run our license plates. So far, legal action against agencies such as the FBI, DEA, ICE, and CBP are being explored by organizations such as the ACLU. It is also imperative that we implement proper legislative oversight over FRT before allowing its rampant use.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 3r3 - 15 May 2024 - 22:16:39 - MoneshDevireddy
Revision 2r2 - 25 Apr 2024 - 14:35:43 - EbenMoglen
Revision 1r1 - 05 Mar 2024 - 00:49:35 - MoneshDevireddy
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM