Computers, Privacy & the Constitution

View   r4  >  r3  ...
MitchellKohlesSecondPaper 4 - 03 May 2021 - Main.MitchellKohles
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"
Line: 13 to 13
 Nevertheless, mass facial recognition tools present a unique danger to the First Amendment values of political deliberation. To avoid this outcome, and in the context of technology permitting mass facial recognition, state actors have an obligation to secure public life from constant visual surveillance, aggregation, and analysis.
Deleted:
<
<

Mass Facial Recognition and Consent

The ACLU is currently litigating a case against Clearview AI for its dissemination of facial recognition software. ACLU says that when Clearview built its database—by scraping the web for photos of all kinds, then analyzing them to build “faceprints” of millions of individuals, which Clearview’s customers can use to identify and track those individuals—it violated an Illinois state law, BIPA, that requires the consent of persons whose biometric information is sold. Clearview’s product is not entirely novel. The FBI and law enforcement agencies across the country employ facial recognition technology of various kinds, and those are now, according to Clearview, the company’s sole clients.
 
Changed:
<
<
However, there is a meaningful difference—turning on Fourth Amendment logic—between the use of photos secured pursuant to arrests, border crossings, and other spheres of governmental presence and personal privilege and what Clearview does.
>
>

Facial Recognition and Consent

 
Changed:
<
<
In its briefs attacking BIPA, Clearview argues that because its technology merely sweeps up information that is “public”—e.g., photos posted to Facebook—its service doesn’t implicate privacy interests. Motion to Dismiss at 22. But Clearview’s invocation of the Fourth Amendment’s third-party doctrine is not only flawed; it shows why the doctrine itself fits so poorly with the Fourth Amendment.
>
>
The ACLU is currently litigating a case against Clearview AI for its dissemination of facial recognition software. ACLU says that when Clearview built its database—by scraping the web for photos of all kinds, then analyzing them to build “faceprints” of millions of individuals, which Clearview’s customers can use to identify and track those individuals—it violated an Illinois state law, BIPA, that requires anyone selling any person’s biometric information to get that person’s consent. Clearview’s product is not entirely novel. The FBI and law enforcement agencies across the country employ facial recognition technology of various kinds, and those are now, according to Clearview, the company’s sole clients.
 
Changed:
<
<
First, the third-party doctrine ultimately depends on consent, whether that be consent to share with your carrier the phone number you’d like to dial or consent to tell your credit card issuer when and where you ate lunch. But a photo on the internet does not necessarily convey any consent of the photographed person, not even the low bar of consent contemplated by Smith v. Maryland, 442 U.S. 735 (1979). Consent in this context breaks down entirely. Surveillance via consent, in Clearview’s product, is accomplished only by proclaiming the entire internet an open field in which John Doe’s selfie that he posts to Instagram implies no more privacy interest (and just as much consent) as a photo of a young woman in her bedroom uploaded by her stalker. What Clearview’s argument, despite its clear flaw, reveals is how cartoonishly ill-suited the third-party doctrine is to modern life.
>
>
However, there is a meaningful difference—turning on Fourth Amendment logic—between the use of photos secured pursuant to arrests, border crossings, or the state’s issuing of a driver’s license and what Clearview does.
 
Changed:
<
<
Equally important, though, is the fact that a photo of your face is not only a great way to connect you to other existing photos, but captures a nearly immutable aspect of your identity—in the text of the Fourth Amendment, your “person”. As a result, a unique “faceprint” developed from iterative analysis of countless photos erodes the privacy of one’s person far into the future.
>
>
In its briefs attacking BIPA, Clearview argues that because its technology merely sweeps up information that is “public”—e.g., photos posted to Facebook—its service doesn’t implicate privacy interests. Motion to Dismiss at 22. But Clearview’s invocation of the Fourth Amendment’s third-party doctrine is not only flawed; it shows why the doctrine itself fits so poorly with the Fourth Amendment.
 
Changed:
<
<

Mass Facial Recognition and Expression

This technology now allows companies—and thus governments, for a fee—to build unique faceprints of individuals based on their (visual) online presence. Such power implicates for the rights of expression, assembly, and association because it can chill the exercise of these rights for those who want to keep their expression and associations anonymous.
>
>
The third-party doctrine ultimately depends on consent, whether that be consent to share with your carrier the phone number you’d like to dial or consent to tell your credit card issuer when and where you ate lunch. But a photo on the internet does not necessarily convey any consent of the photographed person, not even the low bar of consent contemplated by Smith v. Maryland, 442 U.S. 735 (1979). Consent in this context breaks down entirely. Surveillance via consent, in Clearview’s product, is accomplished only by proclaiming the entire internet an open field in which John Doe’s selfie that he posts to Instagram implies no more privacy interest (and just as much consent) as a photo of a young woman in her bedroom uploaded by her stalker. Clearview’s argument, despite its clear flaw, reveals how cartoonishly ill-suited the third-party doctrine is to modern life.
 
Deleted:
<
<
An index of faceprints in the hands of law enforcement means that the state can identify not only the speaker at a public rally in the park, but also passive participants who may want to express support for the speaker’s message, evaluate the message, or develop their own criticisms. Such bystander actions are critical to the discussion and deliberation that the Founders sought to protect in the First Amendment. Deliberation of viewpoints is a means to ensure that individuals can hold their governments accountable in ways deemed appropriate by the community. But with mass facial recognition, the state suddenly has the power to identify individuals based on their possible affinity for one viewpoint or the other.
 
Changed:
<
<
This danger animates the Supreme Court’s overbreadth doctrine, wherein a legislature can be found to violate the First Amendment by passing laws that create a chilling effect on the exercise of rights, even where the aggrieved party does has not been found to violate the law in question. Furthermore, online alternatives for deliberation—assuming a speaker mindful of how her online activity might jeopardize her privacy and capable of protecting it—are not enough. Even assuming that the First Amendment would countenance retreating from public physical life as a tolerable price to pay for one’s privacy, the state of online public debate today leaves much to be desired. Perhaps a healthier internet—one not driven by engagement metrics—could satisfy First Amendment’s values of community deliberation (and that may be necessary if humans don’t provide long-term protection to the planet), but that is not the case today.
>
>

Facial Recognition and Expression

Facial recognition tools have implications for the rights of expression, assembly, and association because—insofar as metadata empowers insights that these tools promise—they can chill the exercise of these rights for those who want to keep their expressions and associations anonymous.
 
Changed:
<
<
If the incorporation against the states of the First Amendment’s non-abridgement principle demands that the state tolerate minority viewpoints, and the use of these surveillance technologies has a meaningful chilling effect on deliberation of those viewpoints, the Constitution requires that state actors refrain from using these tools to build a surveillance apparatus. In this sense, the threat posed by mass facial recognition tools to deliberation in physical spaces illustrates why consent is a flawed paradigm—online or not—with which to regulate our privacy.
>
>
With mass facial recognition tools, the state gains the power to identify individuals based on their possible affinity for one viewpoint or the other, or their decision to walk down a certain street or visit a certain shop. Metadata makes this possible: where the technology is trained on visual data that includes information about location, time of day, the presence of other known “faceprints,” and other information, these tools can correlate one’s face with innumerable privacies of life—a doctor’s visit, a church service, a rally—simply because physical movements can be caught on camera. For example, nothing in the Constitution prevents Clearview from contracting with every private entity who installs cameras inside and outside their business for that data (the kind prosecutors acquire while investigating a crime).
 
Added:
>
>
On the other hand, where faceprints merely assist prosecutors in identifying criminal suspects (for example, by comparing photos to the name associated with a driver’s license), no impermissible chilling occurs. Ironically, collection of the nearly immutable aspect of one’s identity—the face—does not by itself greatly threaten privacy. But a unique faceprint that conveys information correlated to countless data and metadata erodes the privacy of one’s person over time.
 
Changed:
<
<
I think this draft is a valiant effort, but I think there are still some serious gaps in the argument that need to be filled.
>
>
An index of faceprints—where training metadata is accessible to users or otherwise informs insights available to them—means that state actor users can identify not only the speaker at a public rally in the park, but also passive participants who may want to express support for the speaker’s message, evaluate the message, or develop their own criticisms. And the state could do so on the basis of that participant’s decision to attend the rally. Such bystander actions are critical to the discussion and deliberation that the Founders sought to protect in the First Amendment. Deliberation of viewpoints is a means to ensure that individuals can hold their governments accountable in ways deemed appropriate by the community.
 
Changed:
<
<
"Facial recognition" can mean either the real-time identification of individuals from opportunistic or intentional acquisition, or image searching on previously acquired images. You are correct that Clearview AI is simply making a commodity service based on familiar technology, and is doing so with an unclean set of training data and a searching database that cannot pass at least some legal standards. But how training data or the search dataset are acquired is separate in all respects from whether it is constitutionally problematic for the state to search for faces in databases. Showing crime victims booking mugshots of criminals is not unconstitutional, and whether the program in use was trained on Fair Face and is searching dreivers' license photos is mere technical detail.
>
>
This danger animates the Supreme Court’s chilling effect doctrine, wherein a state can be found to violate the First Amendment by acting in a way that chills the exercise of rights, even where the aggrieved party has not been found to violate the state’s laws. To be clear, what matters is a state actor’s purpose in using a deep database of faceprints. Profiling individuals outside the context of a lawfully predicated criminal investigation is an impermissible purpose. But such purposes are made possible by the unrestricted hoovering up of so-called “public” data, and the chilling effect of making metadata-infused insights available to state actors is constitutionally cognizable.
 
Changed:
<
<
Faces are evidence, whether seen by witnesses or by cameras. But there are secondary consequences of, e.g. the collection of every identity that can be determined from photographs at peaceful demonstrations rather than crime scenes. It might be that the best route to improvement is to deal at the level of detail rather than constitutional theory for more of the 1,000 words.

>
>
If the incorporation against the states of the First Amendment’s non-abridgement principle demands that the state tolerate minority viewpoints, and the use of these surveillance technologies may have a meaningful chilling effect on deliberation of those viewpoints, the Constitution requires that state actors refrain from using these tools to build a surveillance apparatus that delivers insights beyond the mere fact of one’s faceprint.
 



Revision 4r4 - 03 May 2021 - 18:47:19 - MitchellKohles
Revision 3r3 - 24 Apr 2021 - 17:44:41 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM