Law in the Internet Society

Biometric Data and the "Psychics" of the Future

"Your entire life is online. And it might be used against you. Be Vigilant."

This message occurs at the end of a video campaign for Safe Internet Banking issued by the Belgian government, but its warning might extend well beyond internet banking. In the video, random participants are invited to appear on what they were told was a new TV show, and a self-proclaimed psychic proceeds to reveal things about each participant that he shouldn’t possibly know, including the location of their tattoos, favorite vacation spots, the color of their house, or their credit card number. Just as the participants begin to be impressed if not spooked, a curtain falls and exposes the true “psychic”—expert hackers working in real time behind the scenes—to highlight just how much of people’s intimate details can be found online.

Presumably, the psychic in the video obtained the participants’ names as a basis for the hacks. Imagine how much more terrifying the video would be if knowing their names was not even necessary. While we generally don’t wear name tags everywhere, we do wear our faces and bodies constantly, and this type of data has become a growing minefield for the government and private organizations alike.

Biometric technology has become increasingly popular since the September 2001 terrorist attack, where government organizations began taking escalated measures of data collection in the name of security. Many airports and cities rushed to install cameras with facial recognition in hopes of identifying terrorists and criminals in crowds, often with questionable rates of accuracy. This endeavor has taken on particularly epic scope abroad: biometrics on over 1.5 million Afghans exist in databases operated by American, NATO, and local forces, which translates into 1 in 20 Afghan residents, and roughly 1 in 6 males of fighting age. In Iraq, the analogue rises to 2.2 million, 1 in 14 citizens, and roughly 1 in 4 males of fighting age.

Biometrics gained further prominence in June 2011, when social media giant Facebook debuted its facial recognition tool as an opt-out feature, without first obtaining user consent for collecting the biometric data necessary for the tool to operate. In response, the Electronic Privacy Information Center ("EPIC") filed a complaint with the Federal Trade Commission, while Germany threatened to sue for violation of German and EU data protection laws. To many, Facebook’s behavior was alarming, as it exposed if not reaffirmed the type of surreptitious data collection and dissemination that organizations like Facebook feel entitled to conduct regularly, often with little legal repercussion. (Two other examples include In re Facebook Privacy Litigation, and In re Zynga Privacy Litigation.) For those who remain skeptical of the hazards of biometric data collection, here is why you should care:

1) The U.S. government has a huge, expanding data pool to draw from.

Biometric data collection is no longer limited to border crossings or serious criminal offenses. Because many police officers carry mobile fingerprint scanners, an arrest as minimal as being stopped for a moving violation could trigger data collection. For instance, the LAPD routinely collects fingerprints of day laborers who stand in street corners, and domestic violence victims may find themselves fingerprinted when police are unsure of who initiated the violence. This is particularly troubling because post-9/11, several executive orders and changes in the law have promoted data sharing (such as among federal government databases, but also across private organizations and foreign governments), meaning that biometrics collected under very low standards of suspicion could later be used for criminal justice. For example, under Secure Communities, states send fingerprints to federal agencies to identify whether people are charged with crimes outside their jurisdiction. But because the FBI shares fingerprints with the Department of Homeland Security, tens of thousands have been deported who were never convicted of any crime, including thousands of U.S. citizens due to database error. Moreover, there’s no guarantee that the government will be transparent, as seen from EPIC’s lawsuits against the Transportation Security Administration and the U.S. Marshals Service for misleading the public about the true capabilities of body scanners, or the more recent (non-biometric) litigation of United States v. Jones, which illustrates how police’s use of technology may skirt the boundaries of constitutional law.

2) Private organizations increasingly collect this data.

Many are aware that casinos use facial recognition to screen for card counters, but the Venetian now employs facial recognition in its digital displays to tailor advertising on the spot. Kraft and Adidas plan to implement similar technology, and a handful of retailers currently use seeing-eye mannequins to identify customers’ age, gender, and race. While facial scanners themselves may provide limited data, much greater potential exists if the data is then matched to information on people’s social media profiles. Additionally, because private companies sell data to each other as well as to the government, there will exist even greater uncertainty over how this data is eventually aggregated and used.

3) The technology is constantly improving.

In the last decade, facial recognition technology has advanced substantially, including 3D scanning that can capture information about faces in profile, and skin scanning technology that can map pores, skin texture, scars, and other microfeatures. Technology is also being developed to identify individuals by gait. Countries like Japan, which already uses facial recognition in billboards and vending machines, as well as at truck stops (to gauge the alertness of drivers) and on the job (to gauge worker enthusiasm), provide a glimpse of what the U.S. could easily become.

Biometric data is unique in that the cost of personal data leakage is high and irreversible. If we lose a credit card, we can obtain a new one with a different identifier. If we lose a biometric, we’re exposed for life. Additionally, our faces and bodies are on public display, meaning that data can be collected fairly instantaneously without the slightest notice. While any technology can be used for both beneficial and detrimental purposes, having such quantities of centralized, commercialized, interoperable, and often uniquely identifying information is a risky proposition. Historically, the FBI has surveilled individuals such as Martin Luther King, Jr., those suspected of communism by weak association, and more recently, Occupy Wall Street participants. Furthermore, erroneous biometric matches can be highly prejudicial: consider John Gass of Massachusetts, who had to defend himself to the DMV for 10 days for alleged identity fraud simply because his face resembled another man’s, or the thousands who have already been wrongfully deported for illegal immigration.

As the EFF notes, with biometrics, we must be wary of getting the worst of both worlds: a system that enables greater surveillance of the population in general, but fails to provide increased protection. Even without deliberate abuse, such surveillance would have an extraordinary chilling effect on artistic and scientific inventiveness and political expression. This is because we have historically relied on a degree of anonymity through obscurity in how we interact with the world.

How would you feel if one day, a “psychic”— looking into your iris through Google Glass— could tell you everything about yourself (and your potential future actions) that even your own recollection could not? Would he be able to see any infractions that you yourself cannot predict, have forgotten, or never even committed?

-- LindaShen - 14 Apr 2013

 

Navigation

Webs Webs

r3 - 14 Apr 2013 - 15:46:42 - LindaShen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM