Index: [thread] [date] [subject] [author]
  From: Heather Schneider <hms2103@columbia.edu>
  To  : <CPC@emoglen.law.columbia.edu>
  Date: Thu, 3 Mar 2005 14:25:45 -0500

Paper 1: Breaking Down Architectures of Control

Breaking Down Architectures of Control 
By Heather Schneider, Spring 2005

"I will not be pushed, filed, stamped, indexed, briefed, de-briefed, or
numbered!"
- The Prisoner (1960s British TV Show)
        
        I will discuss how legal and technical constraints combine to
create architectures that control our behavior. An example of a
real-world architecture that facilitates control is the "panopticon"
discussed by Jeremy Bentham.[1] It contains a ring of jail cells with a
guard post in the middle. A shield covers the guard so you can't see
him, but he can see you. It provides the perfect physical architecture
to monitor and prevent misbehavior. The new architectures of control
combine legal and technical structures to achieve a similar effect.[2]
The technical and legal constraints work in tandem; when one is breached
the other steps in to fill the void.  
        
        One example is the use of Digital Rights Management (DRM)
software to prevent the unauthorized access or copying of copyrighted
material.[3] The cycle goes like this - The law says you can't violate
copyright. But the law is difficult to practically enforce. So
technology steps in to provide code that makes it impossible for the
average user to break the law. However the technology isn't perfect
either, and gets cracked by a 16 year old kid.[4] So the law steps back
in to punish the breach. The cycle continues, with law and technology
supporting each other to develop a system of "perfect technological
justice."[5] Disney is happy, the police are happy, and all is right
with the world.

	Why should we care about DRM if we aren't planning to violate
copyright laws?  One answer is fair use. This is part of the bargain
that makes copyright compliant with the First Amendment. Although there
is a static list of "fair uses" in the copyright statute, the real
growth of the doctrine is through friction and evolution in the courts
after someone tries to do something new with copyrighted material, like
make copies of a magazine article for scientists in a lab (not a fair
use)[6] or record a TV show with a VCR (is a fair use).[7]  But with
privately-implemented technological constraints we no longer have room
to experiment. Fair use can't evolve in a system of perfect
technological justice; we will always be stuck with the status quo.

	A different example is provided by the Grokster case.[8]
Secondary liability for copyright infringement usually requires that the
third party have the ability to control the information that passes
through its systems. The architecture of Grokster abandoned this
control. Unlike the old Napster it has no centralized file servers or
search indexes; it is a peer to peer (P2P) network with some machines
acting as supernodes to facilitate the file sharing process. Even if we
are not sympathetic to the creators of Grokster (who may have designed
the system for the sole purpose of evading liability) we should still be
concerned about this case, since it is quite literally an explicit
battle about architectures of control on the Internet. The motion
picture and recording industries want the Supreme Court to hold Grokster
liable for implementing a P2P system with no centralized control. 

	The DVD and Grokster cases involve designing technology to
prevent illegal activity. But the police want to go farther and actually
predict who will commit crimes. They can do this by piggybacking on the
architectures of private marketing databases run by companies like
Acxiom and ChoicePoint.  The law and the technology work together to
enable a new predictive model of law enforcement. In this panopticon for
the 21st century our data is the prisoner and the AI-like data mining
software is the all-seeing guard in the middle. It sounds like the
perfect recipe for another system of perfect technological justice.

	But what happens when there are bugs in the architecture?
Remember that all this code is just written by the guys you read about
in O'Harrow's book (including a high-school dropout computer prodigy[9]
and an ex-drug running mercenary[10]). I don't know which is scarier,
having these men write code that will decide whether or not we're
terrorists, or allowing individual police the ability to label us as
"criminal extremists."[11] Both scenarios hold great potential for
mistakes and abuse.

	Even if there aren't bugs or breaches in the system, there is
harm in knowing that your data is out there beyond your control.
Recently Daniel Solove gave a lunch talk about his new book "The Digital
Person" where he called this the "architecture of vulnerability." He
compared it to the harm you feel knowing that your front door doesn't
have a lock - even if you don't get robbed you still don't sleep very
well at night. 

        So how can we break down these architectures of control? One
possibility is by opening the source code for the technological
constraints. I admit that I have never been an ardent supporter of the
free software movement like Prof. Moglen. I always thought "Why would I
want to see the source code of Windows? Who cares?" But I'm beginning to
understand that it's not just about seeing and learning from the code
(although that is a benefit), it's about preventing unseen forces of
control. I may not care about the source code to Windows, but I'd love
to see the terrorist detection algorithms in the Matrix system. If the
database is the modern panopticon, then seeing the source code may be
like removing the shield from the guard's enclosure. Although I can't
force ChoicePoint to let me see behind the curtain, I hope the
government demands a look before they start arresting people on the mere
basis of the code's inferences.

        We can also design open software systems whose architectures
just don't allow centralized control, like the BitTorrent P2P software
Prof. Moglen mentioned in class.  But if the Supreme Court rules against
Grokster, systems like BitTorrent may become illegal. This highlights
the fact that both technological AND legal changes are needed to break
the architectures of control. To end the cycle of law and technology
working together to promote control we need to reverse the loop. We need
technologies that promote freedom, supported by laws that promote
freedom, supported by technology, supported by law, etc. 

        In this paper I've introduced what I think is the problem and a
few tentative technical solutions. I plan to revisit this topic at the
end of the semester to propose more comprehensive technical and legal
solutions.

Notes
[1] Robert O'Harrow, Jr. No Place to Hide, Free Press, 2005, page 170.
[2] See Lawrence Lessig, Reading the Constitution in Cyberspace, 45
Emory L.J. No. 3, 2 (1996) (For example, a legal constraint on a
policeman's search of your home is the 4th Amendment; a technical
constraint is that policemen don't yet have x-ray vision.).
[3] Id. at 32.
[4] See Universal City Studios v. Reimerdes, 111 F. Supp. 2d 294
(S.D.N.Y. 2000); Universal City Studios v. Corley, 273 F.3d 429 (2d Cir.
2001)
[5] Lessig, supra note 2, at 34 (citing Bruce Ackerman, Social Justice
in the Liberal State 21-32 (1980).
[6] See American Geophysical Union v. Texaco, Inc., 60 F.3d 913 (2d Cir.
1994).
[7] See Sony Corp. of America v. Universal City Studios, Inc., 464 U.S.
417 (1984).
[8] For information, including briefs submitted by the parties and
amicus briefs, see http://www.eff.org/IP/P2P/MGM_v_Grokster/.
[9] O'Harrow, supra note 1, at 145-146 (describing the background of
Jeff Jonas, creator of the NORA software which is now used by
ChoicePoint).
[10] O'Harrow, supra note 1, at 110-113 (describing the background of
Hank Asher, founder of Seisint).
[11] O'Harrow, supra note 1, at 274-277 (describing non-violent
political activists in Denver who were labeled as criminal extremists by
the Denver Police Department).



-----------------------------------------------------------------
Computers, Privacy, and the Constitution mailing list



Index: [thread] [date] [subject] [author]