Law in the Internet Society

Sleepwalk With Me

-- By EileenHutchinson - 22 Dec 2012

“For the interesting puzzle in our times is that we so willingly sleepwalk through the process of reconstituting the conditions of human existence.” - Langdon Winner

Introduction

For much of this class, I have struggled and failed to understand my own behavior as it relates to technological advances and my own autonomy. I consider myself a reasonably logical person who, where possible, seeks to protect myself from abuse or intrusion at the hands of third parties. Why, then, did I join Facebook and Gmail? Why have I stayed on both services even as it has become clear that my personal information is no longer personal, that my use of these “services” is in fact a means by which a tremendous amount of data about myself may be compiled, sold and used by third parties to impact my behavior in ways I have not yet imagined? I recently stumbled upon an answer to what I believed was a uniquely 21st century conundrum in Langdon Winner’s twenty-six-year-old essay “Technologies as Forms of Life.”

“Technological Somnambulism”

In 1986, Winner coined the term “technological somnambulism” to refer to the philosophy that humans are sleepwalking through their interactions with technology. People who are unwilling to make changes in their lives to fulfill political, social or religious ideologies make tremendous changes in their daily lives to accommodate new technology. Such changes are rarely accompanied by any reflection as to whether such technology and the consequences of the adaptation thereof are positive or negative.

Winner points to a number of factors that help explain this phenomenon. First, humans view technology as tools that may be picked up, used and put down. Just as cavemen picked up a sharpened rock, used it and put it down, so the average teenager picks up a smartphone, uses it and puts it down. This conventional view leads naturally to a narrow conception of how we use technology – our interactions are “occasional, limited and nonproblematic.” The primary question when a new tool is introduced is therefore how we will interact with that tool to achieve a specific purpose. The use perspective leads further to a view that technological objects are morally neutral, as they can be used to ends both good and bad; a knife may be used to slab butter on bread, just as it may be used to stab an innocent person. Though individual objects are considered to be neutral, the idea that technological progress is both positive and inevitable has characterized the American ethos for centuries.

Winner rejects the narrow view of technology as “tools” and “uses,” for he posits that technology changes not only what people do, but how they think. Technological advances, while novel on first encounter, are quickly incorporated into the recurring pattern of daily life and thereby change the fabric of our social, political and economic life. As technology is woven into everyday life, new techniques and devices become part of human identity. The introduction of the automobile in the United States provides a good example of this process. When the car was introduced, users viewed it as a tool to be used occasionally to move from Point A to Point B at a high speed. Yet the introduction also resulted in profound changes in human activity and institutions. The car demanded infrastructure – roads, highways, and traffic lights – laws and law enforcement. Cities like Los Angeles and Phoenix were erected on the premise that all people drove cars. Individuals could move anonymously and quickly from one end of the country to the other. By using cars, Americans transformed the physical and social world they lived in. From this perspective, the view of the car as a tool that may be turned on, used and turned off is hopelessly naïve.

Of course, Winner points out, the transformative role of technology is obvious in hindsight. What is interesting is our refusal to perform the same type of analysis on new technology as it emerges. When a new device hits the market, we ask whether it provides a useful service, whether it is more efficient than its predecessors and whether it is likely to make a new product. We dive into new contracts – both economic and social – in order to get from Point A to Point B more quickly, rarely if ever stopping to examine the greater changes likely to accompany the adoption of new technology. It must be noted that Winner is far from hopeless. He rejects technological determinism, the idea that technological innovation determines societal change and is resistant to political or cultural influence. Winner emphasizes human choice in the realm of technological innovation. The technological consumer’s questions must not be limited to, “What does this do for me?” but must extend to the larger issuer of, “What kind of world are we making?” Will new devices or techniques lead to greater freedom, increased education, truer equality? Or will they lead to oppression, dictatorialism, consumerism?

Winner’s Philosophy and Modern Consumer Culture

In reading Winner’s essay, I realized that I have taken a narrow use-focused approach to essentially every technological decision I have ever made. I bought an iPhone because it made accessing the Internet through a browser easier. I use a Barnes & Noble card because I get a discount. I use Gmail because it offers unlimited storage and because I can Gchat with friends and family. I gave Mint.com complete access to my personal finances so that I could receive weekly email informing me that I spent 15% of my weekly budget on coffee. What I failed to contemplate was that these were not merely tools to be picked up, used and put down, but rather technological devices that would change the patterns of my daily life and shape the world I live in.

Private corporations offer a plethora of devices, many of them free, that promise to get us from Point A to Point B. In return, we offer up tremendous amounts of personal data that may be used or sold for the ultimate purpose of influencing how we think and what choices we make. We are enabling a new generation of marketing that I believe will be troublingly effective and will fundamentally change the way individuals make consumer choices. An example of this new marketing was detailed in a recent BBC article describing a “personal concierge” service offered by DBS Bank. The service works in the following way. A law student enters a luxury goods store. Her smartphone knows where she is thanks to GPS technology embedded in the phone and it alerts her bank through an automated system she has signed up for. The bank knows that the student has a history of buying from similar stores; however she is low on cash, so the chances of the law student making a purchase are therefore low. Suddenly her phone beeps. She is informed by text message that if she makes a purchase in the next 30 minutes, she can borrow money at a good rate and she’ll get 20% off the purchase. DBS Bank refers to the process as “right place, right time” marketing. I would argue it is a process by which we are sleepwalking toward a future where individuals have outsourced their own decision-making to private corporations seeking ever more consumers.

This draft works well for its current purpose: helping you to think about your behavior, and the behavior of others around you, as users of technology. But technology also has designers, whose awareness of the way of life being designed for may be quite acute. Not all of those designers work for corporations seeking consumers. Hence the free software movement and its various successors, allies, and neighbors, which are about designing technology to achieve other social objectives, to spread other ways of life.

Choices in using, then, can and should involve, as you suggest, awareness of "way of life" consequences. (In fact, we could call these "way of life consequences," ideology. We could conclude that technology is a material reflection of ideology. We could insist that "our" technology should reflect "our" ideological objectives. At this point, we would notice that the objectives of those who own are different from the objectives of those who work. But we are now required to stop thinking, because any more thinking would be dangerous.) If we choose technology with awareness of its consequences for our way of life, we can benefit from understanding who has designed it and why. Perhaps you need FreedomBox more than you need Facebook and Google?


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r3 - 23 Aug 2014 - 19:33:50 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM