Law in the Internet Society

View   r18  >  r17  >  r16  >  r15  >  r14  >  r13  ...
BahradSokhansanjSecondPaper 18 - 04 Sep 2012 - Main.IanSullivan
Line: 1 to 1
Changed:
<
<
META TOPICPARENT name="SecondPaper"
>
>
META TOPICPARENT name="SecondPaper2011"
 

Intellectual Property and Thought Control


BahradSokhansanjSecondPaper 17 - 20 Mar 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

Intellectual Property and Thought Control

Ready for review.
Changed:
<
<
We believe that in a free society, government enforces laws that limit freedom of action in order to protect our safety and a democratically determined social order. We'd like to believe that our thoughts can't be restricted. Maybe we could accept a limit on what we can read or hear -- if only rarely, when needed to keep us safe from our darkest fears, terrorists, child pornographers, identity thieves.... But, our sense of freedom recoils from the notion of the state imposing and enforcing limits on how we think, even where thoughts are not physically manifested. And yet, the laws that apply force to support the concept of intellectual property violate this core principle.
>
>
The ease of digital production and distribution disrupts the economic case for intellectual property. Beyond this, the computer's central role in creation, reproduction, and distribution also illuminates the basic immorality of IP law, its incompatibility with a free society.

We presume that in a free society, the state can enforce laws that limit freedom of action, within democratic limits. We may even tolerate limits on what we can read or hear, if rarely and only when told that it will keep us safe from our darkest fears, like terrorists, child pornographers, identity thieves. Still, our sense of freedom recoils from the notion of the state policing any thoughts themselves, if they are not put into action.

 

Line: 13 to 15
 

Changed:
<
<
When we think about computers, we don't usually think about what computers actually are, just what they do -- the software they run or the content they display. The computer is just a passive, invisible entity. We don't even call most of them "computers." We use words like "smartphone," or "tablet" instead of "tablet computer." Kindles and Nooks are "e-readers." We call desktop PCs that have joysticks instead of keyboards "game consoles," and we cannot seen and are not shown the computers in our Blu-Ray players and automobiles. But, these are all programmable, universal computers.
>
>
When we think about computers, we don't usually think about the processing but the results, the output of the software or the content on the display. Most people relate to computers as passive, invisible entities, not even "computers," really. They are "smartphones" or "tablets," or more remotely "game consoles," or in most cases just "the DVD player" and "the car." But, the computers in all these devices are generally programmable, universal computers.
 Universal computers are special, because they can execute any algorithm. An algorithm is thought broken down into pieces -- a set of process and rules that can be described using logic. What algorithms may be run on a computer is limited only by the speed of its circuitry and capacity to store data. It is always important to keep in mind that any computer is a "thinking machine," Computers process concrete logical instructions. In that sense, computer thinking seems to differ from people thinking -- but nevertheless, computers do an increasing amount of our thinking for us.
Line: 25 to 27
 
The "Information Age" is characterized by the word "information." Information is a long, Latin-rooted, technical-sounding word. We understand it, when read or heard, at an intellectual remove from our living experience. "The Information Age" is basically a marketing phrase, used to sell people on the idea that money can be made by buying and selling information. And the idea of commercializing thought would be a tougher sell. Marketers avoid words like "knowing" and "thought." To control the marketplace of thought would mean having to control thought. We don't like to contemplate what that means for a free society. Would it mean that just as state force helps control the market of land and things, it must also guarantee the marketplace of thought? That seems scary. Instead, "information" may be bought, sold, and owned, even as thoughts remain free.
Changed:
<
<
So, the Information Age marketer sells a piece of information, which is translated into a series of logical processes, run through a universal computer, and turned into numbers that can be stored and displayed. A universal computer can run any algorithm with which it is programmed. Duplicating what it has stored in its memory, even when it's only cached there temporarily, is really easy. This means that profits can't be extracted from the scarcity of information.
>
>
The Information Age marketer sells a piece of information that has been carved out and productized. The information is translated into a series of logical processes, run through a universal computer, and turned into numbers that can be stored and displayed. Of course, a universal computer can run any algorithm with which it is programmed. So duplicating what it has stored in its memory, even when it's only cached there temporarily, is really easy. The ease of translation, duplication, and transmission, makes it hard to extract profits from the scarcity of information that can never actually be scarce.
 
Changed:
<
<
In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to control it. But, these have been foiled again and again. Universal computers can run the algorithms that defeat the restrictions, because they have to be leaky for the information to be distributed and read by paying customers. Information sellers respond by developing restrictions that are increasingly fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reports on unauthorized access when a computer goes online, or even shuts computer's operating system and ability to function entirely. This is especially common in computers that are marketed in ways that avoid calling them "computers," like smartphones, tablets, game consoles, and embedded devices.
>
>
In response to this, sellers have tried increasingly sophisticated algorithms that lock up and control access to information products. But, these technological measures have been foiled again and again. Universal computers can run the algorithms that defeat the restrictions, because they have to be leaky for the information to be distributed and read by paying customers. Information sellers respond by developing restrictions that are increasingly fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reports on unauthorized access when a computer goes online, or even shuts computer's operating system and ability to function entirely. This is especially common in computers that are marketed in ways that avoid calling them "computers," like smartphones, tablets, game consoles, and embedded devices.
 Anything thought builds, though, thought can undo. Thoughts, implemented as algorithms running on computers, can be used to break all the most sophisticated locks placed on information. The knowledge of how to circumvent can be restricted by banning certain algorithms, censoring the websites that publicize them, and watching those who seek them. Algorithms, imagined by knowledge applied creatively, can go around all these measures. So, the only solution is to ban the thoughts behind the algorithm -- to punish the people who think about them and try to learn about them.
Line: 37 to 39
 

Changed:
<
<
Intellectual property laws are all about the use of force to support artificial boundaries around thoughts, to make them scarce and thus commercially valuable. The state applies force by banning the "bad" algorithms that pierce the boundaries. The state therefore must, by necessity, police thought. The state must by force, the threat and actuality of imprisonment, prevent the execution of certain algorithms, and the creation of others. Thought is restricted, even where that thought may improve our lives in small ways, like by making the transfer of large amounts of data more efficient, or big ways, like by allowing dissidents to organize protests more safely. More urgently then just our convenience, there is our basic assumption that a free society is based on the freedom to think. The laws that try to regulate thought will inevitably lead to injustice. Innovators who seek to benefit society as a whole fall under the hammer of the law, while those who seek to use algorithms to commit crimes in the shadows slip around the enforcers. The powerless, too unskilled or lacking in resources to escape detection, are the ones who are caught.
>
>
IP laws represent the use of force to support artificial boundaries around thoughts, to make them scarce and commercially valuable. In a world in which thoughts are processed by computers and distributed through communication networks, enforcing the law necessarily means that the police must ban algorithms that pierce these boundaries, backed by the threat of prison. Thoughts embodied by these algorithms must be restricted, even when they may improve lives, by making the transfer of large amounts of data more efficient, or save lives, by allowing dissidents to organize protests more safely. Innovators who seek to benefit society as a whole fall under the hammer of the law, while those who seek to use algorithms to commit crimes in the shadows slip around the enforcers. The powerless, those too unskilled or lacking means to escape detection, are the ones caught and punished.
 
Changed:
<
<
If intellectual property were really about the act of creation by thought -- if it were the way to create a free market of ideas -- can it be compatible with a legal regime that inhibits the freedom of thought and the freedom to create? The digital age reveals what intellectual property has always been about, the parcelization and control of thoughts to benefit those who did not create but rather owned the means of amplification and distribution -- the industrialists who run printing presses, the studio executives, and the financiers behind them. There is no moral case for intellectual property as protection for creators. How can one exist, when the laws needed to enforce intellectual property take away the freedom of people to create?
>
>
The digital age reveals what IP law has always been about, the parcelization and control of thoughts to benefit those who did not create but rather owned the means of amplification and distribution: the industrialists who run printing presses, the studio executives, and the financiers behind them. Those who support the IP regime often couch arguments in terms of economic utility, but underlying the rhetoric is an assumption of a moral case, protecting creators from their ideas being "stolen." We can argue over whether thoughts can be stolen, but more urgently, what does it mean for this moral case when the state's enforcement will inevitably take away the freedom of people to think, and thus create?
 
Changed:
<
<
-- BahradSokhansanj - 18 Mar 2012
>
>
-- BahradSokhansanj - 19 Mar 2012
 
 
<--/commentPlugin-->

BahradSokhansanjSecondPaper 16 - 19 Mar 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"
Changed:
<
<

We Are All Prometheus Now

>
>

Intellectual Property and Thought Control

 
Changed:
<
<
Ready for review.
>
>
Ready for review.
 
Changed:
<
<
The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer, and upon reflection is also a reaction to reading some articles by Robert Hale.

We believe that in a free society, government enforces laws that limit freedom of action in order to protect our safety and a democratically determined social order. We'd like to believe that our thoughts can't be restricted. Maybe we could accept a limit on what we can read or hear -- if only rarely, when needed to keep us safe from our darkest fears, terrorists, child pornographers, identity thieves.... But, our sense of freedom recoils from the notion of the state imposing and enforcing limits on how we think, independent of any manifested action.

>
>
We believe that in a free society, government enforces laws that limit freedom of action in order to protect our safety and a democratically determined social order. We'd like to believe that our thoughts can't be restricted. Maybe we could accept a limit on what we can read or hear -- if only rarely, when needed to keep us safe from our darkest fears, terrorists, child pornographers, identity thieves.... But, our sense of freedom recoils from the notion of the state imposing and enforcing limits on how we think, even where thoughts are not physically manifested. And yet, the laws that apply force to support the concept of intellectual property violate this core principle.
 

Line: 15 to 13
 

Deleted:
<
<
Computers challenge our idea of a free society with freedom of thought and conscience. Computers are the main way we share knowledge. They run 3-D printers that build physical objects. They run machines to manipulate DNA and modify microorganisms. Governments may enforce laws to stop computers from copying movies, build counterfeit or dangerous goods, or synthesize patented or dangerous microorganisms. But, controlling what we can do with a computer doesn't just limit the freedom to do. It also infringes on the freedom to think.
 When we think about computers, we don't usually think about what computers actually are, just what they do -- the software they run or the content they display. The computer is just a passive, invisible entity. We don't even call most of them "computers." We use words like "smartphone," or "tablet" instead of "tablet computer." Kindles and Nooks are "e-readers." We call desktop PCs that have joysticks instead of keyboards "game consoles," and we cannot seen and are not shown the computers in our Blu-Ray players and automobiles. But, these are all programmable, universal computers.

Universal computers are special, because they can execute any algorithm. An algorithm is thought broken down into pieces -- a set of process and rules that can be described using logic. What algorithms may be run on a computer is limited only by the speed of its circuitry and capacity to store data. It is always important to keep in mind that any computer is a "thinking machine," Computers process concrete logical instructions. In that sense, computer thinking seems to differ from people thinking -- but nevertheless, computers do an increasing amount of our thinking for us.

Line: 41 to 37
 

Changed:
<
<
The concept of intellectual property has always been about the control and restriction of thought. The IP regime depends on the state's police power to enforce artificial boundaries around thoughts, so as to create a scarcity in the supply of information, the demand for which can then give it a price. The ability to freely distribute and display with networked computers reveals that this loss of freedom is based on a flawed economic bargain. But what about the other part of this -- that our safety and security depends on restricting what we can do with computers? There is no algorithm that can protect us without being circumvented by someone thinking up a hack for it. A state regime that backs up "good" countermeasure algorithms will thus have to police thought about "bad" hacker algorithms. And just as in the IP regime, the state will fail, except in its ability to punish those who aren't skillful enough to avoid capture and make life harder for dissident thought. use these algorithms to circumvent restrictions on other things governments want to control, like the ability to organize protest, and ultimately, the power to develop real solutions to the problems posed by new technologies.

-- BahradSokhansanj - 5 Mar 2012

>
>
Intellectual property laws are all about the use of force to support artificial boundaries around thoughts, to make them scarce and thus commercially valuable. The state applies force by banning the "bad" algorithms that pierce the boundaries. The state therefore must, by necessity, police thought. The state must by force, the threat and actuality of imprisonment, prevent the execution of certain algorithms, and the creation of others. Thought is restricted, even where that thought may improve our lives in small ways, like by making the transfer of large amounts of data more efficient, or big ways, like by allowing dissidents to organize protests more safely. More urgently then just our convenience, there is our basic assumption that a free society is based on the freedom to think. The laws that try to regulate thought will inevitably lead to injustice. Innovators who seek to benefit society as a whole fall under the hammer of the law, while those who seek to use algorithms to commit crimes in the shadows slip around the enforcers. The powerless, too unskilled or lacking in resources to escape detection, are the ones who are caught.
 
Changed:
<
<
Very interesting piece. One preliminary comment: I'm not quite persuaded yet that this piece has demonstrated that the choice between freedom and safety is a false one. I think there can be very persuasive arguments made that in many circumstances, freedom is a better choice than safety. However, I'm not sure the tension between the two dissolves so easily. I think your piece does more to make a case for freedom (since safety is impossible due to locks being circumventable) than it does to demonstrate that freedom and safety are not in tension. There may be a point to be made though that increased freedom in some circumstances increases safety - that might also be what you are getting at. If that's the case, I think that point could be more explicit. However, specifically with respect to lab synthesization of biological warfare implements, I think the argument that freedom increases safety might be difficult to make. Maybe the safety increase could come from full freedom to share information leading to antidotes for the weapons. But what if there are no antidotes? Then there would seem to be safety 'costs' to the freedom.

I think the argument for freedom would have to be from first principles, that freedom is precious, futility - that restricting freedom would not work, or that restricting freedom in the area in question would have offsetting perverse consequences somewhere else. I think the piece as it is now leans towards the 'it's futile to try to restrict freedoms in this area' argument.

-- DevinMcDougall - 20 Jan 2012

Thank you very much for your thoughtful comments, Devin. I'm going to have to think about this... I'm not sure what it would be to argue for freedom from first principles might look like? I'm trying to start from the initial point that we associate the core of freedom as being the freedom of thought, and that's what's being challenged by all of this -- so if you want the restrictions, then you have to accept the loss of that core freedom (and then what freedoms are really left?) and then, that this would be futile anyway, so it's not really like you're trading freedom for anything but illusory security -- and in fact, real solutions for the security problems can only come from human creativity, which requires freedom to think about these unthinkable algorithms.

-- BahradSokhansanj - 21 Jan 2012

>
>
If intellectual property were really about the act of creation by thought -- if it were the way to create a free market of ideas -- can it be compatible with a legal regime that inhibits the freedom of thought and the freedom to create? The digital age reveals what intellectual property has always been about, the parcelization and control of thoughts to benefit those who did not create but rather owned the means of amplification and distribution -- the industrialists who run printing presses, the studio executives, and the financiers behind them. There is no moral case for intellectual property as protection for creators. How can one exist, when the laws needed to enforce intellectual property take away the freedom of people to create?
 
Changed:
<
<
I've changed the article, but now it's rougher and represents a couple of conflicting ideas. Maybe this really needs to be split in two essays, or I should just focus on the freedom/security false balance (for example, take the time to explain how thinking about algorithms led to secure commerce, for example, better than the solution that government tried to provide through control).
>
>
-- BahradSokhansanj - 18 Mar 2012
 
Deleted:
<
<
-- BahradSokhansanj - 24 Jan 2012
 
 
<--/commentPlugin-->

BahradSokhansanjSecondPaper 15 - 06 Mar 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

We Are All Prometheus Now

Ready for review.
Changed:
<
<
The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer.
>
>
The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer, and upon reflection is also a reaction to reading some articles by Robert Hale.
 
Changed:
<
<
We believe that in a free society, government enforces laws that limit our freedom of action in order to protect our safety and order society the way we'd like. We'd like to believe that our thoughts can't be restricted. Maybe we could accept a limit on what we can read or hear -- if only rarely, when needed to keep us safe from our darkest fears, terrorists, child pornographers, identity thieves.... Still,there can't be limits imposed on thought alone.
>
>
We believe that in a free society, government enforces laws that limit freedom of action in order to protect our safety and a democratically determined social order. We'd like to believe that our thoughts can't be restricted. Maybe we could accept a limit on what we can read or hear -- if only rarely, when needed to keep us safe from our darkest fears, terrorists, child pornographers, identity thieves.... But, our sense of freedom recoils from the notion of the state imposing and enforcing limits on how we think, independent of any manifested action.
 

Line: 15 to 15
 

Changed:
<
<
Computers challenge our idea of a free society based on freedom of thought and conscience. Computers are now the way we gain and share knowledge. They can run 3-D printers to build physical objects and devices. They can run machines to manipulate DNA and modify microorganisms. Governments may enforce laws to stop computers from copying movies, build counterfeit or dangerous goods, or synthesize patented or dangerous microorganisms. But, controlling what we can do with a computer doesn't just infringe on the freedom to do, it also infringes on the freedom to think.
>
>
Computers challenge our idea of a free society with freedom of thought and conscience. Computers are the main way we share knowledge. They run 3-D printers that build physical objects. They run machines to manipulate DNA and modify microorganisms. Governments may enforce laws to stop computers from copying movies, build counterfeit or dangerous goods, or synthesize patented or dangerous microorganisms. But, controlling what we can do with a computer doesn't just limit the freedom to do. It also infringes on the freedom to think.
 
Changed:
<
<
When we think about computers, we don't usually think about what computers actually are, just what they do -- the software they run or the content they display. The computer is just a passive, invisible entity. We don't even call most of them "computers." We use words like "smartphone," or "tablet" instead of "tablet computer." Kindles and Nooks are "e-readers." Playstations are "game consoles," even though they are basically desktop PCs, and we usually ignore the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.
>
>
When we think about computers, we don't usually think about what computers actually are, just what they do -- the software they run or the content they display. The computer is just a passive, invisible entity. We don't even call most of them "computers." We use words like "smartphone," or "tablet" instead of "tablet computer." Kindles and Nooks are "e-readers." We call desktop PCs that have joysticks instead of keyboards "game consoles," and we cannot seen and are not shown the computers in our Blu-Ray players and automobiles. But, these are all programmable, universal computers.
 
Changed:
<
<
Universal computers are special, because they can execute any algorithm. Algorithms are just thoughts that have been broken down to pieces, a set of process and rules that can be described using logic. What algorithms computers can run is limited only by the speed of their circuitry and capacity to store data. Computers are "thinking machines," even though that's a concept that usually comes up in exotic, metaphysical discussions of artificial intelligence and silicon consciousness, the stuff that Kurzweil writes about. The reality of computers seems much more mundane; they just follow concrete, logical instructions. But, computers are already thinking for us, if not exactly like us. Computers execute our thoughts, or someone else's or a collective's thoughts, and then display the results.
>
>
Universal computers are special, because they can execute any algorithm. An algorithm is thought broken down into pieces -- a set of process and rules that can be described using logic. What algorithms may be run on a computer is limited only by the speed of its circuitry and capacity to store data. It is always important to keep in mind that any computer is a "thinking machine," Computers process concrete logical instructions. In that sense, computer thinking seems to differ from people thinking -- but nevertheless, computers do an increasing amount of our thinking for us.
 

Line: 27 to 27
 

Changed:
<
<
The "Information Age" is characterized by the word "information." Information is a long, Latin-rooted, technical-sounding word. We understand it, when read or heard, at an intellectual remove from our living experience. "Knowing" means basically the same thing, but it's not used as much in this context. "The Information Age" is basically a marketing phrase, used to sell people on the idea that money can be made by buying and selling information. But "knowing" is "thinking." Commercializing thought is a tougher sell. To control the marketplace of thought would mean having to control thought, and we don't like to contemplate what that means for a free society. Maybe advertising really is about that, but we don't like to think about what that implies. So we use "information" instead, to feel more comfortable. Information may be bought, sold, and owned, but thoughts are still free.
>
>
The "Information Age" is characterized by the word "information." Information is a long, Latin-rooted, technical-sounding word. We understand it, when read or heard, at an intellectual remove from our living experience. "The Information Age" is basically a marketing phrase, used to sell people on the idea that money can be made by buying and selling information. And the idea of commercializing thought would be a tougher sell. Marketers avoid words like "knowing" and "thought." To control the marketplace of thought would mean having to control thought. We don't like to contemplate what that means for a free society. Would it mean that just as state force helps control the market of land and things, it must also guarantee the marketplace of thought? That seems scary. Instead, "information" may be bought, sold, and owned, even as thoughts remain free.
 So, the Information Age marketer sells a piece of information, which is translated into a series of logical processes, run through a universal computer, and turned into numbers that can be stored and displayed. A universal computer can run any algorithm with which it is programmed. Duplicating what it has stored in its memory, even when it's only cached there temporarily, is really easy. This means that profits can't be extracted from the scarcity of information.
Line: 41 to 41
 

Changed:
<
<
The concept of intellectual property has always been about the control and restriction of thought. IP means the use of the weapons of the state to enforce artificial boundaries around thoughts, so that information can be made scarce to give it a price. The ability to freely distribute and display with networked computers reveals that this loss of freedom is based on a false bargain based on promoting creativity at the cost of our freedom to think about owned thoughts. Computers that can be plugged into 3-D printers and build weapons, challenge the false bargain of sacrificing our freedom for personal safety and security. Any technical countermeasure, enforced by government, will have to control thought about algorithms. It will necessarily fail, except in its ability to punish those who aren't skillful enough to avoid capture, and to make it harder for dissidents to use these algorithms to circumvent restrictions on other things governments want to control, like the ability to organize protest. The solution isn't to control and punish thought, but rather to free thought, and to allow for creative solutions.
>
>
The concept of intellectual property has always been about the control and restriction of thought. The IP regime depends on the state's police power to enforce artificial boundaries around thoughts, so as to create a scarcity in the supply of information, the demand for which can then give it a price. The ability to freely distribute and display with networked computers reveals that this loss of freedom is based on a flawed economic bargain. But what about the other part of this -- that our safety and security depends on restricting what we can do with computers? There is no algorithm that can protect us without being circumvented by someone thinking up a hack for it. A state regime that backs up "good" countermeasure algorithms will thus have to police thought about "bad" hacker algorithms. And just as in the IP regime, the state will fail, except in its ability to punish those who aren't skillful enough to avoid capture and make life harder for dissident thought. use these algorithms to circumvent restrictions on other things governments want to control, like the ability to organize protest, and ultimately, the power to develop real solutions to the problems posed by new technologies.
 
Changed:
<
<
-- BahradSokhansanj - 24 Jan 2012
>
>
-- BahradSokhansanj - 5 Mar 2012
 

BahradSokhansanjSecondPaper 14 - 24 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

We Are All Prometheus Now

Line: 34 to 34
 In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to control it. But, these have been foiled again and again. Universal computers can run the algorithms that defeat the restrictions, because they have to be leaky for the information to be distributed and read by paying customers. Information sellers respond by developing restrictions that are increasingly fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reports on unauthorized access when a computer goes online, or even shuts computer's operating system and ability to function entirely. This is especially common in computers that are marketed in ways that avoid calling them "computers," like smartphones, tablets, game consoles, and embedded devices.

Anything thought builds, though, thought can undo. Thoughts, implemented as algorithms running on computers, can be used to break all the most sophisticated locks placed on information. The knowledge of how to circumvent can be restricted by banning certain algorithms, censoring the websites that publicize them, and watching those who seek them. Algorithms, imagined by knowledge applied creatively, can go around all these measures. So, the only solution is to ban the thoughts behind the algorithm -- to punish the people who think about them and try to learn about them.

Deleted:
<
<
This is why copyright law in the digital age is inconsistent with our conception of free society based on freedom of thought. Enforcement means outlawing circumvention. It means limiting thought, and punishing it when it goes out of bounds. Still, enforcing these laws can't prevent anything. The police can only go after violators after the fact, after the locks have been broken, and the information products -- thoughts -- go free, breaking that carefully constructed market built on false scarcity.
 

Line: 43 to 41
 

Changed:
<
<
Soon, many of us will ask governments to stop evil people from building 3-D printing weapons and synthesizing infectious agents. But, any technological countermeasures will fail, no matter how sophisticated they are, and no matter how heavily backed by laws and their enforcement. The only recourse will be to more severely punish those whom are caught -- only after the locks are already broken. There is a better alternative. We can finally set aside the false choice between freedom and safety. We can stop avoiding hard problems by punishing thinking, and instead share our thoughts and work to build solutions.
>
>
The concept of intellectual property has always been about the control and restriction of thought. IP means the use of the weapons of the state to enforce artificial boundaries around thoughts, so that information can be made scarce to give it a price. The ability to freely distribute and display with networked computers reveals that this loss of freedom is based on a false bargain based on promoting creativity at the cost of our freedom to think about owned thoughts. Computers that can be plugged into 3-D printers and build weapons, challenge the false bargain of sacrificing our freedom for personal safety and security. Any technical countermeasure, enforced by government, will have to control thought about algorithms. It will necessarily fail, except in its ability to punish those who aren't skillful enough to avoid capture, and to make it harder for dissidents to use these algorithms to circumvent restrictions on other things governments want to control, like the ability to organize protest. The solution isn't to control and punish thought, but rather to free thought, and to allow for creative solutions.
 
Changed:
<
<
-- BahradSokhansanj - 17 Jan 2012
>
>
-- BahradSokhansanj - 24 Jan 2012
 
Line: 60 to 58
 Thank you very much for your thoughtful comments, Devin. I'm going to have to think about this... I'm not sure what it would be to argue for freedom from first principles might look like? I'm trying to start from the initial point that we associate the core of freedom as being the freedom of thought, and that's what's being challenged by all of this -- so if you want the restrictions, then you have to accept the loss of that core freedom (and then what freedoms are really left?) and then, that this would be futile anyway, so it's not really like you're trading freedom for anything but illusory security -- and in fact, real solutions for the security problems can only come from human creativity, which requires freedom to think about these unthinkable algorithms.

-- BahradSokhansanj - 21 Jan 2012

Added:
>
>
I've changed the article, but now it's rougher and represents a couple of conflicting ideas. Maybe this really needs to be split in two essays, or I should just focus on the freedom/security false balance (for example, take the time to explain how thinking about algorithms led to secure commerce, for example, better than the solution that government tried to provide through control).

-- BahradSokhansanj - 24 Jan 2012

 
 
<--/commentPlugin-->
\ No newline at end of file

BahradSokhansanjSecondPaper 13 - 21 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

We Are All Prometheus Now

Line: 55 to 55
 I think the argument for freedom would have to be from first principles, that freedom is precious, futility - that restricting freedom would not work, or that restricting freedom in the area in question would have offsetting perverse consequences somewhere else. I think the piece as it is now leans towards the 'it's futile to try to restrict freedoms in this area' argument.

-- DevinMcDougall - 20 Jan 2012

Added:
>
>

Thank you very much for your thoughtful comments, Devin. I'm going to have to think about this... I'm not sure what it would be to argue for freedom from first principles might look like? I'm trying to start from the initial point that we associate the core of freedom as being the freedom of thought, and that's what's being challenged by all of this -- so if you want the restrictions, then you have to accept the loss of that core freedom (and then what freedoms are really left?) and then, that this would be futile anyway, so it's not really like you're trading freedom for anything but illusory security -- and in fact, real solutions for the security problems can only come from human creativity, which requires freedom to think about these unthinkable algorithms.

-- BahradSokhansanj - 21 Jan 2012

 
 
<--/commentPlugin-->

BahradSokhansanjSecondPaper 12 - 20 Jan 2012 - Main.DevinMcDougall
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

We Are All Prometheus Now

Line: 48 to 48
 -- BahradSokhansanj - 17 Jan 2012
Added:
>
>

Very interesting piece. One preliminary comment: I'm not quite persuaded yet that this piece has demonstrated that the choice between freedom and safety is a false one. I think there can be very persuasive arguments made that in many circumstances, freedom is a better choice than safety. However, I'm not sure the tension between the two dissolves so easily. I think your piece does more to make a case for freedom (since safety is impossible due to locks being circumventable) than it does to demonstrate that freedom and safety are not in tension. There may be a point to be made though that increased freedom in some circumstances increases safety - that might also be what you are getting at. If that's the case, I think that point could be more explicit. However, specifically with respect to lab synthesization of biological warfare implements, I think the argument that freedom increases safety might be difficult to make. Maybe the safety increase could come from full freedom to share information leading to antidotes for the weapons. But what if there are no antidotes? Then there would seem to be safety 'costs' to the freedom.

I think the argument for freedom would have to be from first principles, that freedom is precious, futility - that restricting freedom would not work, or that restricting freedom in the area in question would have offsetting perverse consequences somewhere else. I think the piece as it is now leans towards the 'it's futile to try to restrict freedoms in this area' argument.

-- DevinMcDougall - 20 Jan 2012

 
 
<--/commentPlugin-->

BahradSokhansanjSecondPaper 11 - 18 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

We Are All Prometheus Now

Added:
>
>
 Ready for review.

The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer.

Changed:
<
<
We believe that in a free society, government enforces laws that may restrict actions, based on the need to protect safety and social order. We like to believe that thoughts cannot be restricted or punished. We may accept limited prohibitions on reading and listening -- but only in extraordinary circumstances, tied to what we think will keep us safe from our darkest fears, like terrorism or child pornography -- but we don't see these as limiting thoughts.
>
>
We believe that in a free society, government enforces laws that limit our freedom of action in order to protect our safety and order society the way we'd like. We'd like to believe that our thoughts can't be restricted. Maybe we could accept a limit on what we can read or hear -- if only rarely, when needed to keep us safe from our darkest fears, terrorists, child pornographers, identity thieves.... Still,there can't be limits imposed on thought alone.
 
Changed:
<
<
Computers challenge our ability to differentiate between a law that infringes the freedom to do something with the freedom to think about it. This matters because computers are now the way we acquire and transmit knowledge.They can be combined with 3-D printers to manufacture physical objects and devices. They can run DNA synthesis machines and engineer microorganisms. Laws can be enforced to prevent the use of computers to copy movies, build counterfeit or dangerous goods, or produce patented or dangerous microorganisms. But, how compatible are these laws with what we think is a free society?
>
>
********
 
Changed:
<
<
When we think about computers, we don't usually think about what computers actually are, just what they do -- the software they run or the content they display. The computer is just a passive, invisible entity. We don't even call most of them "computers." We use words like "smartphone," or "tablet" instead of "tablet computer." Kindles and Nooks are "e-readers." Playstations are "game consoles," even though they are basically desktop PCs, and we usually ignore the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.
>
>
Computers challenge our idea of a free society based on freedom of thought and conscience. Computers are now the way we gain and share knowledge. They can run 3-D printers to build physical objects and devices. They can run machines to manipulate DNA and modify microorganisms. Governments may enforce laws to stop computers from copying movies, build counterfeit or dangerous goods, or synthesize patented or dangerous microorganisms. But, controlling what we can do with a computer doesn't just infringe on the freedom to do, it also infringes on the freedom to think.

When we think about computers, we don't usually think about what computers actually are, just what they do -- the software they run or the content they display. The computer is just a passive, invisible entity. We don't even call most of them "computers." We use words like "smartphone," or "tablet" instead of "tablet computer." Kindles and Nooks are "e-readers." Playstations are "game consoles," even though they are basically desktop PCs, and we usually ignore the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.

 Universal computers are special, because they can execute any algorithm. Algorithms are just thoughts that have been broken down to pieces, a set of process and rules that can be described using logic. What algorithms computers can run is limited only by the speed of their circuitry and capacity to store data. Computers are "thinking machines," even though that's a concept that usually comes up in exotic, metaphysical discussions of artificial intelligence and silicon consciousness, the stuff that Kurzweil writes about. The reality of computers seems much more mundane; they just follow concrete, logical instructions. But, computers are already thinking for us, if not exactly like us. Computers execute our thoughts, or someone else's or a collective's thoughts, and then display the results.
Added:
>
>

********
 
Changed:
<
<
The "Information Age" is characterized by the word "information." This is interesting, because information is a long, Latin-rooted word. "Information" is a word that removes itself intellectually from our living experience. "Knowing" means basically the same thing, but it's not used as much. This is because "Information Age" is basically a marketing device, used to sell people on the idea that money can be made by buying and selling information. But "knowing" is "thinking." Commercializing "though" is a tougher sell. To control the marketplace of thought would mean having to control thought, and we don't like to contemplate what that means for a free society. Maybe advertising really is about that, but we don't like to think about what that implies. So we use "information" instead, to feel more comfortable. "Information" may be bought, sold, and owned, but "thoughts" are still free.
>
>
The "Information Age" is characterized by the word "information." Information is a long, Latin-rooted, technical-sounding word. We understand it, when read or heard, at an intellectual remove from our living experience. "Knowing" means basically the same thing, but it's not used as much in this context. "The Information Age" is basically a marketing phrase, used to sell people on the idea that money can be made by buying and selling information. But "knowing" is "thinking." Commercializing thought is a tougher sell. To control the marketplace of thought would mean having to control thought, and we don't like to contemplate what that means for a free society. Maybe advertising really is about that, but we don't like to think about what that implies. So we use "information" instead, to feel more comfortable. Information may be bought, sold, and owned, but thoughts are still free.
 So, the Information Age marketer sells a piece of information, which is translated into a series of logical processes, run through a universal computer, and turned into numbers that can be stored and displayed. A universal computer can run any algorithm with which it is programmed. Duplicating what it has stored in its memory, even when it's only cached there temporarily, is really easy. This means that profits can't be extracted from the scarcity of information.

In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to control it. But, these have been foiled again and again. Universal computers can run the algorithms that defeat the restrictions, because they have to be leaky for the information to be distributed and read by paying customers. Information sellers respond by developing restrictions that are increasingly fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reports on unauthorized access when a computer goes online, or even shuts computer's operating system and ability to function entirely. This is especially common in computers that are marketed in ways that avoid calling them "computers," like smartphones, tablets, game consoles, and embedded devices.

Changed:
<
<
Anything thought builds, though, thought can undo. All the most sophisticated means of locking up information can be broken. The knowledge of how to circumvent can be restricted by banning certain algorithms, censoring the websites that publicize them, and watching those who seek them. Still, an algorithm running on a computer can go around all these measures. All it takes is knowledge and thought. So, the only solution is to ban the thoughts behind the algorithm -- to punish the people who think about them and try to learn about them.
>
>
Anything thought builds, though, thought can undo. Thoughts, implemented as algorithms running on computers, can be used to break all the most sophisticated locks placed on information. The knowledge of how to circumvent can be restricted by banning certain algorithms, censoring the websites that publicize them, and watching those who seek them. Algorithms, imagined by knowledge applied creatively, can go around all these measures. So, the only solution is to ban the thoughts behind the algorithm -- to punish the people who think about them and try to learn about them.
 
Changed:
<
<
This is why copyright law in the digital age is inconsistent with what we think of as being a free society. Enforcement means making circumvention illegal, and that means limiting thought, punishing it when it goes out of bounds. And, it still can't prevent anything. The police can only go after people after the fact, after the locks have been broken, and the information -- thoughts -- runs free, and the marketplace based on an artificial scarcity is broken.
>
>
This is why copyright law in the digital age is inconsistent with our conception of free society based on freedom of thought. Enforcement means outlawing circumvention. It means limiting thought, and punishing it when it goes out of bounds. Still, enforcing these laws can't prevent anything. The police can only go after violators after the fact, after the locks have been broken, and the information products -- thoughts -- go free, breaking that carefully constructed market built on false scarcity.
 
Changed:
<
<
Soon, we will try to stop people from 3-D printing weapons and synthesizing microbes so that we can stay safe. But, any technological countermeasures will fail, no matter how sophisticated they are, and no matter heavily they are supported by laws and enforced by government action. The only recourse will be to more severely punish those whom are caught -- only after the locks are already broken. There is a better alternative. We can finally set aside the false choice between freedom and safety. We can stop avoiding hard problems by imposing punishment, and instead harness our shared thinking to actually solve them.
>
>
********

Soon, many of us will ask governments to stop evil people from building 3-D printing weapons and synthesizing infectious agents. But, any technological countermeasures will fail, no matter how sophisticated they are, and no matter how heavily backed by laws and their enforcement. The only recourse will be to more severely punish those whom are caught -- only after the locks are already broken. There is a better alternative. We can finally set aside the false choice between freedom and safety. We can stop avoiding hard problems by punishing thinking, and instead share our thoughts and work to build solutions.
 -- BahradSokhansanj - 17 Jan 2012

BahradSokhansanjSecondPaper 10 - 18 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

We Are All Prometheus Now

Changed:
<
<
Ready for review. The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer, which I strongly recommend.
>
>
Ready for review.
 
Changed:
<
<
We believe that in a free society, government enforces laws that may restrict actions, based on the need to protect safety and social order. We believe that at the foundation of a free society, thoughts cannot be restricted or punished. We may expect that there be limited prohibitions on reading and listening -- but only in extraordinary circumstances, tied to what we think will keep us safe from our darkest fears, like terrorism or child pornography -- but we don't see these as limiting thoughts.
>
>
The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer.

We believe that in a free society, government enforces laws that may restrict actions, based on the need to protect safety and social order. We like to believe that thoughts cannot be restricted or punished. We may accept limited prohibitions on reading and listening -- but only in extraordinary circumstances, tied to what we think will keep us safe from our darkest fears, like terrorism or child pornography -- but we don't see these as limiting thoughts.

 Computers challenge our ability to differentiate between a law that infringes the freedom to do something with the freedom to think about it. This matters because computers are now the way we acquire and transmit knowledge.They can be combined with 3-D printers to manufacture physical objects and devices. They can run DNA synthesis machines and engineer microorganisms. Laws can be enforced to prevent the use of computers to copy movies, build counterfeit or dangerous goods, or produce patented or dangerous microorganisms. But, how compatible are these laws with what we think is a free society?
Line: 16 to 18
 So, the Information Age marketer sells a piece of information, which is translated into a series of logical processes, run through a universal computer, and turned into numbers that can be stored and displayed. A universal computer can run any algorithm with which it is programmed. Duplicating what it has stored in its memory, even when it's only cached there temporarily, is really easy. This means that profits can't be extracted from the scarcity of information.
Changed:
<
<
In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to control it. But, these are consistently foiled again and again. Universal computers can run the algorithms that defeat the restrictions, because they have to be leaky for the information to be distributed and read by paying customers. Information sellers respond by developing restrictions that are increasingly fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reports on unauthorized access when a computer goes online, or even shuts computer's operating system and ability to function entirely. This is especially common in computers that are marketed in ways that avoid calling them "computers," like smartphones, tablets, game consoles, and embedded devices.
>
>
In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to control it. But, these have been foiled again and again. Universal computers can run the algorithms that defeat the restrictions, because they have to be leaky for the information to be distributed and read by paying customers. Information sellers respond by developing restrictions that are increasingly fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reports on unauthorized access when a computer goes online, or even shuts computer's operating system and ability to function entirely. This is especially common in computers that are marketed in ways that avoid calling them "computers," like smartphones, tablets, game consoles, and embedded devices.
 
Changed:
<
<
Anything thought builds though, thought can undo. All the most sophisticated means of locking up information can be broken. The knowledge of how to circumvent can be restricted by banning certain algorithms, censoring the websites that publicize them, and watching those who seek them. Still, an algorithm running on a computer can go around all these measures. All it takes is knowledge and thought. So, the only solution is to ban the thoughts behind the algorithm -- to punish the people who think about them and try to learn about them.
>
>
Anything thought builds, though, thought can undo. All the most sophisticated means of locking up information can be broken. The knowledge of how to circumvent can be restricted by banning certain algorithms, censoring the websites that publicize them, and watching those who seek them. Still, an algorithm running on a computer can go around all these measures. All it takes is knowledge and thought. So, the only solution is to ban the thoughts behind the algorithm -- to punish the people who think about them and try to learn about them.
 This is why copyright law in the digital age is inconsistent with what we think of as being a free society. Enforcement means making circumvention illegal, and that means limiting thought, punishing it when it goes out of bounds. And, it still can't prevent anything. The police can only go after people after the fact, after the locks have been broken, and the information -- thoughts -- runs free, and the marketplace based on an artificial scarcity is broken.
Changed:
<
<
There will be harder questions as in the future. We will want to stop people from 3-D printing weapons and synthesizing microbes so that we can stay safe. But, when we try to restrict the use of computers to do these things, we must realize that our countermeasures will fail -- nothing can be prevented, all that can be done is punish whom we can catch after the locks are already broken. There's something else we can do, though. We can choose to set aside the principle that freedom and safety are in conflict, that we must sacrifice one for the other. We can instead use our freedom to think in our free society to actually deal with the consequences of technology, and not try to avoid them with futile, spiteful laws.
>
>
Soon, we will try to stop people from 3-D printing weapons and synthesizing microbes so that we can stay safe. But, any technological countermeasures will fail, no matter how sophisticated they are, and no matter heavily they are supported by laws and enforced by government action. The only recourse will be to more severely punish those whom are caught -- only after the locks are already broken. There is a better alternative. We can finally set aside the false choice between freedom and safety. We can stop avoiding hard problems by imposing punishment, and instead harness our shared thinking to actually solve them.
 -- BahradSokhansanj - 17 Jan 2012

BahradSokhansanjSecondPaper 9 - 18 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

We Are All Prometheus Now

Ready for review. The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer, which I strongly recommend.

Changed:
<
<
We believe that in a free society, government enforces laws that may restrict actions, based on the need to protect safety and social order. We may expect that there be limited prohibitions on reading and listening -- but only in extraordinary circumstances, tied to what we think will keep us safe, whether related to fears of terrorism or concerns about child pornography. And yet, our moral intuition is that the freedom to do can be curtailed, but for us to be free, our government can't punish thought itself.
>
>
We believe that in a free society, government enforces laws that may restrict actions, based on the need to protect safety and social order. We believe that at the foundation of a free society, thoughts cannot be restricted or punished. We may expect that there be limited prohibitions on reading and listening -- but only in extraordinary circumstances, tied to what we think will keep us safe from our darkest fears, like terrorism or child pornography -- but we don't see these as limiting thoughts.
 Computers challenge our ability to differentiate between a law that infringes the freedom to do something with the freedom to think about it. This matters because computers are now the way we acquire and transmit knowledge.They can be combined with 3-D printers to manufacture physical objects and devices. They can run DNA synthesis machines and engineer microorganisms. Laws can be enforced to prevent the use of computers to copy movies, build counterfeit or dangerous goods, or produce patented or dangerous microorganisms. But, how compatible are these laws with what we think is a free society?
Changed:
<
<
When we think about computers, we don't usually think about what computers actually are, just what they do -- the software they run or the content they display. The computer is just a passive, invisible entity. We don't even call most of them "computers." We use words like "smartphone," or “tablet” instead of “tablet computer.” Kindles and Nooks are "e-readers." Playstations are "game consoles," even though they are basically desktop PCs, and we usually ignore the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.
>
>
When we think about computers, we don't usually think about what computers actually are, just what they do -- the software they run or the content they display. The computer is just a passive, invisible entity. We don't even call most of them "computers." We use words like "smartphone," or "tablet" instead of "tablet computer." Kindles and Nooks are "e-readers." Playstations are "game consoles," even though they are basically desktop PCs, and we usually ignore the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.
 
Changed:
<
<
Universal computers are special, because they can execute any algorithm. Algorithms are just thoughts that have been broken down to pieces, a set of process and rules that can be described using logic. What algorithms computers can run is limited only by the speed of their circuitry and capacity to store data. Computers are "thinking machines," even though that’s a concept that usually comes up in exotic, metaphysical discussions of artificial intelligence and silicon consciousness, the stuff that Kurzweil writes about. The reality of computers seems much more mundane; they just follow concrete, logical instructions. But, computers are already thinking for us, if not exactly like us. Computers execute our thoughts, or someone else's or a collective's thoughts, and then display the results.
>
>
Universal computers are special, because they can execute any algorithm. Algorithms are just thoughts that have been broken down to pieces, a set of process and rules that can be described using logic. What algorithms computers can run is limited only by the speed of their circuitry and capacity to store data. Computers are "thinking machines," even though that's a concept that usually comes up in exotic, metaphysical discussions of artificial intelligence and silicon consciousness, the stuff that Kurzweil writes about. The reality of computers seems much more mundane; they just follow concrete, logical instructions. But, computers are already thinking for us, if not exactly like us. Computers execute our thoughts, or someone else's or a collective's thoughts, and then display the results.
 
Changed:
<
<
The "Information Age" is characterized by the word "information." This is interesting, because information is a long, Latin-rooted word. “Information” is a word that removes itself intellectually from our living experience. "Knowledge" means basically the same thing, but it's not used as much. This is because "Information Age" is basically a marketing device, used to sell people on the idea that money can be made by buying and selling information. The word "knowledge" is bound up with "knowing," to human thought. Commercializing “thought” would be a tougher sell. We intuitively recognize that to control the marketplace of thought, means controlling thought itself. That’s actually the basis of marketing, really, but we don’t like to think about what that implies, so we prefer the word “information.” But the choice of word can’t avoid reality.
>
>
The "Information Age" is characterized by the word "information." This is interesting, because information is a long, Latin-rooted word. "Information" is a word that removes itself intellectually from our living experience. "Knowing" means basically the same thing, but it's not used as much. This is because "Information Age" is basically a marketing device, used to sell people on the idea that money can be made by buying and selling information. But "knowing" is "thinking." Commercializing "though" is a tougher sell. To control the marketplace of thought would mean having to control thought, and we don't like to contemplate what that means for a free society. Maybe advertising really is about that, but we don't like to think about what that implies. So we use "information" instead, to feel more comfortable. "Information" may be bought, sold, and owned, but "thoughts" are still free.
 
Changed:
<
<
The problem is that the information the eager Information Age marketer sells is translated into a series of logical processes, run through a universal computer, and turned into numbers that can be stored and displayed. A universal computer can run any algorithm with which it is programmed. Duplicating what it has stored in its memory, even when it’s only cached there temporarily, is really easy. This means that profits can’t be extracted from the scarcity of information.
>
>
So, the Information Age marketer sells a piece of information, which is translated into a series of logical processes, run through a universal computer, and turned into numbers that can be stored and displayed. A universal computer can run any algorithm with which it is programmed. Duplicating what it has stored in its memory, even when it's only cached there temporarily, is really easy. This means that profits can't be extracted from the scarcity of information.
 
Changed:
<
<
In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to control it. But, these are consistently foiled again and again. Universal computers can run the algorithms that defeat the restrictions, because they have to be leaky for the information to be distributed and read by paying customers. Information sellers respond by developing restrictions that are increasingly fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reports on unauthorized access when a computer goes online, or even shuts computer’s operating system and ability to function entirely. As Cory Doctorow says, "digital rights management always converges on malware." This is especially common in computers that are marketed as smartphones and tablets, or embedded in systems like DVD players, a de-functioning disguised by avoidance of the word “computer.”
>
>
In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to control it. But, these are consistently foiled again and again. Universal computers can run the algorithms that defeat the restrictions, because they have to be leaky for the information to be distributed and read by paying customers. Information sellers respond by developing restrictions that are increasingly fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reports on unauthorized access when a computer goes online, or even shuts computer's operating system and ability to function entirely. This is especially common in computers that are marketed in ways that avoid calling them "computers," like smartphones, tablets, game consoles, and embedded devices.
 
Changed:
<
<
Anything thought builds though, thought can undo. All the most sophisticated means of locking up information can be broken. The knowledge of how to circumvent can be restricted by punishing people who come up with the algorithms, censoring the websites that publicize them, and watching those who seek them. But, an algorithm running on a computer can go around all these measures. All it takes is knowledge and thought.
>
>
Anything thought builds though, thought can undo. All the most sophisticated means of locking up information can be broken. The knowledge of how to circumvent can be restricted by banning certain algorithms, censoring the websites that publicize them, and watching those who seek them. Still, an algorithm running on a computer can go around all these measures. All it takes is knowledge and thought. So, the only solution is to ban the thoughts behind the algorithm -- to punish the people who think about them and try to learn about them.
 
Changed:
<
<
This is why copyright law in the digital age is inconsistent with what we think of as being a free society. Enforcement means making circumvention illegal, and that means limiting thought, punishing it when it goes out of bounds, all to support an already obsolete business model. Not to mention that the enforcement can’t prevent anything. It can only go after people after the fact, after the locks have been broken, and information runs free.
>
>
This is why copyright law in the digital age is inconsistent with what we think of as being a free society. Enforcement means making circumvention illegal, and that means limiting thought, punishing it when it goes out of bounds. And, it still can't prevent anything. The police can only go after people after the fact, after the locks have been broken, and the information -- thoughts -- runs free, and the marketplace based on an artificial scarcity is broken.
 
Changed:
<
<
There will be harder questions as computers do more stuff, and we try to stop people from 3-D printing and synthesizing microbes in ways that we fear will harm safety and social order. But, when we try to restrict the use of computers to do these things, we need to recognize first, that our countermeasures will fail, and second, all we can do is punish anyone caught after the restrictions are already broken. And, in weighing the laws we want as punishment, we need decide how enforcing them sacrifices our own personal freedom to think, in what we believe is a free society.
>
>
There will be harder questions as in the future. We will want to stop people from 3-D printing weapons and synthesizing microbes so that we can stay safe. But, when we try to restrict the use of computers to do these things, we must realize that our countermeasures will fail -- nothing can be prevented, all that can be done is punish whom we can catch after the locks are already broken. There's something else we can do, though. We can choose to set aside the principle that freedom and safety are in conflict, that we must sacrifice one for the other. We can instead use our freedom to think in our free society to actually deal with the consequences of technology, and not try to avoid them with futile, spiteful laws.
 -- BahradSokhansanj - 17 Jan 2012

BahradSokhansanjSecondPaper 8 - 17 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"
Changed:
<
<

We Are All Prometheus

>
>

We Are All Prometheus Now

 
Changed:
<
<
Ready for review. The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer. It’s a really great talk that expresses much more clearly the ideas that have been bouncing around my own head for a long time now. I strongly recommend watching it -- certainly over actually reading what I wrote here!
>
>
Ready for review. The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer, which I strongly recommend.
 
Changed:
<
<
We believe that in a free society, government enforces laws that may restrict actions, based on the need to protect safety and social order. We may expect that there be limited prohibitions on reading and listening -- but only in extraordinary circumstances, tied to what we think will keep us safe, whether related to fears of terrorism or concerns about child pornography. But, our moral intuition is that the freedom to do can be curtailed, but for us to be free, our government can't punish thought itself.
>
>
We believe that in a free society, government enforces laws that may restrict actions, based on the need to protect safety and social order. We may expect that there be limited prohibitions on reading and listening -- but only in extraordinary circumstances, tied to what we think will keep us safe, whether related to fears of terrorism or concerns about child pornography. And yet, our moral intuition is that the freedom to do can be curtailed, but for us to be free, our government can't punish thought itself.
 
Changed:
<
<
But, computers challenge our ability to differentiate between a law that infringes the freedom to do something with the freedom to think about it. Computers are now the way we acquire and transmit knowledge. Computers can be combined with 3-D printers to manufacture physical objects and devices. Computers can run DNA synthesis machines and engineer microorganisms. Laws can be enforced to prevent the use of computers to copy movies, build counterfeit or dangerous goods, or produce patented or dangerous microorganisms. But, these laws will necessarily punish thought.
>
>
Computers challenge our ability to differentiate between a law that infringes the freedom to do something with the freedom to think about it. This matters because computers are now the way we acquire and transmit knowledge.They can be combined with 3-D printers to manufacture physical objects and devices. They can run DNA synthesis machines and engineer microorganisms. Laws can be enforced to prevent the use of computers to copy movies, build counterfeit or dangerous goods, or produce patented or dangerous microorganisms. But, how compatible are these laws with what we think is a free society?
 
Changed:
<
<
When we think about computers, we don't usually think about what computers actually are. We usually think about what computers can do. What can software that runs on a computer do? What can we do on the Internet we access through the computer? The computer is just a passive entity, largely invisible and transparent. We don't even call most computers, "computers." The word isn't found in the term "smartphone." We usually drop off the last half of the bulky phrase "tablet computer;" Kindles and Nooks are "e-readers." Playstations are "game consoles," even when physically and functionally indistinguishable from desktop PCs, and we don't even think about the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.
>
>
When we think about computers, we don't usually think about what computers actually are, just what they do -- the software they run or the content they display. The computer is just a passive, invisible entity. We don't even call most of them "computers." We use words like "smartphone," or “tablet” instead of “tablet computer.” Kindles and Nooks are "e-readers." Playstations are "game consoles," even though they are basically desktop PCs, and we usually ignore the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.
 
Changed:
<
<
Universal computers are special, because they can execute any algorithms. Algorithms are just thoughts that have been broken down to pieces, a set of process and rules that can be described using logic. They are limited only by the speed of the circuitry to run through the algorithm's instructions and its capacity to store data produced and used by the algorithm work.
>
>
Universal computers are special, because they can execute any algorithm. Algorithms are just thoughts that have been broken down to pieces, a set of process and rules that can be described using logic. What algorithms computers can run is limited only by the speed of their circuitry and capacity to store data. Computers are "thinking machines," even though that’s a concept that usually comes up in exotic, metaphysical discussions of artificial intelligence and silicon consciousness, the stuff that Kurzweil writes about. The reality of computers seems much more mundane; they just follow concrete, logical instructions. But, computers are already thinking for us, if not exactly like us. Computers execute our thoughts, or someone else's or a collective's thoughts, and then display the results.
 
Changed:
<
<
Computers are "thinking machines," a concept that usually comes up in metaphysical discussions of artificial intelligence, contemplation of an era of computers that can think creatively like we do, and even be conscious, like science fiction robots or Kurzweilian spiritual machines. The reality of computers seems much more mundane; they just follow concrete, logical instructions. But, computers are already thinking for us, if not exactly like us. Computers execute our thoughts, or someone else's or a collective's thoughts, and then display the results.
>
>
The "Information Age" is characterized by the word "information." This is interesting, because information is a long, Latin-rooted word. “Information” is a word that removes itself intellectually from our living experience. "Knowledge" means basically the same thing, but it's not used as much. This is because "Information Age" is basically a marketing device, used to sell people on the idea that money can be made by buying and selling information. The word "knowledge" is bound up with "knowing," to human thought. Commercializing “thought” would be a tougher sell. We intuitively recognize that to control the marketplace of thought, means controlling thought itself. That’s actually the basis of marketing, really, but we don’t like to think about what that implies, so we prefer the word “information.” But the choice of word can’t avoid reality.
 
Changed:
<
<
The "Information Age" is characterized by the word "information." Information is a long, Latin-rooted word that puts itself at an intellectual remove, as a concept floating outside our human experience. "Knowledge" means basically the same thing, but it's disfavored. This makes sense. The "Information Age" is basically a marketing device, used to sell people on the idea that money can be made by buying and selling information. The word "knowledge" is bound up with "knowing," to human thought. Commercializing thought is a tougher sell. To control the marketplace of thought, means controlling thought itself, which is practically difficult. Information is a less troublesome term that looks a lot better on a prospectus. The problem is, no matter what its called, the same basic reality applies.
>
>
The problem is that the information the eager Information Age marketer sells is translated into a series of logical processes, run through a universal computer, and turned into numbers that can be stored and displayed. A universal computer can run any algorithm with which it is programmed. Duplicating what it has stored in its memory, even when it’s only cached there temporarily, is really easy. This means that profits can’t be extracted from the scarcity of information.
 
Changed:
<
<
The problem is that the information the eager Information Age marketer sells is translated into a series of logical processes represented by numbers that are sent through a universal computer. And, a universal computer is, well, universal. It can run any algorithm with which it is programmed. Duplication of the stuff stored in a computer's memory is really easy. So, it is impossible to make money based on the scarcity of information.
>
>
In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to control it. But, these are consistently foiled again and again. Universal computers can run the algorithms that defeat the restrictions, because they have to be leaky for the information to be distributed and read by paying customers. Information sellers respond by developing restrictions that are increasingly fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reports on unauthorized access when a computer goes online, or even shuts computer’s operating system and ability to function entirely. As Cory Doctorow says, "digital rights management always converges on malware." This is especially common in computers that are marketed as smartphones and tablets, or embedded in systems like DVD players, a de-functioning disguised by avoidance of the word “computer.”
 
Changed:
<
<
In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to restrict access. But, these are consistently foiled again and again because, universal computers are universal -- and they can be programmed with the algorithms to defeat the restrictions. In response, the restrictions have been getting more and more fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reporting on violations of access restrictions when a computer goes online, or even shuts computer’s operating system and ability to function entirely. Cory Doctorow says, "digital rights management always converges on malware." This de-functioning is especially common for computers that are marketed as smartphones, tablets, and the innards of DVD players -- the marketers try to disguise this by avoiding the word "computer."
>
>
Anything thought builds though, thought can undo. All the most sophisticated means of locking up information can be broken. The knowledge of how to circumvent can be restricted by punishing people who come up with the algorithms, censoring the websites that publicize them, and watching those who seek them. But, an algorithm running on a computer can go around all these measures. All it takes is knowledge and thought.
 
Changed:
<
<
Anything thought builds; thought can undo. All the most sophisticated means of restricting access can be circumvented. The knowledge of how to circumvent can be restricted by punishing people who come up with the algorithms, censoring the websites that publicize them, and watching those who seek them. But, even these measures can themselves be circumvented with an algorithm and a computer. All you need is knowledge and thought.
>
>
This is why copyright law in the digital age is inconsistent with what we think of as being a free society. Enforcement means making circumvention illegal, and that means limiting thought, punishing it when it goes out of bounds, all to support an already obsolete business model. Not to mention that the enforcement can’t prevent anything. It can only go after people after the fact, after the locks have been broken, and information runs free.
 
Changed:
<
<
This means that opposing copyright law is easy. To enforce it in the digital age inevitably means punishing thought about how to circumvent. It means supporting an already obsolete business model is worth putting a penalty on certain kinds of thinking. No economic argument for copyright could possibly win out over preserving the basis of free society. Creating an infrastructure that can be used to regulate or punish any thoughts, not just the duplication of artistic work, is clearly overkill.
>
>
There will be harder questions as computers do more stuff, and we try to stop people from 3-D printing and synthesizing microbes in ways that we fear will harm safety and social order. But, when we try to restrict the use of computers to do these things, we need to recognize first, that our countermeasures will fail, and second, all we can do is punish anyone caught after the restrictions are already broken. And, in weighing the laws we want as punishment, we need decide how enforcing them sacrifices our own personal freedom to think, in what we believe is a free society.
 
Changed:
<
<
There will be harder questions as computers do more stuff. We need to recognize that restricting what the computer does will come at the cost of our personal freedom to think. And, any technical restrictions on computers, no matter how clever, cannot actually prevent anything that can't be circumvented. They can only be used punish the thoughts of those who try to circumvent them and get caught -- and challenge what we believe it is to be a free society.

-- BahradSokhansanj - 12 Jan 2012

>
>
-- BahradSokhansanj - 17 Jan 2012
 
 
<--/commentPlugin-->
\ No newline at end of file

BahradSokhansanjSecondPaper 7 - 16 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

We Are All Prometheus

Ready for review. The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer. It’s a really great talk that expresses much more clearly the ideas that have been bouncing around my own head for a long time now. I strongly recommend watching it -- certainly over actually reading what I wrote here!

Changed:
<
<
In a free society, government enforces laws that may restrict actions. You can’t build or buy a certain kind of gun, or a certain kind of sex toy. You can’t copy and reprint a book. There may even be limited prohibitions on reading and listening. Our moral intuition is that the freedom to do can be curtailed, but governments in free states ought not punish thought itself.
>
>
We believe that in a free society, government enforces laws that may restrict actions, based on the need to protect safety and social order. We may expect that there be limited prohibitions on reading and listening -- but only in extraordinary circumstances, tied to what we think will keep us safe, whether related to fears of terrorism or concerns about child pornography. But, our moral intuition is that the freedom to do can be curtailed, but for us to be free, our government can't punish thought itself.
 But, computers challenge our ability to differentiate between a law that infringes the freedom to do something with the freedom to think about it. Computers are now the way we acquire and transmit knowledge. Computers can be combined with 3-D printers to manufacture physical objects and devices. Computers can run DNA synthesis machines and engineer microorganisms. Laws can be enforced to prevent the use of computers to copy movies, build counterfeit or dangerous goods, or produce patented or dangerous microorganisms. But, these laws will necessarily punish thought.
Changed:
<
<
When we think about computers, we don't usually think about what computers actually are. We usually think about what computers can do. What can software that runs on a computer do? What can we do on the Internet we access through the computer? The computer is just a passive entity, largely invisible and transparent. We don't even call most computers, "computers." The word isn’t found in the term “smartphone.” We usually drop off the last half of the bulky phrase "tablet computer;" Kindles and Nooks are "e-readers." Playstations are still "game consoles" even after their capabilities exceed that of many desktop PCs, and we don't even think about the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.
>
>
When we think about computers, we don't usually think about what computers actually are. We usually think about what computers can do. What can software that runs on a computer do? What can we do on the Internet we access through the computer? The computer is just a passive entity, largely invisible and transparent. We don't even call most computers, "computers." The word isn't found in the term "smartphone." We usually drop off the last half of the bulky phrase "tablet computer;" Kindles and Nooks are "e-readers." Playstations are "game consoles," even when physically and functionally indistinguishable from desktop PCs, and we don't even think about the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.
 Universal computers are special, because they can execute any algorithms. Algorithms are just thoughts that have been broken down to pieces, a set of process and rules that can be described using logic. They are limited only by the speed of the circuitry to run through the algorithm's instructions and its capacity to store data produced and used by the algorithm work.
Changed:
<
<
Computers are “thinking machines,” a concept that usually comes up in metaphysical discussions of artificial intelligence, contemplation of an era of computers that can think creatively like we do, and even be conscious, like science fiction robots or Kurzweilian spiritual machines. The reality of computers seems much more mundane; they just follow concrete, logical instructions. But, computers are already thinking for us, if not exactly like us. Computers execute our thoughts, or someone else’s or a collective’s thoughts, and then display the results.
>
>
Computers are "thinking machines," a concept that usually comes up in metaphysical discussions of artificial intelligence, contemplation of an era of computers that can think creatively like we do, and even be conscious, like science fiction robots or Kurzweilian spiritual machines. The reality of computers seems much more mundane; they just follow concrete, logical instructions. But, computers are already thinking for us, if not exactly like us. Computers execute our thoughts, or someone else's or a collective's thoughts, and then display the results.
 
Changed:
<
<
The "Information Age" is characterized by the word “information." Information is a long, Latin-rooted word that puts itself at an intellectual remove, as a concept floating outside our human experience. "Knowledge" means basically the same thing, but it’s disfavored. This makes sense. The "Information Age" is basically a marketing device, used to sell people on the idea that money can be made by buying and selling information. The word "knowledge" is bound up with "knowing,” to human thought. Commercializing thought is a tougher sell. To control the marketplace of thought, means controlling thought itself, which is practically difficult. Information is a less troublesome term that looks a lot better on a prospectus. The problem is, no matter what its called, the same basic reality applies.
>
>
The "Information Age" is characterized by the word "information." Information is a long, Latin-rooted word that puts itself at an intellectual remove, as a concept floating outside our human experience. "Knowledge" means basically the same thing, but it's disfavored. This makes sense. The "Information Age" is basically a marketing device, used to sell people on the idea that money can be made by buying and selling information. The word "knowledge" is bound up with "knowing," to human thought. Commercializing thought is a tougher sell. To control the marketplace of thought, means controlling thought itself, which is practically difficult. Information is a less troublesome term that looks a lot better on a prospectus. The problem is, no matter what its called, the same basic reality applies.
 The problem is that the information the eager Information Age marketer sells is translated into a series of logical processes represented by numbers that are sent through a universal computer. And, a universal computer is, well, universal. It can run any algorithm with which it is programmed. Duplication of the stuff stored in a computer's memory is really easy. So, it is impossible to make money based on the scarcity of information.
Changed:
<
<
In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to restrict access. But, these are consistently foiled again and again because, universal computers are universal -- and they can be programmed with the algorithms to defeat the restrictions. In response, the restrictions have been getting more and more fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reporting on violations of access restrictions when a computer goes online, or even shuts computer’s operating system and ability to function entirely. Cory Doctorow says, “digital rights management always converges on malware.” This de-functioning is especially common for computers that are marketed as smartphones, tablets, and the innards of DVD players -- the marketers try to disguise this by avoiding the word “computer.”
>
>
In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to restrict access. But, these are consistently foiled again and again because, universal computers are universal -- and they can be programmed with the algorithms to defeat the restrictions. In response, the restrictions have been getting more and more fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reporting on violations of access restrictions when a computer goes online, or even shuts computer’s operating system and ability to function entirely. Cory Doctorow says, "digital rights management always converges on malware." This de-functioning is especially common for computers that are marketed as smartphones, tablets, and the innards of DVD players -- the marketers try to disguise this by avoiding the word "computer."
 Anything thought builds; thought can undo. All the most sophisticated means of restricting access can be circumvented. The knowledge of how to circumvent can be restricted by punishing people who come up with the algorithms, censoring the websites that publicize them, and watching those who seek them. But, even these measures can themselves be circumvented with an algorithm and a computer. All you need is knowledge and thought.

This means that opposing copyright law is easy. To enforce it in the digital age inevitably means punishing thought about how to circumvent. It means supporting an already obsolete business model is worth putting a penalty on certain kinds of thinking. No economic argument for copyright could possibly win out over preserving the basis of free society. Creating an infrastructure that can be used to regulate or punish any thoughts, not just the duplication of artistic work, is clearly overkill.

Changed:
<
<
There will be harder questions as computers do more and more stuff. We need to recognize that restricting what the computer does needs to be balanced against what that restriction costs in terms of our personal freedom to think. They couldn’t actually stop Prometheus. They were only able to chain him to a rock after the fact. Ultimately, any technical restrictions on computers, no matter how devious, cannot actually prevent anything that can’t be circumvented. They can only be used punish those who try to circumvent them and get caught -- to punish their illegal thoughts.
>
>
There will be harder questions as computers do more stuff. We need to recognize that restricting what the computer does will come at the cost of our personal freedom to think. And, any technical restrictions on computers, no matter how clever, cannot actually prevent anything that can't be circumvented. They can only be used punish the thoughts of those who try to circumvent them and get caught -- and challenge what we believe it is to be a free society.
 -- BahradSokhansanj - 12 Jan 2012

BahradSokhansanjSecondPaper 6 - 15 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

We Are All Prometheus

Ready for review. The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer. It’s a really great talk that expresses much more clearly the ideas that have been bouncing around my own head for a long time now. I strongly recommend watching it -- certainly over actually reading what I wrote here!

Changed:
<
<
In a free society, government enforces laws that may restrict actions. You can’t build or buy a certain kind of gun, or a certain kind of sex toy. You can’t copy and reprint a book. There may even be limited prohibitions on reading and listening. In general, the freedom to do can be curtailed, but governments in free states ought not punish thought itself.
>
>
In a free society, government enforces laws that may restrict actions. You can’t build or buy a certain kind of gun, or a certain kind of sex toy. You can’t copy and reprint a book. There may even be limited prohibitions on reading and listening. Our moral intuition is that the freedom to do can be curtailed, but governments in free states ought not punish thought itself.
 But, computers challenge our ability to differentiate between a law that infringes the freedom to do something with the freedom to think about it. Computers are now the way we acquire and transmit knowledge. Computers can be combined with 3-D printers to manufacture physical objects and devices. Computers can run DNA synthesis machines and engineer microorganisms. Laws can be enforced to prevent the use of computers to copy movies, build counterfeit or dangerous goods, or produce patented or dangerous microorganisms. But, these laws will necessarily punish thought.

BahradSokhansanjSecondPaper 5 - 15 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

We Are All Prometheus

Ready for review. The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer. It’s a really great talk that expresses much more clearly the ideas that have been bouncing around my own head for a long time now. I strongly recommend watching it -- certainly over actually reading what I wrote here!

Changed:
<
<
In a free society, government enforces laws restrict actions. They do not restrict thoughts. You can’t build or buy a certain kind of gun, or a certain kind of sex toy. You can’t copy and reprint a book. The freedom to do can be curtailed, but the freedom to think, to read, to listen -- this personal human right is inviolate. Certainly, government can’t punish thought.
>
>
In a free society, government enforces laws that may restrict actions. You can’t build or buy a certain kind of gun, or a certain kind of sex toy. You can’t copy and reprint a book. There may even be limited prohibitions on reading and listening. In general, the freedom to do can be curtailed, but governments in free states ought not punish thought itself.
 But, computers challenge our ability to differentiate between a law that infringes the freedom to do something with the freedom to think about it. Computers are now the way we acquire and transmit knowledge. Computers can be combined with 3-D printers to manufacture physical objects and devices. Computers can run DNA synthesis machines and engineer microorganisms. Laws can be enforced to prevent the use of computers to copy movies, build counterfeit or dangerous goods, or produce patented or dangerous microorganisms. But, these laws will necessarily punish thought.

BahradSokhansanjSecondPaper 4 - 14 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

We Are All Prometheus

Changed:
<
<
stream of consciousness draft
>
>
Ready for review. The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer. It’s a really great talk that expresses much more clearly the ideas that have been bouncing around my own head for a long time now. I strongly recommend watching it -- certainly over actually reading what I wrote here!
 
Changed:
<
<
In a recent speech, Cory Doctorow casts the current battles of copyright law, like the continued escalation of digital rights management and the debates surrounding the DMCA and the SOPA, as the relatively low-stakes forerunners of a broader, emerging, and truly significant war on the general purpose computer. The challenge the general purpose computer poses to copyright is that a computer implicitly contain the power to circumvent any digital measure to guarantee copyright limitations. Similarly, the Internet, powered by general purpose computing, provides the ability to circumvent any measures to limit online access. The result, then, is that the only way to really enforce copyright through digital means requires limiting the capabilities of general purpose computers and the general purpose Internet. On the desktop, or laptop, or anywhere else a computer may be found, this means secret spyware, and malware on computers to monitor and control -- and rootkits that prevent the installation of alternate operating systems or running unauthorized software to circumvent them. Online, this means active surveillance and censorship.
>
>
In a free society, government enforces laws restrict actions. They do not restrict thoughts. You can’t build or buy a certain kind of gun, or a certain kind of sex toy. You can’t copy and reprint a book. The freedom to do can be curtailed, but the freedom to think, to read, to listen -- this personal human right is inviolate. Certainly, government can’t punish thought.
 
Changed:
<
<
The reason why this is a broader war is, of course, that computers -- and networks -- are not just found on the desktop, or laptop, or even a smartphone. As chips, storage, and wireless communicators have gotten smaller, faster, and cheaper, it makes more sense to just have a general purpose computer, with I/O access to the outside world, be the embedded device in applications that require computing power. Doctorow's examples include the computers we will embed in our body, like digital hearing aids, the computers that are embedded in vehicles and may soon provide even more control in self-driving cars, the computers that are used to power DNA sequencers and DNA synthesizers, the computers that drive 3-D printers. The question for governments and corporations no longer about just enforcing copyright, or restricting the flow of information. It becomes, how do we control the modification of a self-driving car's computer to maintain traffic control? How do we restrict the synthesis of viruses and microorganisms -- on one-level because of "bioterrorism fears," but more significantly, to protect GMO and biopharma patents? How do we prevent 3-D printers from being used to produce counterfeit goods -- or to make what is needed to make a semi-automatic handgun fully automatic?
>
>
But, computers challenge our ability to differentiate between a law that infringes the freedom to do something with the freedom to think about it. Computers are now the way we acquire and transmit knowledge. Computers can be combined with 3-D printers to manufacture physical objects and devices. Computers can run DNA synthesis machines and engineer microorganisms. Laws can be enforced to prevent the use of computers to copy movies, build counterfeit or dangerous goods, or produce patented or dangerous microorganisms. But, these laws will necessarily punish thought.
 
Added:
>
>
When we think about computers, we don't usually think about what computers actually are. We usually think about what computers can do. What can software that runs on a computer do? What can we do on the Internet we access through the computer? The computer is just a passive entity, largely invisible and transparent. We don't even call most computers, "computers." The word isn’t found in the term “smartphone.” We usually drop off the last half of the bulky phrase "tablet computer;" Kindles and Nooks are "e-readers." Playstations are still "game consoles" even after their capabilities exceed that of many desktop PCs, and we don't even think about the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.
 
Changed:
<
<
-- BahradSokhansanj - 12 Jan 2012
>
>
Universal computers are special, because they can execute any algorithms. Algorithms are just thoughts that have been broken down to pieces, a set of process and rules that can be described using logic. They are limited only by the speed of the circuitry to run through the algorithm's instructions and its capacity to store data produced and used by the algorithm work.

Computers are “thinking machines,” a concept that usually comes up in metaphysical discussions of artificial intelligence, contemplation of an era of computers that can think creatively like we do, and even be conscious, like science fiction robots or Kurzweilian spiritual machines. The reality of computers seems much more mundane; they just follow concrete, logical instructions. But, computers are already thinking for us, if not exactly like us. Computers execute our thoughts, or someone else’s or a collective’s thoughts, and then display the results.

The "Information Age" is characterized by the word “information." Information is a long, Latin-rooted word that puts itself at an intellectual remove, as a concept floating outside our human experience. "Knowledge" means basically the same thing, but it’s disfavored. This makes sense. The "Information Age" is basically a marketing device, used to sell people on the idea that money can be made by buying and selling information. The word "knowledge" is bound up with "knowing,” to human thought. Commercializing thought is a tougher sell. To control the marketplace of thought, means controlling thought itself, which is practically difficult. Information is a less troublesome term that looks a lot better on a prospectus. The problem is, no matter what its called, the same basic reality applies.

The problem is that the information the eager Information Age marketer sells is translated into a series of logical processes represented by numbers that are sent through a universal computer. And, a universal computer is, well, universal. It can run any algorithm with which it is programmed. Duplication of the stuff stored in a computer's memory is really easy. So, it is impossible to make money based on the scarcity of information.

In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to restrict access. But, these are consistently foiled again and again because, universal computers are universal -- and they can be programmed with the algorithms to defeat the restrictions. In response, the restrictions have been getting more and more fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reporting on violations of access restrictions when a computer goes online, or even shuts computer’s operating system and ability to function entirely. Cory Doctorow says, “digital rights management always converges on malware.” This de-functioning is especially common for computers that are marketed as smartphones, tablets, and the innards of DVD players -- the marketers try to disguise this by avoiding the word “computer.”

 
Added:
>
>
Anything thought builds; thought can undo. All the most sophisticated means of restricting access can be circumvented. The knowledge of how to circumvent can be restricted by punishing people who come up with the algorithms, censoring the websites that publicize them, and watching those who seek them. But, even these measures can themselves be circumvented with an algorithm and a computer. All you need is knowledge and thought.

This means that opposing copyright law is easy. To enforce it in the digital age inevitably means punishing thought about how to circumvent. It means supporting an already obsolete business model is worth putting a penalty on certain kinds of thinking. No economic argument for copyright could possibly win out over preserving the basis of free society. Creating an infrastructure that can be used to regulate or punish any thoughts, not just the duplication of artistic work, is clearly overkill.

There will be harder questions as computers do more and more stuff. We need to recognize that restricting what the computer does needs to be balanced against what that restriction costs in terms of our personal freedom to think. They couldn’t actually stop Prometheus. They were only able to chain him to a rock after the fact. Ultimately, any technical restrictions on computers, no matter how devious, cannot actually prevent anything that can’t be circumvented. They can only be used punish those who try to circumvent them and get caught -- to punish their illegal thoughts.

-- BahradSokhansanj - 12 Jan 2012

 
 
<--/commentPlugin-->
\ No newline at end of file

BahradSokhansanjSecondPaper 3 - 12 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"
Changed:
<
<

Digital Freedom

>
>

We Are All Prometheus

 stream of consciousness draft
Changed:
<
<
In his speech, Cory Doctorow casts the current battles of copyright law, like the continued escalation of digital rights management and the debates surrounding the DMCA and the SOPA, as the relatively low-stakes forerunners of a broader, emerging, and truly significant war on the general purpose computer. The challenge the general purpose computer poses to copyright is that a computer implicitly contain the power to circumvent any digital measure to guarantee copyright limitations. Similarly, the Internet, powered by general purpose computing, provides the ability to circumvent any measures to limit online access. The result, then, is that the only way to really enforce copyright through digital means requires limiting the capabilities of general purpose computers and the general purpose Internet. On the desktop, or laptop, or anywhere else a computer may be found, this means secret spyware, and malware on computers to monitor and control -- and rootkits that prevent the installation of alternate operating systems or running unauthorized software to circumvent them. Online, this means active surveillance and censorship.
>
>
In a recent speech, Cory Doctorow casts the current battles of copyright law, like the continued escalation of digital rights management and the debates surrounding the DMCA and the SOPA, as the relatively low-stakes forerunners of a broader, emerging, and truly significant war on the general purpose computer. The challenge the general purpose computer poses to copyright is that a computer implicitly contain the power to circumvent any digital measure to guarantee copyright limitations. Similarly, the Internet, powered by general purpose computing, provides the ability to circumvent any measures to limit online access. The result, then, is that the only way to really enforce copyright through digital means requires limiting the capabilities of general purpose computers and the general purpose Internet. On the desktop, or laptop, or anywhere else a computer may be found, this means secret spyware, and malware on computers to monitor and control -- and rootkits that prevent the installation of alternate operating systems or running unauthorized software to circumvent them. Online, this means active surveillance and censorship.
 The reason why this is a broader war is, of course, that computers -- and networks -- are not just found on the desktop, or laptop, or even a smartphone. As chips, storage, and wireless communicators have gotten smaller, faster, and cheaper, it makes more sense to just have a general purpose computer, with I/O access to the outside world, be the embedded device in applications that require computing power. Doctorow's examples include the computers we will embed in our body, like digital hearing aids, the computers that are embedded in vehicles and may soon provide even more control in self-driving cars, the computers that are used to power DNA sequencers and DNA synthesizers, the computers that drive 3-D printers. The question for governments and corporations no longer about just enforcing copyright, or restricting the flow of information. It becomes, how do we control the modification of a self-driving car's computer to maintain traffic control? How do we restrict the synthesis of viruses and microorganisms -- on one-level because of "bioterrorism fears," but more significantly, to protect GMO and biopharma patents? How do we prevent 3-D printers from being used to produce counterfeit goods -- or to make what is needed to make a semi-automatic handgun fully automatic?
Deleted:
<
<
The first thing to do is to recognize that there are really two questions here. One has to do with the extent of what should be regulated -- what should governments regulate? What should the manufacturer of a device that includes a general purpose computer To what extent can they enforce these limitations? Should it be limited to contract -- you lose your right to the warranty, or can they sue you for enhanced damages?
 
Changed:
<
<
There are the intertwined questions of what can be regulated, and how can that regulation be enforced? And then there is the question of what the government can do -- regulation and enforcement -- and what can a private entity do, in terms of regulation of users and then how can they enforce their restriction?

We should recognize that these questions are separable. We can permit private entities to regulate user behavior, but limit the enforcement -- such as preventing the exercise of a warranty where there is user modification. Is this really useless? Well, it saves the private entity money. We can also allow a government to force private entities to not regulate users in certain ways -- such as preventing a social network from retaining user data, or forcing them to allow a user to withdraw their data. Or, by forcing a company to allow the operation of multiple OS's on a smartphone, as opposed to merely exempting it from the DMCA. Should it be illegal for a company to change its terms of service, such as do what Sony did and prevent the installation of other OS's on the Playstation if someone wanted to go on the PlayStation? network.

The core question then becomes Doctorow's question -- how can general purpose computers be limited, or can they be limited? And how open must that limitation be? If the limitation is so deeply embedded -- as in the rootkit that bricks the system on a violation, or the spyware that reports any violation, should that be allowed? And what does that mean about our control over systems that may be embedded within us?

The key point is to recognize that this is not just about freedom to do what we want with a device that we own. This goes to important freedoms of thought and privacy. We are not used to the idea that governments can control the things we possess without our knowing about them -- and corporations can do the same thing. Also, there is no reason for not all general purpose computers to be compromised, forcing people who don't want at least one device to be so compromised to build them and hope to keep them free. By extension, does this also mean that all computers will be restricted?

-- BahradSokhansanj - 11 Jan 2012

>
>
-- BahradSokhansanj - 12 Jan 2012
 

 
<--/commentPlugin-->
\ No newline at end of file

BahradSokhansanjSecondPaper 2 - 12 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"
Changed:
<
<

How Patents Threaten Your Health

>
>

Digital Freedom

 
Changed:
<
<
Since I only have 1000 words at my disposal, my second essay is a follow-up to my first. As Eben pointed out, there are ways in which the future development of medicine may be "distorted" by the current property-exclusivity system -- in particular by Big Pharma as actors, though the key actors aren't only the firms commonly identified as "Big Pharma." Hopefully this is ok and I don't need to go to an entirely new topic, in which case I'll change this.
>
>
stream of consciousness draft
 
Changed:
<
<
A few things I want to talk about:
>
>
In his speech, Cory Doctorow casts the current battles of copyright law, like the continued escalation of digital rights management and the debates surrounding the DMCA and the SOPA, as the relatively low-stakes forerunners of a broader, emerging, and truly significant war on the general purpose computer. The challenge the general purpose computer poses to copyright is that a computer implicitly contain the power to circumvent any digital measure to guarantee copyright limitations. Similarly, the Internet, powered by general purpose computing, provides the ability to circumvent any measures to limit online access. The result, then, is that the only way to really enforce copyright through digital means requires limiting the capabilities of general purpose computers and the general purpose Internet. On the desktop, or laptop, or anywhere else a computer may be found, this means secret spyware, and malware on computers to monitor and control -- and rootkits that prevent the installation of alternate operating systems or running unauthorized software to circumvent them. Online, this means active surveillance and censorship.
 
Changed:
<
<
1. Everything can be patented in the current regime, and what is clearly not patentable -- because the patents have expired -- are increasingly being covered by exclusivity regimes that are effectively patents.
>
>
The reason why this is a broader war is, of course, that computers -- and networks -- are not just found on the desktop, or laptop, or even a smartphone. As chips, storage, and wireless communicators have gotten smaller, faster, and cheaper, it makes more sense to just have a general purpose computer, with I/O access to the outside world, be the embedded device in applications that require computing power. Doctorow's examples include the computers we will embed in our body, like digital hearing aids, the computers that are embedded in vehicles and may soon provide even more control in self-driving cars, the computers that are used to power DNA sequencers and DNA synthesizers, the computers that drive 3-D printers. The question for governments and corporations no longer about just enforcing copyright, or restricting the flow of information. It becomes, how do we control the modification of a self-driving car's computer to maintain traffic control? How do we restrict the synthesis of viruses and microorganisms -- on one-level because of "bioterrorism fears," but more significantly, to protect GMO and biopharma patents? How do we prevent 3-D printers from being used to produce counterfeit goods -- or to make what is needed to make a semi-automatic handgun fully automatic?
 
Changed:
<
<
2. Health is not equal to drug therapy -- and therapy is also not just about drugs.
>
>
The first thing to do is to recognize that there are really two questions here. One has to do with the extent of what should be regulated -- what should governments regulate? What should the manufacturer of a device that includes a general purpose computer To what extent can they enforce these limitations? Should it be limited to contract -- you lose your right to the warranty, or can they sue you for enhanced damages?
 
Changed:
<
<
3. Definition of what is a doctor.
>
>
There are the intertwined questions of what can be regulated, and how can that regulation be enforced? And then there is the question of what the government can do -- regulation and enforcement -- and what can a private entity do, in terms of regulation of users and then how can they enforce their restriction?
 
Changed:
<
<
4. Forces that are not Big Pharma that like the current regime.
>
>
We should recognize that these questions are separable. We can permit private entities to regulate user behavior, but limit the enforcement -- such as preventing the exercise of a warranty where there is user modification. Is this really useless? Well, it saves the private entity money. We can also allow a government to force private entities to not regulate users in certain ways -- such as preventing a social network from retaining user data, or forcing them to allow a user to withdraw their data. Or, by forcing a company to allow the operation of multiple OS's on a smartphone, as opposed to merely exempting it from the DMCA. Should it be illegal for a company to change its terms of service, such as do what Sony did and prevent the installation of other OS's on the Playstation if someone wanted to go on the PlayStation? network.
 
Changed:
<
<
-- BahradSokhansanj - 28 Nov 2011
>
>
The core question then becomes Doctorow's question -- how can general purpose computers be limited, or can they be limited? And how open must that limitation be? If the limitation is so deeply embedded -- as in the rootkit that bricks the system on a violation, or the spyware that reports any violation, should that be allowed? And what does that mean about our control over systems that may be embedded within us?

The key point is to recognize that this is not just about freedom to do what we want with a device that we own. This goes to important freedoms of thought and privacy. We are not used to the idea that governments can control the things we possess without our knowing about them -- and corporations can do the same thing. Also, there is no reason for not all general purpose computers to be compromised, forcing people who don't want at least one device to be so compromised to build them and hope to keep them free. By extension, does this also mean that all computers will be restricted?

-- BahradSokhansanj - 11 Jan 2012

 

 
<--/commentPlugin-->
\ No newline at end of file

BahradSokhansanjSecondPaper 1 - 28 Nov 2011 - Main.BahradSokhansanj
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="SecondPaper"

How Patents Threaten Your Health

Since I only have 1000 words at my disposal, my second essay is a follow-up to my first. As Eben pointed out, there are ways in which the future development of medicine may be "distorted" by the current property-exclusivity system -- in particular by Big Pharma as actors, though the key actors aren't only the firms commonly identified as "Big Pharma." Hopefully this is ok and I don't need to go to an entirely new topic, in which case I'll change this.

A few things I want to talk about:

1. Everything can be patented in the current regime, and what is clearly not patentable -- because the patents have expired -- are increasingly being covered by exclusivity regimes that are effectively patents.

2. Health is not equal to drug therapy -- and therapy is also not just about drugs.

3. Definition of what is a doctor.

4. Forces that are not Big Pharma that like the current regime.

-- BahradSokhansanj - 28 Nov 2011

 
<--/commentPlugin-->

Revision 18r18 - 04 Sep 2012 - 22:02:22 - IanSullivan
Revision 17r17 - 20 Mar 2012 - 00:19:20 - BahradSokhansanj
Revision 16r16 - 19 Mar 2012 - 04:40:52 - BahradSokhansanj
Revision 15r15 - 06 Mar 2012 - 02:49:49 - BahradSokhansanj
Revision 14r14 - 24 Jan 2012 - 15:05:53 - BahradSokhansanj
Revision 13r13 - 21 Jan 2012 - 20:00:06 - BahradSokhansanj
Revision 12r12 - 20 Jan 2012 - 21:59:59 - DevinMcDougall
Revision 11r11 - 18 Jan 2012 - 23:11:40 - BahradSokhansanj
Revision 10r10 - 18 Jan 2012 - 19:37:10 - BahradSokhansanj
Revision 9r9 - 18 Jan 2012 - 16:53:01 - BahradSokhansanj
Revision 8r8 - 17 Jan 2012 - 23:50:06 - BahradSokhansanj
Revision 7r7 - 16 Jan 2012 - 13:50:39 - BahradSokhansanj
Revision 6r6 - 15 Jan 2012 - 22:02:06 - BahradSokhansanj
Revision 5r5 - 15 Jan 2012 - 01:45:47 - BahradSokhansanj
Revision 4r4 - 14 Jan 2012 - 14:41:50 - BahradSokhansanj
Revision 3r3 - 12 Jan 2012 - 16:08:35 - BahradSokhansanj
Revision 2r2 - 12 Jan 2012 - 02:17:19 - BahradSokhansanj
Revision 1r1 - 28 Nov 2011 - 21:28:04 - BahradSokhansanj
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM