Notes for Paul N. Edwards The Closed World: Computers and the Politics of Discourse in Cold War America

Key concepts: automatic programming, closed world, compiler, connectionism, cyborg discourse, discourse, green world, interpreter, language-games, master tropes, microworlds, obligatory passage point, recombinant subjectivity, physical symbol systems, social construction of technology, subject positions, support, technowar.


Related theorists: Wiebe Bijker, Vannevar Bush, Noam Chomsky, Wendy Chun, Jay Forrester, Michel Foucault, Northrop Frye, James William Gibson, David Golumbia, Donna Haraway, Sherman Hawkins, Jonathan Jacky, Mark Johnson, George Lakoff, Bruno Latour, Claude Levi-Strauss, J.C.R. Licklider, Adrian Mackenzie, John McCarthy, Robert McNamara, John von Neumann, Alan Newell, Tevor Pinch, Oliver Selfridge, Claude Shannon, Herbert Simon, Alan Turing, Sherry Turkle, Ludwig Wittgenstein, Shoshana Zuboff.

Preface
(ix) The primary weapons of the Cold War were ideologies, alliances, advisors, foreign aid, national prestige—and above and behind them all, the juggernaut of high technology.
(ix) This book argues that we can make sense of the history of computers as tools only when we simultaneously grasp their history as metaphors in Cold War science, politics, and culture.


Acknowledgments
(xvii)
The Closed World began its life as a doctoral dissertation in the Board of Studies in History of Consciousness of the University of California at Santa Cruz, under the incomparable Donna Haraway. . . . Robert Meister, another superlative teacher, taught me how to understand the connections between subjectivity and political systems.
(xviii) Richard Gordon provided crucial guidance and material support as well as an intellectual home, through his Silicon Valley Research Group, where I carried out much of the original dissertation research. At Stanford University, Terry Winograd—an external advisor on my Ph.D. committee—offered extensive comments on the dissertation and vital, ongoing support.
(xviii-xix) I am particularly indebted, first, to the sociology and history of science and technology represented by such figures as Bruno Latour, Michel Callon, Steve Woolgar, Trevor Pinch, Thomas Parke Hughes, Wiebe Bijker, John Law, Donald MacKenzie, Steven Shapin, and Simon Schaffer. . . . Second, my work owes much to poststructuralist critical theorists such as Roland Barthes, Michel Foucault, Jacques Derrida, James Clifford, Frederic Jameson, Louis Althusser, and Hayden White. . . . Third, philosophical studies of artificial intelligence by Hubert Dreyfus, John Searle, Joseph Weizenbaum, Douglas Hoftstadter, Harry Collins, and Terry Winograd form part of this book's deeper background. Last but not least, the interpretive sociology of computer communities represented by the work of Sherry Turkle, Rob Kling, Jon Jacky, Lucy Suchman, and Leigh Star played a major part in my thinking about the relations among technology, metaphors, science, and subjectivity.


1
“We Defend Every Place”: Building the Cold War World

Closed world metaphor derives from feedback control model as dome of global oversight.

(1) As machine, computer controlled vast systems of military technology central to the globalist aims and apocalyptic terms of Cold War foreign policy. First air defenses, then strategic early warning and nuclear response, and later the sophisticated tactical systems of the electronic battlefield grew from the control and communications capacities of information machines. As metaphors, such systems constituted a dome of global technological oversight, a closed world, within which every event was interpreted as part of a titanic struggle between the superpowers. . . . Computers made the closed world work simultaneously as technology, as political system, and as ideological mirage.

Cyborg discourse obvious tie to Golumbia.

(1-2) Both the engineering and the politics of closed world discourse centered around problems of human-machine integration. . . . As symbol-manipulating logic machines, computers would automate or assist tasks of perception, reasoning, and control in integrated systems. Such goals, first accomplished in World War II-era anti-aircraft weapons, helped form both cybernetics, the grand theory of information and control in biological and mechanical systems, and artificial intelligence (AI), software that simulated complex symbolic thought. At the same time, computers inspired new psychological theories built around concepts of “information processing.” . . . Cyborg discourse, by constructing both human minds and artificial intelligence as information machines, helped to integrate people into complex technological systems.
(2) They cyborg figure defined not only a practical problem and a psychological theory but a set of
subject positions. Cyborg minds—understood as machines subject to disassembly, engineering, and reconstruction—generated a variety of new perspectives, self-interpretations, and social roles.

Three theses, three scenes: Operation Igloo White, Turing machines, the Terminator.

(2-3) In exploring these ideas, I will develop three major theses. First, I will argue that the historical trajectory of computer development cannot be separated from the elaboration of American grand strategy in the Cold War. . . . Second, I will link the rise of cognitivism, in both psychology and artificial intelligence, to social networks and computer projects formed for World War II and the Cold War. . . . Finally, I will suggest that cyborg discourse functioned as the psychological/subjective counterpart of closed-world politics. . . . Cyborgs, with minds and selves reconstituted as information processors, found flexibility, freedom, and even love inside the closed virtual spaces of the information society.
(3) This chapter sets the stage for the book's argument with three short scenes from the closed world.

Scene 1: Operation Igloo White
(3) Inside the ISC [Infiltration Surveillance Center at Nakhom Phanom in Thailand] vigilant technicians pored over banks of video displays, controlled by IBM 360/65 computers and connected to thousands of sensors strewn across the Ho Chi Minh Trail in southern Laos.
(3) The sensors—shaped like twigs, jungle plants, and animal droppings—were designed to detect all kinds of human activity, such as the noises of truck engines, body heat, motion, even the scent of human urine.
(4-5) Operation Igloo White's centralized, computerized, automated method of “interdiction” resembled a microcosmic version of the whole United States approach to the Vietnam War.
(5) To control the budget, [Robert] McNamara introduced a cost-accounting technique known as the Planning Programming Budgeting System (PPBS), which was built on the highly quantitative tools of systems analysis.
(5) The OSD [Office of the Secretary of Defense] literally micromanaged the bombing campaign, specifying the exact targets to be attacked, weather conditions under which missions must be canceled or flown, and even the precise qualifications of individual pilots.
(6) High-technology communications and computing equipment, nuclear weapons and Cold War nuclear anxiety, quantitatively oriented “scientific” administrative techniques, and the global objectives of U.S. military power combined to drive forward the centralization of command and control at the highest levels. At the same time, this drive created serious—and in the case of Vietnam, finally fatal—impediments both to effective action and to accurate understanding of what was going on in the field.
(6-7) From start to finish the Cold War was constructed around the “outputs” of closed systems like Igloo White and the PPBS. . . . The official language of the Cold War, produced by think tanks such as the Rand Corporation, framed global politics in the terms of game-theoretic calculation and cost-benefit analysis.
(7) None of this—metaphors, weapons, strategy, systems, languages—sprang into being fully formed.
(7) This book locates a key part of the answer to these questions at the intersection of politics, culture, and computer technology, in the ways computers and the political imagination reciprocally extended, restricted, and otherwise transformed each other.

Closed-world discourse language used by Edwards and repeated by many others who agree they are seeing, technologies, practices supporting vision of centrally controlled, automated power, as if the machines had taken over long ago and were systematically controlling humanity this moment in part due to poor execution of Kemeny vision; as media sustaining when run in human brains and machines, everything is created and sustained by computers constituting military control systems and supporting metaphorical understanding of world politics as technically manageable system.

(7) I use the phrase “closed-world discourseto describe the language, technologies, and practices that together supported the visions of centrally controlled, automated global power at the heart of American Cold War politics. Computers helped create and sustain this discourse in two ways. First, they allowed the practical construction of central real-time military control systems on a gigantic scale. Second, they facilitated the metaphorical understanding of world politics as a sort of system subject to technological management.

The Postwar World as a Closed System
(8) Containment, with its image of an enclosed space surrounded and sealed by American power, was the central metaphor of closed-world discourse.
(8) The language of global closure emerged early in the Truman administration as a reflection of perceived Soviet globalist intentions.
(9) That culture saw communism both as an external enemy to be contained or destroyed by overt economic manipulation, covert political intervention, and military force, and as an internal danger to be contained by government and civil surveillance, infiltration, public denunciations, and blacklisting.
(9) World-systems theory holds that the intrinsic logic of capitalism drives it to seek international economic integration: the elimination of trade barriers of all sorts (economic, political, social, and military) to foster free-market exchange. . . . According to this theory, when a single hegemonic power emerges within the world system, its structural position leads it to attempt to force other nations to abandon autarky in favor of free trade and free capital flows.
(10) Thus the world-system formed one kind of closed world, while the Soviet Union and its satellites formed another. The Cold War struggle occurred at the margins of the two, and that struggle constituted the third closed world: the system formed form the always-interlocking traffic of their actions.
(10-11) Under the Truman Doctrine and the Marshall Plan, the world had become a system to be both protected and manipulated by the United States. . . . Bilateralism created a systematic vision of the world by making all third-world conflicts parts of a coherent whole, surrogates for the real life or death struggle between the Free World and its communist enemies.
(11) By 1950, with the U.S. entry into the Korean War, the administration had defined American interests in totally global terms.
(11-12) Acheson, in the MacArthur hearings, explained that Korea itself mattered very little. Rather, American security now depended not only upon strategic might but also upon ideological power. To demonstrate the free world's strength, the United States must now actively repel communist aggression anywhere in the world. . . . [John Foster] Dulles threatened “massive retaliation”--implying nuclear force—in response to communist aggression anywhere in the world.

Characterizing the Closed World
(12) A “closed world” is a radically bounded scene of conflict, an inescapably self-referential space where every thought, word, and action is ultimately directed back toward a central struggle. It is a world radically divided against itself. Turned inexorably inward, without frontiers or escape, a closed world threatens to annihilate itself, to implode.

Literary criticism origin of closed world and green world.

(12-13) The term descends from the literary criticism of Sherman Hawkins, who uses it to define of the the major dramatic spaces in Shakespearean plays. Closed-world plays are marked by a unity of place, such as a walled city or the interior of a castle or house. Action within this space centers around attempts to invade and/or escape its boundaries.
(13) The alternative to the closed world is not an open world but what Northrop
Frye called the “green world.” The green world is an unbounded natural setting such as a forest, meadow, or glade. . . . Green-world drama thematizes the restoration of community and cosmic order through the transcendence of rationality, authority, convention, and technology. . . . Rather, the opposition is between a human-centered, inner, psychological logic and a magical, natural, transcendent one.
(13) Postwar American politics, as well as those of divided Europe, were in fact dominated by the same unity of place that characterizes closed-world drama. . . . The action was one of attempts to contain, invade, or explode a closed communist world symbolized by phrases like “the Iron Curtain” and physically instantiated by the Berlin Wall.

Rhetorical importance of simulations.

(14) Inside the closed horizon of nuclear physics, simulations became more real than the reality itself, as the nuclear standoff evolved into an entirely abstract war of position. Simulations—computer models, war games, statistical analyses, discourses of nuclear strategy—had, in an important sense, more political significance and more cultural impact than the weapons the could not be used.

Techniques, technologies, practices, fictions, and languages formed closed world discourse.

(15) “Closed-world discourse” thus names a language, a worldview, and a set of practices characterized in a general way by the following features and elements.

Scene 2: Turing's Machines
(16) Turing had considered the relationship between the infinite set of “configurations” of a simple imaginary computing machine—known today as the “universal Turing machine,” of which all possible digital computers are more or less incomplete instances—and the mental states of human beings.
(16) Elsewhere in his 1937 paper Turing made clear that the essential move in this analogy was to reduce each “state of mind” of the (human) computer to a single unit. . . . Any mechanical computer would necessarily perform each step in the course of performing the operation; ergo, the steps would functionally define discrete mental states. The mechanical computer might then be said to have a kind of mind, or alternatively, the human computer could be defined as a machine.
(18) Churchill placed the GCCS [Government Code and Cypher School in Bletchley Park] work among his top priorities and personally ordered that the group's requests for personnel and equipment be instantly and fully satisfied.
(18-19) This prediction—not those that herald the actual existence of thinking machines—is the second major theme of this book. . . . By the late 1980s phrases like “expert systems,” “artificial intelligence,” and “smart” and even “brilliant weapons” were part of the everyday vernacular of the business and defense communities and the popular press. . . . Within certain subcutlures, such as computer hackers and child programmers, highly articulated descriptions of the computer as a self with thoughts, desires, and goals, and of the human self as a kind of computer or program, were commonplace.

Cyborgs
(19) In psychology the new view, then still unnamed, opposed behaviorism's emphasis on external observables and simple conditioning with complex internal-process models based on metaphors of computers and information processing. It reached maturity in the middle 1960s with the publication of Ulric Neisser's
Cognitive Psychology.

Turkle second self and object to think with.

(19-20) This new and powerful conception of psychology evolved in a reciprocal relationship with a changing culture of subjectivity for which computers became, in Sherry Turkle's words, a “second self.” As she has shown, the analogy between computers and minds can simultaneously decenter, fragment, and reunify the self by reformulating self-understanding around concepts of information processing and modular mental programs, or by constituting an ideal form for thinking toward which people should strive. . . . With the emergence of global computer networks and “virtual reality” technologies for creating and inhabiting elaborated simulated spaces, by the 1990s cyberspace became a reality.
(20) World War II-era weapons systems in which humans served as fully integrated technological components were a major source of the ideas and equipment from which cognitivism and AI arose.
(20) Contemporary high-technology armed forces employ a second generation of computerized weapon systems that take computer-assisted control to its logical conclusion in fully automatic and, potentially, autonomous weapons.
(20) The word I will use to describe these and similar technologies, ranging from artificially augmented human bodies and human-machine systems to artificial intelligences, both real and hypothetical, is “cyborg.”
(21) Turing thus predicted the emergence of a language of intelligent machines that I will call “cyborg discourse.” This discourse is primarily concerned with the psychological and cultural changes in self-imagining brought on by the computer metaphor. . . . It is both an account and an expression of the view that the computer is an “object to think with,” in Turkle's phrase. . . . While closed-world discourse is built around the computer's capacities as a tool of analysis and control, cyborg discourse focuses on the computer's mind-like character, its generation of self-understanding through metaphor.
(21) Cyborg discourse is the field of techniques, language, and practice in which minds are constructed as natural-technical objects (in Donna
Haraway's phrase) through the metaphor of computing.
(22) Cyborg discourse is also political, though the politics in question are more often socio-cultural then governmental.

Scene 3: Cyborgs in the Closed World
(22) The closed world of computer-controlled global hegemony and the image of the computer as a cyborg, a mind-like artifact, come together powerfully in
The Terminator (1984), a relatively low-budget science-fiction/horror film directed and co-written by James Cameron.
(240 The Terminator thus blends images of a perverse, exaggerated masculine ideal—the ultimate unblinking solider, the body-builder who treats his body as a machine—with images of computer control and robotic single-mindedness, complete with an alien subjective reality provided by the Terminator's-eye sequences.
(24-25) The subplot of
The Terminator is about arming women for a new role as soldiers, outside the more traditional contexts of marriage and male protectorship. The message is also that women are the final defense against the apotheosis of high technology, militaristic masculinity represented by the Terminator—not only because they harbor connections to emotion and love, as in more traditional imagery, but because they are a source of strength, toughness, and endurance: “good soldiers.”
(26) Humans have built subjective, intelligent military machines but are reduced to a militaristic, mechanical, emotionless subjectivity in order to fend off their own products.
(26-27) Just as facts—about military computing, artificial intelligence, nuclear weapons, and powerful machines—give credibility to fiction, so do fictions—visions of centralized remote control, automated war, global oversight, and thinking machines—give credibility and coherence to the disparate elements that comprise these discourses.

Cyborg discourse of human automata took particular trajectory under closed-world discourse through creation of iconographies and political subject position that persisted through the 1980s in the US.

(27) Cyborg discourse is the discourse of human automata: of cybernetic organisms for whom the human/machine boundary has been erased. Closed-world discourse represents the form of politics for such beings: a politics of the theorization and control of systems. Thus the third theme of this book is the interactive construction of facts and fictions through the creation of iconographies and political subject positionsmaps of meaning, possible subjectivities, narrative frames—within the dramatic spaces of the closed world.

Tools, Metaphors, and Discourse

What Are Computers?
(27) Computers are clearly
tools or machines, technical levers usefully interposed between practical problems and their solutions. But two essential features distinguish computers from all other machines. These are (a) their ability to store and execute programs that carry out conditional branching (that is, programs that are controlled by their own results to an arbitrary level of complexity), and (b) their ability to manipulate any kind of symbolic information at all, including numbers, characters, and images.
(28) The computer's extraordinary flexibility and its special nature as an information machine make it attractive as an analogy for other complex processes less well understood. Thus the computer has also become a culturally central
metaphor for control, for scientific analysis, and for the mind. . . . For heavy users [Turkle argues], the computer can become a kind of virtual reality—a domain of experience and a way of life.

Tools as Metaphors
(30) Language is a prominent element in this [Weizenbaum's term] “imaginative reconstruction.” Complex tools like computers and cars evolve complex languages for talking about their functioning, their repair, and their design.

Concepts of Discourse
(31) I intend “discourse,” by contrast, to the both broader and more neutral with respect to the truth or falsity of belief, emphasizing the constructive and productive elements of the interaction of material conditions with knowledge, politics, and society.
(32) “Discourse,” in my usage, will be neither so hermetic nor so coherent as “paradigm” has often been interpreted to be.

Pinch and Bijker social construction of technology approach.

(33) Trevor Pinch and Wiebe Bijker's research program in the social construction of technology signals the power of an analysis of technology guided first and foremost by its role in social groups.
(34) As Pinch and Bijker themselves have noted, few studies have managed fully to engage the relationship between the meanings of scientific facts or technological artifacts and their sociopolitical milieu.
(34) In the larger sense I will employ here, though, discourse goes beyond speech acts to refer to the entire field of
signifying or meaningful practices: those social interactions—material, institutional, and linguistic—through which reality is interpreted and constructed for us and with which human knowledge is produced and reproduced. A discourse, then, is a way of knowledge, a background of assumptions and agreements about how reality is to be interpreted and expressed, supported by paradigmatic metaphors, techniques, and technologies and potentially embodied in social institutions.

Wittgenstein: Language-Games and Meaning as Use
(35) Once a basic vocabulary is established by training, new language can be learned by explanation. Only at this point can language begin to seem primarily representational. This is the force of Wittgenstein's slogan “meaning is use.”
(35) Language-games are profoundly public and conventional in nature.
(36) Wittgenstein's ultimate conclusion is that the process of grounding knowledge comes to an end within language-games—not in a reality external to the social world.
(36-37) Ultimately, for Wittgenstein, language-games are elements of “forms of life,” larger, more general, mutually reinforcing patterns of action, language, and logic. In
Leviathan and the Air-Pump, Shapin and Schaffer offer an extended example. They use Wittgenstein's concept to describe the “experimental life” constructed by Robert Boyle and his colleagues at the Royal Society in the seventeenth century.

Foucault and the Idea of Discourse
(37) But Foucault focuses on a factor Wittgenstein generally ignores: competition among discourses, motivated by power relationships among human groups.
(38) Discourses are the Wittgensteinian
forms of life which institutions and traditions structure for their inhabitants.
(38) When Foucault describes a discourse as an economy, he means that like the economy of wealth, social institutions constitute self-elaborating and above all
productive systems with their own elements and logic. . . . He rejects semiotic or linguistic models because they seem to reduce knowledge to the possession of meaningful symbols, whereas knowledge is for him the result of continuous micropolitical struggles.

Foucault support as object studied and invented by surrounding discourse applied to computers.

(38) The sense of a constantly regenerated and changing discourse differentiates Foucault's concept from the more monolithic stability of Wittgenstein's “forms of life.” . . . The support is the object at once studied and invented by the discourse that surrounds it. I will use this concept to describe the role of computers in closed-world and cyborg discourses.
(39) Such figures as the electronic control center (the War Room, for example) and the cyborg solider are
supports, in this sense, for closed-world discourse.
(39) The metaphor of a discursive economy also ties the self-elaborating logic of discourse to the reality of social power. . . . People who think they are being watched tend to do what they think they are supposed to do, even when they are not.
(39-40) More simply, power determines what can
count as true and false. This is the force of Foucault's concept of “power/knowledge”: true knowledge is an effect of power relationships, since power sets the norms for acceptable statements and also sets in motion the process of generating and evaluating those statements—but also itself produces power, since true knowledge enables its possessors to achieve their practical goals.
(40) Finally, the constant exchanges of language and knowledge in which a discourse is enacted actually help to constitute individual subjects and describe and mold the social body.

Discourse: Technology as Social Process
(40) A discourse, then, is a self-elaborating “heterogeneous ensemble” that combines techniques and technologies, metaphors, language, practices, and fragments of other discourses around a support or supports. It produces both power and knowledge: individual and institutional behavior, facts, logic, and the authority that reinforces it. It does this in part by continually maintaining and elaborating “supports,” developing what amounts to a discursive infrastructure. It also continually expands its own scope, occupying and integrating conceptual space in a kind of discursive imperialism. Like a paradigm, much of the knowledge generated by a discourse comes to form “common sense.”
(40-41) As applied to computers in the postwar world, my concept of discourse accepts neither the billiard ball imagery of technological “impacts” on society nor the too-frequent conspiracy imagery of technological “choices” as governed by dominant social groups. Instead it views technology as one focus of a
social process in which impacts, choices, experiences, metaphors, and environments all play a part.
(41) Science and engineering
normally proceed not so much by the application of well-codified methods to well-defined problems as by what Claude Levi-Strauss called bricolage, or “tinkering.” The models, metaphors, research programs, and standards of explanation that make up a scientific paradigm are assembled piece by piece from all kinds of heterogeneous material. To see science and engineering as tinkering—as discourse—is to blur and twist the sharp, neat lines often drawn between them and the knowledges and practices that constitute other human endeavors such as politics, commerce—or war.


2
Why Build Computers?: The Military Role in Computer Research

(44) For reasons we will explore below, analog computing was more easily adapted to the control applications that constituted the major uses of computers in battle. Only in retrospect does it appear obvious that command, control, and communications should be united within a single technological frame (to use Wiebe Bijker's term) centered around electronic digital computers.
(44) More often than not it was civilians, not military planners, who pushed the application of computers to military problems. Together, in the context of the Cold War, they enrolled computers as supports for the far-reaching discourse of centralized command and control—as an enabling, infrastructural technology for the closed-world political vision.

The Background: Computers in World War II
(45) One of the Allies' most pressing problems in Word War II was the feeble accuracy of antiaircraft guns. . . . The problem was solved by fitting the guns with “gun directors,” a kind of electromechanical analog computer able to calculate the plane's probably future position, and “servomechanisms,” devices that controlled the guns automatically based on the gun director's output signals.

Vannevar Bush: Creating an Infrastructure for Scientific Research
(46) The Moore School's 1930s collaboration with the BRL [Ballistics Research Laboratory], each building a differential analyzer under Bush's supervision, was to prove extremely important. During World War II, the two institutions would collaborate again to build the ENIAC, America's first full-scale electronic digital computer.

The ENIAC project
(50) In 1943 Moore School engineers John Mauchly and J. Presper Eckert proposed the ENIAC project. They based its digital design in part on circuitry developed in the late 1930s by John Atanasoff and Clifford Berry of the Iowa State College.

Serial processing of instruction stream key component of von Neumman architecture.

(50-51) The great mathematician John von Neumann became involved with the ENIAC project in 1944, after a chance encounter with Herman Goldstine on a train platform. By the end of the war, with Eckert, Mauchly, and others, von Neumann had planned an improved computer, the EDVAC. The EDVAC was the first machine to incorporate an internal stored program, making it the first true computer in the modern sense. (The ENIAC was programmed externally, using switches and plugboards.) The plan for the EDVAC's logical design served as a model for nearly all future computer control structures—often called “von Neumann architectures”--until the 1980s.
(51 footnote 20) Goldstine distributed this plan, under von Neumann's name but unbeknownst to him, as the famous and widely read “Draft Report on the EDVAC.” Because von Neumann's name was on its cover, the misunderstanding arose that he was the report's sole author. . . . The most essential feature of the so-called von Neumann architecture is serial (one-by-one) processing of the instruction stream.

Directing Research in the Postwar Era
(52) For a number of reasons, the looming fiscal constraints never materialized. Postwar federal expenditures for R&D remained far higher than before the war, with most of the money channeled through the armed forces.

Transference and Apocalypse
(53) In the eyes of many Americans, communism replaced fascism as an absolute enemy.

American Antimilitarism and a High-Technology Strategy
(57) Antimilitarism, because it required that the number of men under arms be minimized, also helped to focus strategic planning on technological alternatives. The Strategic Air Command came to dominate U.S. strategic planning because it controlled the technological means for intercontinental nuclear war.

Support for Research and Development
(58) In his famous 1945 tract
Science: The Endless Frontier, composed at President Roosevelt's request as a blueprint for postwar science and technology policy, Vannevar Bush called for a civilian-controlled National Research Foundation to preserve the government-industry-university relationship created during the war.

The Military Role in Postwar Computer Research
(61) [Kenneth] Flamm estimates that in 1950 the federal government provided between $15 and $20 million (current) per year, while industry contributed less than $5 million—20 to 25 percent of the total. The vast bulk of federal research funds at that time came from military agencies.

Consequences of Military Support
(62) First, military funding and purchases in the 1940s and 1950s enabled American computer research to proceed at a pace so ferocious as to sweep away competitors from Great Britain, the only nation then in a position to become a serious rival. . . . The Manchester University Mark I became, in June 1948, the world's first operating
stored-program electronic digital computer.

Why Build Computers?
(65) The speed and complexity of high-technology warfare have generated control, communications, and information analysis demands that seem to defy the capacities of unassisted human beings.

Analog vs. Digital: Computers and Control
(66) Most modern computers perform three basic types of functions: calculation, communication, and control.

Pinch and Bijker closure not reached for digital computers during their first decade, but then they took command, as Manovich now claims has occurred with software.

(70) Clearly, in the decade following World War II digital computers were a technology at the early phase of development that Trevor Pinch and Wiebe Bijker describe as, in essence, a solution in search of a problem. The technology of digital computation had not yet achieved what they call “closure,” or a state of technical development and social acceptance in which large constituencies generally agree on its purpose, meaning, and physical form.

Computers Take Command
(71) The automation of command clearly runs counter to ancient military traditions of personal leadership, decentralized battlefield command, and experience-based authority.


3
SAGE

(75) By almost any measure—scale, expense, technical complexity, or influence on future developments—the single most important computer project of the postwar decade was MIT's Whirlwind and its offspring, the SAGE computerized air defense system.

Whirlwind and the Trek from Analog to Digital Control
(76) Exploring this transition will highlight the simultaneously technical, social, and institutional character of technological choice.

Computers for Command and Control
(79) As the conflict over funding approached a critical phase, Forrester began to case about for a new, more urgent, and more fundamental military justification.

Compare language of self-representation of the Whirlwind project to that of Linux kernel as indexical icon by MacKenzie.

(81) From this point on, Forrester's commitment to the goal of real-time military control systems increasingly differentiated Whirlwind from other digital computer projects. . . . These commitments were realized not only in Whirlwind's technical efforts, but in the language of its self-representation.

Mutual Orientation: Constructing the Future
(82) The source of funding, the political climate, and their personal experiences oriented Forrester's group toward military applications, while the group's research eventually oriented the military toward new concepts of command and control.

Cold War Politics, Strategic Doctrine, and Air Defense
(83) Almost without debate, city bombing became the nuclear strategic policy of the new Air Force.

Prompt Use”
(84) In essence, this was a doctrine of preemptive strike. The Air Force planned an all-out nuclear attack against the USSR in any situation where it appeared the USSR might be about to launch a strike of its own.

A Dangerous Complacency”: Resisting Air Defense
(87) In a period of intense interservice competition, the Air Force saw the Nike-Ajax project as worse than no defense at all, because it might lead not only to a “dangerous complacency” but to Army control of continental air defense.

From Whirlwind to SAGE
(91) The timing of [George E.] Valley's encounter with Whirlwind was serendipitous.

Converting the Air Force to Air Defense
(94) These committees, led in their thinking by Valley's group, constructed a grand-scale plan for national perimeter air defense controlled by central digital computers that would automatically monitor radars on a sectoral basis. In the event of a Soviet bomber attack, they would assign interceptors to each incoming plane and coordinate the defensive response. The computers would do everything, from detecting an attack to issuing orders (in the form of flight vectors) to interceptor pilots.

Centralizing Command, Mechanizing Control
(96) After a protracted conflict, the issue was taken to Secretary of Defense Wilson, who resolved it in favor of centralized control under SAGE. In interesting contrast, the USSR's eventual air defense program favored a decentralized approach, also advocated by early Rand Corporation and Stanford Research Institute studies.

Technological and Industrial Influences of SAGE
(99) A central thesis of this book is that computer technology and closed-world discourse were mutually articulated. If this is true, closed-world politics shaped nascent computer technology, while computers supported and structured the emerging ideologies, institutions, language, and experience of closed-world politics. Nothing better illustrates this mutual influence than the history of Whirlwind and SAGE.

Technology
(100) Readers unfamiliar with computer technology may not appreciate the extreme importance of these developments to the history of computing. Suffice it to say that much-evolved versions of all of them remain in use today. Some, such as networking and graphic displays, comprise the very backbone of modern computing.

Social construction of technology illustrated in reliability, speed, and networking of military equipment.

(100) Many of Whirlwind's technical achievements bear the direct imprint of the military goals of the SAGE project and the political environment of the postwar era. As a result, despite their priority of invention, not all of these technologies ultimately entered the main stream of computer development via Whirlwind and SAGE. Some, such as core memory, almost immediately made the transition to the commercial world. Others, such as algebraic languages, had to be reinvented for commercial use, for reasons such as military secrecy and their purpose-built character.

Industry
(101) SAGE contributed devices and ideas to commercial computer technology, increased the competitiveness of American manufacturers vis-a-vis their foreign competitors, and swayed the fortunes of individual companies.

Legacy of Foucaultian support for closed-world politics.

(103-104) Despite these many technical and corporate impacts, I would argue that the most essential legacy of SAGE consisted in its role as a support, in Michel Foucault's sense, for closed-world politics. For SAGE set the key pattern for other high-technology military enclosures for global oversight and control. It silently linked defense- and offense-oriented strategic doctrines—often portrayed as incompatible opposites—around centralized computer systems. It provided the technical underpinnings for an emerging dominance of military managers over a traditional experience- and responsibility-based authority system. At the same time, ironically, SAGE barely worked.

SAGE as Political Iconography
(104) A SAGE center was an archetypal closed-world space: enclosed and insulated, containing a world represented abstractly on a screen, rendered manageable, coherent, and rational through digital calculation and control.

Strategy and Automated Command
(107) SAGE—Air Force project 416L—became the pattern for at least
twenty-five other major military command-control systems of the late 1950s and early 1960s (and, subsequently, many more). These were the so-called “Big L” systems, many built in response to the emerging threat of intercontinental ballistic missiles (ICBMs).

Conclusion
(110) The Air Force located most SAGE direction centers at SAC bases. This decision had only one possible strategic rationale: SAC intended never to need SAGE warning and interception; it would strike the Russians first.
(110) Still, in another important sense, SAGE
didwork.” It worked for the research community, which used it to pursue major intellectual and technical goals. It worked as industrial policy, providing government funding for a major new industry. Perhaps most important, SAGE worked as ideology, creating an impression of active defense that assuaged some of the helplessness of nuclear fear.
(111) Seen in this light, SAGE was far more than a weapons system. It was a dream, a myth, a metaphor for total defense, a technology of closed-world discourse.


4
From Operations Research to the Electronic Battlefield

(114) This chapter traces the coevolution of computer technology, grand strategy, and closed world politics into the 1960s. We begin with the Rand Corporation, the Air Force think tank where natural scientists, social scientists, and mathematicians worked side by side to anticipate and prepare for the future of war.

Operations Research, Systems Analysis, and Game Theory at Rand
(114) New disciplines, such as operations research, cybernetics, information theory, communication theory, game theory, systems analysis, and linear programming, succeeded in devising such algorithms for difficult problems in communication and the control of highly complex systems.

Systems Analysis, Strategy, and Technology
(119) In his history of Rand, [Fred] Kaplan argues that the Rand foward-basing studies were a major goad for the U.S. nuclear buildup of the 1950s and 1960s. . . . The Rand analysts' formal models, increasingly assisted by computerized data processing, rapidly became the standard for military thinking about strategy and planning for the future “weapon systems” necessary to actualize it.

Rand and Computer Science
(121) Much of the corporation's early work in the computer field was and remains classified, but a general picture of its activities can be pieced together from its published documents, oral histories, and studies of other aspects of Rand research.

Robert McNamara, Systems Analysis, and Military Management
(126) The electorate chose Kennedy in part for his promise of an aggressive response to Rand-inspired predictions (leaked by Gaither Report commissioners) of a looming “missile gap” between the United States and the USSR. As his Secretary of Defense, Kennedy appointed the cerebral, 44 year-old Robert S. McNamara.

The Office of Systems Analysis
(127) The Pentagon systems analysis group also came to be called the “whiz kids,” after McNamara's Ford group.

Military Management: Integrated the Armed Services
(129) McNamara saw centralization of DoD decision-making as an imperative imposed not only by problems of cost, strategy, and technological choice, but by standard management practice.

Command and Control
(131) Centralized military management remained—and remains—in tension with the command tradition, which presumes nested spheres of responsibility within which detailed planning and control devolve, to the
lowest levels of authority. Nuclear forces, by contrast, flatten their hierarchies as much as possible and retain authority at the upper levels.

Command, control, communications and information also merge in modern computer operating systems (Chun).

(131) Computerization supported this flattening by automating many tasks and permitting rapid, accurate, and detailed central oversight. . . . A decade later, “command, control, communications, and information” (C3I) had become a single unified process.

Computers as Icons
(133) Whether the systems analysts required room-size IBM mainframes or whether they used desktop adding machines made no difference; the computer became their icon.

Mutual orientation of discourses.

(134) As with SAGE, the McNamara era reflects a discourse process of mutual orientation in which computers played a key part. Civilians, in this case data-oriented managers and economists, sought—using computers to implement the PPBS and the OSA to institutionalize systems analysis—to centralize and rationalize DoD procurements. Their discursive categories—systems, options, data, flexibility, limited war—required the development of program choices that linked strategy with technology and cost. The military, in response to these essentially managerial requirements as well as to rapidly evolving technology, constructed strategic options that depended upon increased centralization of command. The discursive categories of command and control, in turn, motivated the increasing sweep of automated, computerized command systems.

Vietnam
(134) With Sputnik and the space race came a new iconography of global closure: an Earth floating in the black void, encircled by orbiting spacecraft.

Computers and the “Production Model of War”
(137) To understand the role of computers and closed-world discourse in the Vietnam War, it is necessary to understand the war's enormous scale.

Gibson technowar.

(138-139) In his meticulous study of Vietnam, The Perfect War, James William Gibson argues that the institutions responsible for the war conceived the problem it was supposed to solve in the mechanistic terms of physical science. Metaphors of falling dominoes, popping corks, and chain reactions were used to describe the diplomatic situation. Communist governments and armies were depicted as demoniac machines, conscripting their people as parts and consuming their energy; Gibson calls this imagery “mechanistic anticommunism.” The entire transaction was understood as an accounting procedure in which capitalists scored “credits” and communists “debits.” Thus its planners were managers who saw the war as a kind of industrial competition. . . . Counterinsurgency, the new technology of limited war, would allow the prosecution of “technowarin the revolutionary jungles of the Third World.

On the Electronic Battlefield
(142) Operation Igloo White, like the SAGE system it so closely resembled, was a product of civilian scientists attempting to offer a high-technology
defensive strategy to replace the unpopular “retaliatory” (but in fact offensive) bombing campaign. . . . Like the Valley Committee and the Summer Study Groups that created SAGE, the Jasons were a blue-ribbon commission of civilian scientists who spent their summers working on military problems. . . . Their idea was to draw a line of fire between North and South Vietnam that would be so intense and so deadly accurate the NVA could not cross it.

Conclusion
(144) It was an attempt to apply the logic of the closed world to war in a green-world jungle.

Zuboff Information panopticon.

(144-145) The political purpose of the electronic battlefield was to build a deadly version of what Shoshana Zuboff has called an “information panopticon.” . . . On the pseudo-panoptic battlefields of the Vietnam War, soldiers subjected to panoptic control—managed by computers—did exactly what workers in panoptic factories often do: they faked the data and overrode the sensors.
(145) Pure information, “light without heat,” would illuminate future war. . . . Political leaders could achieve the ideal of American antimilitarism: an armed force that would function instantly and mechanically, virtually replacing soldiers with machines. The globe itself would become the ultimate panopticon, with American soldiers manning its guard tower, in the final union of information technology with closed-world politics.


5
Interlude: Metaphor and the Politics of Subjectivity

(147) According to received histories, in their early days all three of these fields [cybernetics, cognitive psychology, AI] were speculative and theoretical, without much practical import. Their significance lay mainly in their perspective on human nature: they pictured minds as nested sets of information processors capable of being duplicated, in principle, in a machine. In this sense, then, they are all “cognitive” theories. In contrast, I will argue that the
cyborg discourse generated by these theories was from the outset both profoundly practical and deeply linked to closed-world discourse. It described the relation of individuals, as system components and as subjects, to the political structures of the closed world.

Politics, Culture, and Representation
(148) We start with politics, which we can define as the contest among social groups for power, recognition, and the satisfaction of interests.

The Power of Metaphor

Lakoff and Johnson master tropes.

(153-154) The linguist George Lakoff, working with the philosopher Mark Johnson, has been the foremost recent exponent of the view that language and thought are essentially structured by metaphor. . . . A unique feature of their theory is that it does not picture conceptual structure as a reflective representation of external reality. Instead, it views concepts as essentially structured by human life and action, and especially by the human body in its interaction with the world.
(158) Some metaphors become entrenched so deeply that they guide and direct many other systems of description. These “
master tropesprovide what amount to basic structures for thought and experience. They may also actually provide constitutive frameworks for institutions.

Computers ad Metaphors
(159) The Turing test makes the linguistic capacities of the computer stand for the entire range of human thought and behavior. . . . The manipulation of written symbols by computer and human being become processes exactly analogous to, if not identical with, thought.

Computer metaphor now pervades everyday self-understanding (Turkle).

(160-161) The computer metaphor contributes to the understanding of “mind” its far greater concreteness and vastly more detailed structure. . . . For the wider world, it eventually came to constitute a cultural background whose terms—like those of psychoanalysis, as Turkle as argued—increasingly pervade the self-understanding of ordinary people.

Entailments of Computer Metaphors
(161) What are the entailments of the Turing-test metaphor THE MID (OR BRAIN) IS A COMPUTER?

Other Metaphors for the Mind
(162-163) First, consider the classical animal-machine metaphor, ANIMALS ARE REFLEX MACHINES. . . . The REFLEX MACHINE metaphor has certain parallels with the COMPUTER metaphor, but it leads in wholly different directions. . . . The metaphor directs attention toward the external variables controlling a response rather than toward internal transformations.

Subject Positions and Cyborg Discourse
(165) The phrase “cyborg discourse,” introduced in chapter 1, captures the COMPUTER metaphor's creative potential for structuring subjectivity.

Salient patterns of cyborg subjectivity include programming styles.

(166-167) By exploring how such subcultures use the COMPUTER metaphor in articulating key cultural formations such as gender and science, we can discern some of the salient patterns in cyborg subjectivity.
(167) Turkle thus establishes a dualism at the heart of cyborg subjectivity, a “hard” self and a “soft” one. This is an evocative, problematic, and paradoxical dichotomy.

Objects to Think With”
(169) One way to think with a computer—in their first three decades, almost the only way—is to learn its “language.” . . . Thus, if a language is a medium for thought, the kind of thinking computer languages facilitate is quite different from the reasoning processes of everyday life.

Jacky on computer languages encouraging programming styles reflecting subject positions.

(169-170) Computer scientist Jonathan Jacky has observed that each computer language tends to encourage a particular programming style, as do subcultures associated with each one. . . . The ongoing invention and spread of new computer languages is a symptom of the search not only for convenience of interaction, but for styles of thinking—subject positionscongenial to different kinds of users and their projects.

Manageable complexity of microworlds.

(171) What gives the computer this “holding power,” and what makes it unique among formal systems, are the simulated worlds within the machine: what AI programmers of the 1970s began to call “microworlds,” naming computer simulations of partial, internally consistent but externally incomplete domains.


6
The Machine in the Middle: Cybernetic Psychology and World War II

(175) New forms of technological power based on the amplification and insertion of soldiers' bodies inside electromechanical systems produced new ways of approaching the nature of the mind and led to the construction of new kinds of biopsychological explanatory spaces. These new tools and metaphors rejuvenated human experimental psychology, involving academic psychologists in practical design projects as consultants on the “machine in the middle” of complex human-machine systems.

Psychology as Power/Knowledge
(175) Psychology is the discipline that constructs and maintains the human individual as an object of scientific knowledge.

Cognitivism, Behaviorism, and Cybernetics
(179) Where behaviorism emphasized comparisons between animal and human behavior, and psychoanalysis concentrated on human social and discursive effects, cognitive psychology reconstructed both humans and animals as cybernetic machines and digital computers.

Cybernetics: The Behavior of Machines
(181) The most important element of the Rosenblueth-Wiener-Bigelow theory was the concept of “negative feedabck,” or circular self-corrective cycles, in which information about the effects of an adjustment to a dynamic system is continuously returned to that system as input and controls further adjustments. In 1943 the three published the landmark article “Behavior, Purpose, and Teleology,” which emphasized comparisons between servo devices and the behavior of living organisms guided by sensory perception.

Serres effect of heterogenous list.

(184 footnote 33) Cybernetics purposively casts itself as a metathoery, an explanation of how everything is connected to everything else. To do so it made extensive use of a “literary device” Geoffrey Bowker calls the “Serres effect,” namely the heterogeneous list.

Psycho-Engineering
(185) Bioengineered organisms would differ greatly from machines in structure and materials, but psychologically relevant (i.e., behavioral) categories would remain the same.

The Macy Conferences
(189) The group discussed the possibility of establishing a research center after the war at Princeton or MIT. Despite considerable enthusiasm on the part of Wiener and others, this never materialized. Instead, in 1946, McCulloch persuaded the Macy Foundation, via Fremont-Smith, to fund a series of interdisciplinary conferences under the eventual title “Cybernetics: Circular Causal and Feedback Mechanisms in Biological Social Systems.” Ten conferences were held between 1946 and 1953, each involving between twenty and thirty “regular” participants and from two to five invited guests.

The First Meeting: Computers as Brains
(190) The very first presentation, by von Neumann, consisted of a description of general-purpose digital computers. . . . Lorente de No followed up von Neumann's talk with a complementary discussion of neurons as digital processing units.

Exploring the Metaphor
(191) Verbatim transcripts of the last five meetings were made and published. Since the presentations were informal and the ensuing discussions were also transcribed, these documents provide an invaluable source for analyzing the genesis of a discourse. They show disciplines in the act of merging, sciences in the process of becoming, and the role new machines can take in creating new forms of discourse.

Challenges to Computational Metaphors
(195) The competing theories of von Foerster and Kohler demonstrate by counterexample a number of facts about the cybernetics group that help to explain why alternative analogies field to capture attention.

Vision and Tracking and Targeting

Compare to discussion of contested narratives by Hayles, who emphasizes rejected recommendations by Kubie and others.

(196) He [Kubie] conjectured that the central nervous system acts like a computer using a periodic scanning mechanism, possibly based on the brain's alpha rhythm.
(198) Like Kubie, Kluver downplayed the significance of the new metaphors by reading them as disguised or rephrased versions of traditional problems in psychological theory.

Project X: Noise, Communication, and Code

Shannon as American counterpart to Turing.

(200) In many ways Shannon was the American counterpart to Turing: a mathematician, interested in what would soon be called digital logic, whose wartime contributions involved him in cryptology.

The Chain of Command
(205-206) The listener as destination became an essentially mechanical linkage in the command circuit, carrying out the orders propagated by the electronic components and mediating among them. Human listeners in military roles were themselves conceived as X systems, natural-technical devices for decoding signals.


7
Noise, Communication, and Cognition

(209) The cybernetics community's leaders were primarily mathematicians and engineers. They succeeded in altering psychological approaches precisely because they were
not psychologists. . . . Instead, cybernetic ideas filtered slowly into psychological theory, channeled through interpreters within the discipline rather than taking it by storm.

The Psycho-Acoustic Laboratory and the Problems of Noise
(210) The airplane, the tank, and the submarine were primitive examples of what would eventually be labeled “cyborgs”: biomechanical organisms made up of humans and machinery. . . . The limits of communication under noisy conditions formed, therefore, ultimate limits to their effectiveness.

The Chain of Communication
(214) War noise thus helped to constitute communication as a psychological and psychophysical problem.

Language Engineering
(217) The PAL investigated the relative intelligibility of words and phrases used as alphabetic equivalents, telephone directory names, and tactical call signs in noisy radio and telephone links.

The Systems Research Laboratory

The Postwar Era

Psychological laboratory as Latour obligatory passage point.

(220) Thus in less than four years, the Harvard Psycho-Acoustic Laboratory created a number of innovations that proved vital for the future of cognitive theory. First, its psychoacoustic studies created a background for an emerging psycholinguistics. Second, it helped to construct visions of human-machine integration as problems of psychology. Third, it established the psychological laboratory as what Bruno Latour has called an “obligatory passage pointin the study of human-machine systems.

Psychoacoustics and Cognition: George A. Miller
(223) Out of this experience [at PAL] came both a doctoral thesis,
The Design of Jamming Signals for Voice Communications, and a book, Transmission and Reception of Sounds under Combat Conditions, the NDRC summary technical report describing PAL's history and wartime research.

Language and Communication
(223-224) Miller's second book,
Language and Communication, was the first major American text on the psychology of language and the first textbook in the field now known as psycholinguistics.

Bringing Information Theory to Psychology

Early potential of critical programming expressed by Miller.

(226) He spent the summer of 1950 at the Institute for Advanced Study, studying mathematics and computers with John von Neumann. There, he said, “I learned that anything you can do with an equation you can do with a computer, plus a lot more. So computer programs began to look like the language in which you should formulate your theory.”

A Cognitive Paradigm
(228-229) There he had his first realization, “more intuitive than rational, that human experimental psychology, theoretical linguistics, and the computer simulation of cognitive processes were all pieces from a larger whole.”
(229) At the [1956 second MIT Information Theory] symposium, Shannon and others gave papers on coding theory in engineering. Newell and Simon, fresh from the 1956 Dartmouth Summer Seminar on artificial intelligence organized by John McCarthy, presented their Logic Theory Machine. Noam Chomsky discussed an early version of transformational-generative (TG) grammar in his “Three Models of Language.” . . . Miller himself presented his theory of “chunking” from the “Magical Number Seven” paper.

Plans and the Structure of Behavior
(230) This triad [Miller, Galanter, Pribram] embodied the complex personal, institutional, and intellectual interconnections of postwar cybernetic psychology.
(230) Miller's discussions with Galanter and Pribram rapidly became a collaboration that produced the immensely influential book
Plans and the Structure of Behavior. This book used a detailed computer metaphor to describe human behavior. It adopted the terms “Plan” and “Metaplan” (roughly, program and system program) to describe the structure of purposive activity. In postulating a complex internal logical structure, Miller and his colleagues abandoned the notion that observation alone could support the construction of psychological theory.
(230) The central theoretical concept of
PSB was that of the TOTE unit, an acronym for “test-operate-test-exit.”

The Center for Cognitive Studies
(234) Over the years it became a gathering place for those scientists who were most active in the blending of psychology, linguistics, computer modeling, philosophy, and information theory that Miller and Bruner were backing. Noam Chomsky, Nelson Goodman, Benoit Mandelbrot, Donald Norman, Jerrold Katz, Thomas Bever, Eric Lenneberg, and Joseph Weizenbaum were only a few of the dozens of visitor who spent a year or more at the Center between 1960 and 1966.

Conclusion
(236-237) The social construction of theory, the centrality of metaphor, and the interconnections among social levels point us once more toward the concept of discourse. Miller and the PAL did much more than run experiments and prove theories. They helped set the terms of cyborg discourse, the understanding of minds as information processors and information machines as potential minds. . . . Building the scaffolding of cognitive theory from computer metaphors and information processing, Miller and the PAL produced a psychology of humans as natural cyborgs. In so doing, they helped create the cyborg subject, for whom experience and knowledge are built bit by bit, from pure information.


8
Constructing Artificial Intelligence

(239) AI established a fully symmetrical relation between biological and artificial minds through its concept of “
physical symbol systems.” It thus laid the final cornerstone of cyborg discourse.

From Cybernetics to AI: Symbolic Processing
(240) The founders of cybernetics were mostly mathematicians, like Wiener and Pitts, working with brain scientists, like Rosenblueth and McCulloch. While their theories of information and communication succeeded in erasing the boundary between humans and machines, allowing the construction of cyborgs, the cyberneticians themselves remained largely within the perspective of the mechanical. . . . The subject, in both senses, of cybernetics was always the
embodied mind.

The Turing Machine
(242) Many of the key elements of the electronic digital computer were already present in this purely hypothetical device: binary logic (square are either blank or marked), programs (“configuration”), physical “memory” (the tape), and basic reading and writing operations.

Symbolic Computing: Levels of Description

Nested levels from hardware electronics, digital logic, machine language, assembly language, high level languages, to user interfaces and operating systems.

(244) The operation of computing machinery can be described at a number of levels.
(245) Each level is
conceptually independent of the ones below and above, while remaining practically dependent on the lower levels.

Practical, contingent origins of symbolic computation in programming craft and cyborg discourse, rather than determined by theoretical concerns.

(246) An intellectual history might say, anachronistically, that those practical conditions simply developed the genesis of symbolic computing, a development that merely played out an inevitable conceptual logic. But in fact symbolic computation did not emerge mainly from theoretical concerns. Instead, its immediate sources lay in the practice of the programming craft, the concrete conditions of hardware, computer use, and institutional context, and the metaphors of “language,” “brain,” and “mind”: in other word, the discourse of the cyborg.

Writing Programs, Building Levels: Programming and Symbolic Computing
(246) Until the late 1940s all program code was written in machine language; and numerical data had to be converted from decimal to binary form before the machine could process it.

Short Code first interpreted assembly language.

(247) In 1949 Eckert and Mauchly introduced Short Code (for the ill-fated BINAC). Employing alphanumeric equivalents for binary instructions, Short Code constituted an “assembly language” that allowed programmers to write their instructions in a form somewhat more congenial to human understanding. . . . A separate machine-language program called an “interpretertranslated Short Code programs into machine language, one line at a time. . . . But instructions still had to be entered in the exact form and order in which the machine would execute them, which frequently was not the conventional form or order for composing algebraic statements.

Earliest programmers were mathematicians and engineers who were close to the hardware; higher-level languages and compilers were needed for nonexperts, requiring more memory and machine time to support their execution.

(247) The earliest programmers were primarily mathematicians and engineers who not only programmed but also designed logic structures and/or built hardware. . . . Short, efficient algorithms and highly mathematical code rules this culture's values. . . . Somewhat ironically, perhaps, programming a computer became a kind of art form.
(247-248) But in order for nonexperts to write computer programs, some representation was required that looked much more like ordinary mathematical language. Higher-level languages, in turn, required
compilers able not merely to translate statements one-for-one into machine code, as interpreters did, but to organize memory addressing, numerical representation (such as fixed- or floating-point), and the order of execution of instructions. . . . Such programs required what then amounted to exorbitant quantities of memory and machine time.

Change in what was considered programming evident in compilers as automatic programming; strangely no mention of Hopper.

(249) In an attempt to kick-start the field, the Office of Naval Research sponsored symposia on “automatic programming” in 1954 and 1956. The name “automatic programmingfor compilers is itself revealing. “Programming” still referred to the composition of the machine-language instruction list. Algorithms written in a higher-level language were not yet seen as actual programs, but rather as directions to the compiler to compose a program; compilers thus performed “automatic programming.” The independence of symbolic levels in computing had not yet achieved the axiomatic status it later required.

Intelligence as Software
(250) Another epiphany for Newell was a November 1954 Rand presentation by Oliver
Selfridge, who was then working at Lincoln Laboratories. . . . At Lincoln, Selfridge was working on the computerized pattern-recognizing system that later evolved into the highly influential “Pandemonium,” a device that recognized letter forms and simple shapes.

The Dartmouth Conference
(252) In 1956, the Summer Research Project on Artificial Intelligence met for two months at Dartmouth College. The Dartmouth conference is generally recognized as the conceptual birthplace of AI (though Newell and Simon's work in fact predates it).

Very ambitious goals by McCarthy for an artificial language for the two month conference.

(253) McCarthy's own goal for the summer was “to attempt to construct an artificial language which a computer can be programmed to use on problems requiring conjecture and self-reference. It should correspond to English in the sense that short English statements about the given subject matter should have short correspondents in the language and so should short arguments or conjectural arguments. I hope to try to formulate a language having these properties and in addition to contain the notions of physical object, event, etc.”

Rejecting cybernetic, embodied brain models enshrined Enlightenment, closed mind.

(255-256) In rejecting the cybernetic brain models and learning machines, AI also rejected a model of mind as inherently embodied. . . . In its Enlightenment-like proposals for an encyclopedic machine, AI sought to enclose and reproduce the world within the horizon and the language of systems and information. Disembodied AI, cyborg intelligence as formal model, thus constructed minds as miniature closed worlds, nested within and abstractly mirroring the larger world outside.

Time-Sharing: Linking AI to Command and Control
(256) At this point, through a rather convoluted series of historical connections, AI intersected with another practical concern of programmers: the availability of computer time.

Like Netflix network usage, symbolic processing as a resource hog.

(257-258) By 1960, courses in computing thesis work, data reduction, and numerical processing devoured most of the available computer time. Yet symbolic processing—such as virtually all the work McCarthy, Minsky, and their AI group wanted to do—required an increasingly large proportion (in 1960, 28 percent of total research computing time). . . . By early 1961 an MIT report on computer capacity . . . recommended acquiring an “extremely fast” computer with a “very large core memory” and a time-sharing operating system as the solution to the time bottleneck.

McCarthy thinking aids as subjective environment required interactive time sharing, shifting social structure as well every from batch processing priesthood towards personal, private encounter.

(258-259) McCarthy needed time-sharing to provide the right subjective environment for AI work. . . . McCarthy and many of his coworkers wanted not simply to employ but to interact with computers, to use them as “thinking aids(in their phrase), cyborg partners, second selves. They wanted a new subjective space that included the machine.
(259) Through McCarthy's work, AI became linked with a major change not only in computer equipment but in the basic social structure and the subjective environment of computer work.

The Advanced Research Projects Agency
(260) In Licklider and his Information Processing Techniques Office, the closed-world military goals of decision support and computerize command and control found a unique relationship to the cyborg discourses of cognitive psychology and artificial intelligence.

J.C.R. Licklider
(263) The early 1950s were a period of profound intellectual excitement in these laboratories. In the years following Licklider's arrival, the list of his RLE and EE Department colleagues reads like an almost deliberate merger of cyberneticians, Macy Conference veterans, Psychol-Acoustic Laboratory staff, and AI researchers.

Man-Computer Symbiosis”
(264) Licklider proposed a more integrated arrangement to which computers would contribute speed and accuracy, while men (sic) would provide flexibility and intuition, “programming themselves contingently,” as he put it in a perfect cyborg metaphor.

Licklider a key text in philosophy of computing.

(266) “Man-Computer Symbiosis” rapidly achieved the kind of status as a unifying reference point in computer science (and especially in AI) that Plans and the Structure of Behavior, published in the same year, would attain in psychology. It became the universally cited founding articulation of the movement to establish a time-sharing, interactive computing regime.

The Information Processing Techniques Office
(267-268) [Jack] Ruina assigned Licklider responsibility not only for IPTO, but also for ARPA's Behavioral Sciences program. . . . Thus a range of existing connections both broad and deep carried Licklider into ARPA as an advocate of interactive computing, time-sharing systems, cognitive simulation, and artificial intelligence.

Conclusion: The Closed World and the Cyborg
(272) Thus, however basic and benign it may have seemed to the researchers working on individual projects, the process of producing knowledge of (and for) cyborgs was militarized by the practical goals and concrete conditions of its formation.

Heterogeneous discourse around Foucaultian support of electronic digital computer rather than deliberate plan.

(272) Academic psychologists and computer scientists generally did not understand the major role they played in orienting the military toward automation, “symbiosis,” and artificial intelligence as practical solutions to military problems. . . . They could do so precisely because for the most part there was no scheme, in the sense of some deliberate plan or overarching vision. Instead, this larger pattern took the form of a discourse, a heterogeneous, variously linked ensemble of metaphors, practices, institutions, and technologies, elaborated over time according to an internal logic and organized around the Foucaultian support of the electronic digital computer.
(273) Ultimately, closed-world discourse represents the political logic of the cyborg. Seen against its backdrop, military support for cognitive research and artificial intelligence is part of the practical future of military power. The closed world, with its mathematical models, tactical simulations, and electronic battlefields, represents the form of politics and war for brains seen as computers and minds conceived as information processors.
(273) Cyborg discourse is the political subject position, the “psychologic,” of the closed world's inhabitants. Artificial intelligence, man-computer symbiosis, and human information processing represent the reductions necessary to integrate humans fully into command and control. The cybernetic organism, with its General Problem Solvers, expert systems, and interactive real-time computing, is the form of a mind in symbiotic relationship with the information machines of the electronic battlefield.


9
Computers and Politics in Cold War II

(275) In the early 1980s, discourses of the closed world and the cyborg found their apotheosis. . . . The most controversial military program of the period, the Strategic Defense Initiative, relied to an unprecedented degree on centralized computer control, while its rhetoric employed extraordinary closed-world iconography. The related Strategic Computing Initiative sought to fulfill the promise of military artificial intelligence in autonomous weapons and battle management systems, putting cyborg theory into practice.

The Era of Détente
(276) Nixon and Kissinger strove to create an image of themselves as stern but cooperative negotiators ready for peaceful coexistence with the Russians.

Cold War Redux
(278) Carter's policy appointees, drawn from a wide ideological spectrum, also included a number of certified Cold Warriors, especially his national security advisor Zbigniew Brzezinski.

Computers and War in the 1980s

The Return of Military-Led R&D
(283) As figure 9.1 shows, the rise of d
étente strikingly paralleled this decline in military support for computer research. Equally remarkable, however, is the direct parallel between Cold War II and renewed federal investment in computing.

Computer Failure and Nuclear Anxiety
(284) The prominence of new peace and disarmament coalitions strongly suggests that Cold War II resurrected, at least for a large minority, powerful anxieties about nuclear holocaust. . . . Such anxieties found additional fuel in new fears of accidental war caused by computers.

The Dangers of Complexity
(286) Their very ubiquity made them increasingly problematic not only when they failed, but also when they worked normally.

Failure of Ada as universal language to address proliferation of computer languages.

(287) Similarly, software problems caused by the proliferation of computer languages had reached an acute stage. . . . Simply maintaining existing programs, in the face of rapidly changing hardware and increased interlinkage of software systems, had become a monumental task. In 1975 the DoD called for proposals for a new, universal computer language in which all future programs would be written. By 1983, the department had frozen specifications for the new language, known as Ada. As might have been expected, though, Ada's very universality proved problematic. Many within the computer science community proved problematic. Many within the computer science community decried the huge Ada instruction set, with its baroque, something-for-everyone character, and some questioned whether efficient compilers could ever be written.

SAGE Reborn: The Strategic Defense Initiative
(288) Such critiques had little impact on Reagan-era military planning. In March 1983 President Reagan appeared on national television to call for a Strategic Defense Initiative (SDI), a high-technology weapons program of a scope and scale matched only by the Manhattan Project. In the original version Reagan delivered, new weapons for ballistic missile defense would be based on outer space, shooting down Soviet missiles with laser beams powered by nuclear explosions.

Flawless specifications required closed system assumption.

(292) New techniques, such as artificial intelligence-based automatic programming from system specifications and automated program-proving procedures, were proposed as ways of eliminating human error and speeding the coding process, but few if any software engineers believed such methods would completely eradicate mistakes.
(292) Even if they did work, nothing could prevent errors
in the specifications themselves. Every possible contingency would have to be anticipated and accounted for. The program would have to respond correctly to a virtually infinite set of possible conditions. It would, in other words, have to capture the whole world within its closed system.

The Strategic Computing Initiative
(293-294) About six months after Reagan put forth his vision, DARPA announced a five-year, $600 million program in advanced computer technology called the Strategic Computing Initiative (SCI). The SCI linked university, industry, and government research in a comprehensive program ranging from research infrastructures (e.g., computer networks and microchip foundries) to “technology base” development of high-speed microprocessors and parallel computer architectures. . . . At this writing, over ten years later, many of the Strategic Computing projects continue, under the aegis of a new High Performance Computing program.
(294) The original plan's scale was extraordinary, but it became controversial for breaking precedent in three other ways: its proposals for battlefield artificial intelligence systems, its use of military applications as research goals, and its connections with the SDI command-control problem.

Artificial Intelligence
(294) It proposed finally to cash in on DARPA's decades of investment in artificial intelligence in order to build autonomous weapons, artificial assistants, and AI systems for battle management.

Battlefield Technology
(296) By coordinating research efforts around development goals, the SCI's planners hoped to achieve a specific set of “scheduled breakthroughs” that would rapidly turn basic advances into usable military technology. The plan contained a detailed ten-year timeline, setting dates for each breakthrough and showing how results would be coordinated to produce the final products.

Strategic Computing and Star Wars
(298) The fourth and most controversial reason for debates over the SCI was the explicit connection of its AI research with the battlefield management software planned for the Strategic Defense Initiative.

MITI challenge in fifth generations of hardware and software systems.

(298) (Like almost everyone else, DARPA planners knew nothing of the SDI until Reagan announced it on television.) Instead, they were responding largely to the Fifth Generation Computer program launched by the Japanese Ministry of International Trade and Industry (MITI) in 1981.

Five generations of hardware and software systems: vacuum tube, transistor, IC, VLSI, imagining decentralized parallel architectures matched by machine code, assembly, symbolic, structured, imagining intelligent knowledge-based languages.

(298 footnote 51) In a generally accepted genealogy of hardware and software, the first four “generations” of digital computers were constituted, in essence, by successive hardware innovations leading to order-of-magnitude reductions of scale. Vacuum tube, transistor, integrated circuit (IC), and very-large-scale integrated circuit (VLSI) technologies superseded each other at roughly ten-year intervals. Machine code, assembly languages, symbolic languages (such as FORTRAN or COBOL), and structured programming languages (Pascal) represented software innovations roughly corresponding to these stages. The fifth generation, as envisioned in the early 1980s, would involve machines built around decentralized “parallel” architectures, which execute numerous instructions simultaneously. The equivalent software innovation would be “intelligent knowledge-based systems” descended from the expert-systems branch of AI. The new generation was expected to understand natural language, “reason” like human beings within limited domains of knowledge, “see” with excellent acuity and comprehension, and command other “intelligent” processes.

Resealing the Dome: AI and the Closed World
(299-300) In terms that strikingly resemble J.C.R. Licklider's vision of “man-computer symbiosis,” the Strategic Computing planning document celebrated the imminent arrival of new computing technologies. . . . It took a further step, though, heralding the genuine artificial intelligences that would take up their place beside human soldiers, and not merely as assistants.


10
Minds, Machines, and Subjectivity in the Closed World

(303)
Star Wars, the movie, graphically represented a whole set of facts about the ongoing militarization of space, the social lives of computers and robots, cultural relativism as a Turing-test problem, and contests and collaborations over identity in a world of cyborgs.

In the Theater of the Mind: Fiction and Cyborg Subjectivity
(304) Narratives and images, like the scientific theories and metaphors we discussed in chapters 5-8, can represent possible
subject positions: imaginary, yet coherent and emotionally invested ways of living within a discourse.

Closed Worlds
(307) The closed world, as a dramatic archetype, is a world radically divided against itself. It is consumed, but also defined, by a total, apocalyptic conflict. The closed world is generally represented in fiction by enclosed artificial environments such as buildings, cities, underground rooms, or space stations.
(307) The archetype of the closed world is Homer's
Iliad; the Trojans cannot leave their walled city, while the Greeks cannot leave off their siege.

Immanent human forces sustaining closure: rationality, political authority, technology.

(308) The architecture and ambiance of the closed world mirror the psychological and political constraints against which characters struggle. The forces that sustain closure are immanent, human in origin: rationality, political authority, and technology.

Green Worlds

Green versus closed worlds in terms of my favorite 1980s computer games: Ultima III and Castle Wolfenstein.

(310) Green-world drama contrasts with closed-world drama at every turn, as shown in table 10.1.
(311) Green-world drama has the character of a heroic quest rather than that of a siege; its archetype is the Odyssey.
(311) Green-world powers are dangerous because they exceed human understanding and control, not because they are evil.

More Human than Human”: Second Selves
(313) Computers, computerized robots, and cyborgs became subjects: speaking, active agents.

Fictional Closed Worlds in the Early Cold War
(316) The first major popular works to treat the role of electronic machines (not only computers, but electronic communication and coding devices) in nuclear war were Sidney Lumet's
Fail-Safe and Stanley Kubrick's Dr. Strangelove, both released in 1964.
(316 footnote 8) For various reasons, though, I have chosen not to consult this [critical] literature.

Fail-Safe
(316) In
Fail-Safe, a highly unusual sequence of events causes the “fail-safe box” on one group of nuclear bombers accidentally to emit the command code that launches them on a mission to bomb Moscow. This command, once issued, cannot be reversed.

Dr. Strangelove
(319) The computer in
Dr. Strangelove is merely a machine, not a decision-maker; blame devolves upon the war-planners whose technological powers exceed their capacity to anticipate the future. The crises in both Dr. Strangelove and Fail-Safe come about because human rational plans fail to foresee unlikely sequences of events.

2001: A Space Odyssey
(321) HAL inhabits the closed world of the spaceship
Discovery, but in a different way from the human characters. Its mind and its presence fill the space; it sees and senses everything that goes on inside the ship, not only visually and aurally but through electrical contact with all the ship's systems. In a sense, the ship is HAL's body.

Colossus: The Forbin Project
(326) The ultra-powerful, ultra-rational machines built to protect humanity end up dominating it instead; the price of eternal peace is slavery. Ironically, Colossus achieves this in part by cooperating with its Soviet counterpart, something the humans failed to do for themselves.

Cyborg Subjectivity in the 1980s
(327) By the early 1980s, the simplistic computers out-of-control that dominated the 1960s and 1970s were replaced by a more sophisticated awareness not only of the machines themselves, but of the cultural networks and identities that had arisen around them: hackers, video gamesters, and teenage whiz kids. This had much to do with the arrival of home or “personal” computers, which provided a new opportunity for hands-on exposure to the machines.

War Games
(330) Disembodied AI turns to fearful foe, but then turns back again, rehabilitated by the touch of innocents (the teenagers David and Jennifer). This pattern of cyborg rehabilitation through communion with caring human beings recurred frequently in the 1980s, as computers were transformed from alienating instruments of corporate and government power to familiar tools of entertainment and communication.

Tron
(331) For
Tron provided a breathtaking glimpse into the surrogate sensory world inside the computer, which would soon after come to be called “cyberspace” or “virtual reality.”

Cyborgs in the Green World: The Star Wars Trilogy
(332) It illustrates the contrast with closed-world fictions, but its importance in the present context extends, as well, to striking closed-world imagery, interpretations of machine subjectivity in the green world, and the rehabilitation of cyborg figures in its final moments.

Star Wars as Green-World Drama
(334) The Death Star, no mere vehicle but a planetoid or moon, is a kid of ultimate closed-world image: an
inverted world, a world turned inside out, stripped of natural elements, and carrying all its life inside it.

Machine Subjectivity in the Green World
(336)
Star Wars presents an embodied machine subjectivity that is friendly, familiar, unthreatening, and personal.

Rehabilitating the Cyborg
(338) This pattern of rehabilitation and sacrifice of cyborg figures became common in mid-1980s film, especially—as with
Star Wars—in sequels to originals depicting cyborgs as threatening.
(339) The extremely popular late-1980s television series
Star Trek: The Next Generation connected cyborg rehabilitation directly with the increasing multicultural awareness of the 1980s.

Multiculturalism as Turing test escapes technical orbit default philosophers of computing work in; compare crossing perceptual threshold for ubiquitous presence of nonthreatening computers to Turkle robotic moment and to others who notice its passing, for it happened, we are past there, and need to get caught up by learning programming again.

(339) Commander Data's sometimes comic efforts to comprehend the human condition, particularly the emotions and intuitive thought he lacks, stand beside the human crew's efforts to comprehend the thoughts and feelings of alien cultures. As an android, Data simply presents one more problem in cultural relativism: here the cyborg Other, fully rehabilitated, is truly reduced to the status of merely another. Star Trek: The Next Generation's implicit equation between the problems of machine intelligence and those of encounters with alien cultures makes multiculturalism itself a kind of Turing test.
(339) What explains the cyborg rehabilitation of the 1980s, at the very height of the Reagan Cold War?
Star Trek: The Next Generation suggests that a sort of perceptual threshold had been crossed in popular culture. Ubiquitous, easily operated computers were less frightening and less representative of government and corporate power.

Conclusion: Recombinant Theater in Blade Runner and Neuromancer
(341) Transcendence is impossible in the closed world because, like a curved Einsteinian universe, it has no outside. Whatever begins to exceed it is continually, voraciously re-incorporated.

Recombinant cyborg subjectivity as only rebellion in closed world; Haraway trickster.

(341) Recombination—to appropriate a 1980s biotechnological information metaphor—is the only effective possibility for rebellion in the closed world: taking the practices of disassembly, simulation, and engineering into one's own hands. Coming to see oneself as a cyborg, fitting oneself into the interstices of the closed world, one might open a kind of marginal position as a constructed, narrated, fragmented subjectivity, capable of constant breakdown and reassembly.
(342) Subverting the closed world by an interstitial engagement rather than a green-world transcendence, the cyborg becomes an always partial, self-transforming outlaw/trickster living on the margins of panoptic power by crisscrossing the borders of cyberspace.

Blade Runner
(343) The Voigt-Kampff test is a kind of Turing test for a post-Turing world, where people cease to define themselves as Aristotelian rational animals—as Sherry Turkle has observed of contemporary computer cultures—and become, instead, emotional machines.

Neuromancer
(346) William Gibson's
Neuromancer became the vastly popular flagship work of “cyberpunk” science fiction, a new subgenre that linked computerization, mass-media artificial experience, biotechnology, and multiculturalism to a dark political future of massive urbanization and militarized corporate henemony.
(349) If everything is optional and recombinant in
Neuromancer—the body, experience, culture, reality itself—everything is also for sale.
(349) The new matrix-mind, self-assembled in a recombinatory genetics of information machines from the technological resources of the network, is pure technology, pure and purely disembodied AI.
(350) In closed-world discourse, as Haraway puts it, the “organic, hierarchical dualisms ordering discourse in 'the West' since Aristotle . . . have been cannibalized, or . . . 'techno-digested.'” . . . The energy product of that techno-digestion, however, powered the construction of a closed world, whose vast scale would allow neither escape nor transcendence.

Are there any green-world discourses to inhabit, or has techno-digestion consumed and reconstituted everything as closed world?

(350) The transcendent organic holism of pure green-world discourse has become very nearly impossible to inhabit. Only its vestiges survive. We might name animistic religions, feminist witchcraft, certain Green political parties, and the deep ecology movement as some of these vestigial locations. . . . The real green world has been largely contained within the closed world, trapped inside the boundaries of land-island national parks, the systems disciplines of ecology and genetic engineering, and the global-management aspirations of the Club of Rome and its successors.

Political subject position of recombinant cyborg as possible habitation, in sense of sojourning over dwelling, inside closed world; compare to Berry good streams.

(350-351) Thus the possibility that remains, the only possibility for genuine self-determination, is the political position of the cyborg. The subjectivity of recombination encourages, even demands, troubled reconstructions of traditional relationships among rationality, intelligence, emotion, gender, and embodiment. It accepts, in the most fundamental of ways, the existence of a space of interaction between human minds and computerized Others. It provides an interstitial, marginal, unholy, and unsanctioned subjectivity, not a blessing or a God's-eye vision. It offers no escape, no redemption, no unity or wholeness. But the recombinant cyborg mind, riding the flow of information, can cross and recross the neon landscapes of cyberspace, where truth has become virtuality, and can find a habitation, if not a home, inside the closed world.


Epilogue: Cyborgs in the World Wide Web
(353) Ironically, in one sense the Cold War's end marked the ultimate achievement of world closure: the realization of a global market economy—a complete world-system, in a favored phrase of economic historians.
(353) In the 1991 Persian Gulf War, the computerized weapons built for the Cold War were finally used to devastating effect. At the same time, another kind of information technology—Pentagon media control—filtered the war and its high-tech weapons into the homes of millions through a carefully constructed information sieve. America once again aimed to enforce the rules of a world-system economy by preventing a resource-rich state from withdrawing into autarky.

The Persian Gulf War
(354-355) In August 1990, the Iraqi army of Saddam Hussein invaded Kuwait. Soon afterward, President George Bush announced the dawn of a New World Order. Bush meant to promote a new international political and financial arrangement for interventions to restore (or maintain) peace.
(355) American audiences saw “smart” computers
embodied in weapons, proto-Terminators seeking out targets and destroying them with awesome force and fully hyped “precision.”
(355) In those moments a worldwide television audience experienced the joining of cyborg subjectivity with closed-world politics.

Neural Networks and AI

Neural network simulated on digital computers reemerges as viable AI approach after failure of symbolic processing.

(356-357) After just a few trials well-designed neural networks could recognize images and forms whose interpretations had eluded symbolic AI for decades. . . . A new school of programming, known variously as “connectionismand “parallel distributed processing,” saw these new techniques as the salvation of computerized intelligence.
(357) AI, once experienced in popular culture as the threat of disembodied, panoptic power, now came to represent the friendly future of the embodied, pseudo-biological machine. The paradigmatic cyborg of the 1990s was not HAL but Commander Data.

Terminator 2: Judgment Day

Rehabilitation of war cyborg for post-postmodern world.

(357) In T2, as its ad campaign called it, all the icons semiotically restructured by The Terminator—the emotional woman, the mechanical man, the white nuclear family—are systematically reconstituted for a post-Cold War, postfeminist, post-postmodern world through the rehabilitation of the war cyborg.

Human control over autonomous technology paves the way to the robotic moment.

(362) Terminator 2 seems to suggest that we are waking up from a nightmare run by cyborg Others to a world in which we control them. . . . The figures of the violent, unfeeling, but rational and devoted father-protector and the emotional, unstable, but loving mother return to form the basis of the post-Cold War social order. The cyborg-machine returns to its place as technological servant, intimating a renewed sense of human control over autonomous technology.
(363) Despite its ideological retreat,
Terminator 2 remains profoundly distrustful of white men. . . . The ersatz frontiers to which it returns us are neither transcendent nor green but technological, suburban, the ordinary contexts and struggles of ordinary lives--”learning the value of human life.”

Problem for cyborg politics of disingenuous multiculturalism leading to easy relativism, with nothing besides postmodern embrace of cultural chaos to fill voids left by prior great narratives except retrenchment in nationalism, fundamentalism, and global trade.

(363) This integration points to one of the fundamental problems cyborg politics may now encounter: co-optation by a disingenuous version of multiculturalism.
(363-364) This simplistic doctrine flattens all cultural difference into two categories.
Exotic differences function as the “interesting” resources for a Believe-It-Or-Not pseudo-anthropology. . . . It is all too easily transformed into the utterly mundane, and anthropology gives way to Disney World. As common places of everyday life, mundane cultural differences form the basis for an easy relativism.
(364) These ways of conceiving (and perceiving) cultural differences ignore the really pressing problems of an America that has become multicultural in fact long before it knows how to become so in values. . . . The postmodern embrace of cultural chaos seems no answer, yet the old secular authorities—science, Western liberalism, Marxism—have collapsed. In their place, increasingly, reign nationalism, fundamentalist religions, and global trade.
(365) Where the AIs, robots, and human-machines of the closed world had to negotiate the centralized, militarized power and totalizing ideologies of the Cold War, the new connectionist cyborgs will have to pay their way at the toll booths of the information superhighway. . . . Cyborgs in the World Wide Web will face the tripartite tension among the global bonds of communication and control technology; the ideological individualism of cyberspace, with its totally malleable personal identities and disposable virtual communities: and the deepening crisis of culture in an increasingly rootless world.


Edwards, Paul N. The Closed World: Computers and the Politics of Discourse in Cold War America. Cambridge, MA: The MIT Press, 1996. Print.