Notes for N. Katherine Hayles My Mother Was a Computer: Digital Subjects and Literary Texts

Key concepts: .

A profession of faith in being already posthuman, and ethical trajectory to not be as dominating as Bacon.

Related theorists: Anne Balsamo.

Prologue: Computing Kin

On the Title

Title inspired by Anne Balsamo, whose mother worked as a computer, in her study of gender implications of IT.

(1) [Anne] Balsamo's mother actually did work as a computer, and she uses this bit of family history to launch a meditation on the gender implications of information technologies. . . . The sentence stands, therefore, as a synecdoche for the panoply of issues raised by the releation of Homo sapiens to Robo sapiens, humans to intelligent machines.

For electronic literary projects like symposia it is all about different versions of the posthuman.

(2) In the twenty-first century, the debates are likely to center not so much on the tension between the liberal humanist tradition and the posthuman but on different versions of the posthuman as they continue to evolve in conjunction with intelligent machines.
(2-3) Although I have not abandoned my commitment to the importance of embodiment, it seems to me that contemporary conditions call increasingly for understandings that go beyond a binary view to more nuanced analyses. . . .
Writing Machines, How We Became Posthuman, and this book form a triology that arcs from the mid-twentieth century to the present, a trajectory that moves from a binary opposition between embodiment and information through an engagement with the materiality of literary texts to a broadening and deepening of these ideas into computation and textuality.
(3) Materiality, as I defined it in
Writing Machines, is an emergent property created through dynamic interactions between physical characteristics and signifying strategies. . . . This view of materiality goes hand in hand with what I call the Computational Universe, that is, the claim that the universe is generated through computational processes running on a vast computational mechanism underlying all of physical reality. . . . In this context, “My mother was a computer” can be understood as alluding to the displacement of Mother Nature by the Universal Computer.

Including sound studies.

(4) Rather, I am interested in the complex dynamics through the the Computational Universe works simultaneously as means and metaphor in technical and artistic practices, producing and also produced by recursive loops that entangle with one another and with the diverse meanings of computation as technology, ontology, and cultural icon. This dynamic interplay of recursive, multiple causalities becomes, I argue, the fertile ground for re-envisioning and remaking a wide variety of cultural artifacts, including computer simulations and literary texts.

Kittler subvocalization and mothers voice consummates print era; why stop here with beeps and clicks when we have text to speech of foreign languages and new possibilities for reading even our natural languages?

(4) reading functions as “hallucinating a meaning between the letters and lines.” . . . these practices gave 'voice' to print text, particularly novels – and the voice most people heard was the same voice that taught them to read, namely, the mother's, which in turn was identified with Mother Nature and a sympathetic resonance between the natural world and human meaning. . . . To an extent, then, the mother's voice that haunted reading has been supplanted by another set of stimuli: the visual, audio, kinesthetic, and haptic cues emanating from the computer. If the mother's voice was the link connecting subjectivity with writing, humans with natural environments, then the computer's beeps, clicks, and tones are the links connecting contemporary subjectivities to electronic environments, humans to the Computational Universe.
(5) Mystifying the computer's actual operation, anthropomorphic projection creates a cultural Imaginary in which digital subjects are understood as autonomous creatures imbued with human-like motives, goals, and strategies. This projection also has a reverse undertow, for it brings into question the extent to which human beings can be understood as computer programs.
(5) What resources do we have to understand the world around us? As Nicholas Gessler, among others, has pointed out, these resources can be grouped into three broad categories: mathematical equations, simulation modeling, and discursive explanations.

Ocularcentric literary letter versus oral; old narrative like semaphoric telegraphy compared to simulation modeling in Sterne.

(6) Narrative is much older than simulation modeling in artificial media – almost as old, many anthropologists believe, as the human species itself. . . . Although they can be rendered in visual forms . . . these appearances are generated through algorithms that operate first and foremost with numerical quantities. Because computers are much better equipped than human minds to carry out the staggeringly tedious calculations involved in creating simulations, simulations are closely associated with artificial intelligence and with postbiological subjectivities.
(6-7) The traffic between language and code that this similarity/opposition sets up is one of the principal ways in which digital subjects and literary texts are interrogated and articulated together in this book. The two other modalities highlighted here are the interpenetration of print and electronic text, and the dialectic between analog and digital representations. . . . Making, storing, and transmitting can be thought of as modalities related to information; they also help to constitute the bodies and subjects of texts. . . . As an embodied art form, literature registers the impact of information in its materiality, in the ways in which its physical characteristics are mobilized as resources to create meaning.

Completely misses text to speech simulation for this discussion of intermediation, see page 201; compare to Barthes grain as embodiment effect.

(7) This entanglement of the bodies of texts and digital subjects is one manifestation of what I call “intermediation,” that is, complex transactions between bodies and texts as well as between different forms of media.

Method and Scope
(8) From a systematic comparison of Saussure's semiotics, Derrida's grammatology, and programming languages, implications emerge that reveal the inadequacy of traditional ideas of signification for understanding the operations of code. . . . The point is not simply to jettison the worldviews of speech and writing – even if by some miraculous fiat this were possible – but rather to understand the processes of intermediation by which they are in active interplay with the worldview of code.

Great perspective for symposia project as dynamic assemblage of texts in embodied VR framework.

(9) This chapter [4] argues that taking materiality seriously requires different models of subjectivity than those usually assumed in textual editing, as well as changed concepts of how embodied texts relate to one another. . . . we should conceptualize texts as clustered in assemblages whose dynamics emerge from all the texts participating in the cluster, without privileging one text as more “original” than any other.
(11) In a recursive loop appropriate for intermediation, the epilogue cycles back to reconsider in a new light the issues raised in the first chapter, rethinking computation and embodiment not as opposed visions of a posthuman future but as intermediating modalities, both of which are crucial to the human world in the present.
(11) exploring how computer simulations may be related to human subjectivity and consciousness. . . . Needed are new theoretical frameworks for understanding the relation of language and code; new strategies for making, reading, and interpreting texts; new modes of thinking about the material instantiation of texts in different media; and new ways to put together scientific research with cultural and literary theory.


1 MAKING
Language and Code

Already division between human and machine providing four perspectives on making: machine language, machine code, human language, human code; not all, of course, are worth studying, however, neglecting a quick survey of the enumerated combinations allows possibly very fruitful philosophical digressions a chance to be tested.

1 Intermediation
Textuality and the Regime of Computation
Language and Code

It there really no human code equivalent, for I thought she makes the very point about disciplinary specialism developing codes forming the contours of their discourses.

(15) Among the differences are the multiple addressees of code (which include intelligent machines as well as humans), the development of code by relatively small groups of technical specialists, and the tight integration of code into commercial product cycles and, consequently, into capitalist economics. . . . If, as Stephen Wolfram, Edward Fredkin, and Harold Morowitz maintain, the universe is fundamentally computational, code is elevated to the lingua franca not only of computers but of all physical reality.

Language plus code.

(15-16) The scholarship on human language is, of course, immense, and a smaller but still enormous quantity of research exists on programming languages. To date, however, criticism exploring feedback loops that connect the two kinds of language have been minimal. . . . Language alone is no longer the distinctive characteristic of technologically developed societies; rather, it is language plus code.

A great take on the timeline from orality to literacy and a characterization of the beyond as digital computer code, shortened to code, her key theorists; this paragraph is a good model for establishing approach and methodology.

(16-17) In the next chapter, I consider the three principle discourse systems of speech, writing, and digital computer code. To focus my discussion, I choose as representative of speech the semiotic theories of Ferdinand de Saussure, and of writing, the early texts of Jacques Derrida, especially Of Grammatology, Positions, Writing and Difference, and Margins of Philosophy, work where he discusses the language system as theorized by Saussure and contrasts it with his theory of grammatology. My remarks on code are drawn from a number of sources; particularly important is the work of Stephen Wolfram, Edward Fredkin, Harold Morowitz, Ellen Ullman, Matthew Fuller, Matthew Kirschenbaum, and Bruce Eckel. . . . From the comparison of these worldviews emerges a series of tensions and problematics that will form the basis for the arenas of interaction – making, storing, and transmittingcentral to contemporary creative practices in scientific simulations, digital arts, electronic literature, and print literature.

The Regime of Computation
(17) Perhaps no philosopher in the last century has been more successful than Jacques Derrida in exposing the impossible yearning at the heart of metaphysics for the “transcendental signified,” the manifestation of Being so potent it needs no signifier to verify its authenticity.

She never defines computing; like many, alluding to the Universal Turing machine signals her position.

(18) Alan Turing gave a formalist definition of computation in his famous 1936 article describing an abstract computer known as the Turing machine, the most general version of which is called the Universal Turing machine. The Universal Turing machine, as its name implies, can perform any computation that any computer can do, including computing the algorithm that constitutes itself.
(18) The wide-reaching claims made for the Regime of Computation are displayed in Stephen Wolfram's
A New Kind of Science.
(19) Wolfram's slide from regarding his simulations as models to thinking of them as computations that actually generate reality can be tracked at several places in his massive text.

Ambivalent about committing fully to programming on account of shifting focus to implications of being situated in a cultural moment when the question whether to code arises, she places her faith in academic writing; this is followed up by OGorman, Bogost, and others who call for new forms of humanities activity.

(20) Rather than attempting to argue one side of the other of this controversial issue, I explore the implications of what it means to be situated at a cultural moment when the question remains undecidable – a moment, that is, when computation as means and as metaphor are inextricably entwined as a generative cultural dynamic.
(20) In
The Second Self and Life on the Screen, Sherry Turkle shows compellingly that such feedback loops can dramatically influence how people perceive themselves and others.

This long stretch for an example hints of Kittler even though she formally argues against his reductivist interpretation of military directed technological determinism; the world is fully of the real effects of the Regime of Computation all over the present US built environment.

(20-21) A second kind of feedback loop emerges when belief in the Computational Universe is linked with an imagined future through which anticipation cycles to affect the present. A striking example of the Regime of Computation's ability to have real effects in the world, whatever its status in relation to the underlying physics, is the initiative to reorganize the U.S. military to pursue “network-centric warfare.”
(21-22) Anticipating a future in which code (a synecdoche for information) has become so fundamental that it may be regarded as ontological, these transformations take the computational vision and feed it back into the present to reorganize resources, institutional structures, and military capabilities.

Her contrast of Egan novels with Zizek illustrates extreme poles of the source of inspiration in science fiction.

(22) We will return to the entanglement of means and metaphor in chapter 9 through an analysis of Greg Egan's “subjective cosmology” trilogy: Quarantine, Permutation City, and Distress. . . . The novels will be contrasted with Slavoj Zizek's analysis of the symptom, which presupposes that such phantasmatic imaginations function as symptoms pointing toward repressed trauma and underlying psychopathologies.
(22) The third way in which the Computational Universe functions indeterminably as means and metaphor is through its current status as a provocative hypothesis that has neither won wide consensus nor been definitively disproved.
(23) Fredkin argues that the discrete nature of elementary particles indicates that the universe is discrete rather than continuous, digital rather than analog. . . . the universe is digital all the way down and, moreover, can be understood as software running on an unfathomable universal digital computer.
(25) At best, then the claims of Fredkin and Wolfram are incomplete, especially with regard to the emergence of higher-order from lower-order complexity. This is where the views of Harold Morowitz and like-minded researchers become important, for they offer a way to think about emergence as a process that not only operates at a single level of complexity but also continues up a dynamical hierarchy of linked systems to produce complexities out of complexities.

Cannot help but think of tying dynamic ontology to points made by Bogost in Alien Phenomenology.

(25) Multileveled complex systems synthesized in this way are called “dynamical hierarchies” (sometimes, significantly, “dynamic ontology”), and the complexities they generate are potentially unlimited in scope and depth.
(27) Far from needing to presume or construct a separation between the observer and the observation – a premise necessary in the classical understanding of science to ensure objectivity – the computational perspective endorsed by Morowitz, Wolfram, and Fredkin has no need of such a stipulation. These researchers can account for the presence of the observer simply by introducing a component into the simulation that performs this function.
(27) In the Regime of Computation, code is understood as the discourse system that mirrors what happens in nature and that generates nature itself.
(28) In fact, no one to date has demonstrated clearly how complex dynamics in simulations can progress more than one level.
(28-29) As noted above, one weakness of Wolfram's argument is his underestimation of the importance of analog processes, and especially of the productive interplay between analog and digital processes.

Intermediation
(31) As the worldview of code assumes comparable importance to the worldviews of speech and writing, the problematics of interaction between them grow more complex and entangled. These complex and entangled interactions are what I call “intermediation,” a term suggested by Nicholas Gessler.
(31) An important aspect of intermediation is the recursivity implicit in the coproduction and coevolution of multiple causalities. . . . Although these feedback loops evolve over time and thus have a historical trajectory that arcs from one point to another, it is important not to make the mistake of privileging any one point as the primary locus of attention, which can easily result in flattening complex interactions back into linear causal chains.

It is still true that digital media invite simulacra, though other media still exhibit self-expressive details.

(31) A case in point is the current tendency to regard the computer as the ultimate solvent that is dissolving all other media into itself. Since sound, image, text, and their associated media (such as phonography, cinema, and books) can all be converted into digital code, many commentators, including Lev Manovich and Friedrich Kittler, have claimed that there is now only one medium, the digital computer.
(32) If anything, print readers relish all the more the media-specific effects of books precisely because they no longer take them for granted and have many other media experiences with which to compare them. . . . Recognizing entangled causalities and multiple feedback loops enables us to understand how media can converge into digitality and
simultaneously diverge into a robust media ecology in which new media represent and are represented in old media, in a process that Jay Bolter and Richard Grusin have called “remediation.”

Compare intermediation versus remediation to Bogost on units versus objects.

(33) “Remediation” has the disadvantage of locating the starting point for the cycles in a particular locality and medium, whereas “intermediation” is more faithful to the spirit of multiple causality in emphasizing interactions among media. In addition, “remediation” (thanks to the excellent work Grusin and Bolter have done in positioning the term) now has the specific connotation of applying to immediate/hypermediate strategies. . . . Perhaps most importantly, “intermediation” also denotes mediating interfaces connecting humans with the intelligent machines that are our collaborators in making, storing, and transmitting informational processes and objects.

Media Technologies and Embodied Subjects
(33-34) To help situate this project, let us consider the contrasting approaches of Friedrich Kittler and Mark B. N. Hansen. Kittler's strategy for escaping from the confines of humanist discourse is to focus on media rather than subjects.
(34) Strenuously resisting Kittler's coercive rhetoric, Hansen, in
New Philosophy for New Media, performs a violence of his own by attempting to reduce Kittler's argument to a linear causal chain that rests solely on the truth or falsity of Shannon's information theory. . . . Hansen instances Donald MacKay's alternative approach to information which was developed contemporaneously with Shannon's. In contrast to Shannon's separation of information and meaning, MacKay develops a concept of information that places the embodied receiver at the center of his theory.
(35) In contrast to Kittler, Hansen privileges embodiment as the locus of subjectivity, reading new media through its effects on embodied users and viewers.
(35) Notwithstanding their opposed viewpoints, Hansen and Kittler share a mode of argumentation that privileges one locus of the human/machine feedback loop at the expense of the other.
(36) A similar pattern emerges from the contrast between Espen Aarseth's
Cybertext and Jerome McGann's Radiant Textuality.
(38) Whereas Aarseth faces forward and reads print literature through a matrix developed in the context of computer games, McGann faces backward and reads electronic literature through a matrix developed in the context of print literature.


2 Speech, Writing, Code
Three Worldviews
The Locus of Complexity

Good support for significance for texts and technology; in the next section she explicitly connects programming and humanities, notes scope of connection is limited to Saussure and Derrida.

(39) Now that the information age is well advanced, we urgently need nuanced analyses of the overlays and discontinuities of code with the legacy systems of speech and writing, so that we can understand how processes of signification change when speech and writing are coded into binary digits.
(40) Out of many possibilities, I have chosen to focus on Ferdinand de Saussure's view of speech and Jacques Derrida's grammatological view of writing partly because these theorists take systematic approaches to their subjects that make clear the larger conceptual issues. . . . a perspective that immediately concerns programming for digital computers but also includes the metaphysical implications of the Regime of Computation. . . . This project, then, is not meant as a general comparison of code with structuralism and deconstruction but as a more narrowly focused inquiry that takes up specifically Saussure and Derrida.
(40) Derrida's remarkably supple and complex writing notwithstanding, much of his analysis derives from a characteristic of writing that would likely spring to mind if we were asked to identify the principal way in which writing differs from speech. Writing, unlike speech (before recording technologies), is not confined to the event of its making.

Advantage and limit of computational perspective (also noted by Turkle) may be enriched by philosophy of computing, such as software studies and critical code studies?

(41) Where does the complexity reside that makes code (or computers, or cellular automata) seem adequate to represent a complex world? . . . The advantages of the computational view, for those who espouse it, is that emergence can be studied as a knowable and quantifiable phenomena, freed both from the mysteries of the Logos and the complexities of discursive explanations dense with ambiguities. One might observe, of course, that these characteristics also mark the limitations of a computational perspective.
(42) Rather, for me the “juice” (as Rodney Brooks calls it) comes from the complex dynamics generated when code interacts with speech and writing, interactions that include their mundane daily transactions and also the larger stakes implicit in the conflict and cooperation of their worldviews.

Saussurean Signs and Material Matters
(42-43) Derrida marshals arguments to insist that writing, far from being derivative as Saussure claims, in fact precedes speech; “the linguistic sign implies an original writing.” . . . The productive role that constraints play in the Regime of Computation, functioning to eliminate possible choices until only a few remain, is conspicuously absent in Saussure's theory. Instead, meaning emerges and is stabilized by differential relations between signs.
(43) In contrast to the erasure of materiality in Saussure, material constraints have historically been recognized as crucial in the development of computers, from John von Neumann in the 1950s agonizing over how to dissipate heat produced by vacuum tubes to present-day concerns that the limits of miniaturization are being approached with silicon-based chips.

Materiality matters, compare sense of material constraints and limits to formant speech synthesis; note her example of TTL voltage thresholds is not really applicable to the state of the art implied in her previous discussion of silicon-based chips.

(43) For code, the, the assumptions that the sign is arbitrary must be qualified by material constraints that limit the ranges within which signs can operate meaningfully and acquire significance. . . . In the worldview of code, materiality matters.
(44) The advantage of defining an immaterial pattern as the signifier is obvious; through this move, he dispenses with having to deal with variations in pronunciation, dialect, and so on (although he does recognize differences in inflection, a point that Johanna Drucker uses to excavate from his theory a more robust sense of the materiality of the sign).

Apparent slippage to immaterial pattern by Saussure rectification invites comparison to simulacra and learned Latin.

(44) Rectifying voltage fluctuations could be compared to Saussure's “rectification” of actual sounds into idealized sound images.

Crucial theoretical move by Hayles linking Saussure semiotics to program-based computer technology, which is an ontological position itself; endnote pays attention to compiled versus interpreted signifieds.

(45) In the context of code, then, what would count as signifier and signified? Given the importance of the binary base, I suggest that the signifiers be considered as voltages – a suggestion already implicit in Friedrich Kittler's argument that ultimately everything in a digital computer reduces to changes in voltages. The signifieds are then the interpretations that other layers of code give these voltages. Programming languages operating at higher levels translate this basic mechanic level of signification into commands that more closely resemble natural language. The translation from binary code into high-level languages, and from high-level languages back into binary code, must happen every time commands are compiled or interpreted, for voltages and the bit streams formed from them are all the machine can understand. . . . Hence the different levels of code consist of interlocking chains of signifiers and signifieds, with signifieds on one level becoming signifiers on another. Because all these operations depend on the ability of the machine to recognize the difference between one and zero, Saussure's premise that differences between signs make signification possible fits well with computer architecture.

Derrida's Differance and the Clarity of Code
(46) In the context of digital computers, even less tenable than ambiguity is the proposition that a signifier could be meaningful without reference to a signified.

What ever, this is a tedious process to follow.

(47) Similarly, it makes no sense to talk about floating signifiers (Lacan's adaptation of Derrida's sliding signifier) because every change in voltage must be given an unambiguous interpretation, or the program is likely not to funcion as intended. Moreover, changes on one level of programming code must be exactly correlated with what is happening at all the other levels.

Advantages of citability and iterability only at OO level.

(48) Nor does code allow the infinite iterability and citation that Derrida associates with inscriptions, whereby any phrase, sentence, or paragraph can be lifted from one context and embedded in another. . . . Only at the high level of object-oriented languages such as C++ does code recuperate the advantages of citability and iterability (i.e., inheritance and polymorphism, in the discourse of programming language) and in this sense become “grammatological.”

Because remember now we are just talking about Derrida listening to typing Bach random. Check out Ullman for exam reading list candidates processed by committee members some taking responsibility for interface with Derrida, Bogost, Heidegger, Ong, Zizek, Turkle, and so on.

(48) Ellen Ullman, a software engineer who has been a pioneer in a field largely dominated by men, has written movingly about the different worldviews of code and natural language as they relate to ambiguity and slippage.

Hard to think about what she means describing Ullman experience of programmer cognition.

(49) Even as she tried to deal with the cloud of language in which these concerns were expressed, her mind raced to translate the concerns into a list of logical requirements to which her programmers could respond. Acting as the bridge arcing between the floating signifiers of natural language and the rigidity of machine code, she felt acutely the strain of trying to reconcile these two very different views of the world.

Code versus language, alluding to Cicero, enumerations of perspectives among code, language, humans, machines (Galloway).

(50) Code that runs on a machine is performative in a much stronger sense than that attributed to language. . . . In Protocol, Alexander R. Galloway makes this point forcefully when he defines code as executable language.

Here is where abundance generated by code like loaves and fish is the unimaginable surplus of matter and energy emanating from matter and energy of milieu (virtual phenomenological field phenomena), where philosophy crosses code includes free, open source objects, which I am trying to formally define bucking Bogost dislike of systems operations.

(52) Although code may inherit little or no baggage from classical metaphysics, it is permeated throughout with the politics and economics of capitalism, along with the embedded assumptions, resistant practices, and hegemonic reinscriptions associated with them. The open source movement testifies eloquently to the centrality of capitalist dynamics in the marketplace of code, even as it works to create an intellectual commons that operates according to the very different dynamics of a gift economy.

The Hierarchy of Code

A clear articulation of difference between code operations to fetch or execute.

(53) Like speech, coding structures make use of what might be called the syntagmatic and paradigmatic, but in inverse relation to how they operate in speech systems. . . . The paradigmatic alternatives are encoded into the database and in this sense actually exist, whereas the syntagmatic is dynamically generated on the fly as choices are made that determine which items in the database will be used.
(53) Flexibility and the resulting mobilization of narrative ambiguities at a high level depend upon rigidity and precision at a low level.

Expectation of hierarchies in reveal code dynamic.

(55) The “reveal code” dynamic helps to create expectations (conscious and preconscious) in which the layered hierarchical structure of the tower of language reinforces and is reinforced by the worldview of computation. . . . The more “natural” code comes to seem, the more plausible it is to conceptualize human thought as emerging from a machinic base of computational processes, a proposition explored in chapter 7.

Making Discrete and the Interpenetration of Code and Language
(56) The act of making discrete extends through multiple levels of scale, from the physical process of forming bit patterns up through a complex hierarchy in which programs are written to compile other programs. Understanding the practices through which this hierarchy is constructed, as well as the empowerments and limitations the hierarchy entails, is an important step in theorizing code in relation to speech and writing.
(57) To some extent, then, the technology functions like a rock strata, with the lower layers bearing the fossilized marks of technologies now extinct.
(57) In the progression from speech to writing to code, each successor regime introduces features not present in hits predecessors.
(57) Whereas procedural languages conceptualize the program as a flow of modularized procedures (often diagrammed with a flowchart) that function as commands to the machine, object-oriented languages are modeled after natural languages and create a syntax using the equivalent of nouns (that is, objects) and verbs (processes in the system design).

Eckel book appears to be a trade publication moreso than peer-reviewed scholarship, yet it will found our philosophy of computing implied by Hayles to make her arguments work, evoking a ruthless ethic for its sparse ontology; Hayles, meanwhile, provides another insightful example taken at the implementation level of interprogrammer discourse, the kind of philosophical debates around the lunch table of software developers nationwide. This could be an intense news feed for the lunch time, for some to hear the lecture seminar of Lacan via text to speech synthesis to the dismay, perhaps, of Zizek, who now cedes popularity to the programs, the abyss to which Kittler alludes.

(57) A significant advantage of this mode of conceptualization, as Bruce Eckel explains in Thinking in C++, is that it allow programmers to conceptualize the solution in the same terms used to describe the problem.
(58) We can now see that object-oriented programs achieve their usefulness principally through the ways they anatomize the problems they are created to solve – that is, the ways in which they cut up the world.

Late binding discussion a mind expanding glimpse at the machine understanding of something we humans can also imagine.

(59) Late binding is part of what allows the objects to be self-contained with minimum interference with other objects.
(59) The point of this rather technical discussion is simple: there is no parallel to compiling in speech or writing, much less a distinction between compiling and run-time.

Intermediation is where to meet intelligent machines, though our personal stance toward programming is also important; see above where Hayles establishes legitimate lines of argument.

(59) The importance of compiling (and interpreting) to digital technologies underscores the fact that new emphases emerge with code that, although not unknown in speech and writing, operate in ways specific to networked and programmable media. At the heart of this difference is the need to mediate between the natural languages native to human intelligence and the binary code native to intelligent machines.

Consider working code alternative to careless codework; her choice of C++ as a philosophically interesting programming languages agrees with my conclusions.

(59-60) C++ is consciously modeled after natural language; once it came into wide use, it also affected how natural language is understood. . . . As high-level computer languages move closer to natural languages, the processes of intermediation by which each affects the other accelerate and intensify. Rita Raley has written on the relation between the spread of Global English and the interpenetration of programming languages with English syntax, grammar, and lexicon. In addition, the creative writing practices of “codework,” practiced by such artists as MEZ, Talan Memmott, Alan Sondheim, and others, mingle code and English in a pastiche that, by analogy with two natural languages that similarly intermingle, might be called a creole.

Chun argues software is ideology.

(60-61) Indeed, Wendy Hui Kyong Chun goes so far as to say that software is ideology, instancing Althusser's definition of ideology as “the representation of the subject's imaginary relationship to his or her real conditions of existence.” . . . As is true for other forms of ideology, the interpolation of the user into the machinic system does not require his or her conscious recognition of how he or she is being disciplined by the machine to become a certain kind of subject.

Why we cannot ignore code and why we need philosophies of computing; note those with deep understanding of code are computer programmers and engineers, so the very force demanded by the ethical stance arrived through her arguments must arise from that for which it is summoned to oppose, and FLOSS facilitates emergence of hobbyists who may also this strongly sought understanding.

(61) This conclusion makes abundantly clear why we cannot afford to ignore code or allow it to remain the exclusive concern of computer programmers and engineers. Strategies can emerge from a deep understanding of code that can be used to resist and subvert hegemonic control by megacoprorations; ideological critiques can explore the implications of code for cultural processes, a project already evident in Matthew Fuller's call, seconded by Matthe Kirschenbaum, for critical code studies; readings of seminal literary texts can explore the implications of code for human though and agency, among other concerns. Code is not the enemy, any more than it is the savior. Rather code is increasingly positioned as language's pervasive partner.


3 The Dream of Information
Escape and Constraint in the Bodies of Three Fictions
The Economy of Information

Virtual bodies in books, not materiality of the media, is her focus.

(62) This chapter explores these intermediating cycles in the effects registered on bodies and subjectivities as they are represented in fictions from the close of the nineteenth century to three-quarters of the way through the twentieth century, roughly the period of time it took to go from the passive code of the telegraph to the executable code of the digital computer. At issue here are not so much the bodies of texts themselves – topics to which we will turn in the next set of chapters – but the bodies within texts and their relation to the human lifeworld as it is reconfigured by interpolating humans with machines that, as they become intelligent, increasingly interpenetrate and indeed constitute human bodies.

Had computing gone with Stallman rather than Gates, who knows what the past may have been like, imprinted by variants of the other cultural phenomena constituting that era (duration of reality production by media).

(62) Throughout this period, the dream of information beckoned as a realm of plenitude and infinite replenishment, in sharp contrast to what might be called the regime of scarcity.

But would it be better or worse under the non-FOSS regime, like TV in the UK?

(63) As I argued in How We Became Posthuman, assumptions that may not be immediately apparent link the conservation laws of thermodynamics with the formation of the liberal subject. . . . Rightly criticizing the rhetoric of free information, Markley points to the ecological and social costs involved in developing a global information network.
(64) The change from flat marks to hierarchical structures opens the possibility of interventions into the encoding/decoding chains. Because these interventions typically require humans to interact with communication technologies, the changed natural of signification ties back into the prostheses joining humans and machines.
Signification, technology, and subjectivity coevolve.

Clearly Hayles serves to ground theory and outline future progress by recruiting philosophers from the pool of programmers and engineers; does she also intend others already liberally inclined to learn and practice programming for years as if they were professionals?

(65) The trajectory formed by these three fictions displays a clear pattern. First the dream of information is figured as an escape, but the more powerfully it exerts its presence as a viable place in which to live, the more it appears not as an escape at all but rather an arena in which the dynamics of domination and control can be played out in new ways. What changes is finally not the regime of scarcity but the subjects within and through whose bodies the informational codes are transmitted.

Code as Reinscription: “In the Cage”
(66) Information is not a presence or an absence and so does not operate within that dialectic. Arther, information emerges from a dialectic of pattern and randomness, signal and noise.
(68) This shift from the arbitrariness of language to the noise of code signals a change of emphasis from the limitations of language in producing meaning to the limitations of code in transmitting messages accurately. . . . Whereas deconstruction focuses on economies of supplementarity, coding theory focuses on techniques of supplementation that graft together organic components with communication technologies. Deconstruction implies a shift from the authority of a speaking subject to the instabilities of a writing subject, whereas coding theory implies a transformation from the writing subject to a posthumn collective of encoders/decoders.

Coding theory as the beyond of postmodernism Turkle does not identify in which Hayles operates to articulate a consequence of these themes intermingling in human culture.

(69) Thus in code what is available for readerly inspection is not so much the ambiguity of meaning – we never find out what the numbers in the telegram refer to – but places in the text where interventions in the coding chain occur, as when the girl alters the telegram.
(70) In opening the informational realm to view, exploring its dynamics and possibilities, and finally allowing it to be folded again into a regime based on conservation, James writes what might be called the prequel to the story of information in the twentieth century. . . . As the century progresses, information increasingly determines the constitution of subjects and, as we saw in chapter 1, the construction of reality itself. A writer who not only anticipated this development but wrote about it with incandescent intensity is Philip K. Dick.

Code as Transubstantiation: The Three Stigmata of Palmer Eldritch
(74) Posing as a dream of information that can satisfy the deepest desires of humans, the Chew-Z world reveals itself not as a refuge but as a rapacious dynamic that preys on the autonomy of the liberal subject and, indeed, on the autonomy of the world itself.
(76) The implicit threat to the liberal subject when possessive individualism no longer functions as a foundation for subjectivity is here realized through the science fiction trope of the alien. The irony, of course, is that this subversion has been achieved without realizing information's dream of endless plenitude.

Hayles reaches radical descriptions of subjectivity that science fiction appears to generate.

(79) Tiptree's “The Girl Who Plugged In” takes this step. The self becomes a message to be encoded and decoded, but the self the receiver decodes is never exactly the same self the sender encoded. The liberal subject, distributed between a privileged and stigmatized body connected by a noisy channel, is not so much lost as reconstituted as a dream of the machine. The focus thus shifts from how the self expresses its agency to questions of who controls the machine.

Code as (Re)incarnation: “The Girl Who Was Plugged In”
(80) The issue, then, is not which world is real, for Delphi's high-flying life a P. Burke's spartan physical existence are both located in real time and space. Rather, the focus in on the connections that make the two bodies into an integrated cybersystem – the kinds of discipline, surveillance, and punishment to which the bodies are subject, and the distribution of agency between these two different sites in the cybersystem.
(82) If the corporation can intervene in the channels to discipline the bodies, perhaps the bodies can also intervene to send messages the corporation has not authorized.

Code as Female Subjectivity:
What Can a Girl Sell When She Really Needs To?

(83) Another way to think about the trajectory formed by “In the Cage,”
The Three Stigmata of Palmer Eldritch, and “The Girl Who Was Plugged In” is to trace the restricted ways in which the female characters can engage in commerce.
(85) Any claim to possessive individualism based on ownership of the self has thus been co-opted by a merger between corporate capitalism and communication technologies so potent that it operates in the intimate territory of nerve and muscle as well as global networks.
(85-86) Following the trope of prostitution and the locus of complexity through these stories reveals the continuing force of gender hierarchies, which do not disappear as possessive individualism collapses but rather continue to operate differentially for male and female characters. . . . What has changed is not the historical power difference between genders but the distribution of subjectivities in relation to the Regime of Computation. Information goes from being an imagined realm of plenitude in James's story, to a marked realm that interpenetrates reality in Dick's novel, to reality in Tiptree's fiction. . . . As Timothy Lenoir has suggested, Baudrillard's prediction of a hyperreality that will displace reality has proven too conservative to keep up with the transformative power of information technologies. . . . Finally, it is not scarcity and market relations that are transformed, but the subjects who are constrained and defined by how they participate in them.


Part 2 STORING
Print and Etext
4 Translating Media
From Print to Electronic Texts

Print bias in notions of textuality manifest by examining William Blake Archive.

(89) To explore these complexities, I propose to regard the transformation of a print document into an electronic text as a form of translation - “media translation” - which is inevitably also an act of interpretation. . . . The challenge is to specify, rigorously and precisely, what these gains and losses entail and especially what they reveal about presuppositions underlying reading and writing. My claim is that they show that our notions of textuality are shot through with assumptions specific to print, although they have not been generally recognized as such.
(90) The issues can be illustrated by the William Blake Archive, a magnificent Web site designed by three of our most distinguished Blake scholars and editors. . . . They thus declare implicitly their allegiance to an idea that Jerome McGann, among others, has been championing: the physical characteristics of a text – page size, font, gutters, leading, and so on – are “bibliographic codes,” signifying components that should be considered along with linguistic codes.

Navigational functions part of signifying structure.

(90-91) A moment's thought suffices to show that changing the navigational apparatus of a work changes the work. Translating the words on a scroll into a codex book, for example, radically alters how a reader encounters the work; by changing how the work means, such a move alters what it means. One of the insights electronic textuality makes inescapably clear is that navigational functionalities are not merely ways to access the work but part of a work's signifying structure.
(91) Great attention is paid to the relation of meaning to linguistic and bibliographic codes and almost none to the relation of meaning to digital codes. Matthew Kirschenbaum's call for a thorough rethinking of the “materiality of first generation objects” in electronic media is very much to the point.

What Is a Text?

Definition of text as abstract artistic entity.

(92) A work is an “abstract artistic entity,” the ideal construction toward which textual editors move by collating different editions and copies to arrive at their best guess for what the artistic creation should be (86). It is important to note that the work is ideal not in a Platonic sense, however, for it is understood to be the result of editorial assumptions that are subject to negotiation, challenge, community norms, and cultural presuppositions. . . . Gunder points out the the “work as such can never be accessed but through some kind of text, that is, through the specific sign system designated to manifest a particular work” (86). Texts, then, are abstract entities from which editors strive to excavate the work. . . . Only when we arrive at the lowest level of the textual hierarchy, the document, is the physical artifact seen as merging with the sign system as an abstract representation.
(93) Thinking of the text as “the
order of words and punctuations” is as print-centric a definition as I can imagine, for it comes straight out of the printer's shop and the lineation of type as the means of production for the booi.
(93-94) An even more serious objection to Shillingburg's definition is its implicit assumption that “text” does not include such qualities as color, font size and shape, and page placement, not to mention such electronic-specific effects as animation, mouseovers, instantaneous linking, and so on.

TEI and OHCO; I experienced this underdetermination implying interpretations of what a text is working on symposia.

(95) When texts are translated into electronic environments, the attempt to define a work as an immaterial verbal construct, already problematic for print, opens a Pandora's box of additional complexities and contradictions, which can be illustrated by debates within the community formulating the Text Encoding Initiative (TEI). The idea of TEI was to arrive at principles for coding print documents into electronic form that would preserve their essential features and, moreover, allow them to appear more or less the same in complex networked environments, regardless of platform, browser, and so on. To this end, the community (or rather, an influential contingent) arrived at the well-known principle of OHCO, the idea that a text can be encoded as an ordered hierarchy of content objects. As Allen Renear points out in his seminal analysis of this process, the importation of print into digital media requires implicit decisions about what a text is. Expanding on this point, Mats Dahlstrom, following Michael Sperger-McQueen, observes that the markup of a text is “a theory of this text, and a general markup language is a general theory or conception of text.”

The default theories of textuality built from underlying assumptions of practitioners.

(95) Although most of these researchers thought of themselves as practitioners rather than theorists, their decisions, as Renear points out, constituted a de facto theory of textuality that was reinforced by their tacit assumption that the “Platonic reality” of a text really is its existence as an ordered hierarchy of content objects.
(96) My interest in this controversy points in a different direction, for what strikes me is the extent to which all three positions – Platonist, pluralist, and antirealist – focus almost exclusively on linguistic codes, a focus that allows them to leave the document as a physical artifact out of consideration.
(96-97) Only if we attend to the interrelations of linguistic, bibliographic, and digital codes can we grasp the full implications of the transformations books undergo when they are translated into a digital medium.
(97) Since no print books can be completely encoded into digital media, we should think about correspondences rather than ontologies, entraining processes rather than isolated objects, and codes moving in coordinated fashion across representational media rather than mapping one object onto another.

Physicality, Materiality, and Embodied Textuality

McGann experiments in failure example of software that keeps revising versus static literary texts.

(97-98) Whether textual form should be stabilized is a question at the center of Jerome McGann's “experiments in failure,” which he discusses in Radiant Textuality. . . . the Web's remarkable flexibility and radically different instantiation of textuality also draw into question whether it is possible or desirable to converge on an ideal “work” at all.

Deformation as reading practice, emphasizing importance of doing and making.

(98) Instead he argues for the practice of what he calls “deformation,” a mode of reading that seeks to liberate from the text the strategies by which it goes in search of meaning. . . . Just as textual criticism has traditionally tried to converge on an ideal work, so hermeneutical criticism has tried to converge on an ideal meaning.
(98-99) This kind of argument opens the way for a disciplined inquiry into the differences in materiality between print and electronic textuality. . . . He emphasizes the importance of
doing and making, suggesting that practical experience in electronic textuality is a crucial prerequisite for theorizing about it.
(99) So extensive and detailed are his redescriptions that one wonders if electronic text has any distinctive features of its own.

A key period in early history of new technology versus comparison with long history of previous medium.

(99-100) It is obviously inappropriate to compare a literary medium that has been in existence for fifteen years with print forms that have developed over half a millennium. A fairer comparison would be print literature produced from 1550 to 1565, when the conventions of print literature were still in their nascent stages, with the electronic literature produced from 1985 to 2000.
(100) The stubborn fact remains, however, that once ink is impressed on paper, it remains relatively stable and immovable.

Electronic texts defined as processes rather than objects.

(101) What, then, are these differences, and what are their implications for theories of textuality? Mats Dahlstrom tackles this question in his exploration of how notions of a scholarly edition might change with electronic textuality. . . . More fundamental is the fact that the text exists in dispersed fashion even when it is confined to a single machine. There are data files, programs that call and process the files, hardware functionalities that interpret or compile the programs, and so on. It takes all of these together to produce the electronic text. Omit any one of them, and the text literally cannot be produced. For this reason, it would be more accurate to call an electronic text a process than an object.
(101) But we should not indulge in the logical confusion that results when it is assumed that the creation of the display – a process that happens only when the programs that create the text are activated – entails the same operations as a reader's cognitive processing.

Explore intertwining of physicality and informational scheme, such as md5sums and other operationally uniquely identifying measures.

(102) In insisting further that electronic text is above all a pattern, Dahlstrom risks reinscribing the dematerialization so prominently on display in Shillingsburg's definition of “text” as a sequence of words and pauses. . . . Rather than stretch the fiction of dematerialization thinner and thinner, why not explore the possibilities of texts that thrive on the entwining of physicality with informational structure?
(103) In this respect and many others, electronic texts are indeed not self-identical. As processes they exhibit sensitive dependence on temporal and spatial contexts, to say nothing of their absolute dependence on specific hardware and software configurations.
(103) What matters for understanding literature, however, is how the text creates possibilities for meaning by mobilizing certain aspects of hits physicality.

Emergent materiality like McGann deformation; McKenzie broad definition of text.

(103-104) The following definition provides a way to think about texts as embodied entities without falling into the chaos of infinite difference: The materiality of an embodied text is the interaction of its physical characteristics with its signifying strategies. . . . Because materiality in this view is bound up with the text's content, it cannot be specified in advance, as if it existed independent of content. Rather, it is an emergent property. . . . McKenzie's definition of “text” includes “verbal, visual, oral and numeric data, in the form of maps, prints, and music, of archives of recorded sound, of films, videos, and any computer-stored inormaiton” (5).

Work as Assemblage
(104) In this account of embodied textuality, texts would spread out along a spectrum of similarity and difference, and clusters would emerge.

New types of work as assemblage; interesting examples follow.

(105) These changed senses of work, text, and document make it possible to see phenomena that are now obscured or make invisible by the reigning ideologies. For example, with the advent of the Web, communication pathways are established through which texts cycle in dynamic intermediation with one another, which leads to what might be called Work as Assemblage, a cluster of related texts that quote, comment upon, amplify, and otherwise intermediate one another.
(106) Going along with the idea of Work as Assemblage are changed constructions of subjectivity. . . . Perhaps now it is time to think about what kinds of textuality a dispersed, fragmented, and heterogeneous view of the subject might imply.
(106-107) An appropriate model may present itself in Gilles Deleuze and Felix Guattari's rhizomatic Body without Organs (BwO), a construction that in its constant deterritorialization and reterritorialization has no unified essence or identifiable center, only planes of consistency and lines of flight along which elements move according to the charge vectors of desire. . . . Rather than being bound into the straitjacket of a work possessing an immaterial essence that textual criticism strives to identify and stabilize, the WaA derives its energy from its ability to mutate and transform as it grows and shrinks, converges and disperses according to the desires of the loosely formed collectives that create it.

Programmed computer as author; contrast to division between print text and human reader as locus of decoding agency, hints as MSA.

(107) As this [Patchwork Girl] work emphasizes, with an electronic text the computer is also a writer, and the software programs it runs to produce the text as process and display also have complex and multiple authorship. . . . A robust account of materiality focusing on the recursive loops between physicality and textuality is essential to understanding the dynamics of the WaA.
(107) The game is to understand both print and electronic textuality more deeply through their similarities and differences.

Preprocessing of multiple layers: compare to Kittler on code, references to Loss Penqueno Glazier and John Cayley, and other layer and diachrony in synchrony process control models.

(108) The primary difference is the fact that an electronic text is generated through multiple layers of code processed by an intelligent machine before a human reader decodes the information. McGann argues that print texts are also coded, but this point relies on slippage between the narrow sense of code as it applies to computers and a more general sense of code as Roland Barthes envisions it, including codes of social decorum, polite conversation, and so on.
(109) Nevertheless, the analogy with language translation can offer useful insights into the problems and possibilities that haunt media translation.

Intermediating between Language Translation and Media Translation: Implications for Textuality

Raley tower of programming languages, Weaver Tower of Anti-Babel, Busa common substratum, Chomsky, and a whole philosophical tradition.

(111) In context, [Warren] Weaver believed that the “great open basement” [of the “Tower of Anti-Babel”] would be a universal substratum common to all languages.

Nice image of commercial software.

(111) Van Lieshout observes, “You do not buy software, you rent a first draft, [in a] beta society that markets patches rather than engineer[s] innovation. There may be failures, but no product recalls. In . . . a genuine marketing triumph, the buyer has turned into an unpaid beta tester with the software automatically reporting errors to the developer” (1).

Language reduced to LCD before translations; a fortiori automatic documentation systems further turn language into the streetcars everyone rides loathed by Heidegger.

(112) Thus language is driven to the lowest common denominator even before the translators receive it.
(112) Here the fragmentation of code that in programmable media translates into high-level flexibility operates, directly and indirectly, to fragment language and reduce its complexities to small pieces.
(112-113) Faced with these relentless practices that aim to produce only instrumental prose (at best), I can almost sympathize with the tack taken by Walter Benjamin in his famous essay, “The Task of the Translator.” . . . In this way, translation contributes uniquely to the literary enterprise, Benjamin suggests, but creating an
emergence, a glimpse of the pure language that could not be seen as clearly without the conjunction of the source and target languages that the translation performs.
(114) So mystical a vision can have its justification only in an intense desire to rescue literature, translation, and language itself from base and instrumental purposes.
(114) Positioned somewhere between Weaver and Benjamin is Jorge Luis Borges. Here we may refer profitably to the brilliant work Efrain Kristal has done on Borges's idea of translation.

Taking off from Borges, imagine non-originality of transitory, rhizomatic constellations of source code revisions that constitute given running instantiations of a software system as machine embodiment of its human readable source code.

(114-115) Borges's idea of “originals” as provocations to go in search of meaning fits well with the idea of Work as Assemblage, for like the restless workings of desire and lines of flight that trace territorializations and deterritorializations of the Body without Organs, texts in an assemblage intermediate one another without necessarily bestowing on any one text the privileged status of the “original.”
(115) How might Borges's perspective apply to media translations? . . . Like van Lieshout, she [Anne Mellor] suspects that if Blake were alive today, he would find authoring and production systems in programmable media entirely congenial to his vision.

Hayles provides a number of programs that generate random poems and paragraphs.

(115-116) Another example of this kind of intermediation is provided by Raymond Queneau's Cent mille milliards de poemes. . . . The entire corpus of possible poems can be generated electronically and presented to the user as a random series of novel productions, as Nicholas Gessler has done in his C++ simluation.

Alternative aims to pure language, the hard AI dream of information from the book form (human langauge).

(116) Like Borges's idea of translations as drafts circulating along with the original in a stream of provisional attempts, so here programs circulate as patchwork productions building on earlier ones and recycling code. . . . Rather than aiming at a pure language, code here recycles “original” language in random patterns that cross and recross the threshold of intelligibility, inviting the reader's projection into the echoic effects, as if infecting language with the random access memory of computer storage.


5 Performative Code and Figurative Language
Neal Stephenson's
Cryptonomicon
(117) This chapter looks at a work that remains in print but nevertheless bears within its body marks of its electronic composition. . . . Given present modes of book production, it is more accurate to view print as a particular form of output for electronic text than it is to regard print as a realm separate from digital media.

Stephenson as programmer and open source advocate writing science fiction, deliberate and unconscious.

(118) With an impressive background as a computer programmer, Stephenson does not employ the ubiquitous Windows or even Macintosh but interfaces with the computer through a Unix-based operating system. . . . Following this line of thought, we can assume that the clash of operating systems—including all that it implies about the nefarious corporate practices of Microsoft, the capitalistic greed that underlies its ruthless business practices, and the resistance to these practices by open source communities, particularly Unix and the related Linux—penetrated deeply into the electronic structure of this text [Cryptonomicon] in physical and material ways.
(119) These transformations involve dialectical attempts to mitigate the tensions inherent in the clashes of materiality and abstraction, code and language, hackers and corporate moguls, good profit and evil greed. Cycling through a series of dialectical juxtapositions, the narrative is unable to achieve a final resting point, a resolution stable enough to permit it to effect a rising action that would lead to a satisfying climax and clear denouement. The failure of the text to conform to these expectations signals that despite its traditional physical appearance and conventional narrative techniques, the literary corpus is riven by the writing technologies that produced it.

Math and Lizards: The Dialectic's First Terms

Tension in my notes strategy balanced by dynamic potential of the software to produce new information.

Performative code combines active vitality and conceptual power.

(120) An important assumption underlying code's supremacy is the idea that information can be extracted from its material substrate. Once the information is secured, the substrate can be safely discarded.
(120) As this example illustrates, controlling the flow of information becomes much more problematic when the complexities of coupling it with physical actions are taken into account.
(123) Next to shit, perhaps the most conspicuous instance of the extent to which the world resists algorithmization is sexuality.
(124) From this contradiction emerges the next phase of the dialectical transformation, as abstract code and animal appetite merge to create a third term:
performative code. . . . In this sense, it combines the active vitality connoted by animal appetite with the conceptual power of abstract code.

Compare analysis of Stephenson Cryptonomicon to Kittler on code.

(124) He comments that “it's become evident to me when I looked into the history of computers that they had this intimate relationship with cryptography going back a long way.” . . . Command Line, which as we have seen can be considered the nonfictional companion to Cryptonomicon, offers valuable insights into the conjunction between writing novels and writing code—insights that are essential to explicating the next phase of the dialectical transformation.

Commanding Cryptography

Relate epistemological transparency to control relationships with other people like Wells Eloi and Morlock; for Turkle it may relate to inception of the robotic moment.

(125) Working through analogy, personal anecdote, and technical exposition, Stephenson tries to persuade the educated public that Unix is a far superior operating system to either Macintosh or Windows. This superiority lies not only in Unix's efficiency and power, nor solely in its economy (it is available for little or nothing over the Web). Equally important is the fact that it allows the user to understand exactly what is happening as typed commands are compiled and executed by the machine.
(126) Such folks are “Eloi,” Stephenson suggests, in an allusion to H. G. Wells classic story
The Time Machine. . . . At issue is pride, expertise, and, most importantly, control. Those who fail to understand the technology will inevitably be at the mercy of those who do. The implication is that those who choose Unix, even though it is more demanding technically, can escape from the category of the Eloi and transcend to Morlock status where the real power is.
(127) To entertain these questions is, of course, to suggest that figurative language is to cultural understanding as Windows is to code, a pernicious covering the conceals the truth of things. . . . Performative code makes machines do things, and we should be in control of our machines. But figurative language makes people do things, and to be persuasive, to be
effective, the writer must craft for his readers images that stick in the mind, narratives that compel through memorable scenes and psychological complexities, including paradox, contradiction, irony, and all the rest of the tricks that constitute a writer's trade.
(127-128) Stephenson's own explicit justification for figurative language focuses on myths rather than metaphors, or more accurately, myths as analogies. . . . As with
Snow Crash, connecting mythic figures with the contemporary action aims to bring about an ethical understanding of technology, with the result that principled action follows.
(128) Instead of language becoming like code, code becomes like figurative language, creating a deceptive surface that misleads the masses while the cognoscenti penetrate this screen of symbols to extract the meaningful messages hidden within. Cryptanalysis thus becomes like mathematics, revealing the essential information hidden in “cosmetic distractions” such as those Lawrence sees through with his X-ray mathematical vision.
(129-130) The Good Hacker reconciles the tensions between a code that honestly testifies to the interior operations of the machine and the deceptive power of metaphoric interfaces (and by implication, figurative language in general). . . . The conflation prepares for the next dialectical turn, which pits good hackers against evil deceivers.

The Brotherhood of Code
(131) The Brotherhood of Code injects into the World War II era the same kind of bonding through cryptological expertise that Stephenson in
Command Line identifies with the open source community.
(132) In its last phase, the dialectic attempts to purge contamination from the Good Hacker and separate it out into a figure of pure evil, whose banishment from the plot can then signal the triumph of the good technology of Athena over the bad technology of Ares. This purgative impulse is, of course, precisely contrary to the dialectic's normal operation of combining contraries. At this point, the dialectic can go no further. . . . Rather than work
through these dense oxymoronic knots, the text achieves a muted resolution, such as it is, by working with them.

Knotty Oxymorons: The Dialectic Exhuasted
(133) As the chain of linguistic signifiers drifts toward nonsense, it resembles a cryptographic code whose hidden meaning can be decoded only by the cognoscenti.
(134-135) This heavy-handed indication of moral outrage indicates how desperately Stephenson wants to protect the Good Hacker against the greed that characterizes the evil deceivers, even at the cost of bad writing. . . . Stephenson's anxiety about protecting the Good Hacker from corruption finds its most extreme expression in the creation of the figure of pure evil, as if by isolating all impurities here he can ensure that they will not contaminate the good characters.

An Interpenetration of Technology and Text
(139) Rather, this narrative functions as a deep structure within the text, setting up a surface/depth dynamic that mimetically re-creates within the text the surface/depth relation between the screenic text displayed on the computer screen and the coding languages producing the text as a display the reader can access. Thus the narrative implicit in the dialectic reproduces the structural dynamics of the trading zones between code and natural language that digital technologies have made a pervasive feature of contemporary culture.
(139) Further marks of the digital technologies that produced the text can be seen in the way the surface narrative is assembled. As the cryptological and gold plots begin to weave together, it becomes increasingly clear that the novel is functioning as a kind of machine assembling a coherent story out of plot lines that have been fragmented and spliced into one another.

Interesting use of Perl script in print, like writing about illegal virtual realities playing music still under copyright protection.

(141) In Cryptonomicon, Stephenson included an e-mail from Enoch Root to Randy in which Enoch gives him the Perl script for the Solitaire algorithm (480), leading to the delicious irony that this print book can be exported legally, whereas the code it inscribes with durable marks, if translated into electronic voltages, may not be legally exported. . . . The point of entangling the opposites so that they cannot be separated now emerges as a covert justification for figurative language, for effective communication in the digital realm requires an understanding both of machine-executable code and human-oriented metaphors.

Important insights for texts and technology incorporating code, contrasting her New Materialism with New Criticism.

(142) To probe these complexities, we require critical strategies that are attentive to the technologies producing texts as well as to the texts as linguistic/conceptual structures—that is to say, we require material ways of reading that recognize texts as more than sequences of words and spaces. Rather, they are artifacts whose materialities emerge from negotiations between their signifying structures and the technologies that produce them. Whereas the New Criticism of the mid-twentieth century isolate texts from political contexts and technological productions, the New Materialism I am advocating in this book and practicing in this chapter insists that technologies and texts be understood as mutually interpenetrating and constituting one another.


6 Flickering Connectivities in Shelley Jackson's Patchwork Girl
(143) I have chosen
Patchwork Girl for my tutor text because I think it remains one of the most interesting of electronic fictions and because it is deeply concerned with how digital media enact and express new kinds of subjectivity.

Subjectivity and the Legal Fictions of Copyright
(143-144) In his important book
Authors and Owners, Mark Rose shows that copyright did more than provide a legal basis for intellectual property. The discussion that swirled around copyright also solidified ideas about what counted as creativity, authorship, and proper literature.
(144) Thus a hierarchy of values emerged that placed at the ascendant end of the scale the disembodied, the creative, the masculine, and the writer who worked for glory; at the lower end of the scale were the embodied, the repetitive, the feminine, and the writer who worked for money.

Macpherson possessive individualism compares literary to real estate for copyright purposes, creating complications for collaborative authorship and subject as unity.

(145) The idea that a literary work is analogous to real estate facilitated the fitting together of arguments about copyright with the Lockean liberal philosophy that C.P. Macpherson has labeled possessive individualism.
(146) A literary tradition must precede an author's inscriptions for literature to be possible as such, yet this same appropriation and reworking of an existing tradition is said to produce “original” work. Arguments about literary property were persuasive in part because they fit together so well with prevailing notions of liberal subjectivity, but that same fit implied common blindnesses.
(146) The commitment to originality led to especially strained interpretations when the work was collaborative, for “originality” implied that the work resulted from the unique vision of one gifted individual, not from the joint efforts of a team of skilled craftsmen. Thus the legal fiction was invented that allowed an organization to become the “author,” a fiction routinely evoked to this day for films collaboratively produced by perhaps hundreds of cultural workers.
(147) In
Patchwork Girl, the unconscious of eighteenth-century texts becomes the ground and surface for the specificity of this electronic text, which delights in pointing out that it was created not by a fetishized unique imagination but by many actors working in collaboration, including the “vaporous machinery” of digital text.

Creating a Monster: Subject and Text
(149) The text mobilizes the specificity of the technology by incorporating the three-dimensionality of linked windows as a central metaphor for the fiction's own operations. Like the hypertext stacks, the monster will not be content to reside quiescent on the page but moves fluidly between the world represented on the pages of Mary Shelley's text and the three-dimensional world in which Mary Shelley lives as she writes the text. . . . In an interview with Rita Raley, Shelley Jackson remarked that “a radical text can't just depict monstrosity, but must be itself monstrous.”
(150) Here, however, the body is figured not as the
product of the immaterial work but a portal to it, thus inverting the usual hierarchy that puts mind first.
(150) Following this philosophy,
Patchwork Girl not only normalizes the subject-as-assemblage but also presents the subject-as-unity as a grotesque impossibility.

Suturing the (Textual) Body: Sewing and Writing in Storyspace
(151) As the unified subject is thus broken apart and reassembled as a multiplicity,
Patchwork Girl also highlights the technologies that make the textual body itself a multiplicity. To explore this point, consider how information moves across the interface of the CRT screen, compared with books. . . . Through its flickering nature, the text-as-image teaches the user that it is possible to bring about changes in the screenic text that would be impossible with print (changing fonts, colors, type sizes, formatting, etc.).
(152-153) As these comments suggest, much of the genius of
Patchwork Girl derives from Jackson's ability to exploit the idiosyncrasies of Storyspace for her own purposes. . . . From about 1987 through 1995, first-generation hypertext writers used Storyspace to create some of the first widely discussed literary hypertexts, including Michael Joyce's afternoon, Stuart Moulthrop's Victory Garden, and, of course, Patchwork Girl.
(153) Because of these limitations, someone intending to create a work for the Web would be much better off beginning with a Web authoring tool rather than trying to retrofit a work created in Storyspace for the Web.

Beyond Derrida, mobilizing media specific resources of electronic hypertext to enact subjectivities.

(153-154) The specificities of the software sharply distinguish her text from the print works on which she draws. Of course, print texts are also dispersed, in the sense that they cit other texts at the same time they transform those citations by embedding them in new contexts, as Derrida among others has taught us. Nonetheless, the specificity of an electronic hypertext like Patchwork Girl comes from the ways in which it mobilizes the resources (and restrictions) of the software and medium to enact subjectivities distributed in flexible and mutating ways across author, text, interface, and user.
(155-156) In her comprehensive survey of the status of the body in the Western philosophical tradition, Elizabeth Grosz has shown that there is a persistent tendency to assign to women the burden of corporeality, leaving men free to imagine themselves as disembodied minds—an observation that has been familiar to feminists at least since Simone de Beauvoir. . . . Whereas the disembodied text of the eighteenth century work went along with a parallel and reinforcing notion of the author as a disembodied face, in Jackson's text the emphasis on body and corporeality goes along with an embodied author and equally material text. . . . [quoting Jackson] “Bad writing is all flesh, and dirty flesh at that. . . . Hypertext is everything that for centuries has been damned by its association with the feminine” (“Stitch Bitch,” 534).
(157) Among Patchwork Girl's many subversions is its attack on the “originality” of the work.
(157-158) The surface of the text as image may look solid, this passage suggests, but the “vaporous machinery” generating it marks that solidity with the mutability and distributed cognition characteristic of flickering signifiers. In “Stitch Bitch,” Jackson argues that even the subject considered in itself is a site for distributed cognition.

Spawning Latour hybrids as category mistakes in Patchwork Girl if materiality of text made a signifying component, identity in patches and scars?

(159) Making the physical appearance of the text a signifying component was improper because it suggested that the text could not be extracted from its physical form. According to this aesthetic, bodies can be represented within the text, but the body of the text should not mix with these representations. To do so is to engage in what Russell and Whitehead would later call a “category mistake”--an ontological error that risks, through its enactment of hybridity, spawning monstrous bodies on both sides of the textual divide.
(159) Composed of parts taken from other textual bodies (
Frankenstein and Frank Baum's Patchwork Girl of Oz, among others), this hypertext, like the monster's body, hints that it is most itself in the links and seams that join one part to another. . . . The user inscribes her subjectivity into this text by choosing which links to activate, which scars to trace. . . . Because these enactments take place through the agency of the computer, all these bodies—the monster, Mary Shelley, Shelley Jackson, the specificity of the electronic text, the active agency of the digital interface, and we the users—are made to participate in the mutating configurations of flickering signifiers.
(161) As
Dictionary of the Khazars has taught us (along with similar works), print texts may also have hypertext structures. Rather, Patchwork Girl could be only electronic text because the trace of the computer interface, penetrating deeply into its signifying structures, does more than mark the visible surface of the text; it becomes incorporated into the textual body.

Closure: Link, Lexia, and Memory

Exploring chronotopes of electronic fictions that are profoundly different than books.

(162) The chronotopes of electronic fictions function in profoundly different ways than the chronotopes of literary works conceived as books. Exploring this difference will open a window onto the connections that entwine the link and lexia together with simultaneity and sequence.
(162) Since the past and the future can be played out in any number of ways, the present moment, the lexia we are reading right now, carries an unusually intense sense of presence, all the more so because it is a smaller unit of narration than normally constitutes an episode.
(163) There thus arises a tension between the sequence of lexias chosen by the user, and the simultaneity of memory space in which all the lexias already exist.

Cyborg subjectivity in monstrous intermingling of ontological levels, including differences between computer and human memory.

(163) The interjection of simultaneity into the sequence of a user's choices makes clear why different ontological levels (character, writer, user) mingle so monstrously in this text. In the heart of the computer, which is to say at the deepest levels of machine code, the distinctions between character, writer, and user are coded into strings of ones and zeros, in a space where the text written by a human writer and a mouse-click made by a human user care coded in the same binary form as machine commands and computer programs. When the text represents this process (somewhat misleadingly) as a “merged molecular dance of simultaneity,” it mobilizes the specificity of the medium as an authorization for its own vision of cyborg subjectivity.
(164) Memory, then, converts simultaneity into sequence, and sequence into the continuity of a coherent past. But human memory, unlike computer memory, does not retain its contents indefinitely or even reliably. If human memory has gaps in it (a phenomenon alarmingly real to me as my salad days recede in the distance), then it becomes like atoms full of empty space, an apparent continuity riddled with holes.
(165) Jane Yellowleees
Douglas, writing on Michael Joyce's hypertext fiction afternoon, suggests that closure is achieved not when all the lexias have been read, but when the user learns enough about the central mystery to believe she understands it.


Part 3 TRANSMITTING
Analog and Digital
7 (Un)masking the Agent
Stanislaw Lem's “The Mask”

(171) Rather, here “transmitting” refers primarily to the mechanisms and processes by which informational patterns are transferred between analog consciousness and digital cognition, understanding the latter variously as located in the computer, in human nonconscious processes, and in digital simulations.

Human or machine could express agency; next comes subject.

(171-172) At issue are questions of cooperation and competition between conscious mind and aconscious coding, free will and programmed outcomes, gendered enculturation and the nongendered operation of algorithms, language and the nonlinguistic operation of code. . . . In light of these complex intermediations, let me advance a proposition: to count as a person, an entity must be able to exercise agency. Agency enables the subject to make choices, express intentions, perform actions. Scratch the surface of a person, and you find an agent; find an agent, and you are well on your way toward constituting a subject.

Determining climate of opinion about regime of computation by analysis of Deleuze, Guatarri, Lacan rather than judgment on correctness of theories, although she will criticize them.

(172) Influential cultural theorists, particularly Gilles Deleuze, Felix Guattari, and Jacques Lacan, also speculate on how the digital and analog interact in human cognition. . . . The point as I see it, however, is not to determine whether the theories are correct or incorrect but to understand their roles in helping to create a “climate of opinion,” as Raymond Williams calls it, in which the complex intermediations between the analog and digital become central to understanding constructions of subjectivity and agency.

The Machine within the Human
(173) Deleuze and Guattari define cellular automata somewhat inaccurately as “finite networks of automata in which communication runs from any neighbor to any other.” In fact, as we know, each cell samples only the cells immediately adjacent to it (or in some cases, the next nearest neighbors). By claiming for cellular automata a less rule-bound dynamic than they in fact possess, Deleuze and Guattari imply that any configuration whatever is possible, an idea they push to the extreme in their notions of “deterritorialization” and “reterritorialization.” Cellular automata fit Deleuze and Guattari's purpose because they are completely mechanistic, computational, and nonconscious but nevertheless display complex patterns that appear to evolve, grow, invade new territories, or decay and die out.

Criticism of Body without Organs for misinterpretation of rules of cellular automata.

(173-174) As a result of this rhetoric, the body becomes the Body without Organs, an assemblage rather than an organism, which does away with consciousness as the seat of coherent subjectivity. . . . The net effect of this rhetorical transmutations is to construct the Body without Organs as an infinite set of cellular automata whose computational rules are re-encoded as desire.
(174) In
A Thousand Plateaus, at the same time that humans take on attributes of computational media, machines acquire biological traits.
(174) In “Machinic Heterogenesis,” Guattari addresses this point by interpolating the human and mechanical into one another, arguing that the “mechanosphere . . . superimposes itself on the biosphere.”

Code could be unnamed signifying system Guattari tied to material process of flickering voltages.

(175) Obscurely expressed, the point here seems to be that semiotics has falsified the workings of language by interpreting it through structuralist oppositions that covertly smuggle in anthropomorphic thinking characteristic of conscious mind. The model for language should instead be machinic operations that do not need structural oppositions; these operations have available to them a materialistic level of signification in which representation is intertwined with material processes. Although the word “code” does not appear in Guattari's essay, it fits well with his vision of a signifying system that is tied directly to the material process of flickering voltages.

To Johnston language begins in mechanistic operations of Lacanian unconscious.

(176) The key idea Lacan lifts from automata theory is the notion that inherent in symbol manipulation are certain structural relationships that can be used to program a Turing machine.
(176-177) By contrast, as John
Johnston shows, Lacan envisions language as beginning in the mechanistic operations of the unconscious, from which emerge the higher order processes of conscious thought.
(177) The net result of these feedback loops between artificial life forms and biological organisms has been to create a crisis of agency, a phenomenon described at length in my book
How We Became Posthuman. . . . Through these reconfigurations, Deleuze, Guattari, and Lacan use automata to challenge human agency, and in the process they configure automata as agents.
(177-178) The uncanny similarities between Wolfram's speculations and the theories of Deleuze, Guattari, and Lacan (developed in large part, if no entirely, independently of one another) illustrate how pervasive within the culture these human/machine dynamics have become.

The Human within the Machine
(178) “The Mask” begins with a threshold. On one side is a consciousness that names “the it that was I.”
(188) The entangling of meanings here is like the entangling of the female character's agency with her creator's will, so that the story can be understood to be
simultaneously about human agency and robotic programming, male authorship and female self-birthing, alien creature and ordinary human being.

Machine and Human Interpenetrating

Unconscious as machinic code rather than hopelessly anthropomorphic mirror of consciousness.

(191) In these complex reconfigurations of agency, the significance of envisioning the unconscious as a program rather than as a dark mirror of consciousness can scarcely be overstated, for it locates the hidden springs of action in the brute machinic operations of code. In this view, such visions of the unconscious as Freud's repressed Oedipal conflicts or Jung's collective archetypes seem hopelessly anthropomorphic, for they populate the unconscious with ideas comfortingly familiar to consciousness rather than with the much more alien operations of machinic code.
(191) Whether consciousness can ever emerge from a coded mechanism remains a matter of intense debate.

Importance of coding technology for humanism as role of unconscious in thought better appreciated, bolstering likelihood of emergent AI.

(191-192) Nevertheless, with the advent of emotional computing, evolutionary algorithms, and programs capable not only of learning but of reprogramming themselves (as in programmable gate arrays), it no longer seems fantastic that artificial minds may some day achieve self-awareness and even consciousness. . . . The central question, in other words, is no longer how we as rational creatures should act in full possession of free will and untrammeled agency. Rather, the issue is how consciousness evolves from and interacts with the underlying programs that operate analogously to the operations of code. Whether conceived as literal mechanism or instructive analogy, coding technology thus becomes central to understanding the human condition.
(192) In this view, agency—long identified with free will and rational mind—becomes partial in its efficacy, distributed in its location, mechanistic in its origin, and bound up at least as much with code as with natural langauge.


8 Simulating Narratives: What Virtual Creatures Can Teach Us
Evolving Virtual Creatures

(193-194) Even people (like me) who know perfectly well that they are watching visualizations of computer programs still inscribe the creatures into narratives of defeat and victory, cheering the winners, urging on the losers, laughing at the schlemiels. Much more is going on here than simple anthropomorphic projection. [Karl
Sim's] “Evolved Virtual Creatures” is a laboratory not only in evolution (its intended purpose), but also in the impact of distributed cognitive systems on traditional modes of description, analysis, and understanding. . . . The intermediations that take place across the screenic interface operate in both directions at once: we anthropomorphize the virtual creatures while they computationalize us.
(194) Compared to the world in which we live, the environment of “Evolved Virtual Creatures” is extremely simple, so simple that it can be described almost completely. . . . By the time we arrive at functions, the level at which Karl Sims discusses his design for “Evolved Virtual Creatures,” we have reached a point where the patterns created by the programmer become explicit. Instantiated in these patterns are the programmer's purposes in creating this particular hierarchy of materio-semiotic codes.
(194-195) Sim's design follows John Koza's proposal that evolutionary programs should take advantage of modular structures that can be repeated over and over to create more complex structures. . . . When the elements are grouped and mutated as modules, the spectrum of possible variations is reduced to a manageable level.
(195) The next step moves form the design of individual creatures to a population of creatures.
(196) In much the same way that the recursive loops between program modules allow a creature's morphology and brain to coevolve, so recursive loops allow the designer's intent, the creatures, the virtual world, and the visualizations to coevolve into a narrative that viewers find humanly meaningful.

Evolving Narratives

Creation of narrative may be evolutionary adaptation allowing construction of models of how others and oneself feel and act (Argyros, Baron-Cohen).

(197) As Alex Argyros, among others, has suggested, the creation of narrative may itself be an evolutionary adaptation of remarkable importance. With their emphasis on causality, meaningful temporal sequence, and interrelation between behavior and environment, narratives allow us to construct models of how others may be feeling and acting, models that coevolve with our ongoing interior monologues describing and interpreting to ourselves our own feelings and behaviors. . . . As Baron-Cohen points out, autism is associated with an inability to construct narratives that will make sense of the behaviors of others. . . . Without the presuppositions embedded in narratives, most of the accomplishments of Homo sapiens could hot have happened.
(198) First-order emergence, as discussed in chapter 1, is any behavior or property that cannot be found in either a system's individual components or their additive properties, but that arises, often unpredictably, from the
interaction of a system's components. Second-order emergence arises when a system develops a behavior that enhances its ability to develop adaptive behaviors—that is, when it evolves the capacity to evolve.

Ability of programmers to sense operation of code like a musician reading a score picked up in Electronic Literature with Caley transliteral morphing, in which nonprogrammers intuit algorithms by watching program operation, all epigenetic brain changes conjoined with machine operations, what Hayles refines from unconscious to technological nonconscious; temper this skilled sense with arguments by Chun sourcery and Tanka-Ishii.

(198) No doubt an experienced programmer such as Karl Sims can look at a program's functions and “see” the morphologies and behaviors of his creatures with no more difficulty than an experienced reader of fiction can “see” Isabel Archer in Henry James's aptly entitled novel, The Portrait of a Lady. . . . When Sims chooses some of his creatures for visual rendering, he taps into this evolutionary history by creating pixilated images that, through culture and training as well as biologically determined capacities, we recognize as representations of three-dimensional creatures.
(199) It is surely no accident that in his evolutionary simulations Sims designs programs that can be “seen” as creatures striving after a goal and winning against competitors, for these are among the most canonical narratives in traditional accounts of evolutionary history (not to mention in Western capitalist society).

Computing the Human: Analog and Digital Subjects
(201) Attributes of the analog subject include, then, a depth model of subjectivity in which the most meaningful part of the self is seen to reside deep inside the body, and the self is further linked with units possessing a natural integrity of form and scale that must be preserved if the subject is to be maintained intact.
(202) As we saw in chapter 6, the legal fight to insure copyright, the cult of the author, print technology, and print culture worked hand in glove to create a depth model of subjectivity in which analog resemblances guaranteed that the surface of the page was matched by an imagined interior within the author, which evoked and also was produced by a similarly imagined interior in the reader.
(203) The mantra for such programs is “simple rules, complex behaviors,” which implies that the farther down into the system one goes, the less interesting it is. Note how this digital model differs from the analog subject, where depth implies a meaningful interiority.

Meaningful interiority of analog subject (down to the letter) versus less interesting depths of digital subject where emergent meaning depends on fragmentation.

(203) In traditional typesetting before the advent of computers, each letter in the alphabet was treated as a distinct unit; in speech, the corresponding phoneme also acts as an intact unit. In contrast are digital sampling techniques, where sound waves may be sampled some forty thousand times a second, digitally manipulated, and then recombined to produce the perception of smooth analog speech. In fact, emergence depends on such fragmentation, for it is only when the programs are broken into small pieces and recombined that unexpected adaptive behaviors can arise. Instead of a depth model of meaningful interiority, the digital subject manifests global behaviors that cannot be predicted by looking at the most basic levels of code with which the program starts.

Scientific Realism and the Transmigration of Form

Oreo ontology; analog resemblance bounding digital layers; fragmentations and recombinations in otherwise deterministic digital.

(206-207) Inscription, then, is crucially important to the transformation of embodied reality into abstract forms. . . . Further developing the discussion in chapters 1 and 2 about the synergistic interactions between the digital and analog, I will here call this digital/analog structure the “Oreo,” for like the two black biscuits sandwiching a white filling between them, the initial and final analog representations connected with embodied materialities sandwich between them a digital middle where fragmentations and recombinations take place.
(207) An example of an Oreo structure is positron emission tomography, or PET images.
(208) Although we may think of the computer as the digital middle of the PET scan, it too has an analog bottom and, insofar as humans need to interact with its processes, an analog top as well. Wherever different embodied materialities are linked, analog resemblance is likely to enter the picture, for it is the dynamic that mediates between the noise of embodiment and the clarity of form.

Digital Creatures and Hybrid Subjectivity: From Form to Process
(210) The complexity the creatures display is not inherent in the binary code; rather, it is
produced as the program runs. . . . To bridge the gap between our narrative inscription of the creatures and the materio-semiotic apparatus producing them, I find it useful to think of them as processes rather than as bodies.
(211) Rather, the more profound change is from form to process, from preexisting bodies to embodied materialities that are linked to one another by complex combinations of processes based both in analog resemblances and coding relationships.
(213) The point, after all, is that [Him Campbell's] “
I Have Never Read the Bible,” that is, the artist as a singular subject has not read it. Rather, “reading” here is a distributed activity taking place partly in the articulations of the artist, partly in the “voiced” text, partly in the Oreo structures of the scanner, computer, and synthesizer, and partly in the perceptions of the viewer who not only makes words out of the voiced letters but also makes meaning out of her interpolation into this distributed cognitive environment.
(213) I think, therefore I connect with all the other cognizers in my environment, human and nonhuman, including both the dynamic processes that are running right now as you decode these letters and all the dynamic processes that have run in the past and congealed to create this paper, this ink, this old language made of nouns and verbs that I am trying to fashion to new purposes that will allow you to see my body, your body, the bodies of the virtual creatures, not as nouns that enact verbs, but as dynamic intermediations that weave together the embodied materialities of diverse life forms to create richly complex distributed cognitions.


9 Subjective Cosmology and the Regime of Computation
Intermediation in Greg Egan's Fiction

(214) The trajectory of the previous chapters has arced toward the conclusion that human bodies, cultures, and artifacts are becoming increasingly entwined with intelligent machines. . . . Egan's fiction, pushing the possibilities for the Computational Universe to extremes, invites us to think about how far the entanglement of humans and computers can or should go.

The Legacy of Alan Turing
(214-215) First, many researchers considered Turing's proposal an invitation to develop intelligent machines, and within a decade artificial intelligence had become a flourishing research field. . . . As the cognitions that intelligent machines can perform have deepened and broadened, we can exclude what they do from our definition of “thinking” only if we narrow the range of what counts as thinking so significantly that it becomes questionable whether many humans can think.
(215) A second development from Turing's paper was to set off debates concerning what we mean when we use words like “thinking,” “mind,” and “alive.”
(216) These two dynamics—the continuing development of intelligent machines and the shifting meanings of key terms—work together to create a complex field of interactions in which humans and intelligent machines mutually constitute each other. . . .
what we make and what (we think) we are coevolve. The parenthesis in the aphorism marks a crucial ambiguity, a doubleness indicating that changes in cultural attitudes, in the physical and technological makeup of humans and machines, and in the material conditions of existence develop in tandem.
(216-217) The bold vision that Wolfram, Fredkin, Morowitz, and others espouse of a Universal Computer generating physical reality is meanwhile circulating in cultural arenas, as Greg Egan's novels, along with much else, testify.
(217) If computation generates physical reality at the subatomic level, then one can claim that in this sense cognition is computational, even while conceding differences in embodiment and the integral relation between embodiment and human cognition. The complexity of these dynamics simultaneously interacting with one another is intermediation in the broadest sense—claims about physical reality, cultural attitudes, and technological developments coevolving in relationships of synergy, contestation, competition, and cooperation.
(217) These concerns include the way code modifies our understanding of speech and writing systems, the interaction of print and electronic textuality, and the transformation of subjectivities as human and computer cognitions dynamically alter the meaning of thinking and mind. The crucial point to which I keep returning is the necessity to think these dynamics together, which requires recognizing that multiple causalities simultaneously interact with one another as both means and metaphors.

Coevolving minds and machine, transformation of understanding of nature of reality; Bogost on objects.

(218) Indeed, since in many ways twentieth-century cybernetics prepared the way for the Regime of Computation, it would not be entirely inaccurate to name the contemporary emphasis on emergence “third-order cybernetics” because it subsumes and transforms the reflexivity characteristic of second-order cybernetics. . . . The Regime of Computation presumes and requires materiality at the same time that it transforms our understanding of the nature of materiality.
(218) Having presented a number of case studies showing that the complex dynamics of making, storing, and transmitting are changing contemporary ideas about language, textuality, and cognition, I turn in this final chapter to test the limits of the Regime of Computation. Egan's novels are particularly rich resources in this regard because he envisions the human/computer connection not as a question of technology (shades of Heidegger) but as an ontological inquiry into the relation of humans to the universe. In this “subjective cosmology” trilogy—Quarantine, Permutation City, and Distress—each of the novels speculates that there are deep connections between human consciousness and the computational processes that generate the universe.

Computational Universe cultural metaphor as symptom, leading to Zizek.

(218-219) For him, intellectual honesty includes the willingness both to follow an idea through to its logical conclusion and to look hard at the differences between what we wish were true and what is actually true. . . . What we cannot escape, in Egan's view, is the materiality of a physical universe that constitutes us as physical beings even as our participatory understanding co-constitutes it. . . . I begin the engagement by positioning the Computational Universe not only as a metaphor but as a pathological metaphor, that is to say, as a symptom.

Computation as Symptom
(219) The appropriation of computation as a cultural metaphor is an instance of a more general dynamic in which cultural, historical, and linguistic presuppositions, invisible to someone because he simply assumes them to be true, constitute the framework in which problems are constructed and judgments made. This dynamic has much in common with the symptom. Useful for explicating its psychological aspects is Slavoj
Zizek's Enjoy Your Symptom!
(219-220) The mechanism, then, involves reasoning backward from one's present position and seeing prior contingent events as constituting a necessary and inevitable teleological progression to that point. . . . Yet like a psychoanalytic symptom that indicates not a supposed physical ailment but repressed psychological trauma, the letter reaches its destination as soon as it is sent because the message points back to the sender rather than to a supposed other.
(220) What letter does Wolfram send, and how does it function as a symptom? . . . The letter arrives at its destination when we understand the teleological illusions that complete the (short) circuit between Wolfram's interpretation of complex systems and the cultural context of computation in which he is embedded.
(220) One might imagine a computational version of the anthropic principle using reasoning similar to Richard Dawkin's argument in
The Selfish Gene. . . . Humans, in this view, are the computer's way to make more computers.

Compare the surplus always present because is no metalanguage to Kittler on media studies always involving media.

(221) Zizek further associates the symptom with the observation that “there is no metalanguage” (12). He draws a connection between teleological reasoning and an insight central to quantum mechanics and contemporary science studies: we always participate in what we observe. . . . For Zizek, the reflexive entangling of subject and object takes the form of a certain excess or surplus, manifesting itself when the sender of a letter “always says more than he 'intended to say'” (14).

Haraway situated knowledge, blind spot better explanatory power than Zizek who is stuck on death drive, leading to study of Permutation City.

(221) In terms less oriented to psychology, the surplus reveals that the sender is implicated in the message, in the sense that the message is never objectively external to his perspective but is formed within and through that perspective, a dynamic central to Donna Haraway's concept of situated knowledge.
(221-222) In its broadest conceptual form, the “blind spot” results not from a specific failure of vision but from the inevitable partiality of the viewer's perspective. . . . For Niklas Luhmann, this blindness is implicit in the cut that distinguishes a system from its environment. . . . Paradoxes of representations that contain within themselves the frames enclosing them have been explored extensively by Douglas R. Hofstader in
Godel, Escher, Bach. Their pertinence here is their association with the symptom and the mechanism of teleological illusion.
(222) For Zizek, death is the ultimate destination of the letter. . . . By contrast, Egan views death as a consequence of our biological embodiment and speculates that it may become optional when technological advances allow human consciousnes to be simulated within computational media. . . . I will argue, however, that his novels are ambiguously coded, and thus can be read both as technophilic extrapolations of the Regime of Computation and, more subtly, as critiques that interpret the Computational Regime as a symptom of our present cultural condition.

Permutation City: The Letter Arrives as Death and Simulation
(223) Fifth in a line of Copies, all of whom “baled out” within fifteen minutes of (re)gaining consciousness, Paul has no choice but to perform the experiments that his original has preestablished.

Functionalist epiphenomena versus embodiment short times and deadlines in discussion of Permutation City.

(223) The experiments are designed to test the nature of consciousness and, specifically, its relation to identity. They suggest that our sense of continuous selfhood is an illusion—that we are never who we think we are. . . . Data can be stored randomly throughout computer memory and still assembled in the correct order by the program, so dispersion in computational space has no relation to the simulation's continuity. Moreover, the simulation's outcome does not depend on processor speed.
(224) In effect, then, Paul recognizes the symptomatic mechanism of putting together contingent events so that they seem to reach a teleological end point. However, Paul's interpretation of this mechanism profoundly subverts the symptom's meaning in psychoanalytic theory. Paul's recognition of the radical contingency of human identity does not signal that he has finally uncovered the short circuit the symptom creates and, by implication, the trauma it covers over. On the contrary, the teleological illusion is transformed so that it now represents the ordinary “sane” belief that human consciousness is continuous.
(225) He has the
illusion of possessing a continuous identity, but the dust hypothesis dictates that we understand this identity as the result of tracing one contingent path through infinitely many branching points.
(225) A similar idea is explored in
Quarantine, Egan's novel preceding Permutation City. Here the mechanism is not a Computational Universe but the “many worlds” hypothesis of quantum theory.
(226-227) As readers, we become aware of this process through discrepancies that reveal that the self narrating a given section is not the same as the self narrating the next one. . . . By the standards of the smeared universe, human consciousness is pathological by its very nature, as the title
Quarantine suggests.

Bogost alien phenomenology.

(227) Egan's story suggests that recognizing the difference between the human teleological illusion and cosmic indeterminacy does indeed induce trauma, but with two crucial differences from Zizek's version. First, this is not the psychopathology of a single individual but the trauma of an entire world; and second, the pathological state for humans is revealed to be the uncollapsed, smeared state that is normal for the rest of the universe.
(228) This dispersion further suggests that the novel itself could exist in countless permutations, of which we know only one. Egan thus carries over into this work
Quarantine's subtle implication that the coherence of the narrator's voice amounts to a teleological illusion created by the reader.
(228) In
Permutation City, the reflexive entwining of frame and picture that Zizek associates with the teleological illusion breaks into visibility through Egan's equivocation on a central point, namely, the position of consciousness within the simulation.
(229) In supposing that a simulation can begin to run on the Universal Computer from the force of its internal coherence, Egan goes beyond Wolfram in making reality and simulation permeable to one another.
(229) Having the simulation leap out of the computer to run on the Universal Computer opens Durham's project to a critique of the kind Robert
Markley undertakes when he contrasts the “freedom” of cyberspace with its cost in economic and social resources.
(229-230) So convinced is Durham that this launch can become reality that after completing the send-off and spending a night of disastrous sex with Maria, he commits suicide in a disembowelment that metaphorically resembles birth. . . . That Durhams now finds “real” life not worth living ironically mirrors and inverts his Copy's earlier attempt at suicide. Thus the letter has circulated through the faux Durham to arrive at the original Paul Durham.
(230) In another permutation of this logic, the third letter arrives at its destination when a Copy chooses to accept himself as the addressee, even though it was sent by his original.
(232) Their triumph over Permutation City functions to distinguish metaphors that obscure the material basis for existence from those that illuminate it (a distinction, as we have seen, crucial to Heal Stephenson's
Cryptonomicon). Through the Lambertians, then, the novel enacts a cautionary tale about confusing metaphor and means, thus performing another resistance to its premises.
(232) If in one sense
Permutation City supports Wolfram's thesis by deconstructing the distinction between simulation and reality, in another sense it reveals the power of simulation to determine its own future, independent of what its creator intends or desires. Juxtaposed with Wolfram's claim that he alone deserves credit for the “new kind of science,” Permutation City implicitly critiques the creator's ability to lay claim to his creations. Precisely because Egan takes the Computational Regime seriously as the means by which reality is produced, he refuses to confine it within the preconceptions of its creator, whatever those might be.

Distress and the Participatory Universe
(233) The reasoning extends to the breaking point those arguments advanced by constructivist critics within science studies that laws of nature do not exist until they are discovered. Most critics acknowledge that
something exists, but they argue that the ways in which that something is articulated and understood deeply affect how it will be integrated into existing frameworks of knowledge, transforming them in the process.
(234) This version goes further: erase history, start with a single person (called the Keystone by those who take the theory seriously), and add the requirement that the Keystone bring history into being (instead of the idea that history brings intelligent life into being).
(234) Translated into psychoanalytic terms, the belief in the Keystone combines scapegoating with a strong teleological illusion, identifying as the letter's addressee a consciousness uniquely invested with cosmic powers of creation and uncreation.
(236-237) Whereas Paul's suicide is a renunciation of embodiment, Andrew's extraction of his software is connected with his realization that he
is his body. As he pulls the coils of optic fiber from his belly, the scene functions as if bringing literally into bloody view the symbolic umbilical cord that Zizek identifies with the place where the frame enters the picture.
(238) the TOE [Theory of Everything] ends up being created not by a single Keystone but by millions of people working cooperatively together.
(238) In conveying this resulting transformation, Egan strives to represent a world that is at once infused with human meaning and objectively true; he calls this world the “participatory universe.”

Investing everything in theory of unconscious as Zizek does is parochial and holds future hostage to present, local conditions: consider conclusions of Reality of the Virtual.

(239) The friction between struggling with embodiment and Egan's belief in a postbiological future can be seen as an admirable commitment on his part to look hard at his own premises. . . . Egan would almost certainly find Zizek's psychoanalytical approach distressingly parochial. . . . [quoting an interview] “It might take 50 years, or it might take 500, but eventually we're going to have unlimited control over whatever physical substrate is 'executing' our minds, and I'm trying to map out some of the benefits and some of the dangers of that.” To invest everything in a theory of the unconscious based on our physical and mental structures, as Zizek does, would in Egan's view be presumptuous because it holds the entire span of the far future hostage to our present local conditions.

What about transformations of literacy and creativity, such as evaluating the future of Caprica where the virtual world constituted indirectly from lifetime accrual by technological nonconscious?

(240) It seems certain that the next century will see increasing convergences between humans and intelligent machines through such developments as implants, pervasive computing, robots with ever more sophisticated sensors and actuators, quantum computing, and the increasing convergence of biology and computation through nanotechnology and other means.


Epilogue
Recursion and Emergence

(241) However necessary his [Bacon] strategic separation of the subjective and objective may have been at a time when alchemy flourished and astrology was a state science, it left a damaging legacy by associating science with mastery, control, and domination, underwritten by the premise that one can act without simultaneously being acted upon.
(241) Yet even the most insightful and reflective of the cyberneticians stopped short of seeing that reflexivity could do more than turn back on itself to create autopoietic systems that continually produce and reproduce their organization.

Always a job for digital humanists; danger of consumer use of simulations entanglement.

(242) The crucial question with which this book has been concerned is how the “new kind of science” that underwrites the Regime of Computation can serve to deepen our understanding of what it means to be in the world rather than apart from it, comaker rather than dominator, participants in the complex dynamics that connect “what we make” and “what (we think) we are.” Amid the uncertainties, potentialities, and dangers created by the Regime of Computation, simulations—computational and narrative—can serve as potent resources with which to explore and understand the entanglement of language with code, the traditional medium of print with electronic textuality, and subjectivity with computation.
(242) Rather than attempt to police these boundaries, we should strive to understand the materially specific ways in which flows across borders create complex dynamics of intermediation.
(243) If we interpret the relations of humans and intelligent machines only within this paradigm, the underlying structures of domination and control continue to dictate the terms of engagement.
(243) This book has focused on narratives of a different kind that offer ways out of the subject/object divide. . . . In my view, an essential component of coming to terms with the ethical implications of intelligent machines is recognizing the mutuality of our interactions with them, the complex dynamics through which they create us even as we create them.

A profession of faith in being already posthuman, and ethical trajectory to not be as dominating as Bacon.

(243) Encountering intelligent machines from this perspective enables me to see that they are neither objects to dominate nor subjects threatening to dominate me. Rather, they are embodied entities instantiating processes that interact with the processes that I instantiate as an embodied human subject. The experience of interacting with them changes me incrementally, so the person who emerges from the encounter is not exactly the same person who began it. What I think of as my human legacy—the language I speak, the books I read, the digital art and information I peruse, the biology I inherit from eons of evolutionary dynamics, the consciousness that generates my sense of identity—is already affected by the intermediating dynamics of my interactions with intelligent machines, and will surely be transformed even more deeply in the decades to come. The challenge, as I see it, is to refuse to inscribe these interactions in structures of domination and instead to seek out understandings that recognize and enact the complex mutuality of the interactions.



Hayles, N. Katherine. My Mother Was a Computer: Digital Subjects and Literary Texts. Chicago: University of Chicago Press, 2005. Print.