Notes for N. Katherine Hayles How We Think: Digital Media and Contemporary Technogenesis

Key concepts: productive theory.


Related theorists: Bolter, Sterne.


Here like all thinkers writers she is staking a term for legitimacy in scholarly discourse gambling that those terms will still be relevant in future thought.

1
How We Think
Digital Media and Contemporary Technogenesis

Extended cognition (Clark); this deemphasis subordinating the plurality in the singular how, we, and think.

(3) The more one works with digital technologies, the more one comes to appreciate the capacity of networked and programmable machines to carry out sophisticated cognitive tasks, and the more the keyboard comes to seem an extension of one's thoughts rather than an external device on which one types. Embodiment then takes the form of extended cognition, in which human agency and thought are enmeshed within larger networks that extend beyond the desktop computer into the environment.

Low readership of scholarly work compared to tens of thousands includes those who play pinball machines and the machines themselves (supporting impossibly uncomputable arrangements).

(4) If influence and audience were considered, one might make a strong argument for taking into account well-written, well-researched blogs to print books and articles that have audiences in the dozens or low hundreds—if that.

Comparative Media Studies exemplifies collaboration among scholars, students.

(4-5) Graphics, animation, design, video, and sound acquire argumentative force and become part of the research's quest for meaning. As a scholar confronts these issues, sooner or later she will likely encounter the limits of her own knowledge and skills and recognize the need—indeed, the necessity—for collaboration.
(5) Working collaboratively, the digitally based scholar is apt to enlist students in the project, and this leads quickly to conceptualizing courses in which web projects constitute an integral part of the work. Now the changes radiate out from an individual research project into curricular transformation and, not coincidentally, into different physical arrangements of instruction and research space.
(7) Needed are approaches that can locate digital work within print traditions, and print traditions within digital media, without obscuring or failing to account for the differences between them. One such approach is advocated here: it goes by the name of
Comparative Media Studies.

State of the Arts of comparative media studies; imagine this is our task, to bring critique, including ideology, to software, released from assumption by distributing into foss (through fossification).

Examples of Comparative Media Studies cross into Software Studies, Critical Code Studies and Platform Studies.

(7-8) Examples of Comparative Media Studies include research that combines print and digital literacy productions, such as Matthew Kirschenbaum's (2007) concepts of formal and forensic materiality, Loss Glazier's (2008) work on experimental poetics, John Cayley (2004, 2002) on letters and bits, and Stephanie Strickland (Strickland 2002; Strickland and Lawson 2002) on works that have both print and digital manifestations. Other examples are theoretical approaches that combine continental philosophy with New Media content, such as Mark Hansen's New Philosophy for New Media (2006b). Still others are provided by the MIT series on platform studies, codirected by Nick Montfort and Ian Bogost (Montfort and Bogost 2009), which aims to locate specific effects in the affordances and constraints of media platforms such as the Atari 5600 video game system, in which the techniques of close reading are applied to code and video display rather than text. Also in this grouping are critical code studies, initiated by Wendy Hui Kyong Chum (2008, 2011) and Mark Marino (2006) among others, that bring ideology critique to the rhetoric, form, and procedures of software. . . . Diverse as these projects are, they share an assumption that techniques, knowledges, and theories developed within print traditions can synergistically combine with digital productions to produce and catalyze new kinds of knowledge.

Or ignore, not bother including in program design, the new cosmic great shrug off of the human in the Internet far in the future after all copyrights expire as the term describing the end, kind of like the end of thirty two bit Unix and Unix-like time, or at least mark the point where it restarts, crosses over with temporary inversion of temporal relations implied in their digital storage production run time existence.

(8) On a pedagogical level, Comparative Media Studies implies course designs that strive to break the transparency of print and denaturalize it by comparing it with other media forms.

Project based research; I am calling for marking out a significant place, space, proportion, duration, support, compatibility, interoperability, real time reliance, and so on.

(9) The implications of moving from content orientation to problem orientation are profound. Project-based research, typical of work in the Digital Humanities, joins theory and practice through the productive work of making. . . . The challenges of production complicate and extend the traditional challenges of reading and writing well, adding other dimensions of software utilization, analytical and statistical tools, database designs, and other modalities intrinsic to work in digital media.
(10) One way into the complexities is to track the evolution of the Digital Humanities, the site within the humanities where the changes are most apparent and, arguably, most disruptive to the status quo.

Technogenesis is the new theory of evolution.

(11) Another way is through the concept of technogenesis, the idea that humans and technics have coevolved together.

So will there be a considerable proportion of humanities practice working code is where I situate the ontological assumptions argument.

(11) As digital media, including networked and programmable desktop stations, mobile devices, and other computational media embedded in the environment, become more pervasive, they push us in the direction of faster communication, more intense and varied information streams, more integration of humans and intelligent machines, and more interactions of language with code.

I coming from technology will present the programming perspective to complement Comparative Media Studies.

(11) A Comparative Media Studies perspective can result in courses and curricular that recognize all three reading modalities—close, hyper-, and machine—and prepare students to understand the limitations and affordances of each.
(12) If we think about humanities research and teaching as problems in design (i.e., moving from content orientation to problem orientation), then Brooks's advice suggests that for collaborative teams working together to craft projects and curricula in digital media, it is crucial for them partners to recognize the importance of human attention as a limiting/enabling factor, both as a design strategy and as a conceptual framework for theoretical work.

Alien temporality: consider differences in trajectories towards future activity (philosophical production) in Turkle and Hayles both reporting on the situation, the state of the art.

(13) Grasping the complex ways in which the time scales of human cognition interact with those of intelligent machines requires a theoretical framework in which objects are seen not as static entities that, once created, remain the same throughout time but rather are understood as constantly changing assemblages in which inequalities and inefficiencies in their operations drive them toward breakdown, disruption, innovation, and change. Objects in this view are more like technical individuals enmeshed in networks of social, economic, and technological relations, some of which are human, some nonhuman. Among those who have theorized technical objects in this way are Gilbert Simondon, Adrian Mackenzie, Bruno Latour, and Matthew Fuller.
(14) The realization that neural plasticity happens at many levels, including unconscious perceptions, makes technogenesis a potent site for constructive interventions in the humanities as they increasingly turn to digital technologies.

Any connection of Lefebrvre to interests and work of Janz?

(14) At least as far back as Henri Lefebvre's The Production of Space ([1974] 1992), contemporary geographers have thought about space not in static Cartesian terms (which Lefebvre calls represented or conceived space) but as produced through networks of social interactions.
(15) The inclusions of databases in spatial history projects has opened the door to new strategies that, rather than using narrative as their primary mode of explication, allow flexible interactions between different layers and overlays.
(15) Chapter 7 explores Steven Hall's distributed literary system that has as its main component the print novel
The Raw Shark Texts: A Novel ([2007] 2008a). In depicting a posthuman subjectivity that has transformed into a huge database capable of evacuating individual subjectivities and turning them into “node bodies,” the text performs a critique of postindustrial knowledge work as analyzed by Alan Liu (2008b).

The tapoc and journal systems present both thought and autobiographical narrative of American Socrates that could be printed as an on demand book and boxed in pinball machine cabinets in the oddest story of a limited liability corporation distribution system business plan ever proposed.

(16) In an era when databases are perhaps the dominant cultural form, it is no surprise that writers are on the one hand resisting databases, as The Raw Shark Texts (2008e) does and on the other hand experimenting with ways to combine narrative and database into new kinds of literature, as does Mark Z. Danielewski's Only Revolutions (2007b).
(17) The position taken throughout this book is that all cognition is embodied, which is to say that for humans, it exists throughout the body, not only in the neocortex. Moreover, it extends beyond the body's boundaries in ways that challenge our ability to say where or even if cognitive networks end.

Compare to conclusion of McGann Radiant Textuality.

(17) Making the case for technogenesis as a site for constructive interventions, this book performs the three reading strategies discussion in chapter 3 of close, hyper-, and machine reading. . . . Finally, the coda to chapter 8, written in collaboration with Allen Riddell, presents results from our machine reading of Only Revolutions. Combining close, hyper-, and machine reading with a focus on technogenesis, the book is meant as a proof of concept of the potential of Comparative Media Studies not only in its arguments but also in the methodologies it instantiates and the interpretive strategies it employs.


FIRST INTERLUDE
Practices and Processes in Digital Media


2
The Digital Humanities
Engaging the Issues
(24) Through narrated experiences, sketched contexts, subtle nuances, and implicit conclusions, the interviews reveal the ways in which the Digital Humanities are transforming assumptions. The themes that emerged can be grouped under the following rubrics: scale, critical/productive theory, collaboration, databases, multimodal scholarship, scope, and future trajectories.

Defining the Field
(25) Although some practitioners continue to prefer “humanities computing,” for [Stephen]
Ramsay and his colleagues, “Digital Humanities” was meant to signal that the field had emerged from the low-prestige status of a support service into a genuinely intellectual endeavor with its own professional practices, rigorous standards, and exciting theoretical explorations.

Second DH wave goes beyond text-based practices to multimodal platforms.

(25) A decade later, the term is morphing again as some scholars advocate a turn from a primary focus on text encoding, analysis, and searching to multimedia practices that explore the fusion of text-based humanities with film, sound, animation, graphics, and other multimodal practices across real, mixed, and virtual reality platforms.
(25-26) [quoting John
Unsworth's “Manifesto”] The first wave of digital humanities work was quantitative, mobilizing the search and retrieval powers of the database, automating corpus linguistics, stacking hypercards into critical arrays. The second wave is qualitative, interpretive, experiential, emotive, generative in character.
(26) From a very different perspective, Johanna
Drucker (2009) argues that the Digital Humanities have been co-opted by a computational perspective inherited from computer science, betraying the humanistic tradition of critical interpretation.
(27) Digital Humanities as a diverse field of practices associated with computational techniques and reaching beyond print in its modes of inquiry, research, publication, and dissemination.

Scale Matters
(28) Scale changes not only the quantities of texts that can be interrogated but also the contexts and contents of the questions.
(28) If one can perform “distant reading” without perusing a single primary text, then a small step leads to Timothy
Lenoir's claim (2008a) that machine algorithms may also count as “reading.”
(29) The controversies around “reading” suggest it is a pivotal term because its various uses are undergirded by different philosophical commitments. At one end of the spectrum, “reading” in the Traditional Humanities connotes sophisticated interpretations achieved through long years of scholarly study and immersion in primary texts. At the other end, “reading” implies a model that backgrounds human interpretation in favor of algorithms employing a minimum of assumptions about what results will prove interesting or important.

Machine reading and big data points toward posthuman scholarship.

(30) The unsettling implications of “machine reading” can be construed as pointing toward a posthuman mode of scholarship in which human interpretation takes a backseat to algorithmic processes.
(31) The tension between algorithmic analysis and hermeneutic close reading should not be overstated.

Productive/Critical Theory

Bolter productive theory exemplified by creative visualization of data from machine queries to discover patterns leading to interpretation.

(31) A different kind of theory emerges when the focus shifts to the digital tools used to analyze texts and convey results. Jay David Bolter (2008) suggests the possibility of “productive theory,” which he envisions as a “codified set of practices.”
(32) Thus two dynamics are at work: one in which the Digital Humanities are moving forward to open up new areas of exploration, and another in which they are engaged in a recursive feedback loop with the Traditional Humanities.
(33) Machine queries frequently yield masses of information that are incomprehensible when presented as tables or databases of results. Visualization helps sort the information and make patterns visible. Once the patterns can be discerned, the work of interpretation can begin.

Collaboration

Need for disciplinary collaboration for conceptualization and implementation of projects.

(34) Implementing such projects requires diverse skills, including traditional scholarship as well as programming, graphic design, interface engineering, sonic art, and other humanistic, artistic, and technical skills.
(35) Conceptualization is intimately tied in with implementation, design decisions often have theoretical consequences, algorithms embody reasoning, and navigation carries interpretive weight, so the humanities scholar, graphic designer, and programmer work best when they are in continuous and respectful communication with one another.

Big Humanities projects can involve collaborations of students and amateurs, Wikipedia the obvious example, considered as long term, public projects they outlive the individual director who launched the idea; however, has impact on tenure and promotion mechanics.

(35) As a consequence of requiring a clear infrastructure within which diverse kinds of contributions can be made, “Big Humanities” projects make possible meaningful contributions from students, even as undergraduates.
(36) For the first time in human history, worldwide collaborations can arise between expert scholars and expert amateurs.
(36) An example is the
Clergy of the Church of England Database directed by Arthur Burns, in which volunteers collected data and, using laptops and software provided by the project, entered them into a database. . . . This kind of model could significantly improve the standing of the humanities with the general public.
(37) At the same time, as collaborative work becomes more common throughout the Digital Humanities, tenure and promotion committees will need to develop guidelines and criteria for evaluating collaborative work and digital projects published online.

Databases
(37) While scale and collaboration transform the conditions under which research is produced, digital tools affect research both at the macro level of conceptualization and the micro level of fashioning individual sentences and paragraphs.
(38) Another advantage of databases is the ability to craft different kinds of interfaces, depending on what users are likely to find useful or scholars want to convey.
(39) The collaborations that databases make possible extend to new kinds of relationships between a project's designer and her interlocutors.
(39) The emphasis on databases in Digital Humanities projects shifts the emphasis from argumentation—a rhetorical form that historically has foregrounded context, crafted prose, logical relationships, and audience response—to data elements embedded in forms in which the structure and parameters embody significant implications.

Multimodal Scholarship
(40) In addition to database structures and collaborative teams, the Digital Humanities also make use of a full range of visual images, graphics, animations, and other digital effects. In best-practice projects, these have emotional force as well as conceptual coherence.
(40-41) In light of such developments, Timothy
Lenoir (2008a) draws the conclusion that the Digital Humanities' central focus should be on developing, incorporating, and creating the media appropriate for their projects. “We make media,” Lenoir proclaims. “That's what we do.” . . . The project [Virtual Peace] was funded by a $250,000 MacArthur grant; the inclusion of commercial programmers indicates that large projects such as this require outside funding, either from corporate sponsors or foundations and granting agencies.
(41) As [Anne]
Balsamo argues in Designing Culture: The Technological Imagination at Work (2011), humanities scholars should seize the initiative and become involved in helping to develop the tools our profession needs.

Code

Enumeration of Digital Humanities centers; bridging the gap between the lost generation of codes to McCarty humanities coders.

(43) Given the double demand for expertise in a humanistic field of inquiry and in computer languages and protocols, many scholars feel under pressure and wonder if they are up to the task. . . . In the future, academic programs such as Georgia Tech's computational media and the humanities computing majors at King's College may produce scholars fluent both in code and the Traditional Humanities. In the meantime, many scholars working in the field are self-taught, while others extend their reach through close and deep collaborations with technical staff and professionals in design, programming, etc.

Future Trajectories
(44) Nevertheless, new factors suggest a critical mass has been reached. Foremost is the establishment of robust Digital Humanities centers at the University of Maryland; King's College London; the University of Nebraska; the University of Texas; the University of California, Santa Barbara; the University of California, Los Angeles; and many other institutions.
(44-45) A concurrent development is the marked increase in the number of scholarly programs offering majors, graduate degrees, and certificate programs in the Digital Humanities, with a corresponding growth in the numbers of students involved in the field. Willard
McCarty (2009) extrapolates from this development to see a future in which humanities scholars are also fluent in code and can “actually make things work.”
(45) Among my interviewees, scholars with administrative responsibilities for program development typically had thought most about future trajectories and were most emphatic about the transformative potential of digital technologies.

Two Strategies for the Digital Humanities: Assimilation and Distinction
(46-47) CCH [Center for Computing in the Humanities at King's College London] arguably boasts the most extensive curricula and the most developed program of Digital Humanities in the world. I want to understand how this robust program was built and why it continues to flourish.
(49) A common problem many of these researchers encountered was having the complexity, extent, and achievement of their digital projects appropriately recognized and reviewed. This was in part because there are only a few people with the experience in digital research to properly evaluate the projects, and in part because other evaluators, experts in the field but unused to digital projects, did not have an accurate understanding of how much was was involved.
(49) The group also noted the gap between the many that use their projects and the few that want to go on to create digital projects themselves.
(50) To explore the advantages and limitations of a distinction strategy, I take as my example the LCC [Literature, Culture, and Communication] at Georgia Tech, with which I have long-standing ties.

Examples of CCH and LCC as model DH programs emphasizing strategies of extensiveness and distinction.

(51) Whereas CCH works mostly with art and cultural institutions, LCC has corporate as well as nonprofit partners and places many of its graduates in for-profit enterprises.
(52) The distinction approach, as it is implemented at LCC and elsewhere, aims to create cutting-edge research and pedagogy specifically in digital media. To a significant degree, it is envisioning the future as it may take shape in a convergence culture in which TV, the web, computer games, cell phones, and other mobile devices are all interlinked and deliver cultural content across as well as within these different media.
(53) The challenge for such programs is to find ways to incorporate the insights of the Traditional Humanities, especially poststructuralist theory and gender, ethnic, and race studies, into practice-based research focusing primarily on the acquisition and deployment of technical skills.


3
How We Read
Close, Hyper, Machine

Theory of embodied cognition.

(55) This chapter looks at the other side of the coin, how digital media are affecting the practices in which our students are engaged, especially reading of digital versus print materials, with attention to the implications of these changes for pedagogy. Since the nature of cognition is centrally involved in these issues, this chapter also begins to develop a theory of embodied cognition encompassing conscious, unconscious, and nonconscious processes that will be crucial to the arguments of this and subsequent chapters.
(56) The crucial questions are these: how to convert the increased digital reading into increased reading ability, and how to make effective bridges between digital reading and the literacy traditionally associated with print.
(57) While literary studies continues to teach close reading to students, it does less well in exploiting the trend toward the digital. Students read incessantly in digital media and write in it as well, but only infrequently are they encouraged to do so in literature classes or in environments that encourage the transfer of print reading abilities to digital and vice versa.

Close Reading and Disciplinary Identity
(58) Literary scholars generally think they know what is meant by close reading, but looked at more closely, it proves not so easy to define or exemplify.
(59) In a special issue of
Representations titled “The Way We Read Now,” Best and Marcus launch a frontal assault on symptomatic reading as it was inaugurated by Fredric Jameson's immensely influential The Political Unconscious (1981).
(59) After more than two decades of symptomatic reading, however, many literary scholars are not finding it a productive practice, perhaps because (like many deconstructive readings) its results have begun to seem formulaic, leading to predictable conclusions rather than compelling insights.

Digital and Print Literacies
(60) No doubt those who already read well will take classes based on close reading and benefit from them, but what about others whose print reading skills are not as highly developed? To reach them, we must start close to where they are, rather than where we imagine or hope they might be.

Zone of proximal development for improving reading skills should also be relevant for digital humanists learning programming.

(60) This principle was codified by the Belarusian psychologist L.S. Vygotsky in the 1930s as the “zone of proximal development.” . . . More recent work on “scaffolding” and on the “zone of reflective capacity” extends the idea and amplifies it with specific learning strategies.

Hyper reading attributes by Sosnoski.

(61) James Sosnoski (1999) presciently introduced the concept of hyper reading, which he defined as “reader-directed, screen-based, computer-assisted reading” (167). Examples include search queries (as in a Google search), filtering by keywords, skimming, hyperlinking, “pecking” (pulling out a few items from a longer text), and fragmenting. Updating his model, we may add juxtaposing, as when several open windows allow one to read across several texts, and scanning, as when one reads rapidly through a blog to identify items of interest. There is considerable evidence that hyper reading differs significantly from typical print reading, and moreover that hyper reading stimulates different brain functions than print reading.
(62) In digital environments, hyper reading has become a necessity. It enables a reader quickly to construct landscapes of associated research fields and subfields; it shows ranges of possibilities; it identifies texts and passages most relevant to a given query; and it easily juxtaposes many different texts and passages.
(62) Much of this evidence is summarized by Nicholas
Carr in The Shallows: What the Internet Is Doing to Our Brains (2012). More judicious then Bauerlein, he readily admits that web reading has enormously increased the scope of information available, from global politics to scholarly debates. He worries, however, that hyper reading leads to changes in brain function that make sustained concentration more difficult, leaving us in a constant state of distraction in which no problem can be explored for very long before our need for continuous stimulation kicks in and we check e-mail, scan blogs, message someone, or check our RSS feeds.

Ways hyper reading negatively changes brain functions.

(63) Among these are hyperlinks that draw attention away from the linear flow of an article, very short forms such as tweets that encourage distracted forms of reading, small habitual actions such as clicking and navigating that increase the cognitive load, and, most pervasively, the enormous amount of material to be read, leading to the desire to skim everything because there is far too much material to pay close attention to anything for very long.

Reading on the Web
(63) Several studies have shown that, contrary to the claims of early hypertext enthusiasts such as George
Landow, hyperlinks tend to degrade comprehension rather than enhance it.

Working memory load affected by hypertext and web reading.

(64) The small distractions involved with hypertext and web reading—clicking on links, navigating a page, scrolling down or up, and so on—increase the cognitive load on working memory and thereby reduce the amount of new material it can hold. With linear reading, by contrast, the cognitive load is at a minimum, precisely because eye movements are more routine and fewer decisions need to be made about how to read the material and in what order.
(64) Supplementing this research are other studies showing that small habitual actions, repeated over and over, are extraordinarily effective in creating new neural pathways.
(66) Current evidence suggests that we are now in a new phase of the dance between
epigenetic changes in brain function and the evolution of new reading and writing modalities on the web.
(66) How valid is this conclusion? Although Carr's book is replete with many different kinds of studies, we should be cautious about taking his conclusions at face value. For example, in the fMRI study done by Small and his colleagues, many factors might skew the results.

The Importance of Anecdotal Evidence

Human-assisted computer reading.

(70) The formulation alerts us to a third component of contemporary reading practices: human-assisted computer reading, that is, computer algorithms used to analyze patterns in large textual corpora where size makes human reading of the entirety impossible.
(72) As we saw in chapter 2, the line between (human) interpretation and (machine) pattern recognition is a very porous boundary, with each interacting with the other. As demonstrated through many examples there, hypotheses about meaning help shape the design of computer algorithms (the “background theory” referred to above), and the results of algorithmic analyses refine, extend, and occasionally challenge intuitions about meaning that form the starting point for algorithmic design.
(73) Close and hyper reading operate synergistically when hyper reading is used to identify passages or to home in on a few texts of interest, whereupon close reading takes over. As Guillory observes, skimming and scanning here alternate with in-depth reading and interpretation (2008). Hyper reading overlaps with machine reading in identifying patterns.
(74) In general, the different distributions among pattern, meaning, and context provide ways to think about interrelations among close, hyper, and machine reading.
(74) The larger point is that close, hyper, and machine reading each have distinctive advantages and limitations; nevertheless, they also overlap and can be made to interact synergistically with one another.

Synergies between Close, Hyper, and Machine Reading

Liu Litearture+ teaching approach.

(75) Starting from a traditional humanistic basis in literature, Alan Liu in the English Department at the University of California, Santa Barbara, has been teaching undergraduate and graduate courses that he calls “Literature+,” which adopt as a pedagogical method the interdisciplinarity facilitated by digital media. . . . Starting with close reading, he encourages students to compare it with methodologies in other fields, including the sciences and engineering. He also has constructed a “Toy Chest” on his website that includes links to software packages enabling students with little or no programming experience to create different modes of representation of literary texts.
(76) Linking traditional literary reading skills with digital encoding and analysis, the Literature+ approach strengthens the ability to understand complex literature at the same time it encourages students to think reflectively on digital capabilities.

Manovich cultural analytics apply big data sets and methods to cultural objects.

(76) Lev Manovich's “Cultural Analytics” (2007) is a series of projects that start from the premise that algorithmic analyses of large data sets (up to several terabytes in size), originally developed for work in the sciences and social sciences, should be applied to cultural objects, including the analysis of real-time data flows.

Machine reading examples are mostly visual; add ensoniment and perhaps eventually machine listening.

(78) As Manovich says about cultural analytics and Moretti proclaims about distant reading, machine analysis opens the door to new kinds of discoveries that were not possible before and that can surprise and intrigue scholars accustomed to the delights of close reading.


SECOND INTERLUDE
The Complexities of Contemporary Technogenesis


4
Tech-TOC
Complex Temporalities and Contemporary Technogenesis

(85) This chapter advances a theoretical framework in which technical objects are also seen in evolutionary terms as repositories of change and, more profoundly, as agents and manifestations of complex temporalities.

Does the discussion of temporality with respect to objects exemplify a complexity where hermeneutic phenomenology falls short?

(86) What would it mean to talk about an object's experience of time, and what implications would flow from this view of objecthood?
(86) Working through
Simondon's concepts as well as their elaboration by such contemporary theorists as Adrian MacKenzie, Mark Hansen, Bernard Stiegler, and Bruno Latour, I will discuss the view that technical objects embody complex temporalities enfolding past into present, present into future.
(86) Working from previous discussions of human-centered change, this chapter combines with them the object-centered view, thus putting into place the other half necessary to launch the full technogenetic spiral.
(87) To explore the ways in which duration and spatialized temporality create fields of contention seminal to human cultures, I turn to an analysis of
TOC: A New Media Novel (2009), a multimodal electronic novel by Steve Tomasula (with design by Stephen Farrell). . . . Composed on computers and played on them, TOC explores its conditions of possibility in ways that perform as well as demonstrate the interpenetration of living and technical beings, processes in which complex temporalities play central roles.

Technics and Complex Temporalities

Simondon technical object categories: elements, individuals, ensembles; concretization the motive force for change.

(87-88) Technics, in Simondon's view, is the study of how technical objects emerge, solidify, disassemble, and evolve. In his exposition, technical objects comprise three different categories: technical elements, for example, the head of a stone ax; technical individuals, for example, the compund tool formed by the head, bindings, and shaft of a stone ax; and technical ensembles, in the case of an ax, the flint piece used to knap the stone head, the fabrication of the bindings from animal material, and the tools used to shape the wood shaft, as well as the toolmaker who crafts this compound tool. . . . The ability of technical elements to “travel” is, in Simondon's view, a principal factor in technological change.
(88) The motive force for technological change is the increasing tendency toward what Simondon calls “
concretization,” innovations that resolve conflicting requirements within the milieu in which a technical individual operates.
(88-89) Concretization, then, is at the opposite end of the spectrum from abstraction, for it integrates conflicting requirements into multipurpose solutions that enfold them together into intrinsic and necessary circular causalities. Simondon correlates the amount of concretization of a technical individual with its technicity.
(89) The conflicting requirements of a technical individual that is substantially abstract constitute, in Simondon's view, a potential for innovation or, in Deleuzian terms, a repository of virtuality that invites transformation. Technical individuals represent, in Adrian MacKenzie's phrase, metastabilities, that is, provisional solutions of problems whose underlying dynamics push the technical object toward further evolution. . . . Temporality is something that not only happens within them but also is carried on by them in a constant dance of temporary stabilizations amid continuing innovations.

Folding of time, skeumorphs, Stiegler tertiary retention.

(89) This dynamic has implications for the “folding of time,” a phenomenon Bruno Latour (1994) identifies as crucial to understanding technological change. . . . In this way, the future is already preadopted in the present (future roads in present cars), while the present carries along with it the marks of the past, for example in the metal ax head that carries in its edge the imprint of the technical ensemble that tempered it (in older eras, this would include a blacksmith, forge, hammer, anvil, bucket of water, etc.). On a smaller scale, the past is enfolded into the present through skeumorphs, details that were previously functional but have lost their functionality in a new technical ensemble.
(90) When devices are created that make these enfoldings explicit, for example in the audio and video recording devices that Bernard Stiegler discusses, the biological capacity for memory (which can be seen as an evolutionary adaptation to carry the past into the present) is exteriorized, creating the possibility, through technics, for a person to experience through complex temporality something that never was experienced as a firsthand event, a possibility Stiegler calls
tertiary retention (or tertiary memory).

The Coevolution of Humans and Tools
(90-91) The constructive role of tools in human evolution involves cognitive as well as muscular and skeletal changes. Stanley Ambrose (2001), for example, has linked the fabrication of compound tools (tools with more than one part, such as a stone ax) to the rapid increase in Broca's area in the brain and the consequent expansion and development of language.
(91) the constructive role of attention in fabricating tools and creating technical ensembles.

Materiality as human-technical hybrid based on not just perception but attention.

(91) Materiality is unlike physicality in being an emergent property. It cannot be specified in advance, as though it existed ontologically as a discrete entity. Requiring acts of human attentive focus on physical properties, materiality is a human-technical hybrid.
(92) An embedded cognitive approach, typified by the work of anthropologist Edwin Hutchins (1996), emphasizes the environment as crucial scaffolding and support for human cognition.
(93) The differences between EXTENDED and BRAINBOUND are clear, with the neurological, experimental, and anecdotal evidence overwhelmingly favoring the former of the latter.

Embedded versus extended cognition; significant role played by unconscious, reversing Descartes.

(93-94) Whereas the embedded approach emphasizes human cognition at the center of self-organizing systems that support it, the extended model tends to place the emphasis on the cognitive system as a whole and its enrollment of human cognition as a part of it. . . . Recent work across a range of fields interested in this relation—neuroscience, psychology, cognitive science, and others—indicates that the unconscious plays a much larger role than had previously been thought in determining goals, setting priorities, and other activities normally associated with consciousness.
(95) In a startling reversal of Descartes, they [Dijksterhuis, Aarts, and Smith] propose that thought itself is mostly unconscious. . . . The senses can handle about 11 million bits per second, with about 10 million bits per second coming from the visual system. Consciousness, by contrast, can handle dramatically fewer bits per second.

The Technological Unconscious

New phenomenality via technological, adaptive unconscious; compare subject (cognitive-embodied processes) built with adaptive unconscious to Derrida archive, and reconsider his question whether psychoanalysis would have evolved differently had there been email.

(96-97) Nigel Thrift (2005) argues that contemporary technical infrastructures, especially networked and programmable machines, are catalyzing a shift in the technological unconscious, that is, the actions, expectations, and anticipations that have become so habitual they are “automatized,” sinking below conscious awareness while still being integrated into bodily routines carried on without conscious awareness. . . . Both time and space are divided into smaller and smaller intervals and coordinated with locations and mobile addresses of products and people, resulting in “a new kind of phenomenality of position and juxtaposition.” . . . Consequently, mobility and universally coordinated time subtly shifts what is seen as human. . . . “human” in developed countries now means (for those who have access) cognitive capacities that extend into the environment, tap into virtually limitless memory storage, navigate effortlessly by GPS, and communicate in seconds with anyone anywhere in the world (who also has access).
(97) These developments hint at a dynamic interplay between the kinds of environmental stimuli created in information-intensive environments and the adaptive potential of cognitive faculties in concert with them.
(97-98) Because the
adaptive unconscious interacts flexibly and dynamically with the environment (i.e., through the technological unconscious), there is a mediated relationship between attention and the environment much broader and more inclusive than focused attention itself allows.
(98) I alter Clark's formulation slightly so that epistemic actions, as I use the term, are understood to modify
both the environment and cognitive-embodied processes that adapt to make use of those changes. Among the epistemic changes in the last fifty years in developed countries such as the United States are dramatic increases in the use and pacing of media, including the web, television, and films; networked and programmable machines that extend into the environment, including PDAs, cell phones, GPS devices, and other mobile technologies; and the interconnection, data scraping, and accessibility of databases through a wide variety of increasingly powerful desktop machines as well as such ubiquitous technologies such as RFID tags, often coupled autonomously with sensors and actuators. In short, the variety, pervasiveness, and intensity of information streams have brought about major changes in built environments in the United States and comparably developed societies in the last half century. We would expect, then, that conscious mechanisms of attention and those undergirding the adaptive unconscious have changed as well.

Synatpogenetic trends like hyper attention seem inevitable; is there a limit (think of Ihde)?

(100) Far from cause for alarm, synaptogenesis can be seen as a marvelous evolutionary adaptation, for it enables every human brain to be reengineered from birth on to fit into its environment. . . . The clear implication is that children who grow up in information-intensive environments will literally have brains wired differently than children who grow up in other kinds of cultures and situations. The shift toward hyper attention is an indication of the direction in which contemporary neural plasticity is moving in developed countries.

Malabou plasticity versus flexibility as preferred comportment having potential for resistance, not just passive accommodation.

(101) Malabou (2005) notes what has struck many critics (for example, A. Liu [2004] and Galloway and Thacker [2007]), that contemporary models of neuronal functioning, which emphasize networks of neurons rather than a single neuron and that see plasticity as an important, lifelong attribute of brain function and morphology, bear an uncanny resemblance to contemporary global capitalism, which also works through networks and requires continuous rearranging and repurposing of objects and people. . . . Her strategy is to distinguish sharply between flexibility and plasticity; whereas flexibility is all about passive accommodation to the New World Order, plasticity has the potential for resistance and reconfiguration.

Sociometer and somameter examples of cybernetic devices for transducing unconscious and nonconscious perceptions into awareness.

(102) From a technogenetic perspective, the holistic nature of human response to the environment, including conscious, unconscious, and nonconscious awareness, suggests the possibility of fabricating devices that can use unconscious and nonconscious perceptions in ways that make their awareness available to consciousness.
(103) The practical goals achieved by these research programs vividly demonstrate that plasticity provides not only the grounds for a philosophical call for action but a potent resource for constructive interventions through human-digital media hybridity.

Complex Temporalities in Living and Technical Beings

Attention essential component of technical change.

(103-104) Weaving together the strands of the argument so far, I propose that attention is an essential component of technical change (although undertheorized in Simondon's account), for it creates from a background of technical ensembles some aspect of their physical characteristics upon which to focus, thus bringing into existence a new materiality that then becomes the context for technological innovation. Attention is not, however, removed or apart from the technological changes it brings about. Rather, it is engaged in a feedback loop with the technological environment within which it operates through unconscious and nonconscious processses that affect not only the background from which attention selects but also the mechanisms of selection themselves.
(104) In this sense too, the computer instantiates multiple, interacting, and complex temporalities, from microsecond processes up to perceptible deplays.
(104) Humans too embody multiple temporalities.

Do multiple temporalities confound phenomenology as Bogost would say?

(105) The point at which computer processes become perceptible is certainly not a single value; subliminal perception and adaptive unconsciousness play roles in our interactions with the computer, along with conscious experience. . . . To a greater or lesser extent, we are all moving toward the hyper attention end of the spectrum, some faster than others.

Suggests hyper attention occurs within technological objects and innovation processes.

(105-106) Going along with the feedback loops between the individual user and networked and programmable machines are cycles of technical innovation. . . . Beta versions are now often final versions. . . . The unresolved background created by these practices may be seen as the technical equivalent to hyper attention, which is both produced by and helps to produce the cycles of technical innovation that result in faster and faster changes, all moving in the direction of increasing the information density of the environment.

Compare technotext TOC example to role played by Yerushalmi book for Derrida in Archive Fever.

(106) This, then, is the context in which Steve Tomasula's electronic multimodal novel TOC was created. TOC is, in a term I have used elsewhere, a technotext (Hayloes 2002): it embodies in its material instantiation the complex temporalities that also constitute the major themes of its narrations. . . . Heterogeneous in form, bearing the marks of ruptures created when some collaborators left the scene and others arrived, TOC can be understood as a metonym for the gap between the neuronal protoself and the narrated self. It explores the relation between human bodies and the creation and development of networked and programmable machines, with both living and technical beings instantiating and embodying complex temporalities that refuse to be smoothly integrated into the rational and unitary scheme of a clock ticking. It thus simultaneously testifies to and resists the “spirit of capitalism” in the era of globalization that Catherine Malabou urges is the crucial problem of our time—and our temporalities.

Modeling TOC
(107) Through its interfaces,
TOC offers a variety of temporal regimes, a spectrum of possibilities enacted in different ways in its content.

Capturing Time
(111) Central to attempts to capture time is spatialization.
(114) The paradox points to a deeper realization: time as an abstraction can be manipulated, measured, and quantified, as in a mathematical equation; time as a process is more protean, associated with Bergson's duration rather than time in its spatialized aspects.
(115) Concealed within its illogic, however, is a powerful insight: humans construct time through measuring devices, but these measuring devices also construct humans through the regulation of temporal processes. The resulting human-technical hybridization in effect conflates spatialized time with temporal duration (fig. 4.4).

Time and the Socius
(118) Whereas Simondon's theory of technics has much to say about nonhuman technical beings,
TOC has more to reveal about human desires, metaphors, and socius than it does about technical objects.
(120) Lacking the budget of even a modest film would have and requiring expertise across a wide range of technical skills,
TOC is something of a patchwork, with different collaborators influencing the work in different directions at different times. Its aesthetic heterogeneity points to the difficulties writers face when they attempt to move from a print medium, in which they operate either as sole author or with one or two close collaborators, to multimodal projects involving expertise in programming, music and sound, animation, graphic design, visual images, and verbal narration, to name some of the skills required.
(120)
TOC gestures toward a new regime, then, in which artistic creation is happening under very different conditions than for print literature. . . . I prefer to situate it in a context that Simondon associates with virtuality—a work that is more abstract than concretized and that consequently possesses a large reservoir of possibilities that may be actualized as the technological milieu progresses and as technical objects proliferate, exploring ways to achieve greater integration. Or perhaps not. . . . In newer technical milieu, changing so fast that a generation may consist of only two or three years, the provisional metastability of technical individuals may become even less stable, so that it is more accurate to speak of continuous transformation than metastability at all.


5
Technogenesis in Action
Telegraph Code Books and the Place of the Human

(123) A case study that brings in more of the connections between epigenetic changes in human biology, technological innovations, cultural imaginaries, and linguistic, social, and economic changes would be useful.
(123-124) My candidate is the first globally pervasive binary signaling system, the telegraph. . . . Extending these analyses, my focus in this chapter is on the inscription technology that grew parasitically alongside the monopolistic pricing strategies of telegraph companies: telegraph code books. Constructed under the bywords “economy,” “secrecy,” and “simplicity,” telegraph code books matched phrases and words with code letters or numbers. The idea was to use a single code word instead of an entire phrase, thus saving money by serving as an information compression technology.

Studying telegraph code books as example of DH practice, somewhat orthogonal to study by Sterne of listening practices and study by Misa of telegraph.

(124) These remnants of a once flourishing industry reveal the subtle ways in which code books affected assumptions and practices during the hundred years they were in use. These effects may be parsed through three sets of dynamic interactions: bodies and information; code and language; and messages and the technological unconscious. . . . In this sense telegraphy was prologue to the ideological and material struggle between dematerialized information and resistant bodies characteristic of the globalization era, a tension explored below in the section on information and the cultural imaginaries of bodies.
(124) The interaction between code and language shows a steady movement away from a human-centric view of code toward a machine centric view, thus anticipating the development of full-fledged machine codes with the digital computer.
(125) Virtually every major industry had code books dedicated to its needs, including banks, railroads, stock trading, hospitality, cotton, iron, shipping, rubger, and a host of others, as well as the military. They reveal changing assumptions about everyday life that the telegraph initiated as people began to realize that fast message transmission implied different rhythms and modes of life. Thus the telegraph code books bear witness to shifts in the technological unconscious that telegraphy brought about.

Telegraphy brought shifts in technological unconscious.

(125) Among the sedimented structures in the technological unconscious is the dream of a universal language.

Bodies and Information

Role of monopoly capitalism in telegraph reconfiguring time and space.

(126) Time and space were not, common wisdom to the contrary, annihilated by the telegraph, but they were reconfigured.
(127) Message transmission was thus dependent on multiple functionalities, most of which lay outside the control—or even the knowledge—of the individual consumer. In this sense, telegraphy anticipated the message transmission systems of the Internet, in which an individual user may have no knowledge or awareness of the complex pathways that information packets take en route to their destinations.
(128) Although no scientific data exist on the changes sound receiving made in neural functioning, we may reasonably infer that it brought about long-lasting changes in brain activation patterns, as this anecdote [about woman hearing Morse code everywhere] suggests.

Disciplining body, enrolling human subjects into techocratic regimes; Sterne connection, though electric telegraph could never realize dream of eliminating man in the middle sought by cybernetics for sending and receiving skills.

(129) Disciplining the body in this way was one of the many practices that made telegram writing an inscription technology enrolling human subjects into technocratic regimes characterized by specialized technical skills, large capital investments, monopolistic control of communication channels, and deferrals and interventions beyond the ken of the individual telegram writer and receiver.
(131) The goal articulated during the mid-twentieth century Macy conferences of eliminating “the man in the middle” was never possible with the electric telegraph, making the technology intrinsically bound to the place of the human.

Code and Language
(132) In addition to inscribing messages likely to be sent, the code books reveal ways of thinking that tended to propagate through predetermined words and phrases.
(133) Saussure's insight that the association between word and thing is arbitrary finds an antecedent of sorts in telegraph code books.
(135) What is true of all language from a deconstructive point of view is literalized and magnified in telegraphy through temporal and spatial dispersions and the consequent uncertainties and ambiguities they introduced.
(139) These prophylactic measures had a consequential outcome: because of these constraints, natural-language code words were no longer sufficient to match code words that all the desired phrases, so some authors began to invent artificial code words.
(139) This opened a floodgate of innovation, which had the effect of moving code from natural language to algorithmically generated code groups.
(140) The list of languages considered “natural,” noted above, gives the Western colonial powers privileged status, indicating one of the ways in which international politics became entangled with the construction of codes—along, of course, with regulations settling telegram tariffs, negotiations over rights to send messages on the lines of another company, and other capitalistic concerns.

Procedurally calculation replaces memory associations from lifeworld, realizing Saussure proposition about arbitrary semiotic relations.

(142) The progression from natural language to artificial code groups, from code words drawn from the compiler's memory associations to codes algorithmically constructed, traces a path in which code that draws directly on the lifeworld of ordinary experience gives way to code calculated procedurally.
(146) As we have seen, sound receiving was a difficult skill to learn, so a trade-off was beginning to take shape: the skills required of the operator decreased as the machines grew more complex, while the skills required to produce and maintain the machines increased accordingly.
(146) Following the same kind of trajectory as the transition from sound receiving to Teletyping, fewer sending and receiving skills were located in humans, and more were located in the machines.

Information and Cultural Imaginaries of the Body

Technogenetic spiral includes anticipatory models that work like technology, such as mysterious nature of electricity, now computational metaphors, a flip side of skeumorphs.

(147) The seminal importance of metaphoric connections between telegraphy and neuronal science suggests another way in which the technogenetic spiral works: models of the nervous system provide clues for technological innovation, and technological innovation encourages the adoptions of models that work like the technology. . . . In the case of telegraphy, one of the gaps that allowed constructive intervention was the “mysterious” nature of electricity.

Messages and the Technological Unconscious
(152) Particularly relevant in this regard are plaintext phrases that reveal how the telegraph was changing expectations and assumptions.
(153) Other code books remind us of the historical circumstances under which disciplinary functions were carried out.
(153) The perils of shipping are vividly inscribed in codes devoted to that trade.
(154-155) The telegraph, with its capability of fast message transmission, had a significant impact not only on civilian matters but also on military strategies and commands.
(157) Perhaps only a small percentage were neurologically affected, principally telegraphers and clerks in businesses who routinely sent and received telegrams. Nevertheless, their physico-cognitive adaptations played important roles in making the emergent technologies feasible. Partly as a result of their work, much wider effects were transmitted via the technological unconscious as business practices, military strategies, personal finances, and a host of other everyday concerns were transformed with the expectation of fast communication and the virtualization of commodities and money into information.
(158) In a nice irony, the computer, lineal descendant of the telegraph, provides the functionality that makes access to the historical information contained in telegraph code books easily available: the great-grandson brings the great-grandfather back to life.

Code and the Dream of a Universal Language
(158) Telegraph code books, in addition to offering secrecy, also reflect a desire to create a mode of communication capable of moving with frictionless ease between cultures, languages, and time periods.

New concept of layers of codes and languages; Raley Tower of Languages.

(160) A revolution in language practice, with important theoretical implications, occurred when the conception changed from thinking about encoding/decoding as moving across the printed page to thinking of it as moving up or down between different layers of codes and languages. . . . Positioned at the bottom layer, binary code became the universal symbol code into which all other languages, as well as images and sound, could be translated.
(161) The interactions of code and universal language in the twentieth century are located squarely in cryptography.

Code as universal language privileges English.

(161) (endnote 16) Positioning code as a universal language not only erases cultural specificities but also creates a situation in which a dominant language (English) tends to suppress the cultural views and assumptions embedded in nonalphabetic or “minor” languages.
(162) As we have seen, this vision of code as the subsurface lingua franca was realized in a different sense in the development of computer code, especially the layered structures of different software and scripting languages that Rita Raley (2006) has aptly called the “
Tower of Languages.”

The World's Most Bizarre Telegraph Code Book
(163) The [Andrew Hallner]
Scientific Dial Primer represents a transition point insofar as its form harkens back to nineteenth-century code books, while its ambitions leap forward to the era of globalized networks of programmable and networked machines.
(169-170) This strangest of telegraph code books shows that the dream of a universal language and “neutral” scientific code is fraught with ideological, political, economic, and social issues that it cannot successfully resolve. . . . This is one version of the tension between narrative, with its traditional associations with human agency and theories of mind, and databases as decontextualized atomized information.


THIRD INTERLUDE
Narrative and Database: Digital Media as Forms

(171-172) One of the ways in which databases are changing humanistic practices is through GPS and GIS technologies, especially in such fields as spatial history. Whereas qualitative history has traditionally relied on narrative for its analysis, spatial history draws from databases not only to express but also to interrogate historical change. Chapter 6 compares the processual view of space, advocated by geographer Doreen Massey, to the requirements of spatial history projects such as those at the Stanford Spatial History Project and elsewhere. . . . Since “lively” space presumes the integration of time and space, one possibility is to represent movement (as distinct from static dates) through object-oriented databases.
(173) Whereas
RST [The Raw Shark Texts: A Novel] posits narrative in opposition to database, OR [Only Revolutions] incorporates database-like informaiton into its literary structure. These two experimental texts thus show the effects of database culture on contemporary literature and speak to the different kinds of responses that contemporary literary texts have to the putative decline of narrative and rise of databases.


6
Narrative and Database
Spatial History and the Limits of Symbiosis

(175) In this imagined combat between narrative and database, database plays the role of the Ebola virus, whose voracious spread narrative is helpless to resist.

Database and narrative symbionts, not combatants.

(176) Rather than being natural enemies, narrative and database are more appropriately seen as natural symbionts.

The Different Worldviews of Narrative and Database
(176) By far the most pervasive form of database is the relational, which has almost entirely replaced the older hierarchical, tree, and network models and continues to hold sway against the newer object-oriented models.
(177) A database is said to be self-describing because its user does not need to go outside the database to see what it contains. . . . The database's self-description is crucial to being able to query it with set-theoretic operations, which require a formally closed logical system upon which to operate.
(177-178) The self-describing nature of database provides a strong contrast with narrative, which always contains more than indicated by a table of contents or a list of chapter contents.
(179) Narratives gesture toward the inexplicable, the unspeakable, the ineffable, whereas databases rely on enumeration, requiring explicit articulation of attributes and data values.
(179) As such, narrative modes are deeply influenced by the evolutionary needs of humans negotiating unpredictable three-dimensional environments populated by diverse autonomous agents. As Mark
Turner ha argued in The Literary Mind: The Origins of Thought and Language (1998), stores are central in the development of human cognition.
(180) These structures imply that the primary purpose of narrative is to search for meaning, making narrative an essential technology for humans, who can arguably be defined as meaning-seeking animals.
(180) For databases, the reverse is true: the paradigmatic possibilities are actually present in the columns and rows, while the syntagmatic progress of choices concatenated into linear sequences by SQL commands is only virtually present (Manovich, 2002: 228).
(181) Wherever one looks, narratives surface, as ubiquitous in everyday culture as dust mites.
(181) What has changed in the information-intensive milieu of the twenty-first century is the position narrative occupies in the culture.
(182) The constant expansion of new data accounts for an important advantage that relational databases have over narratives, for new data elements can be added to existing databases without disrupting their order. . . . Narrative in this respect operates quite differently. Sensitively dependent on the order in which information is revealed, narrative cannot in general accommodate the addition of new elements without, in effect, telling a different story.
(183) No longer singular, narratives remain the necessary others to database's ontology, the perspectives that invest the formal logic of database operations with human meanings and gesture toward the unknown hovering beyond the brink of what can be classified and enumerated.

Spatial History: A Field of Transition

Roots of a Dream
(185) The key here is conceptualizing place not as a fixed site with stable boundaries but rather as a dynamic set of interrelations in constant interaction with the wider world, which nevertheless take their flavor and energy from the locality they help to define.
(186) Making space “lively,” then, implies not only that it is emergent but also that multiple temporal trajectories inhabit it in diverse localities across the globe.

Tensions between the Dream and Spatial History Projects

Assumptions replicate through temes, techincal objects, forming conventions of software (Manovich).

(188-189) What comes into view with this observation are the crucial differences between discursive research of the kind Massey and other geographers practice, and the database projects of the Stanford Spatial History Project and similar database driven projects. . . . Memes do not allow new insights to emerge as a result of establishing correlations between existing databases and new datasets; rather, they serve as catalysts for reframing and reconceptualizing questions that are given specific contexts by the writer appropriating the memes for his or her own purposes. In this sense, memes contrast with what Susan Blackmore (2010) calls temes, ideas that replicate through technical objects rather than human brains. . . . When databases function as the infrastructure for other projects, as do the georeferenced USGS quads in the Western Railroads project, they can carry implicit assumptions into any number of other projects, in particular the assumption that spatial practices are built on top of absolute space.

The Cultural and Historical Specificity of Interrelationality
(191) Massey argues that making interrelationality a key feature of space will ensure that the modernist trajectory of progress she criticizes will not be allowed to dominate.
(191-192) The assumption of a universal trajectory of “progress” through time is disrupted by representing historical progression as a succession of layers rather than a linear time line.

Space and Temporality

Contrasts between relational and object-oriented databases exhibit procedural rhetorics and world models.

(192-193) In theory, databases are models of the world, and relational and object-oriented databases conceive of the world (or better, assume and instantiate world versions through their procedures) in fundamentally different ways. Relational database representations imply that the world is made up of atomized data elements (or records) stored in separate tables, with similar data types grouped together.
(193) Object-oriented databases, by contrast, divide the world into abstract entities called classes.
(193) The object approach is navigational, whereas the relational approach is declarative.
(193) These differences notwithstanding, the contrast should not be overstated.
(195) The kinds of relationality that can be represented within a class's functionalities and therefore between classes are more flexible than the kinds of relations that can be represented through declarative commands operating on relational databases, an especially important consideration when temporal events are being represented.
(196) The contrast between relational and object-oriented databases shows a continuing evolution in software that moves in response to the desire of hiostirans to incorporate temporality as movement rather than as static dates.
(197) Harris, Rouse, and Bergeron propose that “only 20 percent of GIS functionality is sufficient to garner 80 percent of the geospatial humanities benefits” (2010:130). They interpret this to mean that rather than the cumbersome functionality and steep learning curve of a standard GIS data system, 80 percent of its benefits can be gathered from ubiquitous systems such as Google Maps and Google Earth, for which users can provide local information and multimedia content.

Like Feynman notes, doing research by producing visualizations and maps, or as I argue, working code.

(197) [quoting Richard White] visualization and spatial history are not about producing illustrations or maps to communicate things that you have discovered by other means. It is a means of doing research; it generates questions that might otherwise go unasked; it reveals historical relations that might otherwise go unnoticed, and it undermines, or substantiates, stories upon which we build our own versions of the past. (2010: para. 36).


7
Transcendent Data and Transmedia Narrative
Steven Hall's
The Raw Shark Texts

Hall Raw Shark Texts tutor text for contemplating databases.

(200) The creepiness of knowing that one's everyday transactions depend on invisible databases suggests an obvious point of intervention for contemporary novels: subverting the dominance of databases and reasserting the priority of narrative fictions. Steven Hall's remarkable first novel, The Raw Shark Texts, creates an imaginative world that performs the power of written words and reveals the dangers of database structures.

The Emergence of Database Interoperability

Separation of content from instantiation and presentation.

(200) Aside from issues of surveillance, secrecy, and access, databases also raise serious questions, as Parton suggests (2008), about the kinds of knowledge that are lost because they cannot fit into standardized formats. Alan Liu (2008b) identifies the crucial move in this standardization as “the separation of content from material instantiation or formal presentation.”
(201) First, management became distributed into different functions and automated with the transition from human managers to database software. Second, content became separated not only from presentational form but also from material instantiation.
(201) The move to metamanagement is accompanied by the development of standards, and governing them, standards of standards. . . . This implies that the web designer becomes a “cursor point” drawing on databases that remain out of sight and whose structures may be unknown to him.
(202) Whereas data elements must be atomized for databases to function efficiently, narrative fiction embeds “data elements” (if we can speak of such) in richly contextualized environments of the phrase, sentence, paragraph, section, and fiction as a whole. Each part of this ascending/descending hierarchy depends on all the other parts for its significance and meaning, in feedback and feedforward loops called the hermeneutic circle. Another aspect of contextualization is the speaker (or speakers), so that every narrative utterance is also characterized by viewpoint, personality, as so forth. Moreover, narrative fictions are conveyed through the material instantiations of media, whether print, digital, or audio. Unlike database records, which can be stored in one format and imported into an entirely different milieu without changing their significance, narratives are entirely dependent on the media that carry them.

Two Different Kinds of Villains

The Dangerous Delights of Immersive Fiction
(208-209) Recall Henri Lefebrvre's proclamation ([1974] 1992) that social practices construct social spaces; narrative fiction is a discursive practice that at once represents and participates in the construction of social spaces. Through its poetic language, it forges relationships between the syntactic/grammatical/semantic structures of language and the creation of a social space in which humans move and live, a power that may be used for good or ill.
(209) Countering the dangers of immersive fiction is the prophylaxis of decoding.
(211-212) In contrast to the dematerialization that occurs when data are separated from their original instantiations, entered as database records, and reinstantiated as electronic bits, inscriptions on paper maintain a visible, even
gritty materiality capable of interacting directly with human bodies.

The Raw Shark Texts as a Distributed Media System
(212) At the center of the system is the published novel, but other fragments are scattered across diverse media, including short texts at various Internet sites, translations of the book in other languages, and occasional physical sites. . . . Here the quest for understanding the text, even on the literal level of assembling all its pieces, requires the resources of crowd sourcing and collaborative contributions.

New dynamic of language feedback loop of continuous reciprocal causality differs from Saussure and Lacan.

(216) What if language, instead of sliding along a chain of signifiers, were able to create a feedback loop of continuous reciprocal causality such that the mark and concept coconstituted each other? Such a dynamic would differ from Saussure, because there would be no theoretical distance between mark and concept; It would also differ from Lacan, because the signified not only reenters the picture but is inextricably entwined with the signifier.

Interesting consideration of null values as anathema for database data.

(218) As we have seen, in the fields of relational databases, undetermined or unknown (null) values are anathema, for when a null value is concatenated with others, it renders them null as well. In narrative, by contrast, undecidables enrich the text's ambiguities and make the reading experience more compelling.
(218-219) Narrative, language, and the human brain are coadapted to one another. The requirements for the spread of postindustrial knowledge that Alan Liu identifies—transformability, autonomous mobility, and automation—point to the emergence of machine intelligence and its growing importance in postindustrial knowledge work. It is not human cognition as such that requires these attributes but rather machines that communicate with other machines (as well as humans).
(219) Walking around with Pleistocene brains but increasingly immersed in intelligent environments in which most of the traffic goes between machines rather than between machines and humans, contemporary subjects are caught between their biological inheritance and their technological hybridity.
RST imagines a (post)human future in which writing, language, books, and narratives remain crucially important. With the passing of the Age or Print, books and other written documents are relieved of the burden of being the default medium of communication for twentieth- and twenty-first century societies. Now they can kick up their heels and rejoice in what they, and they alone among the panoply of contemporary media, do uniquely well: tell stories in which writing is not just the medium of communication but the material basis for a future in which humans, as ancient as their biology and as contemporary as their technology, can find a home.


8
Mapping Time, Charting Data
The Spatial Aesthetic of Mark Z. Danielewski's
Only Revolutions

Danielewski Only Revolutions tutor text for exploring role of spatiality in literary texts.

(221) In responding to the overwhelming amounts of data inundating developed societies in the contemporary era, Mark Z. Danielewski has launched a bold experimental novel, Only Revolutions, that interrogates the datasphere by accentuating and expanding the role of spatiality in a literary text. In this sense, it displays the effects of data not only at the diegetic level of the narrative but also in the material form of the print codex itself. Among the transformations and deformations the text implements is a profound shift from narrative as a temporal trajectory to a topographic plane upon which a wide variety of interactions and permutations are staged.

Spatial Form and Information Multiplicity
(223) Whereas [Joseph]
Frank focused on the writer's subjectivity, [John] Johnston, following Deleuze and Kittler, argues that a subject-centered view cannot account for the viral properties of exponentially expanding information. . . . The aesthetic unity Frank saw as the principal feature of spatial form now dissolves in the acid bath of information multiplicity.
(223) From unity to assemblage, from subjects who create/apprehend patterns to assemblages that create dispersed subjectivities, from cultural generalizations to technical media as causal agents: these transformations mark the deterritorialized spatial dynamic instantiated and reflected in novels of information multiplicity and media assemblages.
(223)
OR simply assumes the information explosion that Johnston saw as a formative force on contemporary literature. Information has migrated from a foreground figure where it functioned as a causative agent to the background where it forms part of the work's texture. Whereas Johnston believed that the excess of information could never be contained or halted, OR puts information excess into tension with an elaborate set of constraints.
(224) The topographic dimensions are put into play by the configurations of page space and constraints that govern the text's operations. The two narratives center on the “forever sixteen” lovers Sam and Hailey, respectively, with each narrative physically placed upside down on the page and back to front to the other. The turn required to go from one to the other is accompanied by a change in the color coding. . . . The publishers (ventriloquized by Danielewski) recommend that the reader perform the switching operation in units of eight pages of one narrative, then eight pages of the other. This reading practice, which I will call the octet, means that the reader is constantly performing “revolutions” in which the physical book turns 360 degrees (front to back, up to down) each time an octet cycle is completed.

Narrative, Database, Constraint
(224) I identify four different kinds of data arrangements relevant to
OR, each with its own constraints and aesthetic possibilities.
(228) The octet is only the beginning of the text's topographical complexity. In addition to the eightfold reading paths on each page spread are other possibilities that emerge from considering the text as a volumetric space, including alternating between narrators for each page and other reading protocols. Portela has diagrammed many of these possibilities, and his analysis reinforces Hansen's emphasis on embodied practices as essential for actualizing (in Hansen's term, “concretising”) the text. The multiple reading paths, Portela argues, turn the text “into a machine for revealing the mechanisms that make the production of meaning possible” (2011:8), which we may envision (following the opening discussion) as paradigmatic and syntagmatic variations.
(229) Portela further draws an analogy between the text's high dimensionality (my term, not his) as a material artifact and the digitality of alphabetic languages, the very large number of ways in which twenty-six letters may be combined to create different permutations, as well as the ways in which symbols circulate through a culture in different variations and permutations.
(229) Intrinsic to creating the text's complex topography are the chronological entries or “chronomosaics.” Written in epigrammatic style, they merely gesture toward the events they reference.
(230) A complete exploration of the connections between the narratives and chronomosaics would require researching thousands of factoids, a nearly impossible (and certainly tedious) task. In their multiplicity, they gesture toward a vast ocean of data, even as the text's topography puts severe constraints on the brief entries that, as a group, perform as synechdoches for an inexpressible whole.
(230) Further emphasizing the vast store of data available on the Internet are the correlations between the diction and slang of the narratives and the chronomosaics; at every point, the characters' colloquial language is appropriate to the period.
(231) Instead, what emerges is a spacetime within whose high dimensionality and complex topography the personal merges with the mythic in the narratives, while in the chronologies, the individual merges with the collective, and the national with the transnational.

What Cannot Be Said
(231) Along with the combinatoric possibilities constituted by the physical and conceptual configuration of page space, an arguably even more important set of constraints is articulated by the endpapers.
(232) Consider the inversions.
House of Leaves is a large prose hypertext; OR is a tightly constrained poem. House of Leaves uses footnotes to create multiple reading paths, whereas OR uses topographical complexity that works through concatenations rather than links.
(232-233) Beyond these general mirror symmetries is the elaborate set of constraints articulated by the ovals, ellipses, circles, and other “revolutionary” patterns on the endpapers. Each topographic form articulates an ideational cluster. . . . The mirror concordance thus functions as a kind of anticoncordance, indicating words and concepts forbidden in
Only Revolutions. These metonymic remnants from House of Leaves, relegated to the paratextual location of the endpapers and further obscured by appearing as mirror writing, are the paradigmatic equivalents that define the words present in the text by their absences from it (figs. 8.2. and 8.3).
(235-236) Moreover, the writing-down system, as Johnston calls it (a la Kittler), includes all of the affordances of the computer, from the Photoshop “reverse image” function that presumably created the mirror writing of the endpapers to the word-count function that was undoubtedly used to create the specified quantities of text on each quadrant, page, and page spread. . . . In sum, digital-inscription media and the de-differentiation they are presently undergoing can be erased from
OR precisely because they are omnipresent in its writing practices.
(236) Cooperating in the authorial project are the software programs, network functionalities, and hardware that provided sophisticate cognitive capabilities, including access to databases and search algorithms. Networked and programmable machines are here much more than a technology the author uses to inscribe preexisting thoughts. They actively participate in the composition process, defining a range of possibilities as well as locating specific terms that appear in the text. The author function is distributed, then, through the writing down system that includes both human and nonhuman actors.

Affect and Language
(241-242)
OR suggests that narrative and its associated temporalities have not gone into decline as a cultural form, as Lev Manovich predicts. Rather, they have hybridized with data and spatiality to create new possibilities for novels in the age of information. As the book turns, and turns again, the title of this extraordinary work broadens its connotations to encompass the dynamic of renewal that, even as it obliterates traditional novelistic form, institutes a new typographical ordering based in digital technologies—revolutionary indeed.


Coda: Machine Reading Only Revolutions
By N. Katherine Hayles and Allen Beye Riddell

Reverse engineering the print text by transcoding as digital media like digital humanists do to natively print literary works; surprising that the entire text was hand coded, in part because it may violate fair use of copyright.

(242) Our first step was to hand code the entire text and import it into a database, with special categories for cars, plants, animals, minerals, and place-names, and with every word indicated as originating with Sam or Hailey's narratives, respectively. With the place-names identified, we then overlaid them onto a Google map.
(244) Further insight is gained by comparing the word frequency of Hailey's narrative with that of Sam's. The extensive parallels between the two narratives are borne out by the correspondence between paired frequency counts.
(245) Perhaps the most illuminating discovery comes from a hint contained in the “spoiler” poster published in
Revue Inculte 14 (Danielewski 2007d), a large colored print object measuring about two feet by three feet, constructed from notes that Danielewski prepared for his French translator.

Inspiration from French translation spoiler poster hint at list of excluded words led to statistical analysis of English text against Brown corpus.

(245-246) Informative as the poster is about narrative structures, word choices, and chronologies, it is reticent on one point: included is a large column labeled “Nix List” that has been blanked out, suggesting that Danielewski provided his French translator not just with conceptual clusters but specific words that he wanted not to appear in the translated text. This presents an intriguing problem: how do you find the words that are not there? Our solution is to compare the word frequencies in OR with the Brown corpus, a database of one million words carefully selected to be statistically representative of twentieth-century American prose. To assist our comparison, we calculated a chi-square statistic for each word in OR, which provides a rough measure of how noticeable the difference is between the observed frequencies in OR and the Brown corpus.
(246) We conjecture that
or is forbidden because it is the acronym that Danielewski (and others) typically use for OR. . . . Thus self-reflexivity has been banished from the semantic register and displaced onto the topographic, another indication of how important the spatial aesthetic is to this text.

Only Revolutions an example of technogenesis redefining the print codex as a digital technology, manifesting aesthetic, neurocognitive and technical implications.

Cultural moment on verge of the dumbest generation embodied in digital works like TOC, RST and OR shows need for renewed interest in print traditions mixed with technical sensibility of Comparative Media Studies and Big Humanities.

As Hayles brilliantly interprets electronic literature, if we take our pulse from its expression in media, following Kittler and others in a sort of pscyhoanalysis of the technological nonconscious, we easily conclude that the dumbest generation is the Nietzschean last man and childish consumer Horkheimer and Adorno decry as well as absorption of calculative evil from Hollerith generations, the everyday loser who becomes the model human feeding the banality of stupidity nonetheless capable of evil narrative; if only the problem were degenerate skilled programmers rather than zombie hordes of casual gamers and cow clickers.

(247) With this coda, How We Think: Digital Media and Contemporary Technogenesis concludes with an instance of technogenesis redefining the codex as a digital technology that, in cycles of continuous reciprocal causation, both influences and is influenced by the functionalities of networked and programmable machines. To grasp fully the dynamic now in play between print forms and digital technologies, we must consider them as mutually participating in the same media ecology.
(247) The rich conceptualizations and intricate patterns of
TOC, RST, and OR show that technogenesis has a strong aesthetic dimension as well as neurocognitive and technical implications. They demonstrate that in this cultural moment fraught with anxieties about the future, fears for the state of the humanities, and prognostications about the “dumbest generation,” remarkable literary works emerge that can catalyze audiences across the generations. These works vividly show that the humanities, as well as our society generally, are experiencing a renewed sense of the richness of print traditions even as they also begin to exploit the possibilities of the digital regime. In my view the humanities, far from being in crisis, have never seemed so vital.


Hayles, N K. How We Think: Digital Media and Contemporary Technogenesis. Chicago: The University of Chicago Press, 2012. Print.