Notes for N. Katherine Hayles Electronic Literature
Key concepts: broken code, code work, computer-mediated text, computational, electronic literature, embodiment, genetic algorithm, global microsociality, intermediation, literary, machine cognition, multimodality, processual work, recursive feedback loops, subjectivity, subvocalization, synaptogenesis, technological nonconscious, transliteral morphs.
Presents genres of electronic literature, representative artists, works, and theorists in far more detail than Bolter making most of the same points, including preservation issues and FOSS. Her investigation of the effects on subjectivity and suggestions for fruitful research programs are more valuable to my work. Transformation of bodily experience reading may in turn reshape the mind. Intermediation builds the cyborg and realistic computer cognition as alternative to traditional AI goals and foregrounds detailed, reflexive study of both human and machine cultures and sciences. Contemporary literature is computational, with the goal of creating “recursive feedback loops among embodied practice, tacit knowledge, and explicit articulation” (131). Connect symposia project to digital humanities experimentation into auditory virtual realities philosophizing by programming. Escape Hayles' interpretation of fate of authorial voices by positing that programming also produces one's memorable texts, narratives, voices.
Related theorists: Aarseth, Ambrose, Baldwin, Bogost, Bolter, Brown, Bruegger, Bush, Cayley, Cetina, Clark, Cramer, Damasio, Danielewski, Dennett, Derrida, Fredkin, Gallagher, Galloway, Hansen, Hofstadter, Stephen Johnson, Michael Joyce, Kirschenbaum, Kittler, Landow, Liu, Mackenzie, MacKay, Manovich, McGann, Memmott, Mencia, Montfort, Morowitz, Morrissey, Moulthrop, Poundstone, Raley, Rumsfeld, Shannon, Brian Cantwell Smith, Stefans, Thrift, Ulmer, Wardrip-Fruin.
CHAPTER ONE
Electronic Literature: What Is It?
Excludes digitized print literature, but what about digitizations of print literature mediated by programs whose source code fall within Hayles conception of EL, for she implies that source code can be considered and interpreted as a part of EL: create source code containing verbatim digitizations of print literature such as ancient Greek and Latin texts beyond the grasp of any copyright, patent, trademark or other type of law taken in the form of the kind that put Socrates to death, not the biochemical law of the poison he presumably drank, but the state, government, body politic, collective consciousness, what about custom code consuming print literature?
The ELO formulation does not entail the universal law based on the specific example of the most basic digitization of print texts that we would all agree with Hayles does not rise to the occasion of being sufficiently literary, a term she will soon introduce, so despite the harsh use of conjoining exclusion and generally Hayles has really opened the door to electronic texts that are powered, in part, by exact digitizations of commonly conceived as the authoritative and canonical originals, all of which if in the public domain can be cited; in fact she says as much on page 84.
The next thing to consider is the question of style, whether early versions of programs should be preserved, for we do not have a fixed number to consider like we do ancient texts: can you imagine an index of combinations of key phrases such as electronic literature and code work, a sort of phasor (a degree beyond vector in physics, and another word I used to use often in my old notes, which I referred to a few sentences ago as a useful hyperlink do distinguish it from all the possible combinations most of which are nonsense)?
Electronic literature defined as digitally born hybrid literary creative artworks.
(3)
Electronic literature, generally considered to
exclude print literature that has been digitized,
is by contrast “digital born,” a
first-generation digital object created on a computer
and
(usually) meant to be read on a computer. . . . The [Electronic
Literature Organization] committee's formulation reads: “work with
an important literary aspect that takes advantage of the capabilities
and contexts provided by the stand-alone or networked computer.”
(4)
Hybrid by nature, it comprises a “trading zone” (as Peter Galison
calls it in a different context) in which different vocabularies,
expertises, and expectations come together to see what might emerge
from their intercourse. . . . I propose “the
literary” for
this purpose, defining it as creative artworks that interrogate the
histories, contexts, and productions of literature, including as well
the verbal art of literature proper.
GENRES OF ELECTRONIC LITERATURE
(5) The immediacy of code
to the text's performance is fundamental to understanding electronic
literature, especially to appreciating its specificity as a literary
and technical production.
(6) Readers with only a slight
familiarity with the field, however, will probably identify it first
with hypertext fiction characterized by linking structures, such as
Michael Joyce's afternoon: a story, Stuart Moulthrop's Victory
Garden, and Shelly Jackson's Patchwork Girl. . . .
Although Storyspace continues to be used to produce interesting new
works, it has been eclisped as the primary Web authoring tool for
electronic literature.
(6-7) Whereas early works tended to be
blocks of text (traditionally called “lexia”) with limited
graphics, animation, colors, and sound, later works make much fuller
use of the multimodal capabilities of the Web; while the hypertext
link is considered the distinguishing feature of the earlier works,
later works use a wide variety of navigation schemes and interface
metaphors that tend to deemphasize the links as such.
(7)
hypertext fictions also mutated into a range of hybrid forms,
including narratives that emerge from a collection of data
repositories such as M.D. Coverley's Califia.
(8)
Paraphrasing Markku Eskelinen's elegant formulation, we may say that
with games the user interprets in order to configure, wheres in works
whose primary interest is narrative, the user configures in order to
interpret. . . . In his pioneering study [Twisty Little Passages],
[Nick] Montfort characterizes the essential elements of the
form as consisting of a parser (the computer program that understands
and replies to the interactor's inputs) and a simulated world within
which the action takes place.
(10) While works like [Donna
Leishman's] Deviant use perspective to create the impression
of a three-dimensional space, the image itself does not incorporate
the possibility of mobile interactivity along the Z-axis.
(10) One
kind of strategy, evident in Ted Warnell's intricately programmed
Javascript work TLT vs. LL, is to move from the word as the
unit of signification to the letter.
(11) David Knoebel's
exquisitely choreographed “Heart Pole,” from his collection
“Click Poetry,” features a circular globe of words, with two
rings spinning at 90 degrees from one another, “moment to moment”
and “mind absorbing.”
(11) The next move is from imaging three
dimensions interactively on the screen to immersion in actual
three-dimensional spaces. As computers have moved off the desktop and
into the environment, other varieties of electronic literature have
emerged.
(12) The complements to site-specific mobile works, which
foreground the user's ability to integrate real-world locations with
virtual narratives, are site-specific installations in which the
locale is stationary, such as a CAVE virtual reality projection room
or gallery site.
(12) Pioneering the CAVE as a site for
interactive literature is the creative writing program at Brown
University spearheaded by Robert Coover, himself an internationally
known writer of experimental literature.
Transformation of bodily experience through new reading modes that are kinesthetic, haptic, proprioceptively vidid experiences may in turn reshape the mind.
(13) the work has redefined what it means to read, so that reading
becomes, as Rita Raley has pointed out, a kinesthetic,
haptic, and proprioceptively vivid experience, involving not just
the cerebral activity of decoding but bodily interactions with the
words as perceived objects moving in space.
(13) the “page” is
transformed into a complex topology that rapidly transforms from a
stable surface into a “playable” space in which she is an active
participant.
(14) CAVE equipment, costing upward of a million
dollars and depending on an array of powerful networked computers and
other equipment, is typically found only in Research 1 universities
and other elite research sites. . . . Of the few institutions that
have this high-tech resource, even fewer are willing to allocate
precious time and computational resources to creative writers.
(15)
Like the CAVE productions, interactive dramas are often site
specific, performed for live audiences in gallery spaces in
combination with present and/or remote actors.
(16) Interactive
drama can also be performed online.
(16) How to maintain such
conventional narrative devices as rising tension, conflict, and
denouement in interactive forms where the user determines sequence
continues to pose formidable problems for writers of electronic
literature, especially narrative fiction.
(17) the constraints and
possibilities of the medium have encouraged many writers to turn to
nonnarrative forms or to experiment with forms in which narratives
are combined with randomizing algorithms.
(18) The combination of
English and Spanish vocabularies and the gorgeous images from Latin
American locations [in Glazier's White-Faced Bromeliads on 20
Hectares] further suggest compelling connections between the
spread of networked and programmable media and the transnational
politics in which other languages contest and cooperate with
English's hegemonic position in programming languages and, arguably,
in digital art as well.
(18) Philippe Bootz has powerfully
theorized generative texts, along with other varieties of electronic
literature, in his functional model that makes clear distinctions
between the writer's field, the text's field, and the reader's field,
pointing out several important implications inherent in the
separation between these fields, including the fact that electronic
literature introduces temporal and logical divisions between the
writer and reader different from those enforced by print.
(19)
Naming such works [Noah Wardrip-Fruin's Regime Change and News
Reader] “instruments” implies that one can learn to play
them, gaining expertise as experience yields an intuitive
understanding of how the algorithm works.
(20) As Andrews, Kearns,
and Wardrip-Fruin acknowledge, these works are indebted to William
Burroughs's notion of the “cut-up” and “fold-in.” They cite
as theoretical precedent Burroughs's idea that randomization is a way
to break the hold of the viral word and liberate resistances latent
in language by freeing it from linear syntax and coherent narrative.
Code work ranges from machine readable and executable to broken code.
(20-21) “Code work,” a phrase associated with such writers and Alan Sondheim, MEZ (Mary Ann Breeze), and Talan Memmott and with critics such as Florian Cramer, Rita Raley, and Matthew Fuller, names a linguistic practice in which English (or some other natural language) is hybridized with programming expressions to create a creole evocative for human readers, especially those familiar with the denotations of programming languages. “Code work” in its purest form is machine readable and executable, such as Perl poems that literally have two addressees, human and intelligent machines. More typical are creoles using “broken code,” code that cannot actually be executed but that uses programming punctuations and expressions to evoke connotations appropriate to the linguistic signifiers.
Among those languages are programming languages and natural languages, breaking their exclusion in Ong in Orality and Literacy on the grounds that they could never be natural languages; the door opened by the ready supply of ideological constants, a term I used many years ago when I was groping at the vision now much clearer, leads to the idea of code work usable by both human and intelligent machines.
(21)
The conjunction of language with code has stimulated experiments in
the formation and collaboration of different kinds of
languages.
(22-23) The multimodality
of
digital art works challenges writers, users, and critics to bring
together diverse expertise and interpretive traditions so that the
aesthetic strategies and possibilities of electronic literature may
be fully understood. . . . when a work is reconceived to take
advantage of the behavioral, visual, and/or sonic capabilities of the
Web, the result is not just a Web “version” but an entirely
different artistic production that should be evaluated in its own
terms with a critical approach fully attentive to the specificity of
the medium.
Shift from literacy to electracy necessitates new critical practices such as Kirschenbaum digital forensics and Aarseth ergodic reading; suggest that subdivisions of forensic and formal materiality cross in the articulation of technological concretizations.
(23)
the computational media intrinsic to electronic textuality have
necessitated new kinds of critical practice, a shift from literacy to
what Gregory L. Ulmer calls “electracy.”
(24-25)
Exemplifying this kind of critical practice is Matthew Kirschenbaum's
Mechanisms: New Media and Forensic Textuality.
. . .He parses the materiality of digital media as consisting of two
interrelated and interacting aspects: forensic materiality and formal
materiality. Whereas forensic materiality is grounding in the
physical properties of the hardware - how the computer writes and
reads bit patterns, which in turn correlated to voltage differences -
formal materiality consists of the “procedural friction or
perceived difference . . . as the user shifts from one set of
software logics to another” (ms. 27). Using the important
distinction that Espen J. Aarseth
drew
in Cyberspace: Perspectives on Ergodic Literature
between
scriptons (“strings as they appear to readers”) and textons
(“strings as they exist in the text”) (62), Kirschenbaum pioneers
in Mechanisms a
methodology that connects the deep print reading strategies already
in effect with scriptons (letters on the page, in this instance) to
the textons (here the code generating the screenic surface). He thus
opens the way for a mode of criticism that recognizes the specificity
of networked and programmable media without sacrificing the
interpretive strategies evolved with and through print.
Here is a slip by Hayles and her proofreaders, friends, and editors.
(25) In “Writing the Virtual: Eleven Dimensions of E-Poetry,” she
focuses on the ways in which E-poetry achieves dynamism, leading her
to coin the neologism “poietics” (from “poetry” and
“poiisis,” the Greek (sic) work for “making”).
(26) Any
work that uses algorithmic randomizers to generate text relies to a
great or lesser extent on the surprising and occasionally witty
juxtapositions created by these techniques. It should be noted that
algorithmic procedures are not unique to networked and programmable
media.
(27) The Demon, Stefans notes, is involved in a two-way
collaboration: between the programmer who works with the limitations
and possibilities of a computer language to create the program, and
between the user and the computer when the computer poem is read and
interpreted.
(27) The collaboration between the creative
imagination of the (human) writer and the constraints and
possibilities of software is the topic of Ian Bogost's Unit
Operations: An Approach to Videogame Criticism, in which he
develops an extended analogy between the unit operations of
object-oriented programming and a literary approach that explores the
open, flexible, and reconfigurable systems that emerge from the
relations between units.
Bogost unit operation approach takes programming languages and practices into account.
(28) As Bogost's approach suggests, taking programming languages and
practices into account can open productive approaches to electronic
literature, as well as other digital and nondigital forms. The
influence of software is especially obvious in the genre of the Flash
poem, characterized by sequential screens that typically progress
with minimal or no user intervention.
(30) Hypertext fiction,
network fiction, interactive fiction, locative narratives,
installation pieces, “codework,” generative art, and the Flash
poem are by no means an exhaustive inventory of the forms of
electronic literature, but they are sufficient
ELECTRONIC LITERATURE IS NOT PRINT
(31) Early hypertext
theorists, notably George Landow and Jay David Bolter,
stressed the importance of the hyperlink as electronic literature's
distinguishing feature, extrapolating from the reader's ability to
chose which link to follow to make extravagant claims about hypertext
as a liberatory mode that would dramatically transform reading and
writing and, by implication, settings where these activities are
important, such as the literature classroom.
The limitation of current technologies reflected in state of the art designs sees electronic literature as much more restricting than the codex (book) form of literature, overshadowing the unique capability of electronic literature to reform itself dynamically in response to the reader; whereas following hyperlinks may have its print correlate, this property is unique.
(31-32) Compared to the flexibility offered by the codex, which allows the reader complete freedom to skip around, go backward as well as forward, and open the book wherever she pleases, the looping structures of electronic hypertexts and the resulting repetition forced on the reader/user make these works by comparison more rather than less coercive.
They are two different things, being dynamically reconfigurable and being deconstructive.
(32) In conflating hypertext with the difficult and productive
aporias of deconstructive analysis, these theorists failed to do
justice to the nuanced operations of works performed in electronic
media or to the complexities of deconstructive philosophy.
(33)
Rather than circumscribe electronic literature within print
assumptions, Aarseth swept the board clean by positing a new category
of “ergodic literature,” texts in which “non-trivial effort is
required to allow the reader to traverse the text” (1).
(33)
Markku Eskelinen's work, particularly “Six Problems in Search of a
Solution: The Challenge of Cybertext Theory and Ludology to Literary
Theory,” further challenges traditional narratology as an adequate
model for understanding ergodic textuality, making clear the need to
develop frameworks that can adequately take into account the expanded
opportunities for textual innovations in digital media.
(34)
Similar ground clearing was undertaken by Lev Manovich in his
influential The Language of New Media. . . . Although it is
too simplistic to posit these “layers” as distinct phenomena
(because they are in constant interaction and recursive feedback with
one another), the idea of transcoding nevertheless makes the crucial
point that computation has become a powerful means by which
preconscious assumptions move from such traditional cultural
transmission vehicles as political rhetoric, religious and other
rituals, gestures and postures, literary narratives, historical
accounts, and other purveyors of ideology into the material
operations of computational devices.
Galloway emphasis on code as only executable language means any study of electronic literature may, or ought, to include some analysis of the source code and enframing technologies, so the implications of availability of the source code are obvious here; additionally raising questions where is the boundary between the source code of the work and the surrounding operating environment, and what is the status of database records and ephemera of the running of the code?
(35) Alexander Galloway in Protocol puts the case succinctly: “Code is the only language that is executable” (emphasis in original). Unlike a print book, electronic text literally cannot be accessed without running the code. Critics and scholars of digital art and literature should therefore properly consider the source code to be part of the work, a position underscored by authors who embed in the code information or interpretive comments crucial to understanding the work.
McGann argues print texts have always used markup languages.
(35)
[Jerome McGann]
turns this perspective on its head in Radiant
Technology: Literature after the World Wide Web by
arguing that print texts also use markup language, for example,
paragraphing, italics, indentation, line breaks, and so
forth.
(36-37) Complementing studies focusing on the materiality
of digital media are analyses that consider the embodied cultural,
social, and ideological contexts in which computation takes place. .
. . Much as the novel both gave voice to and helped to create the
liberal humanist subject in the seventeenth and eighteenth centuries,
so contemporary electronic literature is both reflecting and enacting
a new kind of subjectivity characterized by distributed cognition,
networked agency that includes human and non-human actors, and fluid
boundaries dispersed over actual and virtual locations.
(37-39)
How and in what ways it should engage with these commercial interests
is discussed in Alan Liu's
magisterial work The Laws of Cool: Knowledge Work
and the Culture of Information.
. . . Realizing this broader possibility requires that we understand
electronic literature not only as an artistic practice (although it
is that, or course), but also as a site for negotiations between
diverse constituencies and different kinds of expertise.
This choice of wording suggests that a more sensitive study of free, open source cultural movements can expand the perspective taken by Mackenzie and/or Hayles.
(39) Among these constituencies are theorists and researchers interested in the larger effects of network culture. . . . Adrian Mackenzie's Cutting Code: Software as Sociality studies software as collaborative social practice and cultural process. . . . electronic literature is evolving within complex social and economic networks that include the development of commercial software, the competing philosophies of open source freeware and shareware, the economics and geopolitical terrain of the internet and World Wide Web, and a host of other factors that directly influence how electronic literature is created and stored, sold or given away, preserved or allowed to decline into obsolescence.
PRESERVATION, ARCHIVING, AND DISSEMINATION
(39) whereas
books printed on good quality paper can endure for centuries,
electronic literature routinely becomes unplayable (and hence
unreadable) after a decade or even less. The problem exists for both
software and hardware.
(40) The Electronic Literature Organization
has taken a proactive approach to this crucial problem with the
Preservation, Archiving and Dissemination Initiative (PAD). Part of
that initiative is realized in the Electronic Literature
Collection, volume 1, co-edited by Nick Montfort, Scott Rettberg,
Stephanie Strickland, and me, featuring sixty works of recent
electronic literature and other scholarly resources.
Considerations of creating long lasting elit.
(41) [Montfort and Wardrip-Fruin's] “Acid-Free Bits”
offers advice to authors to help them “find ways to create
long-lasting elit, ways that fit their practice and goals” (3). The
recommendations include preferring open systems to closed systems,
choosing community-directed systems over corporate driven systems,
adhering to good programming practices by supplying comments and
consolidating code, and preferring plain-text to binary formats and
cross-platform options to single-system options.
(41) More
encompassing, and even more visionary, is the proposal in “Born-Again
Bits” for the “X-Literature Initiative.” The basic premise is
the XML (Extensible Markup Language) will continue to be the most
robust and widespread form of Web markup language into the
foreseeable future.
(42) The X-Literature Initiative makes
startlingly clear that the formation we know as “literature” is a
complex web of activities that includes much more than conventional
images of writing and reading. Also involved are technologies,
cultural and economic mechanisms, habits and predispositions,
networks of producers and consumers, professional societies and their
funding possibilities, canons and anthologies designed to promote and
facilitate teaching and learning activities, and a host of other
factors.
CHAPTER TWO
Intermediation: From Page to
Screen
DYNAMIC HETERARCHIES AND FLUID ANALOGIES
(44)
I focus here on two central conceptual clusters to develop the idea
of intermediation: dynamic heterarchies and fluid analogies as
embodied in multiagent computer programs, and the interpretive
processes that give meaning to information.
(45) One proposal is
“intermediation,” a term I have adopted from Nicholas
Gessler, whereby a first-level emergent pattern is captured in
another medium and re-represented with the primitives of the new
medium, which leads to an emergent result captured in turn by yet
another medium, and so forth. The result is what researchers in
artificial life call a “dynamic hierarchy,” a multi-tiered system
in which feedback and feedforward loops tie the system together
through continuing interactions circulating throughout the
hierarchy.
(45) The potential of this idea to explain multilevel
complexity is the subject of Harold Morowitz's The
Emergence of Everything: How the World Became Complex.
(46)
digital and analog processes together perform in more complex ways
than the digital alone. . . . They [analog processes] excel in
transferring information from one medium to another through
morphological resemblance, and the complexity of continuous variation
allows them to encode information in more diverse ways than digital
encodings.
Humans and computers as dynamic partners bound by intermediating dynamics becomes model of cyborg for Hayles.
(47) Now let us make a speculative leap and consider the human and
the digital computer as partners in a dynamic heterarchy bound
together by intermediating dynamics.
(47) citizens in
technologically developed societies, and young people in particular,
are literally being reengineered through their interactions with
computational devices.
(48-49) In Fluid Concepts and Creative
Analogies: Computer Models of the Fundamental Mechanisms of Thought,
Hofstadter details this research. His mantra, “Cognition is
recognition,” nicely summarizes his conclusion that cognition is
built on the ability to recognize patterns and extrapolate from them
to analogies (pattern A is like pattern B).
(51) Because
literature is not limited to factual recreation but rather works
through metaphor, evocation, and analogy, it specializes in the
qualities that programs like Jumbo and Copycat are designed to
perform.
(51-52) the programs function as components in an
adaptive system bound together with humans through intermediating
dynamics, the results of which are emergent realizations. . . . the
literal/metaphoric binary becomes a spectrum along which a variety of
programs can be placed, depending on their cognitive capacities and
the ways in which the patterns they generate and/or recognize are
structurally coupled with humans.
(52) The idea of considering
meaning making as a spectrum of possibilities with recursive loops
entangling different positions along the spectrum has been catalyzed
by Edward Fredkin's recent proposal that “the meaning of
information is given by the process that interprets it” (my
emphasis), for example, an MP3 player that interprets a digital file
to produce audible sound. The elegance of the concept is that it
applies equally well to human and nonhuman cognizers.
(53) For the
MP3 player, “aboutness” has to do with the relation it constructs
between the digital file and the production of sound waves. For the
music sophisticate, “aboutness” may include a detailed knowledge
of Beethoven's work, the context in which it was written and
performed, historical changes in orchestral instrumentation, and so
on.
(53) [Dennett's thought experiments demonstrate] that
in a certain sense human intentionality too is an artifact that must
ultimately have emerged from the subcognitive processes responsible
for the evolution of humans as a species.
(54-55) Fredkin's
concept also can potentially heal the breach between meaning and
information that was inscribed into information theory when Claude
Shannon defined information as a probability function. . . .
The divorce between information and meaning was necessary, in
Shannon's view, because he saw no way to reliably quantify
information as long as it remained context dependent, because its
quantification would change every time it was introduced into a new
context, a situation calculated to drive electrical engineers mad.
Nevertheless, the probability functions in Shannon's formulations
necessarily implied processes that were context dependent in a
certain sense - specifically, the context of assessing them in
relation to all possible messages that could be sent by those message
elements. The difficulty was that there seemed to be no way to
connect this relatively humble sense of context to the multilayered,
multifaceted contexts ordinarily associated with high-level meanings
(for example, interpretations of Beethoven's Fifth). Fredkin's
formulation overcomes this difficulty by defining meaning through the
processes that interpret information, all the way from binary code to
high-level thinking.
(55) Putting Shannon's mechanistic model
together with MacKay's embodied model make sense when we see
higher-order meanings emerging from recursive lower level
subcognitive processes, as MacKay emphasizes when he highlights
“visceral responses and hormonal secretions and what have you.”
Like humans, intelligent machines also have multiple layers of
processes, from ones and zeros to sophisticated acts of reasoning and
inference.
Definition of computer cognition as execution and performance of a work.
(56-57) In electronic literature, this dynamic is evoked when the
text performs actions that bind together author and program, player
and computer, into a complex system characterized by intermediating
dynamics. The computer's performance builds high-level responses out
of low-level processes that interpret binary code. These performances
elicit emergent complexity in the player, whose cognitions likewise
build up from low-level thoughts that possess much more powerful
input to high-level thoughts than the computer does, but that
nevertheless are bound together with the computer's subcognitive
processes through intermediating dynamics. The cycle operates as well
in the writing phase of electronic literature. When a
programmer/writer creates an executable file, the process reengineers
the writer's perceptual and cognitive systems as she works with the
medium's possibilities. . . . The result is a meta-analogy: as
human cognition is to the creation and consumption of the work, so
computer cognition is to its execution and performance.
(57-58)
The book is like a computer program in that it is a technology
designed to change the perceptual and cognitive states of a reader. .
. . “Recombinant flux,” as the aesthetic of such works [that is,
electronic texts] is called, gives a much stronger impression of
agency than does a book.
(58) this knowledge is carried forward
into the new medium typically by trying to replicate the earlier
medium's effects within the new medium's specificities.
(59) In My
Mother Was a Computer: Digital Subjects and Literary Texts, I
explored intermediation by taking three different analytical cuts,
focusing on the dynamics between print and electronic textuality,
code and language, and analog and digital processes.
FROM
PAGE TO SCREEN: MICHAEL JOYCE'S AFTERNOON: A
STORY AND
TWELVE BLUE
(60)
That evolution is richly evident in the contrast between Michael
Joyce's seminal first-generation hypertext afternoon:
a story and
his later Web work Twelve
Blue.
(62-63)
The technique of conflicting plot lines is of course not original
with Michael Joyce. . . . Comparing the two works reveals how print
centric afternoon
is,
notwithstanding its implementation in an electronic medium. . . .
Although the reader can choose what lexias to follow, this
interaction is so circumscribed that most readers will not have the
sense of being able to play the work - hence my repeated use here of
the term “reader” rather than “player.”
Example of processual work is Twelve Blue.
(63)
In Twelve
Blue,
by contrast, playing is one of the central metaphors. . . . Compared
with afternoon,
Twelve Blue
is
a much more processual
work.
Its central inspiration is not the page but rather the flow of
surfing the Web.
(63-64) Twelve
Blue's
epigraph, taken from William H. Gass's On
Being Blue: A Philosophical Inquiry,
signals that the strategy will be to follow trails of associations,
as Gass says, “the way lint collects. The mind does that” (7). .
. . The second, less explicit, intertext is Vannevar Bush's
seminal essay “As We May Think,” in which he argues that the mind
thinks not in linear sequences but in associational links, a
cognitive mode he sought to instantiate in his mechanical Memex,
often regarded as a precursor to electronic hypertext. . . . [Twelve
Blue]
instantiates associational thinking and evokes it for the player, who
must in a certain sense yield
to
this cognitive mode to understand the work (to say nothing of
enjoying it). . . . Like sensual lovemaking, the richness of Twelve
Blue takes
time to develop and cannot be rushed.
(69) As Anthony Enns points
out in his reading of Twelve
Blue,
this work challenges Frank Kermode's criterion for “the sense of an
ending” that helps us make sense of the world by establishing a
correlation between the finitude of human life and the progression
through a beginning, middle, and end characteristic of many print
narratives. . . . I would argue rather that Twelve
Blue makes
a different kind of sense, one in which life and death exist on a
continuum with flowing and indeterminate boundaries.
Ulmer electracy represents shift from print bound alphabetic language to web syntheses of image and text, comparable to prior historical shift from lyric poem to novel.
(70) Gregory Ulmer relates it to the shift from a novel-based aesthetic to a poetics akin to the lyric poem. He also relates it to a change from literacy to “electracy,” arguing that its logic has more in common with the ways in which image and text come together on the Web than to the linearity of alphabetic language bound in a print book. . . . The leap from afternoon to Twelve Blue demonstrates the ways in which the experience of the Web, joining with the subcognitive ground of intelligent machines, provides the inspiration for the intermediating dynamics through which the literary work creates emergent complexity.
MARIA MENCIA: TRANSFORMING THE RELATION BETWEEN SOUND AND MARK
An excellent articulation of the hegemonic computation process of reading to produce virtual realities by Mencia; go back to this in more detail.
(71)
In “Methodology,” Mencia comments that she is particularly
interested in the “exploration of visuality, orality and the
semantic/'non semantic' meaning of language.” On the strength of
her graduate work in English philology, she is well positioned to
explore what happens when the phone and phoneme are detached from
their customary locations within morphemes and begin to circulate
through digital media into other configurations, other ways of
mobilizing conjunctions of marks and sounds. . . . With traditional
print literature, long habituation causes visuality (perception of
the mark) to flow automatically into subvocalization (cognitive
decoding) that in turn is converted by the “mind's eye” into the
reader's impression that the words on the page give way to a scene
she can watch as the characters speak, act, and interact.
(73) [In
Birds
Singing Other Birds' Songs]
the human is in-mixed with nonhuman life forms to create hybrid
entities that represent the conjunction of human and nonhuman ways of
knowing.
(74) The analogy-between-analogies suggests that media
transformations are like the dynamic interchanges between different
kinds of cognizers, thus revealing a deep structure of intermediation
that encompasses the history of media forms as well as the emergent
complexities of interactions between humans, animals, and networked
and programmable machines.
RUPTURING
THE PAGE: THE JEW'S DAUGHTER
(74-76)
The entire work exists as a single screen of text. . . . When the
player mouses over the blue letters, some part of the text, moving
faster than the eye can catch, is replaced. Reading thus necessarily
proceeds as rereading and remembering, for to locate the new portion
of the page the reader must recall the screen's previous
instantiation while scanning to identify the new portion, the
injection of which creates a new context for the remaining
text.
(78-79) The “stickiness” of phrases that can ambiguously
attach to different sentences and phrases also enacts a difference
between modernist stream of consciousness and the kind of awareness
represented in The
Jew's Daughter.
. . . narration is both belated and premature, early and late.
(79)
Taken as a representation of consciousness, the kind of awareness
performed here is not a continuous coherent stream but rather
multilayered shifting strata dynamically in motion relative to one
another.
Dennett Multiple Drafts Model is very rich but complicated reinterpretation of consciousness as an epiphenomenon in which there is no self, only an illusion of one.
(79)
This kind of interaction is very similar to the “Multiple
Drafts Model” that
Daniel C. Dennett,
in Consciousness
Explained,
argues best explains the nature of consciousness. Dennett proposes
that consciousness is not the manifestation of a single coherent self
synthesizing different inputs (characterized as the “Cartesian
Theater,” the stage on which representations are played out and
viewed by a central self); rather, interacting brain processes,
operating with varying temporal dynamics and different
neural/perceptional inputs, are
consciousness.
. . . To explain the subjective impression of possessing a central
self, Dennett argues that the self is not synonymous with
consciousness as such. Rather, the illusion
of
self is created through an internal monologue that does not so much
issue from a central self as give the impression a central self
exists.
(80) Seen in this perspective, The
Jew's Daughter recapitulates
the temporal and spatial discontinuities constitutive of
consciousness through the (inter)mediation of computer software and
hardware.
(81) Without knowing anything about The
Jew's Daughter,
Dennett sets up the comparison between human and machine cognition by
likening the subcognitive agents from which consciousness emerges,
and the even simpler processes that underlie them, to mechanical
programs that could theoretically be duplicated in a computer.
Genetic algorithm as complex adaptive system involving player choices.
(82-83) The Error Engine, a collaborative work co-authored by Judd Morrissey, Lori Talley, and computer scientist Lutz Hamel, carries the implications of The Jew's Daughter to another level by functioning as an adaptive narrative engine that initiates a coevolutionary dynamic between writer, machine, and player. . . . In the next instantiation of the program, no yet implemented, the authors envision an algorithm whose selection criteria can itself evolve in relation to the player's choices. Such a program would deserve to be called a “genetic algorithm,” a complex adaptive system in which the user's choices and the algorithm responding to those choices coevolve together. . . . In this sense intermediating dynamics, whereby recursive feedback loops operate through the differently embodied entities of the computer and human, become an explicit part of the work's design, performance, and interpretation. Adaptive coevolution implies that real biological changes take place in the player's neuronal structure that result in emergent complexity, expressed as a growing understanding of the work's dynamics, thematics, and functional capabilities; these in turn change and evolve in interaction with the player's choices.
Undeniable influence of computation for the critical framework of contemporary literature; gratuitous reference to Phaedrus with the close loop feedback difference between electronic and print literature.
(83)
Certainly print literature changes a reader's perceptions, but the
loop is not closed because the words on the page do not literally
change in response to the user's perceptions. . . . To fully take
this reflexivity into account requires understanding the computer's
cascading interpretive processes and procedures, its possibilities,
limitations, and functionalities as a subcognitive agent, as well as
its operations within networked and programmable media considered as
distributed cognitive systems. . . . Whatever limitations
intermediation as a theory may have, its virtue as a critical
framework is that it introduces
computation into the picture at a fundamental level,
making it not an optional add-on but a foundational premise from
which to launch further interrogation.
(84)
No less than print literature, literary criticism is affected because
digital media are increasingly essential to it, limited not just to
word processing but also to how critics now access legacy works
through digital archives, electronic editions, hypermedia
reinstantiations, and so forth.
(85) Contemporary
literature, and even more so the literary that extends and enfolds
it, is
computational.
CHAPTER THREE
Contexts for Electronic Literature: The
Body and the Machine
(87) The context of networked and
programmable media from which electronic literature springs is part
of a rapidly developing mediascape transforming how citizens of
developed countries do business, conduct their social lives,
communicate with each other, and perhaps most significantly, how they
construct themselves as contemporary subjects. . . . The stakes are
nothing less than whether the embodied human becomes the center for
humanistic inquiry within which digital media can be understood, or
whether media provide the context and ground for configuring and
disciplining the body.
(88) I argue that both the body and
machinic orientations work through strategic erasures. A fuller
understanding of our contemporary situation requires the articulation
of a third position focusing on the dynamics entwining body and
machine together. . . . Most importantly, it empowers electronic
literature so that it not only reflects but reflects upon the
media from which it springs.
THE EPOCH OF TECHNICAL MEDIA
Kittler key of technical media autonomously determining subjectivity.
(88-89) No theorist has done more to advance the idea of technical media as an autonomous force determining subjectivity than Friedrich A. Kittler. . . . Influenced by Foucault rhetorically as well as methodologically, Kittler departs from him in focusing not on discourse networks understood as written documents, but rather on the modes of technology essential to their production, storage, and transmission.
Subvocalized voice replaces material grapheme with hallucination.
(89) Literature acts on the body
but only within the horizon of the medium's technical capabilities.
Especially important in this regard, Kittler argues, was the
development of the phonetic method of reading, introduced in Germany
by Heinrich Stephanie around 1800. The phonetic method transformed
the mark into sound, erasing the materiality of the grapheme and
substituting instead a subvocalized voice.
(90)
[from Gramophone, Film,
Typewriter] So-called
Man is split up into physiology and information technology. (intro,
16)
(90) With the formation of a new kind of subject, the voice of
Mother/Nature ceases to spring forth from the page in a kind of
hallucination.
(91)
In his excellent forward to Discourse
Networks 1800/1900,
David E. Wellbery calls this the “presupposition
of exteriority” (DN, xii). . . the crucial move of making social
formations interior to media conditions is deeply flawed. . . .
Although Kittler's presupposition is fruitful as a theoretical
provocation, leading to the innovative analyses that make his work
exciting, it cannot triumph as a theoretical imperative because it
depends on a partial and incomplete account of how media technologies
interact with social and cultural dynamics.
(91-92) One indication
of this partiality is the inability of Kittlerian media theory to
explain how media change comes about (as has often been noted, this
is also a weakness of Foucault's theory of epistemes). In a
perceptive article, Geoffrey Winthrop-Young argues that in Kittler's
analyses, war performs as the driving force for media
transformation.
(93) clearly the real problem is that media alone
cannot possibly account for all the complex factors that go into
creating national military conflicts. What is true for war is true
for any dynamic evolution of complex social systems; media
transformations alone are not sufficient.
(93) To be fair to the
Kittlerian viewpoint, I have chosen a site where media conditions are
unusually strong in determining the interactions that take place
within it - namely, the elite world of global finance. The media
conditions that prevail here are characteristic of the contemporary
period, in that the differentiation between data streams marking
early twentieth century media transformations have undergone
integration. . . . Contemporary de-differentiation crucially depends
on digital media's ability to represent all kinds of data – text,
images, sound, video – with the binary symbolization of “one”
and “zero.”
MEDIA CONDITIONS FOR GLOBAL FINANCE: WHY MEDIA THEORY IS NOT SUFFICIENT
Ethnographic studies of international currency traders reveal global microsociality thus undeniable cultural influence beyond media technological determination; complication of Kittler and also Castells.
(94) Among important recent work on
global finance are the ethnographic studies of international currency
traders by Karin Knorr Cetina and
Urs Bruegger. . . .
In brief, this is money at its most virtual, moving around the globe
in nearly instantaneous electronic exchanges and reflecting rate
fluctuations sensitively dependent on a wide variety of fast-changing
economic, social, and political factors.
(94-95) Knorr Cetina and
Bruegger propose the theoretical concept of global
microsociality. . . . Global
microsociality represents a new kind of phenomenon possible only with
advanced communication technologies allowing for nearly instantaneous
exchanges between geographically distant locations; compared to the
telephone and teletype, the quantitative differences are so great as
to amount to qualitative change. . . . Inflecting by the dynamics of
global economics, the traders nevertheless operate within microsocial
dynamics – hence the necessity for global microsociality.
(95)
Recapitulating within their sensoria the media differentiation into
separate data flows, the traders develop a form of parallel
processing through a division of sensory inputs, using phones to take
orders from brokers through the audio channel and the screens to take
in visual data and conduct trades electronically. The environment,
however, is dominated by the screens.
(95-96) the effect that
dominates is watching time unfold. . . . As new events appear over
the ever-transforming horizon, the traders use their knowledge of
past configurations, present statistics, and anticipated tendencies
to weave a fabric of temporality, which like the fabled magic carpet
is perceived at once as a space one can occupy and as an event as
ephemeral and ever-changing as the air currents on which the magic
carpet rides.
(96-97) Temporality partakes of these
characteristics because the screens function as temporalized “places”
traders occupy; time in these circumstances becomes the spatialized
parameter in which communities are built and carry out their
business. . . . Time thus ceases to be constructed as a universal
“now” conceived as a point source moving unambiguously forward
along a line at a uniform pace. Driven by globalized business
pressures, time leaves the line and smears into a plane.
(97)
Greenwich time thus operates as the conventional one-way time that
always moves in one direction, whereas local time becomes
incorporated into a spatialized fabric that can be traversed in
different directions as circumstances dictate.
(97-98) In this
spatialized temporality, the traders occupy an ambiguous position. On
the one hand, they are participants in the place of temporality they
create by watching the screens, helping in significant ways to shape
the market and related events as they continuously unfold and affect
one another. . . . On the other hand, they are also observers outside
the screens, watching the action as it unfolds. . . . The net result
of these interactions is perceived by the traders as “the market.”
. . . Note that although location enters into the trader's sense of
the market, it is the temporal dimension - everything all
the time - that constitutes
the place of habitation the market creates and the traders occupy.
Instead of calling the market a mind, call it a megaentity; does this then exclude interpreting it with respect to Gallagher body image/body schema distinction?
(98) This sense of the market as
“everything” is reinforced by the traders' experience in being so
intimately and tightly connected with the screens that they can sense
the “mind” of the market. . . . This intuition is highly
sensitive to temporal fluctuations and, when lost, can be regained
only through months of immersion in current conditions. Attributing a
“mind” to the market of course implies it is an entity possessing
consciousness, desires, and intentions; more precisely, it is a
megaentity whose
existence is inherently emergent. Containing the traders' actions
with everything else, it comes into existence as the dynamic
realization of innumerable local interactions.
(99) This is the
context in which the screens become objects of intense attachment for
the traders.
(99) So far the case study has functioned as an
object lesson demonstrating Kittler's dictum, “Media determine our
situation.” At this point, however, let us turn to consider how
cultural dynamics interact with the media conditions to codetermine
their specificities. . . . This gender predominance, far from
accidental, is deeply imbricated into the ways in which the media
dynamics play out.
(100-101) In this setting, gendered cultural
practices proliferate more uniformly and extensively than normally
would be the case. . . . Traders see themselves as engaged in combat,
if not outright ware, with rival banks and other traders. . . .
Warfare here does not function to bring about media transformations,
as it often does in Kittler's analyses; rather, warfare is
encapsulated within the horizon codetermined by media conditions and
cultural formations. It is appropriated in part because it expresses
- indeed, explains and justifies - the intensified desires and fears
aroused by the traders' situation.
(101) The media conditions
alone, then, are underdetermining with respect to the culture that
actually emerges. Other factors, particularly cultural models linked
with masculine dominance, are necessary to explain how the media
function to “determine the situation.”
(102) Media provide the
simultaneity that spatializes time, creates global microsociality,
catalyzes attachment to screens, and gives rise to emergent objects
“that are not identical with themselves,” but the emotional tone,
dominant metaphors, hypermasculinized dynamics, and capitalist
economics codetermine how trading practices actually operate.
EMBODIMENT AND THE COEVOLUTION OF TECHNOLOGY
Hansen used as foil to Kittler by foregrounding role of embodiment in perceptual experience.
(102-103) No one has more
forcefully argued for the importance of embodiment in relation to new
media art than Mark B. N. Hansen.
. . . Updating Bergson's idea in Matter
and Memory that the body
selects from the environment images on which to focus, Hansen
contests Deleuze's reading of Bergson in Cinema
1 in order to reinstall
affectivity at the center of the body-brain achievement of making
sense of digital images.
(103) This is a major intervention that
serves as an important counterweight to Kittler's perspective. Hansen
posits that “only meaning can enframe information” (82), and in
his view it is humans, not machines, who provide, transmit, and
interpret meanings.
(103-104) Yet despite my overall sympathy, I
cannot help noticing places where the argument, in its zeal to
establish that embodiment trumps every possible machine capacity,
circumscribes the very potential of the body to be transformed by its
interaction with digital technologies for which Hansen otherwise
argues.
Clearly a tie into Gallagher since she mentions Francisco Valera; cannot find the reference to Demasio I remember reading.
(105) Vision, then, cannot in
Hansen's account be allowed to be the dominant perceptual sense, on
even on par with privileged faculties that (not coincidentally) are
much more difficult to automate, particularly what he calls
“affectivity,” the capacity of the sensorimotor body to
“experience itself as 'more than itself' and thus to deploy its
sensorimotor power to create the unpredictable, the experimental, the
new” (7). To substantiate that the sensorimotor body has this
capacity, he draws on the writing of Raymond Ruyer, a French theorist
who during the 1950s proposed to combat the mechanist tendency of
post-World War II cybernetics by positing bodily faculties that, he
argued, are nonempirical and nonobservable. Chief among these is a
“transpatial domain of human themes and values” (80) . . . If we
were to call the “transpatial domain” by the more traditional
name “soul,” its problematic nature would quickly become
evident.
(106) Although it is undoubtedly true, as Hansen argues
citing Brian Massumi (109), that proprioception, kinesthetic, and
haptic capacities are involved with vision, this does not mean that
they replace vision or even that they become dominant over vision in
the VR interface. Indeed, it is precisely because vision plays such
an important role in VR that VR sickness arises.
(108) Falling
back on such mystified terms as “transpatial domain” and
“absolute survey” fails to do justice to the extensive research
now available on how synesthesia actually works.
(108) there is an
unavoidable tension between Hansen's insight that technology and the
body coevolve together and his ideological commitment to the priority
of embodiment over technology.
(109) As Hansen uses the
term, however, reality becomes “mixed” when the perceptual input
for humans comes not from their unaided bodies operating alone in the
environment, but rather from their embodied interactions with
technologies.
Coevolution of body and technology in teleological trajectory in Hansen, erasing material specificities of technological media; consider Malabou quoting Deleuze on adequacy of brain for modern world.
(109) In this account, then, the coevolution of the body and technology is given a teleological trajectory, a mission as it were: its purpose is to show the “constituting or ontological dimension of embodiment.” Largely erased are material specificities and capacities of technical objects as artifacts. It is as though the feedback loop between technical object and embodied human enactor has been cut off halfway through: potentiality flows from the object into the deep inner senses of the embodied human, but its flow back into the object has been short-circuited, leading to an impoverished account of the object's agential capacities to act outside the human's mobilization of its stimuli.
Hansen ignores materiality of machine cognition, thus the machine dimension on my timeline.
(110) This encapsulation is problematic for several reasons. It
ignores the increasing use of technical devices that do not end in
human interfaces but are coupled with other technical devices that
register input, interpret results, and take action without human
intervention.
(111) Such an account is helpless to explain how
technology evolves within the horizon of its own limitations and
possibilities.
(111-112) As if assuming a mirror position to
Kittlerian media theory, which cannot explain why media change except
by referring to war, Hansen cannot explain why media develop except
by referring to embodied capacities. . . technologies are embodied
because they have their own material specificities as central to
understanding how they work as human physiology, psychology, and
cognition are to understanding how (human) bodies work.
(112-113)
Had early tools developed along different technological lineages,
early hominid evolution also might have developed along quite
different biological lines. . . . Instead of subordinating the body
to technology or technology to the body, however, surely the better
course is to focus on their interactions and coevolutionary
dynamics.
(113) Central to these dynamics, especially in the
context of media theory and electronic literature, are neural
plasticity and language ability. . . . Evidence indicates that
compound tools were contemporaneous with the accelerated development
of Broca's area in the frontal cortex, a part of the brain involved
in language use.
Synaptogenesis and brain plasticity combine phylogenetic selection (genetics) and ontogenic mechanisms of adaptation (learning); Baldwin organic selection. What does Gallagher think?
(114) Although synaptogenesis is greatest in infancy,
plasticity continues throughout childhood and adolescence, with some
degree continuing even into adulthood. In contemporary developed
societies, this plasticity implies that the brain's synaptic
connections are coevolving with environments in which media
consumption is a dominant factor. . . . Children growing up in
media-rich environments literally have brains wired differently than
humans who did not come to maturity in such conditions.
(115)
[James Mark] Baldwin argued for what he called “organic
selection.” In the same way that the brain overproduces neuronal
connections that are then pruned in relation to environmental input,
so Baldwin thought that “organic selection” proceeded through an
overproduction of exploratory behaviors, which are then pruned
through experience to those most conducive to the organism's
survival. This results in a collaboration between phylogenetic
selection (that is, selection that occurs through genetic
transmission) and ontogenic mechanisms of adaptation (which occur in
individuals through learning).
(115) what begins as ontogenetic
adaptation through learning feeds back into selective pressures to
affect physical biology.
Ambrose argues electronic technology breaks monopoly of vision associated with learning.
(116) [Stanley H.] Ambrose's scenario linking compound tools with the emergence of language illustrates how technology enters into the psychophysical feedback cycle by changing the ways in which learning occurs and the kinds of learning that are most adaptive. . . . If data differentiation at the beginning of the twentieth century broke the ancient monopoly of writing, the computer at the beginning of the twenty-first century breaks the monopoly of vision associated with reading. Interactive text, reminiscent in some ways of the digital art discussed by Hansen, stimulates sensorimotor functions not mobilized in conventional print reading, including fine movements involved in controlling the mouse, keyboard, and/or joystick, haptic feedback through the hands and fingers, and complex eye-hand coordination in real-time dynamic environments. Moreover, this multisensory stimulation occurs simultaneously with reading, a configuration unknown in the Age of Print. Brain imaging studies show that everyday tool use entails complex feedback loops between cognitive and sensorimotor systems. For humans who habitually interact with computers, especially at young ages, such experiences can potentially affect the neurological structure of the brain.
Deep attention versus hyper attention as examples of ontogenic mechanisms of adaptation by Steven Johnson (ironically, hyper attention is what makes me and perhaps other digital immigrants turn away from The Jews Daughter and other EL examples); Heim discusses this shift, too.
(117)
Steven Johnson,
in Everything Bad Is Good For You: How Popular
Culture Is Actually Making Us Smarter,
cites the studies of James R. Flynn indicating that Iqs rose
significantly from 1932-78, the so-called Flynn effect that Johnson
correlates with increased media consumption. Anecdotal evidence as
well as brain imaging studies indicate that “Generation M” (as
the Kaiser Family Foundation dubbed the 8- to 18-year-old cohort) is
undergoing a significant cognitive shift, characterized by a craving
for continuously varying stimuli, a low threshold for boredom, the
ability to process multiple information streams simultaneously, and a
quick intuitive grasp of algorithmic procedures that underlie and
generate surface complexity. The cognitive mode, which I have
elsewhere called “hyper attention,” is distinctively different
from that traditionally associated with the humanities, which by
contrast can be called “deep attention.” Deep attention is
characterized by a willingness to spend long hours with a single
artifact (for instance, a seven-hundred-page Victorian novel),
intense concentration that tends to shut out external stimuli, a
preference for a single data stream rather than multiple inputs, and
the subvocalization that typically activates and enlivens the reading
of print literature.
(118) The effects of hyper attention are
already being reflected in literary works, for example in John
Cayley's Translation and
Imposition,
discussed in Chapter 5, where text is accompanied by glyphs visually
indicating the algorithm's operation.
(118) As media change, so do
bodies and brains; new media conditions foster new kinds of ontogenic
adaptations and with them, new possibilities for literary
engagements. This is the context in which we should evaluate and
analyze the possibilities opened by electronic literature.
(119)
It is precisely when these multilayered, multiply sited processes
within humans and machines interact through intermediating dynamics
that the rich effects of electronic literature are created,
performed, and experienced.
THE BODY AND MACHINE IN ELECTRONIC LITERATURE
(120) The
notorious “nervousness” of this work [Lexia to Perplexia],
whereby a tiny twitch of the cursor can cause events to happen that
the user did not intend and cannot completely control, conveys
through its opaque functionality intuitions about dispersed
subjectivities and screens with agential powers similar to those we
saw with international currency traders.
(122) Memmott's
rewriting of the myth in the context of information technologies, the
“I-terminal,” a neologism signifying the merging of human and
machine, looks at the screen and desires to interact with the image,
caught like Narcissus in a reflexive loop that cycles across the
screen boundary between self/other.
(122) The feedback cycle
suggested here between self and other, body and machine, serves as a
metaphor for the coconstruction of embodiment and media technologies.
Distinguish broken code and pseudo code.
(123) The passage cited above continues with “broken” code,
that is, code that is a creolization of English with computer code,
evocative of natural language connotations but not actually
executable. . . . In particular, the play between human language and
code points to the role of the intelligent machine in contemporary
constructions of subjectivity, gesturing toward what Scott Bukatman
has called “terminal identity,” or in Memmott's lexicon, the
“I-terminal.”
(123-124) Engaging the hyper-attentive
characteristics of multiple information streams and rapid
transformations (images, words, graphics, lightning-quick morphing of
screens, mouseovers, and so on), the work reflects upon its own
hyper-attentive aesthetics in the final section, where the prefix
“hyper” replicates itself with every imaginable stem. At the same
time, the work obviously requires deep-attention skills to grasp the
complex interactions between verbal play, layered screen design,
twitchy navigation, and JavaScript coding. . . . In terms of the
complex dynamics between body and machine, we might say that the
gamer and textual critical have had their neural plasticities shaped
in different but overlapping ways.
(124) While Lexia to
Perplexia is primarily concerned with the transformative effect
of information technologies on contemporary subjectivity, Young-Hae
Chang Heavy Industries engages the global microsociality and
spatialization of temporality characteristic of information-intensive
settings such as international currency trading discussed above. . .
. Programmed in Flash, their works use timed animation to display
sequential blocks of text, with the movement from one screen of text
to the next synchronized with an accompanying sound track, typically
jazz.
(125) The impression is not that the eye moves but rather
that the text moves while the eye remains (more or less) stationary.
Agency is thus distributed differently than with the print page where
the reader controls the pace of reading and rate at which pages
turn.
(125) [Bill] Brown devised a machine that he called the
“Readie,” [in the 1920s] which was intended to display text much
as it appears in YHCHI's compositions.
(126-127) In Nippon,
global microsociality is emphasized by an intimate address that
appears on a screen split between Japanese ideograms above and
English words beneath. . . . The subvocalization that activates the
connection between sound and mark in literary reading here is
complicated by the text's movement and its interpenetration by sound,
becoming a more complex and multimodal production in which embodied
response, machine pacing, and transnational semiotics, along with the
associated spatialization of temporality, all contribute to construct
the relation between text, body, and machine.
(128) The resulting
tension mandates that the user intent on comprehending the work will
necessarily be fored to play it many times, unable to escape hyper
attention by stopping the text-in-motion or deep attention by lapsing
into interactive game play.
(130) Hence the rhetoric of
imperatives employed by Kittler (“must not think,” “forbids the
leap,” and so on) finds its mirror opposite in Hansen's rhetoric of
encapsulation (“subordination of technics,” “from within the
operational perspective of the organism,” and the like). In
contrast, the model herein proposed entnagles body and machine in
open-ended recursivity with one another. This framework mobilizes the
effect recursivity always has of unsettling foundations while
simultaneously catalyzing transformations as each partner in the loop
initiates and reacts to changes in the other. In this model neither
technological innovation nor embodied plasticity is foreclosed. The
future, unpredictable as ever, remains open.
CHAPTER
FOUR
Revealing and Transforming: How Electronic Literature
Revalues Computational Practice
(131)
The chapter elucidates further a framework in which digital
literature can be understood as creating recursive
feedback loops among embodied practice, tacit knowledge and explicit
articulation.
(132)
in developed societies, almost all communication, except face-to-face
talk, is mediated through some kind of digital code. . . . What
differences do these entanglements make to our sense of what it means
to communicate with one another, long understood to be a fundamental
component of human sociality and being? Or to put it another way,
what does the intermediation between language and code imply for
practices of signification?
Same Rumsfeld quote used by Zizek; this point seems obvious given the enormous role of not consciously articulated purposive action transmitting practices of all sorts.
(132) Paraphrasing that well-known Zen poet Donald Rumsfeld, I propose that (some of) the purposes of literature are to reveal what we know but don't know that we know, and to transform what we know we know into what we don't yet know. Literature, that is, activates a recursive feedback loop between knowledge realized in the body through gesture, ritual, performance, posture, and enactment, and knowledge realized in the neocortex as conscious and explicit articulations. As the French sociologist Pierre Bourdieu has shown, robust and durable knowledge can be transmitted through social practices and enactments without being consciously articulated.
Control system model of human being, with importance of trauma, embodiment, tying literature to the functional role of control system component activating feedback circuits, is an analogy Hayles draws between studies of human body in cognition and literature (print and electronic).
How do these feedback loops operate in nonhuman systems left open as Hayles does not spend much time at all discussing the source code of any of the examples of EL in this book; her musings on what feelings/body and ratiocination/mind may be in nonhuman systems trace the same boundaries of fantasy as do those surrounding her postulate that nonhuman intelligences exist in representations of it by Memmott.
(133-134) bodily knowledge is directly tied in with the limbic system and the viscera, as Antonio Damasio has shown, with complex feedback loops operating through hormonal and endocrine secretions that activate emotions and feelings. . . . Traumatic events are understand in precisely this way, as disruptions that disconnect conscious memory from the appropriate affect. . . . In this context, literature can be understood as a semiotic technology designed to create – or more precisely, activate – feedback loops that dynamically and recursively unite feelings with ratiocination, body with mind.
Thrift technological unconscious also invoked by Feenberg along with Simondon, for which Hayles prefers nonconsicous.
(134-135) As networked and programmable media move out of the box and into the environment . . . distributed cognitive systems in which human and nonhuman actors participate become an everyday condition of contemporary life in developed societies. Nigel Thrift has written about the psychological consequences of participating in technologically mediated environments in his thick descriptions of what he calls the “technological unconscious.” . . . I prefer to use the term “technological nonconscious” to avoid confusing the sedimented embodied experiences he discusses with the Freudian unconscious. . . . I argue that while the technological nonconscious has been a factor in constituting humans for millennia, the new cognitive capabilities and agencies of intelligent machines give it greater impact and intensity than ever before.
Primary argument for studying EL, thankfully one viewing machine cognitions as intimate component of human activity rather than mere opponent or tool where a big difference between machine cognition and anything possible in human interaction with print literature is the efficacy of the machine realm to change itself dramatically; in one sense this merely echoes and completes a thought first articulated many centuries ago in Plato Phaedrus, in another sense, it goes places unthinkable to the ancients because it adapts itself to (what I will call, daringly) the ontogenic potential of ECM.
(135-136) Electronic literature extends the traditional functions of print literature in creating recursive feedback loops between explicit articulation, conscious thought, and embodied sensorimotor knowledge. The feedback loops progress in both directions, up from embodied sensorimotor knowledge to explicit articulation, and down from explicit articulation to sensorimotor knowledge. While print literature also operates in this way, electronic literature performs the additional function of entwining human ways of knowing with machine cognition. . . . Change anywhere catalyzes change everywhere, resulting both in new understanding of embodied responses and new valuations of technical practice.
Her approach involving asserting effects of code on verbal narratives and distributed agency.
(136) The first proposition asserts that verbal narratives are simultaneously conveyed and disrupted by code, and the second argues that distributed cognition implies distributed agency.
Moulthrop 404 errors; Deleuze and Guattari make the same point about break-downs in Anti-Oedipus, Hegel the torn sock. This is not the healthiest starting point, and should not be the primary example used to study and create with nonhuman cyberspace, although it is one of the default (as in unconscious, unpreferred, unintended but highly probably) approaches.
(136-137) Stuart Moulthrop, writing on “404” errors, notes that such episodes are not simply irritations but rather flashes of revelation, potentially illuminating something crucial about our contemporary situation. The “errors,” he suggests, are actually minute abysses puncturing (and punctuating) the illusion that the human life-world remains unchanged by its integration with intelligent machines.
Once cognition is offloaded (distributed) into the environment, computational processes impossible for humans can be performed upon the memory (Clark). Should I read his book? Gallagher put him on his chart as the crossover theorist. Also of importance to him are Lakoff and Johnson.
(137) There is now a wealth of research regarding extended cognition as a defining human characteristic. In Natural Born Cyborgs: Minds, Technologies, and the Future of Human Intelligence, Andy Clark argues that we are “human-technology symbionts,” constantly inventing ways to offload cognition into environmental affordances so that “the mind is just less and less in the head.” Edwin Hutchins makes similar points in Cognition in the Wild.
This may be an antidote to the suggestion that all narrative themes have been exhausted except science fiction by escaping print into electronic media forms.
(138-139) Electronic literature can tap into highly charged differentials that are unusually hetergeneous, due in part to uneven developments of computational media and in part to unevenly distributed experiences among users. . . . These differences in background correlated with different kinds of intuitions, different habits, and different cognitive styles and conscious thoughts. . . . Only because we do not know what we already know, and do not yet feel what we know, are there such potent possibilities for intermediations in the contemporary moment.
RECURSIVE INTERACTIONS BETWEEN PRACTICE AND ARTICULATION
(139)
William Poundstone's Project for Tachistoscope is modeled
after a technology developed for experimental psychology exploring
the effects of subliminal images.
Incomprehensible temporal orders foregrounded in Poundstone Project for Tachistoscope.
(140) In historical context, then, the tachistoscope was associated
with the nefarious uses to which subliminal perception could be put
by Communists who hated capitalists and capitalists who egged on the
persecution of “Reds” and “Commies.”
(140-141) As the
kinds and amounts of sensory inputs proliferate, the effect for
verbally oriented users is to induce anxiety about being able to
follow the narrative while also straining to put together all the
other discordant signifiers.
Borderland of machine and human cognition cooperating to evoke meanings, albeit for humans.
(142-143) Programmed by a human in the high-level languages used in Flash (C++/Java), the multi-modalities are possible because all the files are ultimately represented in the same binary code. The work thus enacts the borderland in which machine and human cognition cooperate to evoke the meanings that the user imparts to the narrative, but these meanings themselves demonstrate that human consciousness is not the only actor in the process. Also involved are the actions of intelligent machines. In this sense the abyss may be taken to signify not only those modes of human cognition below consciousness, but also the machinic operations that take place below the levels accessible to the user and even to the programmer.
Insistence by Stewart on the power of subvocalization enriching literary language by being read through in the body could confound the audible recreation of literary texts by external mechanisms that also automate homophonic variants.
(143) His [Garrett Stewart] strategy is to approach the question
[what makes literary language literary?] not by asking how we read or
why, but rather where. We read, he suggests, in the body,
particularly in subvocalizations that activate for us a buzz of
homophonic variants surrounding the words actually on the page. These
clouds of virtual possibilities, Stewart argues, are precisely what
give literary language its extraordinary depth and richness. Without
subvocalization, which connects the activity of the throat and vocal
cords with the auditory center in the brain, literary language fails
to achieve the richness it otherwise would have. [Garrett] Stewart's
argument implies that embodied responses operating below the level of
conscious thought are essential to the full comprehension of literary
language, a proposition enthusiastically endorsed by many
poets.
(145) Automating the homophonic variants that are the stock
in trade of literary language, [Millie Niss's] Sundays in the Park
brings to conscious attention the link between vocalization and
linguistic richness.
Cayley transliteral morphing explorations of algorithms underlying phonemic and morphemic relations.
(145-146) Cayley has been exploring what he calls
“transliteral morphing,” a computational procedure that
algorithmically morphs, letter by letter, from a source text to a
target text. . . . Cayley conjectures that underlying these
“higher-level” relationships are lower-level similarities that
work not on the level of words, phrases, and sentences but individual
phonemes and morphemes. . . . Just as Mencia invokes the
philological history of language as it moves from orality to writing
to digital representation, so Cayley's transliteral morphs are
underlain by an algorithm that reflects their phonemic and morphemic
relations to one another.
(146-147) The complexity of these
relationships as they evolve in time and as they are mediated by the
computer code, Cayley conjectures, are the microstructures that
underlie, support, and illuminate the high-level conceptual and
linguistic similarities between related texts.
Importance of embodied practice in Cayley suggesting the machine cognition can be intuited through observation in transliteral morphing; why not also look at the design, through reverse engineering, especially its program source code and system integration, that could be part of conscious thought?
(147) The correlations between higher and lower-level relationships
can be revealed (watching the morphs) by activating channels of
communication between embodied practice, tacit knowledge, and
conscious thought. . . . Cayley suggests that if a user watches
these long enough while also taking in the transliteral morphs, she
will gain an intuitive understanding of the algorithm, much as a
computer game player intuitively learns to recognize how the game
algorithm is structured. The music helps in this process by
providing another sensory input through which the algorithm can be
grasped.
(147) While all this is happening, through embodied and
tacit knowledge, the conscious mind grapples with the significance of
the transliterating text [Walter Benjamin “On Language as Such an
on the Language of Man”].
(149) For Benjamin the transcendent
language associated with God ensures translatability of texts, while
for Cayley the atomistic structures of computer and human languages
are the correlated microlevels that ensure translatability.
(149-150)
The “objectivity” of this translation is guaranteed not by God
but by the entwining of human and computer cognitions in our
contemporary mediascapes.
(151) At the performance, laptops
throughout the space began playing versions while Cayley projected
the full implementation on the front screen. . . . The stunning
effect was to create a multimodal collaborative narrative distributed
on laptops, throughout the performance space, in which different
sensory modalities and different ways of knowing entwined together
with machine cognition and agency.
(151) In other recent work,
Cayley has focused on the ways in which our inuitive knowledge of
letter forms can define space and inflect time.
(152) The temporal
interactions, as well as the virtual/actual spatiality of the textual
surfaces, create an enriched sense of embodied play that complicates
and extends the phenomenology of reading.
(153) Cayley further
explores the phenomenology of reading in Lens, designed first
as a CAVE installation and then transferred to a QuickTime
maquette.
(154-155) For text, the ability to function
simultaneously as a window into the computer's performance and as a
writing surface to be decoded puts into dynamic interplay two very
different models of cognition. . . . Mediatint between the brute
logic of these machinic operations and human intentions is the
program that, when run, creates a performance partaking both of the
programmer's intentions and the computer's underlying architecture as
symbolic processor. In electronic literature, authorial design, the
actions of an intelligent machine, and the user's receptivity are
joined in a recursive cycle that enacts in microcosm our contemporary
situation of living and acting within intelligent environments.
REVALUING COMPUTATIONAL PRACTICE
(156)
[Brian Kim] Stefans sees in this confrontation the possibility that
the boundaries of the conscious self might be breached long enough to
allow other kinds of cognitions, human and nonhuman, to communicate
and interact. “The space between these poles - noise and convention
- is what I call the 'attractor,' the space of dissimulation, where
the ambiguity of the cyborg is mistaken as the vagary of an
imprecise, but poetic, subjectivity” (151).
(157) Joining
technical practice with artistic creation, computation is revalued
into a performance that addresses us with the full complexity our
human natures require, including the rationality of the conscious
mind, the embodied response that joins cognition and emotions, and
the technological nonconscious that operates through sedimented
routines of habitual actions, gestures, and postures. Thus
understood, computation ceases to be a technical practice best left
to software engineers and computer scientists and instead becomes a
partner in the coevolving dynamics through which artists and
programmers, users and plays, continue to explore and experience the
intermediating dynamics that let us understand who we have been, who
we are, and who we might become.
CHAPTER
FIVE
The Future of Literature: Print
Novels and the Mark of the Digital
Print is now an output form of digital data rather than separate medium.
(159)
So essential is digitality to contemporary processes of composition,
storage, and production that print
should properly be considered a particular form of output for digital
files rather than a medium separate from digital instantiation.
(160)
This engagement is enacted in multiple senses: technologically in the
production of textual surfaces, phenomenologically in new kinds of
reading experiences possible in digital environments, conceptually in
the strategies employed by print and electronic literature as they
interact with each other's affordances and traditions, and
thematically in the represented worlds that experimental literature
in print and digital media perform.
(161) How does the mark of the
digital relate to the subjectivities performed and evoked by today's
experimental print novels? .. Rather than asking if there is evidence
that the “literary” novel may in fact be losing audience share to
other entertainment forms, however, [Kathleen] Fitzpatrick asks what
cultural and social functions are served by pronouncements about the
death of the print novel.
Characteristics of computer-mediated texts: imitating, layered, multimodal, storage separated from performance, fractured temporality; excellent example is the futz time required to adequately see (run) Lexia to Perplexia.
(162)
imitating electronic
textuality through comparable devices in print, many of which depend
on digitality to be cost effective or even possible; and intensifying
the
specific traditions of print, in effect declaring allegiance to print
regardless of the availability of other media.
(163-164)
Computer-mediated text is layered.
. . . the layered nature of code also inevitably introduces issues
of access and expertise.
(164) Computer-mediated
text tends to be multimodal.
(164)
In computer-mediated text, storage is separate from
performance.
. . . code can never be seen or accessed by a user while
it is running.
(164)
Computer-mediated text manifests fractured
temporality.
DIGITALITY AND THE PRINT NOVEL
Justified gimmick of Foer numerical code illustrating breakdown of language under trauma.
(166)
Why write it [numerical
code in Jonathan Safran Foer's Extremely
Loud and Incredibly Close]
in
code? Many reviewers have complained (not without reason) about the
gimmicky nature of this text, but in this instance the gimmick can be
justified. It implies that language has broken down under the weight
of trauma and become inaccessible not only to Thomas but the reader
as well.
(169) The text, moving from imitation of a noisy machine
to an intensification of ink marks durably impressed on paper, uses
this print-specific characteristic as a visible indication of the
trauma associated with the scene, as if the marks as well as the
language were breaking down under the weight of the characters'
emotions. At the same time, the overlapping lines are an effect
difficult to achieve with letter press printing or a typewriter but a
snap with Photoshop, so digital technology leaves its mark on these
pages as well.
(170) The novel remediates the backward-running
video in fifteen pages that function as a flipbook, showing the
fantasized progression Oskar has imagined (327-41).
(172) Further
complicating the ontology implicit in the book's materiality is the
partitioning of some chapters into parallel columns, typically with
three characters' stories running in parallel on a page spread, as if
imitating the computer's ability to run several programs
simultaneously.
What ontological levels are available for metafictional play in the genres of electronic literature Hayles has introduced can be related to Foucault meditation upon what is an author.
(173)
Within the narrative world, however, this apparent imitation of
computer code's hierarchical structure is interpreted as the baby's
ability to hide his thoughts from the reader as well as from Saturn,
an interpretation that locates the maneuver within the
print novel's tradition of metafiction by playing with the
ontological levels of author, character, and reader.
(175)
In a now-familiar pattern, a technique that at first appears to be
imitating electronic text is transformed into a print-specific
characteristic, for it would, of course, be impossible to eradicate a
word from an electronic text by cutting a hole in the screen.
(175)
In House of
Leaves,
the recursive dynamic between strategies that imitate electronic text
and those that intensify the specificities of print reaches an
apotheosis, producting complexities so entangled with digital
technologies that it is difficult to say which medium is more
important in producing the novel's effects.
(177) As if
positioning itself as a rival to the computer's ability to represent
within itself other media, this print novel remediates an astonishing
variety of media, including film, video, photography, telegraphy,
painting, collage, and graphics, among others.
(178) Digital
technology functions here like the Derridean supplement;
alleged to be outside and extraneous to the text proper, it is
somehow also necessary.
Technology itself is perfectly representable whether or not its referent exists; enter the concept of epistemological transparency.
(180) Although it is true that digital technologies can create objects for which there is no original (think of Shrek, for instance), the technology itself is perfectly representable, from the alternating voltages that form the basis for the binary digits up to high-level languages such as C++. The ways in which the technology actually performs plays no part in Hansen's analysis. For him the point is that the house renders experience singular and unrepeatable, thus demolishing the promise of orthographic recording to repeat the past exactly. Because in his analogy the house equals the digital, this same property is then transferred to digital technology.
Points out Hansen ignores ability of digital technology to exercise agency.
(180-181) More important, in my view, is an aspect of digital technology that Hansen's elision of its materiality ignores: its ability to exercise agency. . . . the layered architectures of computer technologies enable active interventions that perform actions beyond what their human programmers envisioned.
Human attention occupies small plateaus in machine to machine communication networks; old thoughts bad bots, participation in process of preference formation.
(182) Increasingly human attention occupies only the tiny top of a huge pyramid of machine-to-machine communication. . . . We would perhaps like to think that actions require humans to initate them, but human agency is increasingly dependent on intelligent machines to carry out intentions and, more alarmingly, to provide the data which the human decisions are made in the first place.
We have long been in the position that no single person can comprehend not just many but most programs and communications systems; Cantwell Smith on emergence of complexity ties back to Socrates question and von Neumman on automata.
(183) Although humans originally created the computer code, the complexity of many contemporary programs is such that no single person understands them in their entirety. In this sense our understanding of how computers can get from simple binary code to sophisticated acts of cognition is approaching the yawning gap between our understanding of the mechanics of human consciousness. . . . As Brian Cantwell Smith observes, the emergence of complexity within computers may provide crucial clues to “how a structured lump of clay can sit up and think.”
There is no there there hunting for homuncular thinking thing zooming through computer interior ties back to Dennett theory of consciousness.
(185) Like the nothingness infecting the text's signifiers, a similar nothingness would confront us if we could take an impossible journey and zoom into a computer's interior while it is running code. We would find that there is no there there, only alternating voltages that nevertheless produce meaning through a layered architecture correlating ones and zeros with human language.
Connect novel forms of subjectivity performance implicit in symposia project to digital humanities experimentation into auditory virtual realities philosophizing by programming to escape Hayles interpretation of fate of authorial voices by positing that programming also produces ones memorable texts, narratives, voices; important now to start adding sound and specifically text to speech formant synthesis to virtual reality generation for humans and computers (OGorman link).
(186) The subjectivity performed and evoked by this text differs from traditional print novels in subverting, in a wide variety of ways, the authorial voice associated with an interiority arising from the relation between sound and mark, voice and presence. Overwhelmed by the cacophony of competing and cooperating voices, the authority of voice is deconstructed and the interiority it authorized is subverted into echoes testifying to the absences at the center.
Hayles, N. Katherine. 2008. Electronic Literature. Notre Dame, IN: University of Notre Dame Press.
Hayles, N. Katherine. Electronic Literature. Notre Dame: University of Notre Dame Press, 2008. Print.