Notes for Jerome McGann Radiant Textuality: Literature After the World Wide Web

Key concepts: autopoetic phenomena, decentered text, deformative criticism, incarnational forms, poiesis-as-theory, postmodern incunable, quantum poetics, radiant textuality, Standard Generalized Markup Language, text, Text Encoding Initiative, textuality, un-editing, vehicular forms.

Good view of texts, scholarly editing theory, and book as knowledge machine. Same problem duplication in data structures and programming, which can be a theme linking McGann's comments on scholarly editing of literary works to code studies. Possibilities of hyperediting themselves create new problems while addressing existing problems. Appreciate his detailed account of theory-informed and exploratory technological development. Could the flip side of the quest for designing ever more decentered tools be to foster user involvement in creating the interfaces? Scholarly books as postmodern incunables, low-level programming languages, beg for poetic theorization of electronic editions. These instantiated arguments are like creating software projects to do humanities research. Like bricoleur programming, different from unknown knows of psychoanalysis, he differentiates poiesis-as-theory from traditional theory. Compare to “Alice Fallacy” and Ulmer mystory random connections: although I thought it would be about developing the system itself, he is focusing on great insights via poiesis-as-theory resulting from Photoshopping Rossetti images, thus more like 404 errors and gliches in Hayles, “behind the blip” of Fuller. Claims chapter 3 appendix the center of the book in terms of the chronological development of his thoughts about radiant textuality is also where I make the most connections to my own research, not only the image/text and surface/depth distinction, but also the use of revision history and source code comments to trace the evolution of theoretical thought embodied in poiesis. Deformations of software systems are described in Marino and others, such as stepping through processes and otherwise altering their normal temporal behavior.

Related theorists: Batson, Brown, Burnard, Busa, della Volpe, Greg, Hayles, Kittler, Landow, Maturana, Patterson, Pitti, Renear, Tufte, Turkle, Varela, Williams.

Acknowledgements

Dan Pitti is noted in acknowledgments, whose work was mentioned at THATCamp.

How does writing a book by revising old texts resemble creating a new program from an old one?

(x) The texts of the following chapters are revised versions of essays originally published elsewhere.


Preface
(xi) Because the most advanced forms of textual codings are what we call “poetical,” the study and application of digital codings summons us to new investigations into our textual inheritance.
(xii) But
the general field of humanities education and scholarship will not take the use of digital technology seriously until one demonstrates how its tools improve the ways we explore and explain aesthetic works – until, that is, they expand our interpretational procedures.
(xii) We forget that ten years ago – I am writing this sentence in late February 2000 – the number of humanities scholars who used any computerized tools at all was relatively small.
(xii-xiii) A discontinuous historical event occurred during those ten years, and in the course of its unfolding emerged W3, the digital environment that organizes and commands the subjects of this book. To the speed and ubiquity of digital intercourse and transaction have been added interface and multimedia, and that, as the poet said, “has made all the difference.” Our sense of language will never be the same.

The Symposium too is one medium containing another; explicate my early work as fumbling towards philosophy of computing for lack of available VR hardware and technical skills.

Hypermedia are profane resurrection of once-sacred models of communication; the medium is the message.

(xii) However toddling they appear, contemporary instruments of hyper- and multimedia constitute a profane resurrection of those once-sacred models of communication. . . . In the rediscovered “Grotesque” art of the Middle Ages was heard – the metaphor is deliberately mixed – the first premonition of the famous proverb that would define the coming of the digital age a century later: the medium is the message.
(xiii) Recall that even before we began creating formal systems of visual signs – systems that generate this very sentence-object you are now reading – the language we use is woven from audible and visible elements.
(xiv) Computational systems are not designed like the first sentence of the previous paragraph. They are designed to negotiate disambiguated, fully commensurable signifying structures.
(xiv) It is organized to show how the work at the University of Virginia's Institute for Advanced Technology in the Humanities (IATH) from 1993 to 2000 led to the practical implementation of catastrophe and quantum models for the critical investigation of aesthetic forms.

Print is flat but expresses human complexities requiring quantum models; code is deep but based on von Neumann architecture designed to negotiate disambiguated, fully commensurable signifying structures.

(xiv) The empirical data of consciousness are texts and semiotic phenomena of all types - “autopoetic” phenomena, in the terms of Humberto Maturana and Francesco Varela. This book will argue that our “classical” models for investigating such data are less precise than they might be and that quantum dynamical models should be imagined and can be built.
(xv) “Quantum poetics” in this study does not signify certain figures and tropes that stimulated the practices of a certain group of historically located writers. On the contrary, it comprises a set of critical methods and procedures that are meant to be pursued and then applied in a general way to the study of imaginative work.
(xv) We propose to build it in the hope that it may stimulate others to develop and build more adequate critical tools.

Note on the Text


Introduction
Beginning Again: Humanities and Digital Culture, 1993-2000

(1-2) Before 1993 the computerized future of our humanities inheritance was apparent to a relatively small group of librarians and archival scholars and to very few other people in literary and cultural studies. . . . The situation in 2000 is different, for many educators now understand that our inherited archive of materials in libraries and museums will have to be re-edited with information technology tools.
(2) But also now in 2000 some are being pushed further by the inertia of the new tools being placed at our disposal. Ideas about textuality that were once taken as speculative or even imaginary now appear to be the only ones they have any practical relation to the digital environments we occupy every day.

Good view of texts and textuality.

(2) Reconnecting with certain performative and rhetorical traditions, however, writers like Jarry laid a groundwork for post-romantic procedural writing. They began to make clear once again the constructed character of textuality – the fact that texts and documents are fields open to decisive and rule-governed manipulations. In this view of the matter, texts and documents are not primarily understood as containers or even vehicles of meaning. Rather, they are sets of instantiated rules and algorithms for generating and controlling themselves and for constructing further sets of transmissional possibilities.

Points of Departure
(3) The humanities computing work sponsored by IATH – a large array of research projects in texts, media, images, and information – can now be seen to mark the end of a first and distinct phase in the history of humanities computing.

Use of IT in humanities beginning with Busa.

(3) The use of IT in humanities disciplines began in the late 1940s with Father Roberto Busa SJ, whose work on the corpus of St. Thomas Aquinas set the terms in which humanities computing would operate successfully for more than 40 years. Two lines of work dominate the period: first, the creation of databases of humanities materials – almost exclusively textual materials – for various types of automated retrieval, search, and analysis; second, the design and construction of statistical models for studying language formalities of many kinds, ranging from social and historical linguistics to the study of literary forms.
(4) To the degree that IT attracted the attention of humanities scholars, the interest was largely theoretical, engaging the subjects of media and culture in either speculative and relatively abstract ways or journalistic treatments.

Signal event of development of TEI.

(4) So far as the humanities are concerned, the signal event was the development of TEI (Text Encoding Initiative).
(4) These dates and events are important because of what happened in the larger world of IT between 1993 and 1994: the definitive appearance of the W3. . . . The scholarly meetings and journal devoted to humanities computing show with unmistakable clarity, however, that few people in those communities registered the importance of W3.
(5) The upside of these events was the coming of a large and diverse population of new people into digital fields previously occupied by small and tightly connected groups. More significantly, they came to build things with digital tools rather than simply to reflect abstractly on the new technologies.

Humanities Computing at the University of Virginia: 1992-1993

IATH created at UVA through IBM offer evolving through randomized state of affairs; compare to Hayles account of development of the shape and focus of cybernetics.

(6) Later that same year IBM approached UVA's computer science department with an offer of $1 million in equipment for educational use over a three-year period. Two CS faculty members, Alan Batson and Bill Wulf, contacted two humanities professors, Ed Ayers and myself, to see if IBM's offer might be useful to people in the arts and sciences division of the university.
(6) Because IATH came into being fortuitously, its shape and focus evolved through a randomized state of affairs.

The Idea of IATH

Batson wanted IATH to promote specific, demonstrable projects rather than making equipment available as soon as possible.

(7) The overwhelming initial answer to the central question was that the equipment should be made available as soon as possible to all arts and sciences departments for as long as possible.
(7)
Batson's model was different: to seek out projects with demonstrable intellectual importance for humanities scholarship and to fund those projects as completely as possible with the technical resources the projects need. His rationale: “Educational change at the level of the university is driven by the active research work of the faculty. Changes in pedagogy and classroom dynamics follow from research.”
(9) It is a fact that right now one can function most effectively as a university scholar and teacher by working within the parer-based system we inherit. (This moment, this “now,” is quickly passing away.)
(10) A crucial factor in UVA's involvement with humanities computing was the close liason that was fostered from the start between IATH and the library.
(10) Finally, the remarkable success of IATH resulted in major part because its work from the outset was consciously developed in relation to W3.

The Rossetti Archive and the Theory of Scholarly Editing
(11) But what precisely was involved in
The Rossetti Archive's image-based approach to its materials? I can pose this question now because the hindsight of seven years has exposed how loosely and unselfconsciously we undertook our work with digital images.

Scholarly editing theory actively evolved working on Rossetti archive; compare to Burnard.

(11-12) The Rossetti Archive was undertaken as a practical effort to design a model for scholarly editing that would have wide applicability and that would synthesize the functions of the two chief models for such works: the critical edition (for analyzing the historical relations of a complex set of descendant texts with a view toward locating accumulated linguistic error); and the facsimile edition (a rigorously faithful reproduction of a particular text, usually a rare work, for scholarly access and study). . . . The theory holds two positions: first, that the apparitions of text – its paratexts, bibliographical codes, and all visual features – are as important in the text's signifying programs as the linguistic elements; second, that the social intercourse of texts – the context of their relations – must be conceived an essential part of the “text itself” if one means to gain an adequate critical grasp of the textual situation.
(12) We spent the year from 1992 to 1993 theorizing the methodology of the project and designing its logical structure. Then in 1993 we built the first small demonstration model of
The Rossetti Archive, which at that time I described in the following general terms.

Beginning Again
(15) Because
The Rossetti Archive was conceived and pursued, early on, as much as a kind of thought experiment in the theory of texts as an editorial project per se, it kept a constant focus on reflexive attention.
(15) The work of those writings has been recomposed into the parts of this book, which is organized around a double vanishing point. In one perspective appears a set of related but independent explorations into the characteristics of different kinds of textualities. In another, one follows a kind of metanarrative or critical history projecting a map of future scholarly operations.

Tension between real demo and imaginary objective similar for symposia.

(16) One wants to hold this initial situation clearly in mind, for the contradiction between the web demo model, a simple visual interface built in HTML, and the archive itself, a set of logical relations and determinants conceived in SGML, would surface repeatedly in all our work.
(17) More than anything else, the making of
The Rossetti Archive has exposed the gulf that stands between digital tools and media, on one hand, and the regular practices of traditional philosophy, “theory,” hermeneutics, and arts/literary/cultural criticism, on the other.

Digital humanities scholarship missing depth for not building critical and reflective functions into the deep components; compare to discussions of unknown knowns, Reddell.

(17) Works like The Rossetti Archive or The Perseus Project or The Dickens Web are fundamentally archival and editorial. . . . Unlike works imagined and organized in bibliographical forms, however, these new textual environments have yet to develop operational structures that integrate their archiving and editorial mechanisms with their critical and reflective functions at the foundational level of their material form, that is, at the digital/computational level. . . . Thus, however primitive hyperfiction and video games may seem, we recognize their functional relation to their underlying digital processes.

Creating software to pursue humanities scholarship with incessant reflection on the design processes, making things, as he says next.

(18-19) As it emerges around us, it exposes our need for critical tools of the same material and formal order that can execute our other permanent scholarly function: to imagine what we don't know in a disciplined and deliberated fashion. How can digital tools be made into prosthetic extensions of that demand for critical reflection? . . . The next generation of literary and aesthetic theorists who will most matter are people who will be at least as involved with making things as with writing texts.

Need to reconsider fundamental problems of texts and textuality; his role is to write one more book, mine is to probe these questions by working code.

(19) These kinds of issues won't be usefully engaged without reconsidering certain fundamental problems of texts and textuality.


Part I
Hideous Progeny, Rough Beasts: 1993-1995

Wild, putatively incorrect interpretation of a poem breathes new life into otherwise stale interpretation, inspiring The Alice Fallacy.

(24) My argument begins, therefore, with a performance definition of the state of my thinking in 1993, just before I undertook The Rossetti Archive, about problems of scholarly method and aesthetic interpretation. Like its companion dialogues, “The Alice Fallacy” is an open-ended inquiry into current ideas about textuality, on one hand, and interpretive method, on the other.

Rossetti Archive developed to demonstrate feasibility of alternative approach to textuality and editing theory presented in his book Critique of Modern Textual Criticism: an act of critical programming?

(25) The Rossetti Archive was developed in major part to demonstrate the practical feasibility of the Critique's [McGann's A Critique of Modern Textual Criticism] alternative approach to textuality and editing theory.
(25) “The Rationale of Hypertext” is thus a code to the
Critique. It argues that textuality as such operates as a radiant and decentered structure.
(26) “Editing as a Theoretical Pursuit” was the dialectical offspring of “The Rationale of Hypertext.” In a chronological sense, it is the pivotal chapter of this book.
(27) Just as we wanted to use the development of the archive as a practico-theoretical instrument for investigating the nature of texts and textuality, we saw our arrangement with the press as an opportunity for extending those investigations into the social and institutional aspects of the work.

Chapter 1
The Alice Fallacy; or, Only God Can Make a Tree
A Dialogue of Pleasure and Instruction


Chapter 2
The Rationale of Hypertext
(54) In this chapter I focus primarily on a particular feature of literary works—their physical character, whether audial or visible.
(54) My remarks here apply onto to textual works that are instruments of scientific knowledge. The poet's view of text is necessarily very different. To the imagination the materialities of text (oral, written, printed, electronic) are incarnational, not vehicular forms.
(54) As we shall see, efforts toward “rethinking textuality”--traditional as well as digital—are impeded by the uncritical assumption of the authority of this same distinction.

The Book as a Machine of Knowledge
(55) We no longer have to use books to analyze and study other books or texts. That simple fact carries immense, even catastrophic, significance. . . . In literary studies, the book has evolved (over many centuries) a set of scientific engines—specific kinds of books and discursive genres—of great power and complexity. Critical and other scholarly editions of our cultural inheritance are among the most distinguished achievements of our profession.

Expands on claim by Kittler that it is challenging to study media because the study itself takes place with and through media.

(56) Their problems arise because they deploy a book form to study another book form. This symmetry between the tool and its subject forces the scholar to invent analytic mechanisms that must be displayed and engaged at the primary reading level—for example, apparatus structures, descriptive bibliographies, calculi of variants, shorthand reference forms, and so forth.

Same problem duplication in data structures and programming, which can be a theme linking McGann to code studies.

(56) Because the entire system develops through the codex form, however, duplicate, near-duplicate, or differential archives appear in different places. The crucial problem here is simple: The logical structures of the “critical edition” function at the same level as the material being analyzed.
(56-57) When a book is translated into electronic form, the book's (heretofore distributed) semantic and visual forms can be made simultaneously present to each other.

Hayles may object to his assumptions about embodiment that are based on print culture.

(57) Of course, the electronic text will be “read” in normal space-time, even by its programmers: the mind that made (or that uses) both codex and computer is “embodied.”

Hyperediting and Hypermedia
(58) In my own view, for example, a fully networked hypermedia archive would be an optimal goal. Because such an archive is not yet a practical achievement, however, one must make present design decisions in a future perfect tense. What that means in practice is the following: (1) that the hyperediting design for a specific project be imagined in terms of the largest and most ambitious goals of the project (rather than in terms of immediate hardware or software options); and (2) that the design be structured in the most modular and flexible way, so that the inevitable and fast-breaking changes in hardware and software will have a minimal effect on the work as it is being built.
(58) Hypermedia editions that incorporate audial and/or visual elements are preferable since literary works are themselves always more or less elaborate multimedia forms.

The Necessity of Hypermedia
Example A

(59) First, then, think about songs and ballads—think in particular about Robert Burns's ballad “Tam Glen.”
(60) What if one could have a crtical edition of Burns's work in audial forms that allowed one to engage the songs in the same kind of scholarly environment that we know and value in works like the Kinsley/Clarendon edition?

Example B
(61) Just as Burns's poetry almost always exploits the language's auditional forms and materials, Blake's almost always exploits in print medium for expressive effects. A text of Blake's
Songs, for example—whether critical or otherwise—that does not at a minimum give us a color facsimile, is simply an inadequate text.
(62) Because Blake's texts operate simultaneously in two media, an adequate critical edition would have to marry a complete facsimile edition of all copies of Blake within the structure of a critical edition. One needs in such a case not a critical edition of Blake's work but a critical archive. . . . Hypermedia structures for the first time make this kind of archive possible; indeed, work toward the development of such a Blake archive is now underway.

Example C
(62) It has taken 100 years for scholars to realize that a typographical edition of Dickinson's writings—whether of her poetry or even her letters—fundamentally misrepresents her literary work.
(63) Dickinson set up a kind of gravitational field for her writing when she fixed an uncanceled three-cent stamp (with a locomotive design) to a sheet of paper and then wrote her poem in the space she had thus imaginatively created. Whatever this poem “means,” the meaning has been visually designed—more in the manner of a painter or a graphic artist than in the manner of writers who are thinking of their language in semantic or—more generously—linguistic terms.
(63) To edit her work adequately, then, one needs to integrate the mechanisms of critical editing into a facsimile edition—which is precisely the kind of thing that codex-based editing finds exceedingly difficult to do.

Example D
(66-67) In this example from Landon I have allowed myself to range beyond bibliographical issues into interpretive commentary. I have done this because literary history has long invisibilized Landon and the gift book traditions she used. . . . In Landon's case, the pictorial and ornamental context of gift book production can be torn away from her work only at the cost of its destruction.

Example E
(68) Mark Reed narrativized the “Five Book”
Prelude for one reason only: The book format (including the commercial factors governing that format) did not lend itself to printing yet another Prelude volume in the Cornell series. Too much of the material was viewable in the other volumes.
(68) In the computerized edition, however, the reader does not have to learn or even encounter the codes in order to execute critical operations (e.g., moving back and forth across different parts of books or separate volumes, carrying out analytic searches and comparisons).

Conclusion: The Rossetti Hypermedia Archive

Possibilities of hyperediting themselves create new problems while addressing existing problems.

Appreciate his detailed account of theory-informed and exploratory technological development as a prototype for critical programmer.

(69) How to incorporate digitized images into the computational field is not simply a problem that hyperediting must solve; it is a problem created by the very arrival of the possibilities of hyperediting. . . . Those of us who were involved with The Rossetti Archive from the beginning spent virtually the entire first year working at this problem. In the end we arrived a a double approach: first, to design a structure of SGML markup tags for the physical features of all the types of documents contained in The Rossetti Archive (textual as well as pictorial); and second, to develop an image tool that permits one to attach anchors to specific features of digitized images.

Coda. A Note on the Decentered Text
(72) The theory of hypertext flows directly from this way of imagining a noncentralized structure of complex relationships.
(72) This kind of organizational form resembles our oldest extant hypertextual structure, the
library, which is also an archive (or in many cases an archive of archives).
(73-74) The exigencies of the book form forced editorial scholars to develop fixed points of relation—the “definitive text,” “copy text,” “ideal text,” “Ur text,” “standard text,” and so forth—in order to conduct a book-bound navigation (by coded forms) through large bodies of documentary materials. Such fixed points no longer have to govern the ordering of the documents. As with the nodes on the Internet, every documentary moment in the hypertext is absolute with respect to the archive as a whole, or with respect to any subarchive that may have been (arbitrarily) defined within the archive. In this sense, computerized environments have established the new “rational of hypertext.”
(252 endnote 10) Textual scholars will understand that this chapter stands in a consciously revisionist relation to W. W. Greg's great essay “The Rational Of Copy-Text.”

He seems to be taking the designer, system-centric perspective: could the flip side of the quest for designing ever more decentered tools be to foster user involvement in creating the interfaces?

(74) The interface one encounters in the actual Rossetti archive is, in fact, anything but decentered. In this respect it is quite like every other scholarly and educational hypertext work known to me—The Perseus Project, say, or any of George Landow's “webs.” All are quite “centered” and even quite nondynamical in their presentational structure. We want to be aware of this since a major part of our future with these new electronic environments will be the search for ways to implement, at the interface level, the full dynamic—and decentering—capabilities of these new tools.


Chapter 3
Editing as a Theoretical Pursuit
(75) According to Kane and Donaldson, their edition [of
Piers Plowman] is “a theoretical structure, a complex hypothesis designed to account for a body of phenomena in the light of knowledge about the circumstances which generated them” (212).
(76) Only another theory of the work
that instantiates itself as a comprehensive edition could supplant the authoritative truth of the Kane-Donaldson text.
(76) Why this requirement should be the case is one part of my subject. The other part, which is related, concerns procedures of theoretical undertaking as such. In this last respect we will be focusing on electronic textuality.
(77) Over against these two theoretical approaches to editing stands that great tradition of what Randy McLeod would call (I think) “
un-editing: that is, the scholarly reproduction of text in documentary forms that reproduce more or less adequate replicas of the originary materials.

Scholarly books as postmodern incunables, like low-level programming languages, beg for poetic theorization of electronic editions, high-level languages.

(79) At once very beautiful and very ugly, fascinating and tedious, these books drive the resources of the codex to its limits and beyond. Think of the Cornell Wordsworth volumes, a splendid example of a postmodern incunable. Grotesque systems of notation are developed in order to facilitate negotiation through labyrinthine textual scenes. To say that such editions are difficult to use is to speak in vast understatement. But their intellectual intensity is so apparent and so great that they bring new levels of attention to their scholarly objects. Deliberate randomness attends every feature of these works, which are as well read as postmodern imaginative constructions as scholarly tools.

Theoretical Embodiments
(79) Most scholarly editions follow a path of what we call “normal science.” . . . But they [Lachmann's
Lucretius, Bowers's Dekker, the Kane-Donaldson Piers Plowman] also seek discoveries that stand beyond the purposes of customary scholarly practices.

Instantiated arguments are like creating software projects to do humanities research.

(80) Most important, their arguments are not made abstractly, nor even through a set of illustrative examples. They are instantiated arguments—what William Carlos Williams called “The Embodiment of Knowledge”--and they call attention to the theoretical opportunities involved in making an edition. The totalized factive commitments and obligations of an editorial project open into a theoretical privilege unavailable to the speculative or interpretive essay or monograph.

Finally his point about what makes electronic texts special.

(81) In a certain sense all editions end up doing that. Shakespeare and the Bible and our entire archive of textual works undergo repeated re-editing because we respond to the inadequacies and limits of previous editions. But electronic texts have a special virtue that paper-based texts do not have: They can be designed for complex interactive transformations.

Horizons of Failure
(82) Translating paper-based texts into electronic forms entirely alters one's view of the original materials. So in the first two years of the archive's development we were forced to study a fundamental limit of the scholarly edition in codex form that we had not been aware of.

Ong and McLuhan make this point that only through study of new media systems are the limits and affordances of older media systems revealed.

(82) We began our work of building the archive under an illusion or misconception—in any case, a transparent contradiction: that we could know what was involved in trying to imagine what we didn't know.

Like bricoleur programming versus hard mastery in Turkle, different from unknown knows of psychoanalysis, he differentiates poiesis-as-theory as instrument or machine making engineering project from traditional theory.

(83) In terms of that ancient distinction, The Rossetti Archive is a poiesis, although modern disciplinary conventions would see it as a kind of engineering project—instrument or machine-making. Patterson's discussion of the Kane-Donaldson edition of Piers Plowman implicitly affirms the same kind of distinction, where “theory” operates through concrete acts of imagining.
(83) The close relation it bears to artistic work is important because
poiesis-as-theory makes possible the imagination of what you don't know. Theory in the other sense—for instance, Heideggerian dialectic—is a procedure for revealing what you do know but are unaware of.
(83-84) The commitment of
The Rossetti Archive to elucidating digital images kept generating one of its most important series of logical impasses and failures (for more on this, see the appendix). The problem—images recalcitrance to analytic treatment—meant that we were continually kvetching over the matter, which in turn meant that I would often find myself engaging the problem in undisciplined ways and circumstances.

Gives example of making crucial design discoveries by playing with Photoshop deformations: does this relate to Hayles notions of transformations of subjectivity?

(85) What is important—and new—about our electronic deformations, however, is their arbitrary character. . . . The deformed images suggest that computerized art editing programs can be used to raise our perceptual grasp of aesthetic objects.
(85) There are critical opportunities to be exploited in the random use of these kinds of deformation.

Compare to Alice Fallacy and Ulmer mystory random connections: although I thought it would be about developing the system itself, he is focusing on great insights via poiesis-as-theory resulting from Photoshopping Rossetti images, thus more like 404 errors and gliches in Hayles, behind the blip of Fuller.

(86) But that revelation was unusual, and it seemed clear to me that the deformations largely functioned in a pedagogical way. Insofar as these images brought an imagination of the unknown, they were pointing to the image editor as a critical and interpretive tool.
(86) The critical force of the Photoshop deformations develops from their ability to expose matters that will be generally recognized, once they are seen.
(86) Strange images evoke our interest exactly because they don't pretend to supply us with a generic response to the picture.
(86-87) Distortion and original stand in immediately dialectical relation to each other. . . . This is an ancient way of engaging art that was revised in symbolist and surrealist practice. Not surprisingly, it is a view that Rossetti shared.

Appendix to Chapter 3

This appendix in the chapter he claims is the center of the book in terms of the chronological development of his thoughts about radiant textuality is also where I make the most connections to my own research, not only the image/text and surface/depth distinction, but also the use of revision history and source code comments to trace the evolution of theoretical thought embodied in poiesis.

(88) One kind of project is presentational, designed for the mind's eye (or the eyes' mind); the other is analytic, a logical structure that can free the conceptual imagination of its inevitable codex-based limits. The former tend to be image-oriented, the latter to be text-based.

Image/text another expression of surface/depth; he invokes the Unix Mac split, and claims consciousness of this division is built into his project: putting aside his critical, his instrumental engagement with technology through texts if visual, oriented to sight rather than sound, although he does mention the audial a few times; mump down to his description of Mallarme equivocating digital and book characteristics, and the concept of the Ivanhoe Game as a model for future virtual reality digital humanities scholarship projects.

(89) The computerized imagination is riven by this elementary split, as everyone knows. It replicates the gulf separating a Unix from a Mac world. It also represents the division upon which The Rossetti Archive was consciously built.
(89) We arrived at two schemes for achieving what we wanted. One involved a piece of original software we would develop, now called Inote. The other plan was to develop an SGML markup design that would extend well beyond the conceptual framework of TEI, the widely-accepted text markup scheme that had spun off from SGML.
(89) The overlapping structures of literary works and their graphical design features are not easily addressed by TEI markup.
(90) Texts have bibliographical and linguistic structures, and those are riven by other concurrencies: rhetorical structures, grammatical, metrical, sonic, referential. The more complex the structure the more concurrencies are set in play.

Interative bootstrapping development familiar to any bricoleur: theorizing via revision history and comments, as I am doing in software source code, is the big insight he and I both want to leverage as an advance in humanities scholarship emerging from engaging in computer technologies as producers.

(91) Our plan was to use the construction process as a mechanism for imagining what we didn't know about the project. In one respect we were engaged in a classic form of model-building, whereby a theoretical structure is designed, built, and tested, then scaled up in size and tested at each succeeding junction. The testing exposes the design flaws that lead to modifications of the original design. That process of development can be illustrated by looking at one of our SGML markup protocols—the DTD for marking up every Rossetti archive document (or RAD). . . . My interest here is not in the SGML design as such but in the record of modifications to the design. That record appears as the list of dated entries at the top of the document.
(91) A great many modifications to the initial design were made during that year, but we did not at first think to keep a systematic record of the changes.

Seems to be attributing a cognitive role to the evolving archive as an IT integration, following Hayles human-computer cyborg articulated in Electronic Literature: what came to make sense as iterative changes to the protocols that made the system work better in retrospect reflect the discovery of unknowns as if the result of Socratic self-questioning.

(91) Second, the record does not indicate certain decisive moments when the archive was discovering features of itself it was unaware of. In these cases no actual changes were made to the DTDs.
(92) Rossetti is one of the first modern artists to take a serious interest in photographs.

Leveraging out of copyright photographs Rossetti took of his own paintings solves economic problem and invites new theoretical speculation. Consider effects of copyright and licensing factors in studying software.

(92) The move allows us to temporize on the extremely vexed issue of copyright. . . . On one hand we now comprehensively represent Rossetti's visual work in the medium that was probably its major early disseminating vehicle. On another, we create a digital archive of great general significance for studying both the history of photography and the history of painting.
(93) Working out this scheme for collating Rossetti's texts revealed an interesting general fact about electronic collating tools: that we do not yet have any good program for collating units of prose texts. . . . The person who discovers a reasonably simple solution to this problem will have made a signal contribution not just to electronic scholarship but to the theoretical understanding of prose textuality in general.

Future work and intellectual problems revealed by comments made of their iterative development of DTD tags constituting history of RAD revision: collating units of prose texts, general problems of concurrency, limitations of SGML software, and text itself.

(93) But let us return to the history of RAD revision. Look at the notation for 14 June 1995:
<!-- revised: 14 Jun 95 to add group to rad for serials -->
A large-scale change in our conception of the archive's documentary structure is concealed in this small entry.
(94) That practical insight, however, was not nearly so interesting as the insights we gained into general problems of concurrency and into the limitations of SGML software.
(95) As others looked for features that would answer their interests, Inote emerged as a device for editing images with multiple-style overlays that, if clicked, would generate a text file carrying various annotations to the image. These annotations would be saved as part of the total archive structure and hence could be imbedded with hypertext links to other images or archival documents.
(96) We began by posing the question “what is the formal structure of a text page?”

Definition of text as rhetorical sequence organized by page unit with assumed organization.

Inote theoretical practice revealed ideas about computerizing text in relation to image database.

(96) We assume that a “text” is a rhetorical sequence organized by units of page, with each page centrally structured in terms of a sequence of lines commonly running from top to bottom, left to right, and within some set of margins (which may be reduced to nil [practically] on any side).
(97) The practice of the theory of Inote revealed some interesting ideas about computerizing textual materials in relation to a database of images. . . . Only if the basic unit is the page (or the page opening) can the lineation in the digital image be logically mapped to the SGML markup structure. Of course if SGML software were able to handle concurrent structures, this consequence would not necessarily follow.


Part II
Imagining What You Don't Know: 1995-1999

(101) For [Lisa] Samuels (a poet), the resistance that aesthetic form raises against a “translation back to knowledge” is not cognitively “useless . . . private and incommunicative,” as knowledge professions from at least the time of Plato have commonly assumed or argued. On the contrary, “knowledge is also—perhaps most importantly—what we do not yet know.”
(101-102) For all its usefulness, twentieth-century theory of the sign has helped to perpetuate an uncritical understanding of the illusion of transcendental form. If we reflect on the materiality of the sign, however, semiosis emerges to our view as a system of deliberated transformations with no untransformed origin or end. The transcendental sign is a signifying transformation mapped on the discourse of formalization, where it serves an important heuristic function for both reflective and procedural thinking.

Invitation to compare deformative interpretation of poetry with deformations that occur in everyday working code.

(102) Galvano della Volpe's important insight was to see interpretation as an interface for organizing and generating critical thinking. An interpretation so-called makes a record of a particular act of critical reflection and analysis. This record is at the same time an algorithm for generating further reflection and analysis, starting with the record itself. In this respect the record is less clearly understood as a meaning or even a form than as a program, in the computational sense of the term.

Compare invocation of Hockey to Hayles and Turkles way of casting questions: what does it mean that society asks these questions about technology, rather than about the implications of the answers.

(102) The object of critical reflection is not ultimately directed to the sign as such but to the rhetorical scene and its functional (social) operators, not least of all the person(s) engaged in the acts of deformance we commonly locate in a filed headed “Interpretation.”
(103) Susan
Hockey organized an important occasion for addressing the first question, “What is Text?” . . . What were the constraints keeping the vast majority of “scholars of the book” from serious practical engagements with this new medium of textuality?

How to do humanities with computers?

(103) “Whereas we thrive in a world of analogues and fuzzy logic, computers exploit a different type of precision.” What if the point were not to try to bridge that gap but to feed off and develop it? Meditating that question is the recurrent object of this book's last five chapters. All move in pursuit of a new ground on which to build computerized tools that generate and enhance critical reflection. . . . Electronic or not, our tools are prostheses for acting at a distance. It is exactly that distance that makes reflection possible.


Chapter 4
Deformanance and Interpretation
(with Lisa Samuels)

A Question of Interpretation

Is this the sort of conclusion apparent from texts and technology studies?

Bogost unit operations.

(106) To understand a work of art, interpreters try to close with a structure of thought that represents its essential idea(s).
(106) We will argue that concept-based interpretation, reading along thematic lines, is itself best understood as a particular type of performative and rhetorical operation.

Reading Backward
(107) But the
Convivio [of Dante] is not only a model of thematized interpretation: When we recall its rhetorical context we see a very different dynamic at work. That context exposes the Convivio as one of our best and earliest examples of reading “backward” within an interpretive tradition (as opposed to Dickinson's performative tradition).
(108) Emily Dickinson's thought is different. When she talks of reading poems backward she is thinking of recitation, whether silent or articulated. She proposes that an intellectual “overtaking” may come if one recites a poem from end to beginning, last line to first line (or is it last word to first word?).
(108) In this perspective, the critical and interpretive question is not “what does the poem mean?” but “how do we release or expose the poem's possibilities of meaning?”
(109) We use Dickinson's proposal for reading poems backward, then, as an emblem for rethinking our resources of interpretation. It is a splendid model for what we would call
deformative criticism. . . . Dickinson's is a protomodernist strategy of estrangement. . . . Dickinson's critical model is performative, not intellectual.

Interpretation as Performance: The Case of Dante, the Coda of Shelley

Dante Convivio as model for hermeneutics, reading backward.

(109) Reading backward is a deformative as well as a performative program.
(110) Coming before the historical period when prose gained its scientistic function, the
Convivio is especially important: for it is also the work that models and licenses many of our most basic hermeneutic procedures.

Intuiting machine embodiment where human limits clearly crossed.

(112) The poem's ragionamento [meaning and information] is regularly exposed to its human limits through a formal devotion to the artifices of surprising pleasures. Paradoxically, then, this structure of pleasure works to draw the intellect beyond what it is able to imagine. In this sense, the elementary, linguistic pleasure of verse becomes the manifest form of divine presence.
(112) The turn of poiesis from performance to deformance marks an epoch when Dantean
ragionamento, the dream vision of enlightenment, had grown vexed to scientistic nightmare. No one exposes this turn of events better than Shelley, whose allegiance to Dante's visionary hopes is unmistakable.
(113)
Epipsychidion is a love poem that realizes a dysfunction between desire and action.

From Performance to Deformance
(114) criticism (scholarship as well as interpretation) tends to imagine itself as an informative rather than a deformative activity. . . . Here we want to point out that lines of performative and deformative critical activity have always existed. Editions and translations are by definition performative. Elaborate scholarly editions foreground their performative characteristics. Sometimes translators do the same.

Forbidden zone of deformative scholarhsip; are Ulmer and OGorman deformative scholars?

(114-115) Deformative scholarship is all but forbidden, the thought if it either irresponsible or damaging to critical seriousness. . . . Despite its bad eminence, forgery is the most important type . . . Sortes Virgilianae and subjective appropriations of poetical works are types of interpretive deformation. So are travesty retextualizations, buth deliberate and unpremeditated.
(115) The reluctance shows, more interestingly, that interpreters—even radical ones—do not commonly locate hermeneutic vitality in the documentary features of literary works. Because meaning is assumed to develop as a linguistic event, critical deformance plays itself out in the field of the signifieds.

Operating system metaphor for basic units of language: when language is not artificial, deformative decompositions yield surprises; for artificial languages, it is the basic tenet of epistemological transparency that sustains our faith in their reliable operation.

(115) These forms are so basic and conventionally governed—they are alphabetical and diacritical; they are the rules for character formation, character arrangement, and textual space, as well as for the structural forms of words, phrases, and higher morphemic and phonemic units—that readers tend to treat them as preinterpretive and precritical. In truth, however, they comprise the operating system of language, the basis that drives and supports the front-end software.
(116) The computing metaphor explains why most readers don't fool around with these levels of language. To do so entails plunging to deep recesses of textual and artifactual forms. . . . Reading backward is a critical move that invades these unvisited precincts of imaginative works. It is our paradigm model of any kind of deformative critical operation.

Deformations of software systems are described in Marino and others, such as stepping through processes and otherwise altering their normal temporal behavior, can result in dramatic exposure of subjectivity as lively option for interpretive commentary.

Deformative practice is what hacking meant for some time, too (thinking of what we did to Apple II games and called hacking).

(116) For more important is the stochastic process it entails. . . . When we run the deformative program through a particular work we cannot predict the results.
(116) Not the least significant consequence, as will be seen, is the dramatic exposure of
subjectivity as a live and highly informative option of interpretive commentary.

Examples and Experiments
(116-117) Pictorial deformation is a mode not explicitly addressed or exemplified here, for reasons of space and medium. Readers are referred to the critical deformations carried out on a painting by Dante Gabriel Rossetti, the Fogg Museum's copy of
The Blessed Damozel. Here we focus instead on poetic deformations, which we have so far organized into four types: reordering (for example, reading backward), isolating (for example, reading only verbs or other parts of speed), altering (exteriorizing variants—potential versions—of words in the work; or altering the spatial organization, typography, or punctuation of a work), and adding (perhaps the most subjective of our deformative poetics).
(119) The arbitrary imposition of a reversed order on the original layout indicates that the poem possesses its own means for evading temporal determinateness.
(120) Deformance
does want to show that the poem's intelligibility is not a function of the interpretation, but that all interpretation is a function of the poem's systemic intelligibility.
(120) Perhaps even more crucially, deformance reveals the special inner resources that texts have when they are constituted poetically.

Conclusion: Deformance and Critical Dialectics
(127) The work of Randall McLeod is the contemporary exception proving the rule: that interpretive deformance is an unlicensed critical activity, all very well for poets and artist, but inapt for the normative rigor of the scholar and critic.
(127) In our view, however, we may usefully regard all criticism and interpretation as deformance.
(127) The truth-content of such views is further exposed when we reflect on the critical dialectics of the great Italian philologist Galvano
della Volpe. . . . Like Dante, and in contrast to, say, Coleridge or Schlegel, della Volpe sees poetry as a type of “discourse” whose rationality—ragionamento--consists in its exploitation of the “polysemous” dimensions of language, whose structures are no more (and no less) difficult or even “mysterious” than processes of logical deduction and induction.
(127-128) When he argued that “critical paraphrase” should ground interpretive method, he was consciously installing a non-Hegelian form of dialectical criticism.
(128) Critical interpretation develops out of an initial moment of the originary work's “degradation” via “uncritical paraphrase.” . . . Thus paraphrastics becomes “the
beginning and end of a whole process” of comparative explorations that get executed across the “quid” or gap that a process of interpretation brings into being (199).

Della Volpe dialectical criticism different from that of Hegel and Heidegger, who reveal unknown knowns, to imaginations, more like Ulmer heuretics.

(128-129) Della Volpe carefully separates his theory of interpretation from the dialectics we associate with Hegel and especially Heidegger. The latter involves a process of thought refinement: Through conversation or internal dialogue, we clarify our ideas to ourselves. We come to realize what we didn't know we knew. . . . Interpretvie moments stand in nonuniform relations with each other so that the interpretation unfolds in fractal patterns of continuities and discontinuities. Besides realizing, perhaps, what we didn't know we knew, we are also led into imaginations of what we hadn't known at all.
(129) “Meaning” is important not as explanations but as residue. It is what is left behind after the experiment has been run. We develop it not to explain the poem but to judge the effectiveness of the experiment we undertook.

Consider my journal/tapoc software as deformative experimentation, and look back on the previous chapter about the evolution of the DTD for the archive.

Usefulness of self parody and irony in interpretations, such as Derrida textual games; the appendix offers deformations of Wallace Steven The Snow Man and Samuel Taylor Coleridge Limbo.

(130) Interpretations that parody or ironize themselves become especially apt and useful, as we see in Derrida's textual games, in the brilliant philological studies of Randall McLeod, in Barthes's S/Z, and in Laura Riding's attitude toward language and understanding: “our minds are still moving, and backward as well as forward; the nearest we get to truth at any given moment is, perhaps, only an idea—a dash of truth somewhat flavoring the indeterminate substance of our minds.” This attitude toward literate comprehension, and the kind of criticism it inspires, gains its power by baring its own devices. We take it seriously because it makes sure we do not take it too seriously.


Chapter 5
Rethinking Textuality

All media are marking systems revealed to be ordered ambivalence, leading to OHCO thesis.

Gives detailed elaboration of five ideas about textuality (summarize): privileging visual texts in his frustrated study of encoding images; recall how he divides reality into images and texts on page 88.

Take off point established on this notion of textuality, where texts include programs, which may individually be further digital humanities experiments, such as being object oriented from natively object oriented programming languages or object modeling procedural programming languages.

(137) As we have seen over and over again, complex problems emerge when you try to think about digital media through our inherited codex paradigms or vice versa. The collision of these two marking systems . . . shifted into useful focus when Drucker and I undertook a simple experiment with an OCR scanner. The point of the experiment was to use computer hardware to demonstrate what our thought experiments kept suggesting to us: that the rationale of a textualized document is an ordered ambivalence and that this ambivalence can be seen functioning at the document's fundamental graphic levels.

The Initial Experimental Context

OHCO thesis of textuality evident in design of SGML hypergrammar.

(139) This [SGML] hypergrammar treats its documentary materials as organized information, and it chooses to determine the systems of organization as a hierarchy of nested elements; or, in the now well-known formulation: “text is an ordered hierarchy of content objects(the so-called OHCO thesis).

Idea of SGML preposterous for imaginative texts.

(140) This traditional community of readers comprises the second group to which our project is critically addressed. For this group textual interpretation (as opposed to text management and organization) is the central concern. In this community of readers, the very idea of a “standard generalized markup,” which is to say a standard generalized interpretation, is either problematic or preposterous. The issue hangs upon the centrality of the poetical or imaginative text for cultural scholars.
(141) So as we proceeded with the practical construction of the archive we began to see the hidden fault lines of its design structures.

The Rewards of Failure

Also basic premise of software studies as that computer tools reflect conscious and unconscious knowledge, beliefs, preferences, biases, and intentions (in addition to economic, capitalist prerogatives).

(143) Because our computer tools are models of what we imagine we know—they're built to our specifications—when they show us what they know they are reporting ourselves back to us.

The Experiment

Realization from the experiment that all texts are marked texts.

(143) Suppose one were to try to begin a computerized analysis of texted documents at a primitive level. The first move in this case would be to choose to “read” the document at a presemantic level.
(143) These conversations brought another important realization: that the text primitives we were trying to articulate would comprise an elementary set of markup codes. And that understanding brought out a crucial further understanding about textuality in general: that
all texts are marked texts.
(144) [quoting Drucker] Jerry saw “reveal codes” as an aspect of “deformance” and I saw it as a first step in a “metalogics of the book.”
(145) Several important consequences flowed from these experiments. First, we now possessed a powerful physical argument for a key principle of “textual deformance” and its founding premise: that no text is self-identical.

Importance of bibliographical codes in signification.

(145) Second, the OCR experiments showed that textual ambivalence can be located and revealed at graphical, presemantic levels. This demonstration is important if one wishes to explore the signifying value of the bibliographical codes of a textual document. For it is a commonplace in both the SGML/TEI and the hermeneutic communities that these codes do not signify in the way that semantic codes do.

Every text possesses self-parsing markup, but another parsing agent required to read that markup; no unread text.

(145-146) Third, as the experiments strongly suggested that while every text possesses, as it were, a self-parsing markup, the reading of that markup can only be executed by another parsing argent. That is to say, there can be no such thing as an “unread” text.

The Present Situation
(147) What use-functions distinguish cybertext from docutext? And (how) might any of those functions promote our appreciation of texts as difference engines?

A Brief Digression
(147) Some of the most reliable promoters of cybertext—whether critical (like Espen Aarseth) or inspirational (like Janet Murrary)--have themselves, I think, obscured the issues. Murray, for example, distinguishes four central properties of digital environments. . . . It can be shown, however, that none of these properties are peculiar to digital environments.

Critique of Murrary and Aarseth obscuring issues of cybertext and docutext.

(148) Useful as Aarseth's study is, however, he too, like Murray, misconstrues “ordinary text” as “linear.” . . . C.S. Peirce's turn-of-the century effort to replace the alphanumeric text with what he called existential graphs in order to achieve a greater range and clarity of logical exposition is an extremely important event in the history of Western textuality. The graphs were an effort to develop a language for nonlinear relations.

Material Messages
(149) First, we want to recover a clear sense of the rhetorical character of traditional text. . . . Secondly, we want to remember that textual rhetoric operates at the material level of the text—that is to say, through its bibliographical codes. . . . Texts are not self-identical, but their ambiguous character—thiss is especially clear in poetical texts—functions in very precise and determinate ways.

Important that textual rhetoric operates at material level, making it more like machine executable program than human readable code (using CCS distinction).

(149) We think of this as some kind of pagespace or its equivalent, but in fact text can be entertained in spaces whose elements are distributed in linear or nonlinear arrangements, or both. In the case of nonlinear, the topology may be open or closed (a cave wall, say, versus a bowl, a vase, a knife, etc.). Those spaces represent different executable programs for the deployment of text.
(150) In all these cases we are considering what texts are doing in saying what they say.

Unit analysis view of semantic materials as constitutive of language games, contextually parsed character data.

(150) Any textspace can, in the abstract, deploy any lexicon. But in fact any text coded into any textspace brings with it certain discursive instructions, that is, certain rules that delimit the discourse(s) being deployed in the textspace. . . . What is important to remember—Wittgenstein forced us to this recolection, remember?--is that semantic materials are not units of atomized meaning. They are parts of a language game—more than that, they are instantiated instructions for playing a certain language game in a certain time and place for certain particular purposes.
(153) Both grapheme and phoneme are forms of thought and not facts—not character data but parsed character data, or “data” that already functions within an instructional field.
(153) The elemental scene where those metaphoric transformations expose themselves in the marked field, the graphical or auditional record. Because this will be a record of rule-governed differences, one can extract from that field a dataset of (hypothetical and arbitrary) rules that could replicate analogous differences in comparable fields (including the original record as it might be augmented and transformed by replicant operations). The output of such operations would be collated as a calculus of variants and delivered to use for study.

Instructional Examples from The Garden of Forking Paths
A. Byron's “To the Po” / “Stanzas to the Po”
(154) These two basic textual forms [Byron's manuscript and the influential 1831 printing] represent different sets of reading instructions for the work. Choosing one or the other radically affects both how one physically negotiates the work and how one interprets it for meaning.
(156) We want to poiunt out that in this example we have consciously chosen a poet who is not known for any special interest in exploiting graphical forms for poetical effect.

B. Byron's “When We Two Parted”

C. Byron's “Fare Thee Well!”

Digital age explodes meanings based on variations in material and transmissional forms even when texts remain stable at linguistic level.

(158) In all cases, while the linguistic level of the texts remains fairly stable, the material/transmissional forms stand as eloquent witnesses of radical changes in the poem's “meanings.”

Knowing Games
(158) What then
does distinguish cybertext from traditional docutext? Without pretending to answer that question, I would call attention to the special kinds of simulation that can be realized in cybernetic environments.
(158) For Plato, the optimal scene for thinking had to be living and dialectical. Texts are inadequate because they do not converse. When we interrogate them, Plato observed, they maintain a majestic silence. But in MUDS (Multi User Domains) and with various kinds of cybergames like ELIZA, one enters simulated environments where the user's interaction is no longer a readerly one.
(158) Computer games exploit this new dynamic space of textuality by inviting the user to play a role in the gamespace.
(159) But from the point of view of the scholar, or someone wanting to reflect upon and study our imaginative inheritance, the resources of cybernetic simulation remain underutilized. The difficulty is conceptual but technical. Even when we work with cybernetic tools, our criticism and scholarship have not escaped the critical models brought to fruition in the nineteenth century: empirical and statistical analysis, on one hand, and hermeneutical reading on the other.

Gamefication of critical analysis has become a new goal of digital humanities, but there are countless other possibilities beyond the statistical and hermeneutical traditions.

(159) What critical equivalents might we develop for MUDS, LARPS, and other computer-driven simulation programs? How would one play a game of critical analysis and reflection?

Compare McGann game rethinking Ivanhoe to Wikipedia.

(159-160) The game is to rethink Ivanhoe by rewriting any part(s) of its codes. Two procedural rules pertain: First, all recastings of the codes must be done in an open-text environment such that those recastings an be themselves immediately rewritten or modified (or unwritten) by others; second, codes can only be recast by identifiable game-players, digital or human, who have specifically assumed a role in the game.
(160) The roles may be played in various forms: in conversation or dialogue, through critical commentary and appreciation, by rewriting any received text, primary or secondary, seen to pertain to Scott's work.

Interesting trajectory for future scholarly virtual realities rethinking textuality by consciously simulating social reconstruction, which I imagine doing for Macy Conferences.

Ensoniment is a different twist on rethinking textuality that deforms by operating in different phenomenal fields, although the encoding effort itself to prepare for ensoniment represents a form of noncritical editing.

(160) The goal is to rethink the work's textuality by consciously simulating its social reconstruction.


Part III
Quantum Poetics: 1999-2000

Does his grasping for quantum conception and fractals reflect too much reliance on an analogy to fuzzy physical processes, forgetting that computational objects can operate by their own logics?

The unit analysis as surface bit harboring potential forking paths alludes to the missing appreciation for the depth and structure that is likewise missed in textual analysis that ignores bibliographical codes and materiality in general, focusing on linguistic units as atomic.

(164) How: by the operation, and the experience, of what we now know best as Godel's theorem.
(164) What we need is a poetics grounded in an epistemology congruent with a quantum conception of phenomena and the critical reflections we construct for studying those phenomena. This would entail a framework for grasping the objective instability of the subjects of our study (the works and their relational fields), of our tools, and of the results (interpretations and meanings) generated through the study processes. Gaining that frame of reference will come along two reciprocal lines: first, by exposing the fault-lines of interpretational methods that implicity or explicitly treat any part of the study process as fixed or self-identical; second, by proposing interpretational methods that operate through different critical protocols.

New domains of study offered by artificially computed texts extended beyond fantasy to precise, feasible projects (like his archive), simulacral creations of the sciences of the artificial.

(164) The second of these goals, which is naturally the more important, emerges at (and perhaps as) the interface of human beings and their simulacral creations. . . . Models for these kinds of tool descend to us through our culture in games and in role-playing environments. . . . The remarkable ability of computerzied tools for storing, accessing, and transforming unimaginably large bodies of data opens the field of what we know to what we could not otherwise bring to or hold in the field of our disciplined attention—literally, not simply to imagine what we don't know, but to be able to choose to undertake such imaginings in precise and determinate ways.
(164-165) These chapters therefore culminate the theoretical re-investigation of traditional textual and semiotic forms that grew out of the initial scholarly project begun in 1993,
The Rossetti Archive: to design and build an online model for critically editing multi-media aesthetic materials.
(165) All are creatures of what Herbert
Simon years ago called “the sciences of the artificial.”
(165) The dialogue we enter at the interface of man and machine sends us out in quest of digitized instruments that promote the kinds of critical reflection we have known for centuries in our ancient dialogue at the interface of man and book.

Software studies and CCS applies same consideration of social and historical determinations to machine texts and other assemblages.

(166) The truth is that all such works are “special” because they call attention to a crucial general feature of textuality as such: its social and historical determinations.

How prevalent is the visual in textualities when also considering machine texts, for example is the Universal Turing Machine fetch operation really visually oriented, or does the analogy break down?

(166) As The Rossetti Archive emerged, however, its virtual form began to expose the visible languages that play in all textual forms, even those that seem without them.


Chapter 6
Visible and Invisible Books in N-Dimensional Space
(168) Right now and in the foreseeable future, books do a number of things much better than computers. There is no comparison, for example, between the complexity and richness of paper-based fictional works, on one hand, and their digital counterparts—hypermedia fiction—on the other. . . . The truth is that the hypermedia powers of the book, in this area of expression if not prima facie, far outstrip the available resources of digital instruments.
(169) This is very much a
material revolution, and in negotiating it we all—not least of all traditional scholars—would do well to recall Marx's eleventh thesis on Feurbach, which has acquired interesting new meanings beyond those originally conceived by Marx: “The philosophers have only interpreted the world in various ways; the point, however, is to change it.”

Philosophy and computing intersect; following Marx, go beyond interpreting, change the world.

(169) Information scientists and systems engineers will be (already are) much involved with these changes. But it is the literary scholar, the musicologist, the art historian, etc. who have the most intimate understanding of our inherited cultural materials. Hence the importance that traditional scholars gain a theoretical grasp and, perhaps even more important, practical experience in using these new tools and languages.

Informational and aesthetic functions performed by books and hypermedia; book will retain aesthetic while losing informational.

(170-171) This situation does not portend the death of the book and its typographical world. It does mean, however, that one heretofore central function of book technology will be taken over by these electronic media. Think about what books do. Like computerized information tools, the book performs two basic functions: It is a medium of data storage and transmission; and it is an engine for constructing simulations. That first is an informational, the second an aesthetic functions. Computers will displace—are already displacing—most of the information functions of our bibliographical tools. The aesthetic function of books will remain, however, and it's clear to me that they will prove indispensable in this respect.

Already difficult to represent dramatic works in books, recalling Tufte metaquestions about textuality; now sensing difficulty of marking up recursive patterns in poetry and imaginative works by SGML.

(171) How do we exploit the aesthetic resources of digital media? The question brings to mind Edward Tufte's work. . . . Nonetheless, his studies underscore an important set of metaquestions that are too rarely asked: What is a page, what is a book, what are their parts, how do they function?
(171) The new engines could handle, in full and unabbreviated forms, vast amounts of data—far more than any book or reasonable set of books. They could also handle different kinds and forms of material data—not just textual, but visual and audial as well. . . . Digital tools also exposed the critical deficiencies of the paper-based medium as such. Any kind of performative work—dramatic works, for example, and pre-eminently Shakespeare's dramas—gets more or less radically occluded when forced into a bookish representation.
(172) The recursive patterns that constitute an essential—probably
the essential—feature of poetry and imaginative works in general cannot be marked, least of all captured, by SGML and its offspring.

Questions raised by new media: nature of literary work, it critical representation, functioning thereof.

(172-173) Here are the questions. First, what is a literary work, what are its parts, how do they function? . . . Second, what constitutes a critical representation of a literary work, and how does such a representation function? . . . A hypermedia work by choice and definition, the archive therefore obliged us to integrate in a critical way both textual and visual materials. Our efforts were continually frustrated, however, because while digital texts lie open to automated search and analysis, digital images do not.
(173) A true critical representation does not accurately (so to speak) mirror its object; it consciously (so to speak) deforms its object. The critical act therefore involves no more (and no less) than a certain perspective on the object, its acuity of perception being a function of its self-conscious understanding of its own powers and limitations. As della Volpe shows, it stands in a dialectical relation to its object, which must always be a transcendental object so far as any act of critical perception is concerned.
(173) Aesthetic forms recreate—they “stage” or simulate—a world of primary human intercourse and conversation.
(175) Our failures with implementing some of the goals of
The Rossetti Archive were bringing a series of paradoxical clarities not only about our digital tools but even more about the works those tools were trying to reconstitute. We realized that we were making inadequate assumptions about such works, and that we were using tools designed through those assumptions.

Compare his analysis of Gerard Manley Hopkins As Kingfishers Catch Fire to CCS source code examples.

(178) What is this kind of text, really? First of all, it is both—and simultaneously—a perceptual and a conceptual event. Informational texts seek to minimize their perceptual features in the belief that texts calling attention to their vehicular forms interfere with the transmission of their ideas.

Opportunities for nonlexical expression in marked and unmarked spaces of texts and other material characteristics of books.

(178) We tend not to notice an elementary fact about printed or scripted texts: that they are constituted from a complex series of marked and unmarked spaces. The most noticeable are the larger regular units—the lines, the paragraphs, or (in verse) the stanzas, as well as the spaces between them. Every one of these spatial units, as well as all the others on a page or in a book, offer themselves as opportunities for nonlexical expression.
(179) It is highly significant that readers of books move from recto to verso, that their field of awareness continually shifts from page to “opening” (i.e., the space made by a facing verso/recto), and that the size of the book—length, breadth, and thickness—helps to determine our reader's perceptions at every point.
(181) Every document, every moment in every document, conceals (or reveals) an indeterminate set of interfaces that open into alternate spaces and temporal relations.

Radiant textuality defined as indeterminate set of interfaces opening alternate spaces and temporal relations concealed or revealed in every point of every document, revealed through study of books and carried over into electronic media.

(181) Traditional criticism will engage this kind of radiant textuality more as a problem of context than a problem of text, and we have no reason to fault that way of seeing the matter.

Quantum poetics organizes aesthetic space so identity of elements shift with moving attention, shimmering signifiers.

(183) The line [from Keats] exhibits in the clearest way what I mean by a quantum poetics. Aesthetic space is organized like quantum space, where the “identity” of the elements making up the space are perceived to shift and change, even reverse themselves, when measures of attention move across discrete quantum levels.
(183) Every feature represents a determinate field of textual action, and while any one field might (or might not) individually (abstractly) be organized in a hierarchical form, the recursive interplay of the fields appears topological rather than hierarchic.
(183) Considered strictly in terms of bibliographical codes, then, poetical works epitomize a crucial expressive feature of textuality in general: that it can be seen to organize itself in terms of various relational segmentations and metasegmentations.
(184)
Every page, even a blank page, even a page of George W. Bush's ignorant and vapid prose, is n-dimensional. The issue is, how clearly has that n-dimensional space of the page—its “multivariate” character—been marked and released?

Task for scholars that will default to other actors, like default philosophers of computing arising from industry trends and powerful voices.

(184-185) One of the great tasks lying ahead is the critical and editorial reconstitutions of our inherited cultural archive in digital forms. We need to learn to do this because we don't as yet know how. Furthermore, we scholars need to learn because it is going to be done, if not by us, then by others. We are the natural heirs to this task because it is we who know most about books.
(185) If these new machines can deliver stunning images to our view, the only images they understand are their own electronic constructions. Original objects—visual, audial—remain deeply mysterious to a computer. . . . Even when (some would say “if”) that limitation gets transcended, logical ordering through metadata will never
not be a part of computerized scholarship of literary works.

Criticizes computer-text theorist Steven DeRose.

(185) So far as I can see, nearly all the leading design models for the scholarly treatment of imaginative works operate from a naïve distinction between a text's “form” and “content.”


Appendix to Chapter 6
“What Is Text?”

Renear famous five theses about textuality: real, abstract, intentional, hierarchical, linguistic; fails for poetry and many philosophers.

(187-188) Allen Renear proposed the following “five theses” about textuality. . . . That clear and succinct statement reflects an intensive involvement, over many years, with the theory of text as it was being engaged by Renear and his colleagues, principally at Brown University, as they were developing TEI as a standard for electronic markup of humanities texts. . . . Renear's “account of text,” while in certain respects a very good one indeed, has serious limitations. And “serious competitors”' have been around for a long time.
(188) This ground, explicitly “abstract” (Renear, “Out of Praxis”), represents a view of text as essentially a vehicle for transmitting information and concepts (final cause). Text is “hierarchical” (formal cause) and “linguistic” (material cause), and it is a product of human intention (efficient cause).
(188) When Plato called for the expulsion of the poets from the city, he was arguing for a certain theory of textuality.
(188-189) Poetical texts are recursive structures built out of complex networks of repetition and variation. . . . The logic of the poem is only frameable in some kind of paradoxical articulation such as: “a equals a if and only if a does not equal a.”

Non-hierarchical philosophical texts challenge TEI/SGML (see chapter in Burnard on Wittgenstein archive); grateful that computer scientists understand some general problems of textuality.

(189) The case of poetry in fact defines a kind of textual ethos, as it were, that may be seen to pervade genres not normally thought of as poetical. Certain kinds of philosophers lend themselves to a hierarchical approach—St. Thomas, Kant, Hegel. Other's don't. Not without reason did the Bergen Wittgenstein project shy off TEI/SGML as a system for marking up the corpus of Wittgenstein's texts.
(189) I was lucky in 1993 to begin my work with computer scientists who understood the general problems far better than I did.

Envisions human computer symbiosis in which humans do analog and computers digital thinking, the latter ignorantly performing deformations and submitting results for human consideration; seems to foreclose on notions of emergence and co-constituted subjectivity Hayles suggests.

(190-191) An important move will be to exploit the difference between analogue thinking, which we do so well, and digital thinking, which computers do better than their human makers. A new level of computer-assisted textual analysis may be achieved through programs that randomly but systematically deform the texts they search and that submit those deformations to human consideration. Computers are not more able to “decode” rich imaginative texts than human beings are. What they can be made to do, however, is expose textual features that lie outside the usual purview of human readers.
(191) Nonetheless, even in transacting imaginative texts our desire to close the sympathetic exchange is such that we make decisions about what we are reading, and those decisions occlude other kinds of awareness.
(191) A computer with the same set of reading codes is naturally (so to speak) inclined to be less discriminating. That lack of discrimination in computerized reading is exactly what we want to exploit. We want to see what textual possibilities have been forbidden or made nugatory by the original act of textual encoding—that is, by the decisive and particular text that stands before us. The random access procedures of digital technology can bring those possibilities to view. The fact that many will appear to us, at that point, as
impossible nonsense is exactly what holds out such promise, on two counts. First, not everything tossed up by the computer will seem nonsensical, and besides, people will differ. Second, however we judge the results, they will inevitably clarify our own thoughts to ourselves by providing a set of contrasts to throw our thinking into sharper relief.


Chapter 7
Dialogue and Interpretation at the Interface of Man and Machine
(194) This chapter will thus reconsider the issues taken up in [G. Spencer Brown's]
Laws of Form. My purposes are, however, more narrow and more practical. I want to elucidate some key but neglected formalities of textual documents and to meditate satisfactory ways of communicating those formalities. . . . But with increasing numbers of humanities scholars using digital tools in their research work, the realization is growing that TEI's problems are not technical but systemic. To address them properly we have to step back and think not about TEI but about “text” itself.

Begin thinking about textuality with Dante as in Latour premodern?

(194-195) To begin thinking about textuality with Brown, then, let's begin again further back, by thinking about textuality with Dante, whose grasp of the subject was acute. His way of thinking is especially useful in this case exactly because it is a premodern way.

Inner Infinities
(197) The elemental condition or manifestation of form is the appearance of a mark in an otherwise unmarked space. Brown calls this mark a “distinction” so that the elemental law of form is: a distinction can be drawn.

Argument by Bibliographic Code

Spatial conception of textual field for Dante book of memory; pagespace elemental.

(197) Drawing on the ancient tradition of the Arts of Memory, Dante's textual divisiones point toward the inherently spatial conception he has of his textual field. The Vita Nuova is a “book of memory” shaped by visible rubrications so as to give a mirror image of the events it aims to recall.
(199) This pagespace is elemental because it replicates at a different scalar level the same kind of distinction marked within the page space by the elementary letter and graphic marks. The relation between the elementary graphic marks and the elemental page space sets the parameters for all types of graphemic directionality. . . . In bookspace, pagespace variances emerge as a set of higher order conventions of three-dimensional relations: between page rectos and versos; between the single page and the page opening; and between sequences of pages gathered together. As more explicit shapes and/or images are introduced into the paperspace, that space will be pushed toward a space governed by rules of collage rather than by rules of textuality.
(199) A page of printed or scripted text should thus be understood as a certain kind of graphic interface.
(200) Brown's root concept of “distinction” is thoroughly replicated in this text's bibliographical codes, in which various key differentials are developed through the manipulation of pagespace, changes in font, and the deployment of one simple, explicit shape (a line).
(200) That structure in Brown's book is extremely significant for understanding how laws of form operate in a textual horizon.
(203) In the reflexive not to chapter 2 of his book Brown explains that “the primary form of mathematical communication is not description, but injunction. In this respect it is comparable to practical art forms like cookery [and] music” (77).
(203) As Brown's book shows, a primary textual injunction is to make and elaborate distinctions.

Injunctive Forms
(204) Unbeknownst to itself until the moment when it turns reflexively back upon itself—and then it is too late—every form of thought is incommensurate with itself. Certain texts—and certain kinds of text—make that contradiction a primary focus of attention.
(204) Not many works of philosophy demonstrate that paradox as elegantly as
Laws of Form—perhaps Nargarjuna's treatise on “Emptiness” makes an apt comparison.
(204) Works of imagination, however—let us say henceforth “poetry”--make the discourse of paradox and contradiction the ground of the semiosis. . . . New distinctions conceal algorithms—hidden injunctions—to cancel the same distinctions by recrossing the boundary initiated when the distinction first appeared to view.
(204) Whereas everyone knows this about poetical texts, we are less clear about how and why this network of recursions unfolds. Yet clarity on the matter is particularly important in a digital horizon if we are to have any hope of building adequate electronic re-presentations of our received textual archive.

Modern aesthetic understanding of literary texts is simulacral.

(205) The concept functions reasonable will in analyses focused on informational and nonpoetic texts, but its analytic force dissipates when directed toward poetry. This happens because a modern aesthetic understanding shapes our thought about “the literary text.” Since poetical works are conceived as “communication sui generis(or “language [oriented] toward the message itself” [Segre, 28-29], neither affirming nor denying anything beyond their internal relations, “reference” in the literary text turns (virtually) virtual.
(205) The signifier/signified/referent structure implicitly poses two (related) questions to a text: “What is it saying?” and “What is it doing in saying what it says?”

(In)Conclusion

Plan for a text reading program starting with bibliographical codes.

(206) In such circumstances what is needed is a dynamic engagement with text and not a program aimed at discovering the objectively constitutive features of what a text “is.” That dynamic requirement follows from the laws of form themselves, as Brown's work shows. But what equally follows is that the analysis must be applied to the text as if it is performative.

I wanted to exclaim, let the Big Other speak, but McGann AI project is not really to generate dialogue, more like unexpected output for humanities scholars to interpret, as in Manovich big data experiments, ultimately reveals unknown knowns about texts and textuality.

(206-207) The project imagined here attends only to the text's bibliographical codes in order to begin with a relatively simple set of rules for marking or interpreting textuality. We want to teach the computer a set of rules for reading texts. Trying to teach it higher order rules presents enormous difficulties. It seems possible, however, to develop an initial set of rules for bibliographical coding options and forms. Part of the programmatic operation is to implement these rules in order to expose and generate a more complex set of rules extending to higher orders of textual form.
(207) The ultimate event in this program will be a dialogue between the computer and the human beings who are teaching it how to read. We want to study the bibliographical formations that appear out of the computerized readings. These readings will, we believe, inevitably constitute a set of (de)formations full of surprises for the rule-givers. What those surprising readings will be cannot be predicted, but that they will come is, we think, as certain as the fact that no text is commensurate with itself.
(207) We begin by implementing what we think we know about the rules of bibliographical codes. The conversation should force us to see—finally, to imagine—what we don't know that we know about texts and textuality.

Conclusion
Beginning Again and Again: “The Ivanhoe Game”
The Aesthetic Interface

Basic forms of digital life correspond to advanced self-conceptions of book culture: this is a description of real virtual production as intended by Castells, situating Mallarme, Blake Rossetti, Sinburne, Morris, Dickinson and Whitman as similar artisans crafting virtual reality machinery, what Murray refers to as the holodeck, in the media available to them, or else their work (texts) manifesting the asymptotic limit of those fantasies symptomatically.

(210) The work he [Mallarme] has in mind will only be realized when it is composed in at least three senses simultaneously: a typographical sense, a musical sense, and a poetic sense. The book that emerges is a machine for executing the orders that bring it into existence at each of those three orders.
(210) Within cultures of modernity, Blake was perhaps the first to adopt and execute such a view of text. Born before historical circumstances could provide a clarifying framework for his work, however, he would not become an important cultural presence until the English advent of Mallarme's generation—until the coming, that is to say, of D. G. Rossetti, A. C. Swinburne, and William Morris. . . . In the United States, Dickinson and Whitman—both in their very different ways—were playing with comparable imaginations of books and texts.
(210-211) Take these basic “forms of digital life” (as Wittgenstein might have called them):
Simulation Medium
Interaction/Interoperability
Accessible Memory
Programs and Protocols
How remarkable that these commonplaces of digital culture should so correspond with the most advanced self-conceptions of book culture.
(212) We have had some centuries with our bookish mirrors—mechanisms that simulate with a difference—and we aren't done with them yet, if we ever shall be.

Difference Engines
(213) But the striking congruences between digital and Mallarmean ideas of text and book suggest that our traditional critical procedures may themselves be lagging behind our most interesting ideas about books and texts.
(214) Thinking in practical ways about the differences between textual space and digital space—that is to say, actually building
The Rossetti Archive—finally brought us to think through those differences. It seems not within our capacity to build true cyborgs. We create such machines only in our imaginations. But it is definitely within our capacity to build machines that produce simulated forms of meaning—that is to say, machine-generated interpretative forms that can augment our own procedures of critical reflection.

Critical reflection develops to simulacral interfaces, as with the book; intellectual input is his instrumental interest in, notice what he calls them, digital instruments..

(214) We want to work with digital tools as we have always worked with our tools for making simulations—for instance, with all our semiotic tools, and pre-eminently with the book. Critical reflection emerges in the mirroring event that develops at simulacral interfaces, of which the book is the one we are most used to using.

The creative limit of programming according to McGann does not reach cyborgs, which he associates with traditional artificial intelligence: Hayles delivers us to possibilities beyond his apparent determination as yielding only output, interpretive forms for human review, rather than thinking itself.

(214) How do we—we humans—exploit this situation if our interests are primarily intellectual rather than instrumental—that is to say, if we want to use these tools the way we use books for critical and reflective purposes?

Examples of creative AI yield weak creativity at best; behavior mistaken index of conscious activity of machine, as Weizenbaum and others express.

(214-215) Consider Project BRUTUS, the “Storytelling Machine” developed by Selmer Bringsjord and David Ferrucci. This foray into artificial intelligence and literacy creativity was undertaken to examine the possibility of simulating imaginative creativity (rather than computational or problem-solving capabilities). . . . But in all these cases the stories would exhibit only what Bringsjord calls “weak creativity.”
(215) All of these programs exhibit self-reflection, that is to say they are designed to generate textual forms and transforms by studying and elaborating on their own processes.
Hofstadter isolates four of the most important features of these kinds of programs under the general heading “Dynamic Emergence of Unpredictable Objects and Pathways.”
(215) Because “one doesn't know what goes on behind the scenes” we are unjustified in imagining what our pleasure and sense of amazement suggest: that this behavior is an index of the machine's conscious agency.

Typical usage of literary AI reflects human directed inquiry; we are not interested in machine embodiment.

(216) And the interaction that Hoftadter lays before us is the sign of the simple but clear truth that it is we who want to use these tools and that we want to use them in order to understand ourselves more clearly—to understand not how the machines work but how we work when we make, use, and interact with machines of this kind.

Unrealized critical possibilities concealed in charming surfaces of existing projects.

(216-217) Concealed in the cool codes and charming surfaces of projects like AARON, BRUTUS, and RACTER lie our own unrealized critical possibilities. Let's try to think about using such creatures that same way we use traditional paper-based instruments—as vehicles for self-awareness and self-reflection.

The Ivanhoe Game” and the Precisely Indeterminate Text
(217-218) Its central object is to make explicit the assumptions about critical practice and textual interpretation that often lie unacknowledged, or at least irregularly explored, in a conventional approach to interpretational practice. . . . Players produce text in response to the opportunities and problems raised by the texts produced by the other players.

Quantum approach of Ivanhoe Game because each act of interpretation a function rather than view of system.

(218) From the start the premise of the game—and of our critical ideas in general—was (is) that works of imagination contain within themselves, as it were, multiple versions of themselves. . . . Not that in both cases, classical and romantic, either the perceiving subject or the perceived object is artfully stabilized for purposes of an interpretive action. In what I would call a quantum approach, however, because all interpretive positions are located at “an inner standing point,” each act of interpretation is not simply a view of the system but a function of its operations.

Interesting conception of autopoetic phenomena employing Maturana and Varela offers an entity-neutral ground of emergent subjectivity.

(218-219) Artifices of reality as they propose to be, imaginative systems simulate what Humberto Maturana and Francesco Varela call an “autopoeticreality that sustains itself by communicating with itself. . . . Understanding the system means operating with and in the system. The more this “meaning” can be defined, the more capabilities it has for generating different lines that are latent but undeveloped by the system.

Must live through it, playing with others: is it thus ergodic?

(219) That initial game involved replaying the discourse field determined by the book Scott wrote. . . . In contrast to the preponderant body of received literary exegesis, its critical method is procedural rather than expository.

Basins, strange attractors, field concept of quantum approach: compare his initial game rules to what ludologists depict as typical characteristics of all games.

(219) The discourse field of Ivanhoe—Scott's romance—is itself what topologists call a “basinof dynamic order arbitrarily (consciously) taken out of an encompassing and indeterminate social space. That space is pervaded by “strange attractorsthat organize around themselves local dynamic basis of order. . . where the concept “work” is replaced by a “field” concept.
(220) This initial set of rules was kept, deliberately, simple.
(1) That all game moves by a player get executed under the auspices of a particular and explicit “role” to be taken by the player. . . .
(2) That each player keep a “player-file” . . . . Thus the game involves two “lines” of material: the line represented by the player's moves (always a fully public line) and the line represented by the player-file, which documents the player's commentaries on the moves being played in the game.
(221) The point of the game is for players to hypothesize and then extrapolate ideas about the discourse field of
Ivanhoe within a performative and dynamic intellectual space. The point is not as such to arrive at a reading or interpretation of Ivanhoe but to refashion and reshape its discourse field in ways that bring to the fore “possible worlds” latent in the work and in the materials that transmit the work to us.

Holodeck connection: do with the Macy conferences.

(221) A computerized environment could hold the entirety of the gameplay open to random or structured transformations. . . . As in any computer game, the machine would thus be itself an active agent in the gamespace. It could intervene and constrain the human players in various ways and it could as well generate gameplayers of its own.

Dialogues of the Mind with Itself
(222) For some time various scholars have been seeking after disciplined critical methods that could exploit much greater ranges of conscious subjectivity. The impact of the rise of science during the past 200 years has had its inevitable effect on humanistic pursuits of knowledge.

Self-directed conscious subjectivity improvement: what is new from Socrates is it being truly disciplined in being different, like the OULIPO group; McGann presents his own examples, including The Alice Fallacy, yet he makes the important point of having to modify his approach to deliberately and nontrivially (that is, not as in Socratic dialogue) collaborate with others.

Is this not license for me to give examples from my working code, following McGann use of his own scholarly software projects as nearly announcing critical programming?

(222) The first clear—that is to say disciplined and self-conscious—revolt against these methods of critical inquiry came at the end of the last century. . . . The program sketched by [Alfred] Jarry would get resurrected more than a half-century later, in our own day, in the work of the OULIPO group, most notably in the writings of Perec, Queneau, Mathews, and Calvino. Two important things to keep in mind are: first, that a “science of exceptions” must inevitably be related to statistics; second, that 'pataphysical work has largely assumed imaginative rather than critical forms.

Besides the obvious exemplar of the Symposium, whose ensoniment project I have already written substantially, a version repeating the Macy conferences and other critical periods in intellectual history is a holodeck extension of his idea that could become the brainy virtual realities into which our humanity eventually perishes, raising dialogue to a higher power.

(224) The dialogues as originally written and presented did not require the presence of others--”the real presence,” as Christian theology puts the matter. By contrast, “The Ivanhoe Game” simply cannot be played alone.
(224) Considered as a critical tool in this context, game is dialogue raised to a higher power.

IVANHOE: A Game of Interpretation

IVANHOE game of interpretation developed to use computational resources on noninformational, aesthetic and rhetorical aspects of texts; see how discussed in Ramsay Reading Machines.

(224-225) IVANHOE is being developed to begin such a demonstration [suggested in the preface]. Its purpose is to bring computational resources to bear on elucidating the noninformational—the aesthetic and rhetorical—aspects of texts. . . . INVANHOE proposes to open up the transformational structure of imaginative works by promoting a dynamic connection between the digital pattern-analysis capacities of computational tools and the analogue pattern-making capacities of human beings.

IVANHOE game moves employ computer database, public and nonpublic player moves, and computer interventions in MOO and email.

(225) In IVANHOE, however, every move, when made, itself gets added to the initial discourse field, which in this case is the computational database. The database therefore grows from three sets of additions: the public moves made by the players, their nonpublic analytic moves located in the player-files, and the computer's interventions in the gameplay.
(225) The game space is extended to include a MOO, where the player roles can execute their dialogical moves in real time, a chat room, where the players can discuss the course of the game play, and various other functionalities, including dynamically generated analytical displays of the gameplay as it stands at any point in time.

IVANHOE dreams at markup as flexible as natural language, although differences between analog and digital mapping protocols constitute part of the critical output.

(225) Computational resources alter the gameplay in certain crucial respects, all a consequence of the differences between analogue and digital mapping protocols. . . . To do this the computer will be forced to deliver what it takes to be disambiguated results from a body of inherently ambiguous data.
(226) The ideal of a markup system that would be as flexible as natural language but logically unambiguous is equivalent to the AI dream of creating a true cyborg.

IVANHOE submerges into software studies considering its fantasized interface.

(227)(Figure C.2) Diagram of IVANHOE Functions. This sketch gives an idealized presentation of the general functional elements of IVANHOE as they might appear in framed spaces on a monitor.

Producing Keats unheard melodies by leveraging digital capabilities begs for connection to symposia.

(227) IVANHOE's object is to cultivate that gap—to replicate and develop it and in the process, to expose to our thinking aspects of our own thought that would have otherwise remained only intuitively or randomly available to us. Keats called those things “unheard melodies,” and he revealed them, in that famous passage, to an ear—to an intelligence—we often hardly believe we possess.

Quantum Poetics
(228) Relativity, quantum mechanics, and non-Euclidean geometries all realize a world marked by the same kind of ambiguities, transformations, and incommensurable features that we take for granted in Ovid and Lucretius, Dante and Petrach, Blake and Byron.

Awareness in axis of software is where I wish to extend McGann, as I sense he does not seek great insights in that region.

(228-229) The awareness we are after would move along any or all of the three axes by which the game is ordered: the axis of the literary work (Ivanhoe, Wuthering Heights, and the data of their discourse fields); the axis of digitization and the tools of analysis, display, and transformation (the software); the axis of the gameplay with its text, sound, and image outputs and their new, second-order data.
(229) But the digital architecture locates a statistical and probabilistic order at the very heart of the game. . . . These are the “strange attractors” of topology, the systemic elements of a probabilistic universe that simultaneously licenses order and disorder.

Challenge to expose every scrap of oral or typographical text to critical investigation.

(229) But the truth is that even the most pedestrian scrap of prose text—oral or typographical—might and should, for critical purposes, be investigated with a passion for fine, for microscopic, for subatomic discriminations.

Desire framework that will fracture facticities of gameplay to become refracting mirrors revealing significance.

Deploy unit analysis at level of bibliographical codes as well as linguistic codes, adding Bogost (and Hayles, addressing his rejection of the true cyborg as the posthuman cybersage) what is needed to extend McGanns thought, for he proposes a tool for examining subjectivity without allowing that subjectivity might be already be deeply implicated in the built environment in which he is seeking to use it as a second-order, refracting mirror of a non-externalize subjectivity, the discrete mind of the human user.

(230-231) The difference between revealing and fixing significance is perhaps the crucial thing. . . . The subject of IVANHOE, after all, is not the subject of (say) physics or computer science—the natural world, digital order—it is the mind of those who have imagined and created those kinds of intellectual prostheses, the mind of Ivanhoe and IVANHOE. We want a framework in which such items can be regularly and self-consciously examined as “facts” that are also consciously seen as illusions of reality. We want a framework that will fracture our facticitiesin this case, the actual phenomena generated in the gamplay—until they become refracting mirrors.

Discourse field of human cognitive and affective exchange ignores machine component that Hayles embraces.

(231) Like art it is a game of mirrors in which (like engineering) actual things get made; but like science it demands that the made things be studied to expose their structure and their “laws.” The latter must also be made as artistic, illusionistic forms put into gameplay. What emerges is a discourse field shaped as an evolving scene of human cognitive and affective exchange: a repertory of what we know and think we know and hence also a set of negative images and spaces for imagining what we don't know—for all that remains “still art and part” of these processes, though they remain yet to be realized.

Appendix to the Conclusion
A Round of Moves in “The Ivanhoe Game”

Appendix depicts a round of moves in Ivanhoe Game.

(232) The materials here are of two kinds: (1) the actual game moves I made; (2) my player file notes explicating those moves. The moves of the other players can be found online.


McGann, J. J. (2001). Radiant textuality: Literature after the World Wide Web. New York: Palgrave.


McGann, Jerome J. Radiant Textuality: Literature After the World Wide Web. New York: Palgrave, 2001. Print.