Notes for Stephen Ramsay Reading Machines: Toward an Algorithmic Criticism
Key concepts: algorithm, algorithmic criticism, deformance, declarative, digital humanities, imperative, Mathews algorithm, overpotentialized text, 'pataphysics, tamperings, textual intervention.
Related theorists: Harold Abelson, Walter Abish, Busa, Vannevar Bush, Derrida, Estelle Irizarry, Alfred Jarry, Kemeny, Willard McCarty, McGann, Franco Moretti, Oulipo, Rob Pope, Lisa Samuels, Saussure, Stefan Sinclair, Gerald Sussman.
PRECONDITIONS
Promise to highlight programming that is really only advertised as a potential; will subjectivity itself be questioned?
Primacy of pattern hailed as basic hermeneutic function yet Hayles is not in the TOC.
(x)
The “algorithmic
criticism”
proposed
here seeks, in the narrowing forces of constraint embodied and
instantiated in the strictures of programming, an analogue to the
liberating potentialities of art. It proposes that we create
tools—practical, instrumental, verifiable mechanisms—that enable
critical engagement, interpretation, conversation, and contemplation.
It proposes that we channel the heightened objectivity made possible
by the machine into the cultivation of those heightened
subjectivities necessary for critical work.
(x-xi) Envisioning an
alternative to the strictures of the scientific metaphor entails
reaching for other, more obviously humanistic models. . . . I argue,
moreover, that this important modernist genealogy points to the
primacy of pattern as the basic hermeneutical function that unites
art, science, and criticism.
(xi) Close analysis of several
apparently diverse critical works—from readings of the I
Ching and
Saussure's anagrams to medieval poetry and Shakespearean
sonnets—reveals the essential deformative nature of critical
reading.
Programming redefined in service of critical reading strategy away from generic control.
(xi) Programming, which algorithmic criticism reframes as the enactment of a critical reading strategy, undergirds all of these meditations.
1 AN ALGORITHMIC CRITICISM
Nod to Busa as founder of digital humanities with project begun in late 1940s to automatically generate Aquinas concordance using a computer, yet not algorithmic criticism.
(1)
The founder [of digital humanities] is Roberto Busa,
an Italian Jesuit priest who in the late 1940s undertook the
production of an automatically generated concordance to the works of
Thomas Aquinas using a computer.
(2) But “algorithmic
criticism”--criticism
derived
from algorithmic manipulation of text—either does not exist or
exists only in nascent form. The digital revolution, for all its
wonders, has not penetrated the core activity of literary studies,
which, despite numerous revolutions of a more epistemological nature,
remains mostly concerned with the interpretative analysis of written
cultural artifacts.
Busa admits his motivation was to reconstruct verbal system of Aquinas, a rather conservative hermeneutic approach.
Criticism evolving from reflecting about evolution of XML schema for creating an electronic archive or electronic scholarly edition not in scope of algorithmic criticism, although estrangement, defamiliarization, and deformations produced by software are.
(3)
Even Busa would have had to concede that the effect is not the
immediate apprehension of knowledge, but instead what the Russian
Formalists called ostranenie—the
estrangement and defamiliarization of textuality.
(3) But text
analysis would take a much more conservative path. Again and again in
the literature of text analysis, we see a movement back toward the
hermeneutics of Busa, with the analogy of science being put forth as
the highest aspiration of digital literary study.
(5) The data is
presented to us—in all of these cases—not as something that is
also in need of interpretation, but as Dr. Johnson's stone hurtling
through the space of our limited vision.
(6) Hermeneutically, such
investigations rely upon a variety of philosophical positivism in
which the accumulation of verified, falsifiable facts forms the basis
for interpretive judgment.
(7) In some sense, humanistic discourse
seems to lack methodology; it cannot describe the ground rules of
engagement, the precise means of verification, or even the parameters
of its subject matter.
Data is situated and transformed for literary-critical analysis, thus inherently subjective.
(8) When it comes to literary criticism, however, we find that the “data” is almost entirely intractable from the standpoint of the computational rubric. Paper-based textual artifacts must either be transformed form a continuous field into some more quantized form (i.e., digitized), or accompanied, as in the case of markup, with an elaborate scaffolding by which the vagaries of continuity can be flattened and consistently recorded. . . . Literary-critical interpretation is not just a qualitative matter; it is also an insistently subjective manner of engagement.
Computer as component of symbiosis to provide computational results to for humans to engage in inferences (Licklider, Kemeny).
(9) The computer is certainly incapable of offering “the shift to a redemptive worldview” as a solution to the problem at hand; it is wholly incapable of inferring this from the data. But it is likewise the case the computational results—the data and visualizations that the computer generates when it seeks to quantize and measure textual phenomena—cannot be used to engage in the sort of discussion that might lead one to such a conclusion?
Category error mistaking questions about properties of objects with phenomenal experience of observers.
(10)
The category error arises because we mistake questions about the
properties of objects with questions about the phenomenal experience
of observers.
(10) If text analysis is to participate in literary
critical endeavor in some manner beyond fact-checking, it must
endeavor to assist the critic in the unfolding of interpretative
possibilities. . . . The evidence we seek is not definitive, but
suggestive of grander arguments and schemes.
(11) Criticism drifts
into the language of mathematics. . . . A term frequency list is
therefore the set of tf
values
for each term within that speaker's vocabulary. Such lists are not
without utility for certain applications, but they tend to follow
patterns that are of limited usefulness for our purposes.
(12) The
list is a paratext that now stands alongside the other, impressing
itself upon it and upon our own sense of what is meaningful.
Methodological questions of algorithmic textual analysis may be as provocative as hermeneutical ones.
(13) These are provocative results, but the provocation is as much
about our sense of what we are doing (the hermeneutical question) as
it is about how we are doing it (the methodological question).
(15)
We would do better to recognize that a scientific literary criticism
would case to be criticism.
(15) No serious scientist could ever
deny that interpretation, disagreement, and debate is at the core of
the scientific method. But science differs significantly from the
humanities in that it seeks singular answers to the problems under
discussion.
(15-16) The understanding promised by the critical act
arises not from a presentation of facts, but from the elaboration of
a gestalt, and it rightfully includes the vague reference, the
conjectured similitude, the ironic twist, and the dramatic turn.
Algorithmic criticism already built into reading practices.
(16) If algorithmic criticism is to have a central hermeneutical
tenet, it is this: that the narrowing constraints of computational
logic—the irreducible tendency of the computer toward enumeration,
measurement, and verification—is fully compatible with the goals of
criticism set forth above. . . . This is possible because critical
reading practices already contain elements of the algorithmic.
(16)
Any reading of a text that is not a recapitulation of that text
relies on a heuristic of radical transformation. . . . In every case,
what is being read is not the “original” text, but a text
transformed and transduced into an alternative vision, in which, as
Wittgenstein put it, we “see an aspect” that further enables
discussion and debate.
Seeking patterns, but no mention of Hayles.
(17) Or rather, it is the same thing at a different scale and with
expanded powers of observation. It is in such results that the critic
seeks not facts, but patterns. And from pattern the critic may move
to the grander rhetorical formations that constitute critical
reading.
(17) It would not be averse to the idea of
reproducibility, but it would perhaps be even more committed to the
notion of “hackability.”
Interested in evaluating robustness of discussion inspired by particular procedures of textual analysis over fitness of the procedures; in Janz terms, asking what does it mean to do philosophy in this place versus what are the philosophical conclusions.
(17) Algorithmic criticism seeks a new kind of audience for text analysis—one that is less concerned with fitness of method and the determination of interpretative boundaries, and one more concerned with evaluating the robustness of the discussion that a particular procedure annunciates.
2 POTENTIAL LITERATURE
Etymology of algorithm from al-Kwarizimi to step-by-step machine problem solving.
(18)
Most scholars now believe the word relates back to the word
“algorism,” which is in turn a corruption of the name of the
Persian mathematician al-Kwarizmi from whose book Kitab
al-jabr wa'l-muqabala
(“Rules
for Restoring and Equating”), we get the word “algebra” (Knuth
1). . . . During the twentieth century, however, the word “algorithm”
came to be associated with computers—a step-by-step method for
solving a problem using a machine.
(18) If computational methods
are to be useful in the context of literary study, however, we must
consider the use of algorithms loosed from the strictures of the
irrefragable and explore the possibilities of a science that can
operate outside of the confines of the denotative.
Jarry pataphysics as apothesis of perspectivalism.
(20)
To the degree that algorithmic criticism tries to enter this debate,
it does so by considering a third culture that is at once the product
of both scientific and artistic investigation and has subtly suffused
both cultures since the turn of the twentieth century. It begins with
the “'pataphysics”
of
Alfred Jarry, and in particular with that extraordinary
“neo-scientific” novel Gestes
et opinions du docteur Faustroll, pataphysician,
in which the science of “imaginary solutions” is put forth.
(21)
At its most fundamental level, 'pataphysics is the apotheosis of
perspectivalism—a mode, not of inquiry, but of being,
which refuses to see the relativity of perspective as a barrier to
knowledge.
(22) Bok correctly intuits the continuities between
Jarry's critique and the anarchic science of Feyerabend.
Science turning to narrative to explore meaning and implication of phenomena (Bok and Feyerabend).
(23)
In the light of such marvels, we witness modern science turning to
narrative—not merely as a way to explain complex phenomena, but as
a methodology for exploring the meaning and implication of phenomena.
While Jarry was formulating his new science, the
scientist-turned-philosopher Ernst Mach was coining the term “thought
experiment” to describe these new meditations.
(23) In both
cases [Maxwell's demon and Schrodinger's cat], the narrative amounts
to an impossible fantasy constructed for the purpose of divining the
possibilities of the real. In a sense, thought experiment is the
hyperbolic extreme of reductio ad absurdam—the 'pataphysical
expansion of reality to the point of absurdity, which, like the
ancient reductio,
has truth as its ultimate object. Jarry's awareness of the narrative
possibilities of such experiments are everywhere apparent in
Faustroll.
(25)
Oulipo, indeed, might be said to do with (and for) mathematics and
structural linguistics what Jarry did with physics: use the terms of
its vision in order to seek not denotative truth, but imaginative
insight.
Oulipo imaginative meaning at intersection of potentiality and constraint, for example Abish Alphabetical Africa.
(25)
For Oulipo, that imaginative meaning arises at the intersection of
potentiality
and
constraint.
. . . Oulipo thus approaches the literary work as Jarry approaches
the watch face—as an object rife with exceptions, brimming with
paths not taken and possibilities unexplored.
(27-28) Constraint,
which might at first seem to oppose the exuberant perspectivalism of
potentiality, reveals itself in the work of the Oulipo as the
condition under which perspective shifts and potential emerges. The
constraints of form—like the strictures of scientific and
mathematical reasoning—alter one's vision and expose the explosive
potentiality of the subject and of subjectivity.
(28) Few Oulipian
works illustrate the liberation of constraint as well as Walter
Abish's Alphabetical
Africa (1974),
which uses a series of seemingly impossible strictures to construct a
coherent prose narrative. The first chapter permits only the use of
words that begin with the letter “a”; the second with the letters
“a” or “b”; the third with “a,” “b,” or “c”; and
so on until the full range of letters has been employed, at which
point the process reverses itself. . . . The intelligibility of
Abish's text—which, as Schirato demonstrates, extends to the
richness of political metaphor—is not a fortuitous accident of its
form, but a direct result of its constraints. As Jarry asks (of the
objects of reality) what infinite smallness would entail, so Abish's
text asks what narrative might emerge from a text in which no one can
“die” until chapter 4 or “suffer” until chapter 19.
Algorithmicly generated poetry like Mathews algorithm instantiating phonemic potentiality of ordinary words.
(29-30)
There is, however, a third type that represents the most obvious
literary analogue to computer-assisted criticism—namely, poetry
generated by purely algorithmic processes. One of the most famous of
these is the so-called Mathews
algorithm,
which remaps the data structure of a set of linguistic units (letters
of words, lines of poems, paragraphs of novels) into a
two-dimensional tabular array. . . . These maneuvers create a
serendipitous morphology—an instantiation of the phonemic
potentiality of ordinary words.
(30) The algorithm therefore
represents “a new means of tracking down this otherness hidden in
language (and, perhaps, in what language talks about)” (126). Form,
in other words, is both a means of poetic communication and an
enunciation of possible procedures for analyzing that communication.
Sonnet form as example of procedural rhetoric.
(30-31) As a critical work, the new poem makes obvious a
long-standing intuition about sonnet form—namely, that the form
itself has a rhetorical structure that is almost independent of the
words themselves, insofar as the form raises expectations that may
condition us to pursue particular patterns of sense making.
(31)
The computer revolutionizes, not because it proposes an alternative
to the basic hermeneutical procedure, but because it reimagines that
procedure at new scales, with new speeds, and among new sets of
conditions. It is for this reason that one can dare to imagine such
procedures taking hold in a field like literary criticism.
3 POTENTIAL READINGS
Pope textual intervention, McGann and Samuels deformance, Irizarry tamperings base eisegesis/katagesis rather than radical exegesis that deliberately and literally alters semantic codes of textuality.
(32) The hermeneutic proposed by algorithmic criticism does not oppose the practice of conventional critical reading, but instead attempts to reenvision its logics in extreme and self-conscious forms. As such, it is of a piece with recent work on the notion of “textual intervention” as set forth by Rob Pope; of “deformance” as proposed by Jerome McGann and Lisa Samuels; and with the computationally enacted “tamperings” undertaken by Estelle Irizarry. All three set forth a bold heuresis—one that proposes not a radical exegesis, but a radical eisegesis (perhaps a katagesis) in which the graphic and semantic codes of textuality are deliberately and literally altered.
Eisegesis examples of reading poem backward and entropic poem.
(33-34)
Reading a poem backward is like viewing the face of a watch
sideways—a way of unleashing the potentialities that altered
perspectives may reveal. . . . In pouring the “well of English
undefiled” through the thin opening of Von Neumann's bottleneck, we
discover strange tensions, exceptions, and potentials.
(36)
Irizarry's work, in what the Oulipians might gleefully call an
instance of “anticipatory plagiary,” enacts the principles of
deformance in explicitly machinic terms. . . . Irizarry thus
envisions a group of what we might call “deformance machines”:
small programs designed to effect algorithmic transformations of
poetic works.
(37) The entropic poem shares a family resemblance
with the output of word-frequency analysis tools, which are among the
fundamental computational primitives of text analysis. . . . It is a
readable work that maintains its coherence fully until the thinning
logic of compression overtakes it.
(38) The entropic poem does not
so much provide data about the original poem as focus our attention
on certain energies in the original—in this case, similar movements
in thought redescribed in new terms at the ends of stanzas.
(38)
To speak of algorithmic criticism is to take a further step and
imagine this generalization as an explicit technological program for
critical reading. Texts that have become proverbial among students of
new media, like the Talmud and the I
Ching,
are particularly useful here. Because they are often held up as
foreshadowings of the ergodic, the interactive, and the
hypertextual—there has been a tendency to deemphasize their
continuity with the more normative practices of reading and
writing.
(41-42) The minute someone proposes to explain the
meaning of a narrative—to speculate, conjecture, extrapolate, or
shout abuse at it, whether in the privacy of one's thoughts or in a
critical journal—the narrative changes, because we are no longer
able to read it without knowledge of the paratextual revolt. Chinua
Achebe's charges of racism in Joseph Conrad's Heart
of Darknessis
a case in point.
(45) Both the I
Ching and
the work of the Oulipo call attention to the always dissolving
boundaries between creation and interpretation. Despite this, both
productions are ordinarily considered aesthetic in nature and thus
impervious to the objections often leveled against more overtly
interpretative works, The agonistic relationship between artistic
deformation and critical legitimacy are far more evident when a work
declares itself as primarily interpretative, and nowhere is this
anxiety more poignant than in Ferdinand de Saussure's
research on pre-classical Latin poetry, the details of which form the
subject of a number of unpublished notebooks written between 1906 and
1909.
(47) In dozens of examples Saussure finds an encrypted
message running alongside, over, and against the aural and graphic
elements of the text.
(47-48) Such questions seem so natural to us
that we tend to overlook the obvious similarities between Saussure's
apparently eccentric inquiries and the more ordinary act of
literary-critical interpretation. . . . Like any literary critic,
Saussure deforms and reforms his text, revealing unknown aspects of
its ontology—literally creating it anew.
(48) In one sense,
deformation is the only rational response to complexity. . . . It is
precisely this fear of an eviscerated objectivity that gives rise to
those rhetorical structures that work to conceal the deformations
that lie between text and interpretation.
(50) The “multiple
clamor” is nothing less than the text's status as a work already
deformed, already mediated by the accumulated experience of language
that produced it and that the reader must have in order to read it.
It lies “hidden” only if we believe that the new organizations
that arise from deformative activity are revelatory of something
inherent in the text before the act of interpretation.
Acknowledge deformance of all interpretation, though more obvious in algorithmic operations.
(50-51)
Saussure's anxieties are rooted in a basic assumption about text and
meaning. Statements of methodology, generalizations about literary
significance, surmises concerning authorial intention, and various
other forms of literary-theoretical philosophizing about these
engagements all give the appearance of existing outside or somehow
above the textuality of the object under discussion; even when we
speak of meaning as “in” or arising “from” the text, we
nonetheless proceed as if the meanings we generate and the texts
themselves were separate entities. This same belief does not obtain
from algorithmic procedures, which, because they explicitly deform
their originals, tread upon the rhetorically maintained separation
between text and reading. . . . To read a poem as
postcolonial
artifact, as
evidence
of generic protest, as
cultural
touchstone (the preposition in each case signalizing the onset of
deformation) is to present a narrative that depends upon a number of
discrete (de)formal procedures.
(52) The existence of so many
competing, perhaps incommensurable readings of a work of literature
is part of the normal course of literary studies. . . . The paragraph
almost resembles the control structures of modern programming
languages: if
x is
true, then
y is
also true, or else
we
must default to a different set of variables or pursue a different
procedure.
(54) We might conclude that “The Wife's Lament” is
a testimony to the poststructuralist insight that textuality is a
shifting pattern of signification incapable of coalescing into any
stable textual identity. We would do better to conclude that “The
Wife's Lament” is a work that is always coalescing into stability
by virtue of the readerly process of deformation.
Dickinson implicit faith that something will overtake the mind.
(55-56) It is precisely the absence of this detail that renders Dickinson's suggestion (and the algorithmic criticism from which it descends) so strange. The apparent randomness with which she suggests the procedure and the implicit faith in the “Something” will overtake the mind deliberately eschews those rhetorical procedures that seek to conceal the status of a text as alternative.
Place re-performances already exercised in print texts into computational environment; replace fear of breaking faith with text with faith in liberating capacity of subjective engagement.
(57) Algorithmic criticism is, in this sense, nothing more than a self-conscious attempt to place such re-performances into a computational environment. But within this move there lies a fundamental remonstration against our anxiety about the relationship between text and reading. Those activities that are usually seen as anathema to the essential goal of literary criticism—quantitative analysis chief among them—will need to be reconsidered if it turns out that backward poems lie at the root of our forward endeavors. Our fear of breaking faith with the text may also need to give way to a renewed faith in the capacity of subjective engagement for liberating the potentialities of meaning.
4
THE TURING TEXT
(62)
Something like this occurs when one considers text-analytical results
generated using imperative routines. If something is known from a
word-frequency list or a data visualization, it is undoubtedly a
function of our desire to make sense of what has been presented. . .
. As with the Turing test, the reader invariably engages not one
text, but two texts operating within an orbit of fruitful antagonism:
the text that creates the results (the code) and the results
themselves.
Machinic inflection of programming at the base of algorithmic criticism, hermeneutics of how to; different from mathematical text because it describes the step by step movement of the process.
(63)
Algorithmic criticism is easily conceived as the form of engagement
that results when imperative routines are inserted into the wider
constellation of texts stipulated by critical reading. But it is also
to be understood as the creation of interactive programs in which
readers are forced to contend not only with deformed texts, but with
the “how” of those deformations. Algorithmic criticism therefore
begins with the machinic
inflection of programming—a
form of textual creation that, despite the apparent determinism of
the underlying machine, proceeds always in organic and unexpected
ways. . . . It is by nature a “meticulous” process, since to
program is to move within a highly constrained language that is
wholly intolerant toward deviation from its own internal rules. But
the goal of such constraint is always unexpected forms of knowing
within the larger framework of more collective understandings. . . .
The hermeneutics of “what is” becomes mingled with the
hermeneutics of “how to.”
(65) What is needed, then, is not a
mathematical text, but an algorithmic text. . . . The former tells us
that the definition of F
for
numbers greater than 1 is related to F
in
a particular way; the latter describes a process in which we move
step-by-step through the relationship itself.
Programming languages emphasize imperative versus declarative descriptions of mathematics (Abelson and Sussman).
(65-66) Harold Abelson and Gerald Sussman, in The Structure and Interpretation of Computer Programs explain: “ . . . In mathematics we are usually concerned with declarative (what is) descriptions, whereas in computer science we are usually concerned with imperative (how to) descriptions” (26). Mathematics undergirds computing at every turn, and yet “executable mathematics” is more dream than reality for designers of programming languages. . . . If code represents a radical form of textuality, it is not merely because of what it allows us to do but also because of the way it allows us to think.
Hermeneutic understanding required to develop programs for textual deformation; toys with difference between a run once arrival at a deformation to interpret versus constantly operating under the condition of reciprocal transformation between programs and texts.
(66)
In order to write the program, the critic must consider the “how
to” of a deformative operation, but once that text is written, the
output will be one that places the same critic into a critical
relationship not only with the text of the result but with the text
of the program as well. . . . But in another sense it is a recursive
process. In order to understand the text we must create another text
that requires an understanding of the first text. The former might
suggest analogies with science, but the latter suggests analogy with
the deepest philosophical questions of the humanities, including the
hermeneutic circle that has so preoccupied poststructuralist
thought.
(66) I was attempting to write a
program that could draw directed graphs of the scene changes in
Shakespeare's plays.
(67) Questions will be raised about any
possible choice. Suspicions will arise about the machinic text that
underlies the apparent. Yet even as we consider such matters, the
computer waits. It still demands an answer.
OHCO needs analysis in part because it fits so well with computer forms.
(67-68) Even if we are loath to regard texts as being, in the words of one commentator, “ordered hierarchies of content objects” (DeRose), we must acknowledge that this is the way the computer would prefer to have it.
Immateriality of code may arise from this distinction between the form inhering in the material versus arising from potentialities.
(68) The word processor or Web browser
does not inhere in the material. It is not, as Michelangelo is said
to have believed, a matter of chipping away all that is not the
sculpture. To contend with the “how to” of programming is to
discover that potentialities of constraint. To read the outputted
text is to do the same.
(68) The goal, after all, is not ato
arrive at the truth, as science strives to do. In literary criticism,
as in the humanities more generally, the goal has always been to
arrive at the question.
5 'PATACOMPUTING
Beating on TAPoR as text analysis toolset.
(74)
Again and again, the language of TAPoR
[Text
Analysis Portal for Research]
points not to methods or procedures, but to “tools”--things to be
wielded against any text on the Web (the default examples
optimistically include both a corpus of French medieval poetry and
the Universal
Declaration of Human Rights).
Yet despite these metaphors, all of which mingle marketing with
mechanization in a way that suggests anything other than the sober,
meandering parole
of
humanistic discourse, TAPoR
confidently
asserts a rhetoric of self-interrogation. . . . However foreign its
interface might be, text analysis is insistently put forth by TAPoR
as
“an interactive practice of discovery with its own serendipitous
paths comparable to, but not identical to, the serendipitous
discovery that happens in rereading a text.” (Rockwell, “What Is
Text Analysis”)
(74) Few tools better illustrate these
serendipitous paths than Stefan Sinclair's
HyperPo,
one of the tools for which TAPoR
acts
as a portal.
(76-77) Goblin
Market becomes
what Jacques Derrida,
in “Ulysses
Gramophone,”
called an “overpotentialized
text.”
(77)
Text analysis of the sort put forth by WordHoard,
TAPoR,
and HyperPo
suggests
other antonyms to close reading, including what Franco Moretti
has
called “distant reading.”
Reference to Derrida overpotentialized text; Hayles is more measured in her critique of algorithmic deformation.
(78)
What is different about digital archives is the way in which text
analysis procedures (including that most primitive of procedures: the
keyword search) has the potential to draw unexpected paths through a
documentary space that is distinguished by its overall
incomprehensibility. Even Vannevar Bush,
amid a conception of hypertext still more sophisticated than that
offered by the World Wide Web, imagines the negotiation of the
document space as it has been for centuries.
(80) The “result”
of a system like MONK
[Metadata
Offer New Knowledge] is the same as that for virtually any
text-analytical procedure: a textual artifact that, even if
recapitulated in the form of an elaborate interactive visualization,
remains essentially a list.
Computational practices may not become critically tractable until they are also commonplace, when hacker/scholar are not mutually exclusive.
Add Plato Symposium to texts susceptible to algorithmic reading, although in the case of symposia the outcomes are sonic experiments and readings informed by consideration of generating audio environments from the text rather than the lamentably inevitable list; that is is accomplished as a programming exercise reiterates Kemeny vision of making coding a basic skill of all intelligent citizens.
(80)
It may be that the tools of algorithmic criticism are like
Wittgenstein's ladder. When we have used them to “climb up beyond,”
we recognize them as nonsensical and case the ladder aside (Tractatus
74).
(81)
If algorithmic criticism does not exist, or exists only in nascent
form, it is not because our critical practices are computationally
intractable, but because our computational practices have not yet
been made critically tractable. . . . Once those changes are
acknowledged, the bare facts of the tools themselves will seem, like
the technical details of automobiles or telephones, not to be the
main thing at all. . . . For by then we will have understood
computer-based criticism to be what it has always been: human-based
criticism with computers.
POSTCONDITIONS
(84)
Humanities computing was part of—and, indeed, the result of—the
same set of epochal changes that had produced the personal computer
and at that very moment were in the process of producing the World
Wide Web.
Wondering at the radical shift in building versus just theorizing programmed objects without citing any working code indicates continued hegemony of literary criticism; critical programming considers working code.
(84) Humanities
computing had its theorists, its administrators, its teachers, and
its historians, but nearly everyone in the field was involved, in one
way or another, with building something.
(84) But it is nowhere
near as jarring—or, frankly, as radical—as the shift from
theorizing about games and Web sites to building them.
(85)
Humanists concern themselves with the study of the human experience;
digital humanists find that building deepens and enriches that
engagement.
Algorithmic criticism provides domain for hacker scholar besides toiling with TEI.
(85) Algorithmic criticism offers a vision of the hacker/scholar as unperturbed by the tension these two words elicit.
Ramsay, Stephen. Reading Machines: Toward an Algorithmic Criticism. Urbana: University of Illinois Press, 2011. Print.