Notes for Software Studies: A Lexicon edited by Matthew Fuller

Key concepts: multiple coding, code, pataphysics, von Neumann architecture.

Related theorists: Babbage, Ceruzzi, Alfred Jarry, Knuth, Stallman, Tukey.

Introduction, the Stuff of Software
(1) What is covered here includes: algorithms; logical functions so fundamental that they may be imperceptible to most users; ways of thinking and doing that leak out of the domain of logic and into everyday life; the judgments of value and aesthetics that are built into computing; programming's own subcultures and its implicit or explicit politics; or the tightly formulated building blocks working to make, name, multiply, control, and interrelate reality.
(2) Secondly, Software Studies proposes that software can be seen as an object of study and an area of practice for kinds of thinking and areas of work that have not historically “owned” software, or indeed had much of use to say about it. Such areas include those that are currently concerned with culture and media from the perspectives of politics, society, and systems of thought and aesthetics or those that renew themselves via criticism, speculation, and precise attention to events and to matter among others.

Software's Roots and Reverberations

Tukey first to use the term software.

(2) Recent etymological research credits John W. Tukey with the first publish use of the term “software.”
(2-3) Another crucial moment was the decision by IBM in 1968, prompted in no small part by antitrust court actions, to split its software section off from its hardware section.
(3) Software is seen as a tool, something that you do something with. It is neutral, grey, or optimistically blue. On the one hand, this ostensive neutrality can be taken as its ideological layer, as deserving of critique as any such myth. But this interpretation itself one that emphasizes only critique can block a more inventive engagement with software's particular qualities and propensities. Working with the specificities of software at the multiple scales at which it occurs is a way to get past this dichotomy.

Hayles media specific approach; also alludes to Robert Johnson, Feenberg, and FOSS epistemological transparency.

(4) Technologisation of the senses and structuring of relations by technology is often carried out by naturalized means, lessening our ability to understand and engage with such change. . . . The optimal solution becomes the one that is most amenable to technical description, usually a description that is only in terms of certain already normalized precursors. By contrast, when technology is used in a way that is interrogable or hackable, it allows and encourages those networked or enmeshed within it to gain traction on its multiple scales of operation.
(4) Another theoretical blockage that this collection seeks to overcome is the supposed “immateriality” of software. . . . The new lexicon relies upon an understanding of the materiality of software being operative at many scales.
(5-6) it is this paradox, the ability to mix the formalized with the more messy – non-mathematical formalisms, linguistic, and visual objects and codes, events occurring at every scale from the ecological to the erotic and political – which gives computation its powerful effects, and which folds back into software in its existence as culture.
(7) The range of articulation software allows due to the nature of the joints it forms with other layers of reality means that this freedom (that of a closed world), while somewhat paralyzing, has also guaranteed it a space for profound and unfinishable imagination.

Parallels and Precursors
(7) It comes out of a wider set of interlocking areas of activity in digital cultures, but two other key areas, historical research into the genesis of computing and the discourse associated with free and open source software, have provided a context for the work here.
(7) While art and design have for a reasonably long period had something of an inkling that objects, devices, and other material entities have a politic – that they engage in the arrangement and composition of energies, allow, encourage or block certain kinds of actions – these concerns have also more recently been scrutinized by the interdisciplinary area of science and technology studies.

Make the list: software art, concretization, epistemological transparency, alien temporality.

(8) The area that has become known as software art is perhaps the most direct feed into this lexicon. . . . Art understands that the style of thought is crucial – style not simply as a metric for the deposition of flourishes and tricks, but as a way of accessing multiple universes of reference. . . . The project provides a space for interactions between art and mathematics outside of clean-room purity in dirtier interactions with cultures, economies, hardware, and life. Mathematics becomes applied, not to the cleanly delineated sets of problems set it by productivity and efficiency goals in software projects, but to the task of inventing and laughing with its own goofy serene self and in doing so regaining its “pure” task of establishing systems and paroxysms of understanding.

What Is a Lexicon?
(9) words work in relation to another set of dynamics, a technical language that is determined by its relation to constants that are themselves underpinned by a commitment to an adequately working or improved description.

Stuff behind Stuff
(10) One rule of thumb for the production of this book is that the contributors had to be involved in some way in the production of software as well as being engaged in thinking about it in wider terms.
(10) Part of this underlying shift is that software is now, unevenly, a part of mass and popular cultures. . . . Knowledge about how to make it, to engage with programming and how to use software more generally, circulates by an increasing number of formal and informal means.

Social life of knowledge; renewed interest in what literatcy should mean: Hayles and digital literacy, Mateas Procedural Literacy.

(10) Intelligence arises out of interaction and the interaction of computational and networked digital media with other forms of life conjugate new forms of intelligence and new requirements for intelligence to unfold. As a result, a number of authors collected in this book have called for a renewed understanding of what literacy should mean contemporarily. Amongst others, Michael Mateas has made an important call for what he describes as Procedural Literacy.

The articles in the book are arranged alphabetically as in a lexicon.

Andrew Goffey
(17-18) Appropriately translated into the field of software studies, however, focusing on the development and deployment of algorithms and an analysis of the actions they accomplish both within software and externally might lead us to view the latter as a sort of machinic discourse, which addresses the ways in which algorithms operate transversally, on themselves, on machines, and on humans.
(18) the problem with the purely formal conception of the algorithm as an abstract machine is not that is is abstract. It is that it is not abstract enough. That is to say, it is not capable of understanding the place of the algorithm in a process which traverses machine and human.

Algorithm: logic plus control; machine embodiment of rationality.

(19) Understanding things, activities, tasks, and events in algorithmic terms appears only to exacerbate this situation. What is an algorithm if not the conceptual embodiment of instrumental rationality within real machines.
(19) In questioning the self-sufficiency of the algorithm as a formal notion by drawing attention to its pragmatic functioning, however, it becomes possible to consider the way that algorithms work as part of a broader set of processes. Algorithms act, but they do so as part of an ill-defined network of actions upon actions, part of a complex of power-knowledge relations, in which unintended consequences, like the side effects of a program's behavior, can become critically important.

Derek Robinson
Analogies and Amplifications


Feedback Everywhere

Analog Again

Soren Pold
(32) interface buttons disguise the symbolic arbitrariness of the digital mediation as something solid and mechanical in order to make it appear as if the functionality were hardwired: they aim to bring the old solid analog machine into the interface. In this sense buttons are a part of a remediation of the machine in the computer interface, a way of dressing it up as something well known and well understood, but there is more to it than this. It points directly to our limited understanding of the computer as a machine and as a medium and how it functions in culture and society.
(33) Just think about how many codes and values – from programming, commerce, and ideology – are mobilized when you click “buy,” pay with your credit card, and download a tune in a proprietary file format with technically and juridicially imposed restrictions on how you can use, play, and copy it. The cultural, conventional, and representational elements are disguised or “black-boxed” as pure technical functionality, the money transfer via your credit card company, or the way the music is produced, commercialized, and regulated by the recording company, the outlet, and the artist.
(35) As exemplified by the function buttons, software buttons turned into hardware are often reconfigurable, programmable, and, as such, they reverse the logic of mechanical buttons from giving the interface a hardwired functional trustworthiness to softening the buttons on the box.
(36) From this perspective, the interface button becomes an emblem of our strong desire to handle the increasingly complex issues of our societies by efficient technical means—what one may call the “buttonization” of culture, in which our reality becomes clickable.

Class Library : Graham Harwood

Class Library: entire entry is Perl source code with large comment sections providing needed context that is imagined to be run to produce the message as an embodiment of procedural rhetoric simultaneously enacting automatic capitalist class operations.

(39) # This is the core subroutine designed to give away
# cash as fast as possible
sub ReDistributeCash {
my $RichBasterd_REFERENCE = @_;
# go through each on the poor list
# giving away Cash until each group
# can afford clean drinking water
while($RichBastard_REFERENCE ->(CASH) >= TO_MUCH) {
foreach my $Index (keys @{Poor}) {
$RichBastard_REFERENCE ->{CASH}--;
if( $Poor->{$Index}->{Cash} => $Poor->{$Index}->{PriceOfCleanWater}) {

Code (or, How You Can Write Something Differently)
Friedrich Kittler
Imperium Romanum

Code: universal but necessarily technological, Roman empire (war) substrate via Suetonius allowing Antony link. Note this was translated by Tom Morrison with Florian Cramer.

Kittler refers to encrypted letters in Suetonius Lives of the Caesars; introduce Plutarch.

(40-41) Codes materialize in processes of encryption, which is, according to Wolfgang Coy's elegant definition, “from a mathematical perspective of mapping of a finite set of symbols of an alphabet onto a suitable signal sequence.” . . . Contrary to current opinion, codes are not a peculiarity of computer technology or genetic engineering; as sequences of signals over time they are part of every communications technology, every transmission medium. On the other hand, much evidence suggests that coes became conceivable and feasibly only after true alphabets, as opposed to mere ideograms or logograms, had become available for the codification of natural languages. . . . developed communications technology. . . In his Lives of the Caesars, Suetonius . . . recounts discovering encrypted letters among the personal files left behind by both the divine Caesar and the divine Augustus. . . . the basis on which command, code, and communications technology coincided was the Empire . . . “command, control, communication, computers.”

Code: American copyright law, and the possibility of being unaffected by its sway.

(41) The etymon codex . . . for the first time be leafed through. And that was how the word that interests us here embarked on its winding journey to the French and English languages. . . . Message transmission turned into data storage, pure events into serial order. And even today the Codex Theodosius and Codex Iustinianus continue to bear a code of ancient European rights and obligations in those countries where Anglo-American common law does not happen to be sweeping the board. In the Corpus Iuris, after all, copyrights and trademarks are simply meaningless, regardless of whether they protect a codex or a code.

(42-43) For a long time the term code was understood to refer to very different cryptographic methods whereby words could still be pronounced, but obscure or innocuous words simply replaced the secret ones. . . . Separate character sets, however, are productive. Together they brew wondrous creatures that would never have occurred to the Greeks or Romans. . . . Alberti's proposal that every new letter in the clear text be accompanied by an additional place-shift in the secret alphabet was followed up until World War II. . . . Only since Viete have there been equations containing unknowns and universal coefficients written with numbers encoded as letters. This is still the work method of anybody who writes in a high-level programming language that likewise allocates variables (in a mathematically more or less correct manner) to alpha-numeric signs, as in equations.

Code: Viete foonote on combinations of characters sets versus crossing languages that are pronounced (blituri); tie in to high level programming languages.

(42 footnote 9) Viete himself chose vowels for unknowns, and consonants for coefficients.

Global Message Traffic

Code: trans-semantic optimized languages as products of technical problem solving yielded lookup tables, fetching, not cipher computing.

(43) For the first time [in Morse Universal Code Condensers] a system of writing had been optimized according to technical criteria – that is, with no regard to semantics. . . . What used to be called deciphering and enciphering has since then been referred to as decoding and encoding. . . . The twentieth century thus turned a thoroughly capitalist money-saving device called “code condenser” into highest mathematical stringency.

The Present Day – Turing
(43) All that remains to ask is how the status quo came about or, in other words, how mathematics and encryption entered that inseparable union that rules our lives.

Code: Clear philosophies of embodiment tie in with Turing acknowledging embodiment in decoding natural language, as well as environmental knowledge. Does Hayles make these points too? Kittler died today.

(44) Turing himself raised the question of the purpose for which computers were actually created, and initially stated as the primary goal the decoding of plain human language: [quoting from “Intelligent machinery”] . . . This field seems, however, to depend rather too much on sense organs and locomotion to be feasible. . . . the subject matter of cryptography is very easily dealt with by discrete machinery, physics not so easily.


Code: overused term for law of subjecting empire holding sway.

(45) the notion of code is as overused as it is questionable. . . . But perhaps code means nothing more than codex did at one time: the law of precisely that empire which holds us in subjection and forbids us even to articulate this sentence. . . . for instance, in the case of DNS . . . in the case of bioengineering. . . . Therefore, only alphabets in the literal sense of modern mathematics should be known as codes, namely one-to-one, finite sequences of symbols, kept as short as possible but gifted, thanks to grammar, with the incredible ability to infinitely reproduce themselves: Semi-Thue groups, Markov chains, Backus-Naur forms, and so forth. . . . For while Turing's machine was able to general real numbers from whole numbers as required, is successors have – in line with Turing's daring prediction – taken command. Today, technology puts code into the practice of realities, that is to say: it encodes the world.

Power? Obvious FOS linkage. DNS? Or DNA? A slip. Like von Neumann's emphasis on self-reproducing automata. Manovich, Heidegger, even as remediating Zizek's reality of the virtual.

Code, to Kittler, presents an insoluable dilemma yielding random buzz either way, either society producing human working code software industries full of casual philosophers, or, after turning over to machines to do the work on their own with humans merely tending their server farms, extended cognition descends into necessarily inexplicable, incomprehensible, unphilosophical zones and temporal orders of magnitude: instead of studying code we must be good scholars and cultural observers.

If only more philosophers had been programming back then is reduced by lack of feasibility operating rhetorically like ancient literary reading formant synthesis conventions, audible techniques repeated in centuries of writing techniques.

Code: not really sure what is the point of his dilemma, that good code arrives only when the programmer is not thinking of words to describe the program to humans but is addressing the machine world multipurposively optimally as component, constituent, memory structure; suggest a different way of considering versions than the final one with all the bugs out so always in media res (Chun, Mackenzie).

Code: philosophy of embodiment tie in with Turing recognizing need for knowledge of environment.

Code: gives reasons to avoid speaking about actual code because of bizarre way that programmers are most creative; I am suggesting a different way of considering versions than the final one with all the bugs out, the terminus of Kittlers approach: is this a question of to code, or to write?

(45-46) Turing himself, when he explored the technical feasibility of machines learning to speak, assumed that this highest art, speech, would be learned not by mere computers but by robots equipped with sensors, effectors, that is to say, with some knowledge of the environment. However, this new and adaptable environmental knowledge in robots would remain obscure and hidden to the programmers who started them up with initial codes. The so-called “hidden layers” in today's neuronal networks present a good, if still trifling, example of how far computing procedures can stray from their design engineers, even if everything works out well in the end. Thus, either we write code that in the manner of natural constants reveals the determinations of the matter itself, but at the same time pay the price of millions of lines of code and billions of dollars for digital hardware; or else we leave the task up to the machines that derive code from their own environment, although we then cannot read – that is to say: articulate – this code. Ultimately, the dilemma between code and language seems insoluable. And anybody who has written code even only once, be it in a high-level programming language or assembly, knows two very simple things from personal experience. For one, all words from which the program was by necessity produced and developed only lead to copious errors and bugs; for another, the program will suddenly run properly when the programmer's head is emptied of words. And in regard to interpersonal communication, that can only mean that self-written code can scarcely be passed on with spoken words. May myself and my audience have been spared such a fate in the course of this essay.

Adrian Mackenzie

(48) Codecs are intimately associated with changes in the “spectral density,” the distribution of energy, radiated by sound and image in electronic media.

Patent Pools and Codec Floods
(49) Economically, MPEG-2 is a mosaic of intellectual property claims (640 patents held by entertainment, telecommunications, government, academic, and military owners according to Wikipedia. . . . It participates in geopolitical codec wars).

Trading Space and Time in Transforms
(50) From the standpoint of software studies, how can these different algorithms be discussed without assuming a technical background knowledge? . . . One strategy is to begin by describing the most distinctive algorithmic processes present, and then ask to what constraints or problems these processes respond.
(51) It what way can a videoframe be seen as a waveform? The notion of the transform is mathematical: It is a function that takes an arbitrary waveform and expresses it as a series of simple sine waves of different frequencies and amplitudes. Added together, these since or cosine waves reconstitute the original signal.
(51) The decomposition of a spatial or temporal signal into a series of different frequency components allows correlation with the neurophysiological measurements of human hearing and sight.

Motion Prediction – Forward and Backward in Time
(52) Video codecs are very preoccupied with reordering relations between frames rather than just keeping a series of frames in order.

Codecs: motion vectors are elementary component rather than picture itself.

(53) If intrapicture compression is the first major component of MPEG-2, motion prediction between frames is the second. . . . Here the picture itself is no longer the elementary component of the sequence, but an object to be analyzed in terms of sets of motion vectors describing relative movements of blocks and then discarded.

From Complicated to Composite
(54) As a convention, the MPEG-2 standard refers implicitly to a great number of material entities ranging from screen dimensions through network and transmission infrastructures to semiconductor and data storage technologies. . . . And the codec more or less performs the function of displaying light, color, and sound on screen within calibrated psycho-perceptual parameters.

Codecs: materiality of images replaced by approximate operations working within calibrated psycho-perceptual parameters.

(54) However, the way the MPEG-2 codec pulls apart and reorganizes moving images goes further than simply transporting images. Transform compression and motion estimation profoundly alter the materiality of images, all the while preserving much of their familiar cinematic or televisual appearance.

Computing Power
Ron English
The Need for Alternatives to the Realist Critique

(56) When we blindly start putting categories of the Real on the ethical side, and categories of the Unreal on the unethical side, we imply a system of morality which mimics the Christian story of the fall from the Garden, or Rousseau's dichotomy between nobility of the natural and the evils of artifice.

Three Dimensions of Computer Power: Speed, Interactivity, and Memory
(57) But such formal definitions for computing power, collectively termed the
Chomsky hierarchy, are essentially absent in the world of commercial computing.

(58) Like the Marxist observation that “money is congealed labor,” special effects are congealed computing. The power to command reality to do your bidding is sexy even if it is only a virtual reality.


(60) Computing memory is comparable to social memory, interactivity is comparable to social discourse, and computing speed is comparable to social rhetoric. Thus we see the rhetorical power of special effects, the discursive power of interactive websites, and the mnemonic power of large-scale lay constructions and professional simulations.

Elite versus Lay Public Access to Computing Power
(61) But by the early 1990s a gradient of computing power began to resolidify in which the “cutting edge” of elite computer simulations could leverage truth claims in ways unavailable to the “trailing shadow” of the lay public's computer power (figure 3).
(62) We need organizations like the National Science Foundation to support research specifically directed to the challenge of problem definition in the application of supercomputing power to nonelite humanitarian causes.

Concurrent Versions System
Simon Yuill
(65-66) In many ways CVS has been essential to the success of FLOSS (Free/Libre Open Source Software), as it facilitates the collaboration of the widely dispersed individuals who contribute to such projects. . . . Because CVS focuses on cohering code implementations, it is arguably not well suited to facilitating discussion of more abstract, conceptual aspects of a particular project. While mailing lists and IRC are often the forums for such discussions, they do not, by the very temporality of their nature, allow for such discussions to be built into identifiable documents. Similarly, comments in source code, while also facilitating this, can become too diffuse to gather such ideas together. The Wiki emerged as a response to this need, adapting the version control of CVS into a simpler web-based system in which the more conceptual modeling of projects could be developed and archived, exemplified in the very first Wiki, the Portland Pattern Repository, which gathered different programming “design patterns” together.
(66) forms of sociological analysis have developed based around “archaeological” studies of CVS repositories.
(67) These kinds of studies provide an understanding of agency and governance within FLOSS, and clarify how software development operates as a form of discursive formation.

Concurrent Versions System: support working code argument for understanding digital literacy as reading and writing via version control systems, including the archaeological investigation of software history (Foucault).

(68) The ways in which tools such as CVS are used will carry a residue of these factors, and the CVS repository can become a territory in which these issues and debates are inscribed. CVS is not simply a tool to manage the production of code therefore, but as “the space in which code emerges and is continuously transformed” (to paraphrase Foucault), also an embodiment and instrument of its discursive nature.

Jussi Parikka

(70) In Alberti's time, the spiritual concept of
imitatio (Latin) or mimesis (remediated from the philosophy of Ancient Greece) became the cornerstone of art theory, which lasted for hundreds of years, but also turned at the same time into the material process of copying: especially the texts of the ancients.

Copy: meme theory expresses cult of the copy of digital era contrasted to those of imitatio and mimesis.

(72) What makes meme theory interesting is not whether or not it is ultimately an accurate description of the basic processes of the world, but that it expresses well this “cult of the copy” of the digital era while it abstracts “copying” from its material contexts into a universal principle.
(73-74) Theological issues defined the importance of what was copied and preserved, whereas nowadays the right to copy and to reproduce culture is to a large extent owned by global media companies. This illustrates how copying is an issue of politics in the sense that by control of copying (especially with technical and juridical power) cultural production is also hierarchized and controlled.
(74) Copy production as the dominant mode of cultural production culminated in the digital production techniques of GUI operating systems that originated in the 1980s.

Copy: society of control docility through microcontrol user behavior built into data.

(75-76) Hence we move from the error-prone techniques of monks to the celluloid-based cut and paste of film, and on to the copy machines of contemporary culture, in which digitally archived routines replace and remediate the analog equivalents of prior discourse networks. . . . The novelty of the digital copy system is in the capability to create such copy management systems or digital rights management (DRM) techniques, which act as microcontrollers of user behavior. Data is endowed with an inherent control system, which tracks the paths of software (for example, restricting the amount of media players a digitally packed audiovision product can be played on).

Data Visualization
Richard Wright

Data Visualization: compare overcoming distances to Ihde on instrumentation, Hayles on nonrepresentational imaging such as PET scans.

(79) Visualizations are created for people rather than for machines—they imply that not all computational processes can be fully automated and left to run themselves. . . . One of the fundamental properties of software is that once it is being executed it takes place on such a fine temporal and symbolic scale and across such a vast range of quantities of data that it has an intrinsically different materiality than that with which we are able to deal with unaided. Visualization is one of the few techniques available for overcoming this distance.
(81) At this end of the spectrum, visualization is nonrepresentational because it is speculatively mapped from the raw, isolated data or equations and not with respect to an already validated representational relation. A visualization is not a representation but a means to a representation.
(81) As recently as 2004, visualization scientist Chaomei Chen described visualization as still being an art rather than a science.
(82) Visualization as a practice is not just a question of designing for human perception but of being perceptive. In fact, some people's eyes have been “retrained” by visualization itself until it has altered their apprehension of the world.

Matthew Fuller
(87) In Literate Programming, Donald Knuth suggests that the best programs can be said to possess the quality of elegance. Elegance is defined by four criteria: the leanness of the code; the clarity with which the problem is defined; spareness of use of resources such as time and processor cycles; and, implementation in the most suitable language on the most suitable system for its execution.
(88) The benefit of these criteria of elegance in programming is that they establish a clear grounding for these evaluation of approaches to a problem.
(88) For users of software configured as consumers such “metaphysical” questions aren't often the most immediately apparent, although questions of elegance, as will be suggested below are also recapitulated at the scale of interface.
(89) At a certain threshold, the possibility of starting the tersest formula for arriving at an answer is undecidable. . . . Elegance, because it cannot be proven, comes down to a rule of thumb, something that emerges out of the interplay of such constraints, or as something more intuitively achievable as style (in Knuth's terminology, an “art”).
(90) Finding a way of aligning one's capacities and powers in a way that arcs through the interlocking sets of constraints and criteria, the material qualities of software, and the context in which it is forged and exists is key to elegance.
(90) Elegance then is also the capacity to make leaps from one set of material qualities and constraints to another in order to set off a combinatorial explosion generated by the interplay between their previously discrete powers.
(90-91) Elegance can also be seen in the way in which a trajectory can be extended, developing the reach of an abstraction, or by finding connections with domains previously understood as being “outside” of computation or software.
(91) While elegance, then, demands that we step outside of software, keep combining it with other centers of gravity, computation also suggests a means by which we can think it through, prior to its formulation. . . . Steven Wolfram's figure of the “computational universe” suggests that it is possible to map out every possible algorithm, and every possible result of every algorithm. . . . If an ironic computational universe is not the one we currently inhabit, it will inevitably occur as soon as computation snuggles up to its outside. The condensation of multiple meanings into one phrase or statement turns elegance from a set of criteria into a, necessarily skewed, way of life.

Elegance: Feenberg relates elegance and concretization; here Fuller ties it to understanding digital literacy.

(91) Elegance exists in the precision madness of axioms that cross categories, in software that observes terseness and clarity of design, and in the leaping cracks and shudders that zigzag scales and domains together.

Matti Tedre and Ron Eglash

(93) The first argument is that a better understanding of the cultural dimensions of computing can improve the design of computational devices and practices in disadvantaged groups and third world populations. The second argument is that an understanding of the cultural dimensions of computing can enrich the disciplinary self-understanding of computer science at large.

Theory: Conceptual Starting Points
(93-94) Unlike the natural sciences, where most theoretical and practical problems arise from the complexity of the physical world, in computer science the difficulties usually stem from computer scientists' earlier work—computer scientists have created the complexity of their own discipline.

Research Directions
(94) Computers are cultural artifacts in which a Western understanding of logic, inference, quantification, comparison, representation, measurement, and concepts of time and space are embedded at a variety of levels.

Applications in ICT Education

Applications in Innovation and Diffusion

Other Ethnocomputing Exemplars

Cellular Automata Model for Owari

Simputer and the $100 Laptop

IAAEC Alternative to the Desktop Metaphor Project

Culturally Embedded Computing Group

Native American Language Acquisition Toys


Derek Robinson

(102) The regressus of expressions composed of functions whose definitions are expressions composed of functions ultimately bottoms out in a small and irreducible set of atomic elements, which we may call the “axioms” or “ground truths” of the symbol system. In a computer these are the CPU's op-codes, its hardwired instruction set. In the system of arithmetic they would be the primitives “identity” and “successor,” from which the four basic arithmetic operations can be derived and back into which they can be reduced.

Functional Programming
(103) The first functional programming language was GEORGE, created in 1957 by Charles Hamblin for use on DEUCE, an early Australian computer.

Functions and Mappings
(104) A function proper is propaedeutic, telling how the thing should behave, giving the theory but not concerning itself with how it is to be implemented.
(104) A function can be regarded as a look-up table (often enough it may be implemented as one too) which is to say a mapping from a certain symbol, the look-up key, to a value associated with this key.

Functions and Logic
(105) A function is an abstract replica of causality.
(106) There's a one-wayness to functions, an asymmetry.
(107) The index, as it were, “reverses time.”

Embedding Functions

Olga Goriunova and Alexei Shulgin
(111) Let us look at machine aesthetics as formed by functionality and dysfunctionality, and then proceed to the concept of glitches as computing's aesthetic core, as marks of (dys)funcitons, (re)actions and (e)motions that are worked out in human-computer assemblages.
(112) Historically, the shape, style, and decoration of every new technology has been introduced in a manner owing much to the aesthetics and thinking customary of the time.
(113) There are moments in the history of computer technology that are rich in computer functionality producing distinct aesthetics. At such times, computer functionality reveals itself through technological limitations.

Glitch: communal decision to return to aesthetics of obsolete technologies like 8-bit music.

(114) Returning to a genuine computer aesthetics of obsolete technology is not a question of individual choice, but has the quality of a communal, social decision.

Glitch: dysfunctional event allowing insight into alien computer aesthetics reminiscent of Freudian method.

(114) A glitch is a singular dysfunctional event that allows insight beyond the customary, omnipresent, and alien computer aesthetics. A glitch is a mess that is a moment, a possibility to glance at software's inner structure, whether it is a mechanism of data compression or HTML code. Although a glitch does not reveal the true functionality of the computer, it shows the ghostly conventionality of the forms by which digital spaces are organized.

Lev Manovich

(122) Everybody who is involved in design and art today knows that contemporary designers use the same set of software tools to design everything. The crucial factor is not the tools themselves but the workflow process, enabled by “import” and “export” operations.
(123) Instead of “digital multimedia”--designs that simply combine elements from different media—we see what I call “
metamedia--the remixing of working methods and techniques of different media within a single project.
(123) The result is a hybrid, intricate, complex, and rich media language—or rather, numerous languages that share the basic logic of remixability.
(124) The consequences of this compatibility between software and file formats that was gradually achieved during the 1990s are hard to overestimate.

Ted Byfield

(126) The word itself dates in English to the late fourteenth century, and even so many centuries ago was used in ways that mirror current ambiguities.
(126-127) “Fisher information” has had ramifications across the physical sciences, but its most famous and most influential elaboration was in the applied context of electronic communications. . . . According to Hartley, information is something that can be transmitted but has no specific meaning.
(128) In
The Mathematical Theory of Communication, he [Shannon] and Weaver explained that “information is a measure of one's freedom of choice when one selects a message” from a universe of possible solutions. . . . Put simply, the more freedom one has, the less one knows.
(128) However disparate these prescriptions and descriptions may be, both typically have one general and essential thing in common: mediation.

Andrew Goffey
(133) Both the Turning test and the Turing machine are indicative of how machine intelligence has historically been conceptualized as imitation.
(133-134) The rivalry and conflict characteristic of the libidinal underpinnings of the ways in which issues of machine intelligence have been posed tap into a far broader material and conceptual issue. . . . That machines can replace humans tells us nothing special about intelligence, particularly if this is as part of an economy that, in its entropic repetition of the eternally self-same, generally produces stupidity rather than intelligence.
(134) Is it possible to arrive at an understanding of intelligence without implicitly or explicitly referring to the human as our model? Is it possible, in other words, to think of the intelligence that traverses machines and our relations with them as really alien?

Intelligence: great discussion that hints on 2x and 10x rules governing electronics, giving it the ability to behave deterministically; Deleuze and Guattari natal; and Lacanian unconscious; Kittler.

(135) Critical common sense would find the idea of an alien, machinic intelligence not only rebarbative but contradictory. Because humans program machines, machines must in principle be under the control of humans. The tacit assumption here is that it is impossible to make something autonomous.
(135) The research of actor-network theorist Bruno Latour and philosopher of science Isabelle Stengers has alerted us to the ways in which the world gets divided by scientists, technologists, and their cultural critics into the unproblematically real and the socially or culturally constructed.
(136) That the computer scientist operates on symbols and codes or the chip designer on the properties of silicon, and so on is little different from the complex set of processes characteristic of disciplinary society. In each case the aim is to construct a cofunctioning ensemble of elements that acts autonomously, in a stable and predictable fashion. . . . This is because a computer, like pretty much anything else, is made up of a series of agents that through a process of interactive stabilization have been tamed enough to work together on their own.
(137) The concept of intelligence operative in AI is closely related to the intelligence of computing, as both rely on the formal possibilities of symbol systems (and such systems have the engineering advantage of being relatively easy to implement physically). It is perhaps not that surprising then that cognitive science subsequently found itself arguing, as a consequence of the success of the abstractions of symbol manipulation, that human intelligence itself was computation.
(139) Both artificial intelligence and artificial life research provide us with some interesting insights into the kind of intelligence that is operative within software, but neither are well equipped to help us understand the exteriority of a kind of intelligence that exceeds both software and its human users. Our contention is that such intelligence must be understood in terms of a logic of events: It is the process-flux of events of which software is a part that bears the intelligence, not the relatively closed systems that we program and over which believe we have control.
(139-140) The concept of the natal proposed by Gilles Deleuze and Felix Guattari in A Thousand Plateaus provides a helpful way to work through this argument. . . . because material reality and symbol systems do not “add up,” there is an unformalized excess that undercuts our understanding of intelligence. This excess continues to undermine attempts to manage intelligence by means of coded, rationally deductible properties.
(140) Friedrich Kittler's amusing view of computers as operating like the Lacanian unconscious, expressed best in his statement that all coding operations are ultimately “signifiers of voltage differences” casts light on why machine intelligence has been and needs to be seen as a libidinal problem. If Kittler's view is followed programmable machines would be, as Turing imagined, like the child in the proverbial family triange: in training them to do what we ask them, they internalize the (formal) law on which the desire for recognition depends and give us the answers we deserve to the questions we ask. However as Gilles Deleuze and Felix Guattari have shown, the artificial isolation of a “primal scene” (of programming, in this instance) makes it all too easy to forget the flux of events that gnaws away at the laws of formalism and that makes intelligence something in excess of the symbols that we might choose to represent it.

Michael Murtaugh
Input Tape

(144) Licklider links interaction to a crucial shift from computer as problem-solver to computer as problem-finder or problem-explorer in a space of necessarily unforeseen possibilities.
(145) Lippman describes five “corollaries” or properties he felt were necessary to add to this to attain true interactivity: interruptibility; graceful degradation; limited look-ahead (not pre-computed); no default pathway, and; the “impression of an infinite database.”

Work Tape

(146) Interaction always involves simultaneity, as computation occurs iteratively through feedback to a shared and changing environment. Designing with interaction requires a sensitivity to the timing of the process involved.

Plasticity and Accretion

Interruption and Incompleteness
(147) Incompleteness is a necessary price to pay for modeling independent domains of discourse whose semantic properties are richer than the syntactic notation by which they are modeled.

Output Tape

Florian Cramer and Matthew Fuller

(149) Computer programs can be seen as tactical constraints of the total possible uses of hardware. . . . In other words, they interface to the universal machine by behaving as a specialized machine, breaking the former down to a subset of itself.
(150) The distinction between a “user interface,” and “Application Program Interface” (API), and a computer control language is purely arbitrary.
(150-151) In this sense the term interface emphasizes the representation or the re-articulation of a process occurring at another scalar layer, while the term language, in a computer context, emphasizes control. . . . It is this alienness that allows software, particularly at moments when one is attempting to understand its workings or to program it, that engenders the delicious moments of feedback between the styles of perception and ordering, logic and calculation, between the user and the computer to be so seductive and compelling.
(152) The meshing of poetic and formal language in the area of writing known as “
codeworksexplores the rich experimental and speculative potential of alphanumerical computer control languages.

Adrain Mackenzie

(153) Pathological software forms such as viruses, worms, trojan horses, or even bugs are one facet of otherness marked in software. Much of the architecture and design, as well as much everyday work, pivots on security measures meant to regulate the entry and presence of these others, and at the same time to permit software to translate smoothly between institutional, political, linguistic and economic contexts.

Greetings,” “Inquiry, “Farewell”: Technical Universality

Internationalization: first of two entries so far that include source code, to develop criticism of bias for European languages built into Java locales and Unicode, thus merely technical universality in the sense it runs anywhere.

(154-155) Java supports a standard set of locales that correlated with well-developed, affluent countries. . . . In the character series for European languages, the order of Unicode characters corresponds to alphabetical order. This is not guaranteed for all languages. Sorting strings in non-European languages requires different techniques.

Software for “Human Beings”: Fictitious Universality
(155) Software is becoming social.
Ubuntu, “Linux for Human Beings,” a project supported heavily by Mark Shuttleworth, a South African entrepreneur, is a Linux/GNU distribution in which internationalization of distribution itself figures centrally as part of the project.
(156) In this respect, no matter how distributed its production might become, and how many eyes and hands contribute to it, there is no Other figured in software because software itself now garners universality from that other universal, “human beings,” free individuals who are normalized in important ways.

Tropically Relevant Code and Ideal Universality
(157) It is difficult to articulate any viable alternative to technical universality (software that runs anywhere, as Java claims) or to fictitious universality (Ubuntu's software for human beings) because universality itself is a deeply ambiguous concept. To highlight this ambiguity, I want to point out some of the underpinnings of all software: reliance on practices of numbering, enumerating, and sorting.
(157-158) That is, numerals are elements in a writing system, but numbers are things that marshal, order, and define bodies in the most general sense. The translation from inscribed numeral to embodied number occurs through practices of
enumeration that are lived, singular, and specific.
(158) In a less radical difference, programming languages could be analyzed in terms of their enumeration strategies and the ways they generate unities and pluralities. . . . The very same construction and manipulation that transform numerals (graphic forms) into numbers (things in relations of plurality), constitute bodes in structural relationships. Interpellation is one way of theorizing the ritual hailing that brings bodies of all kinds into forms of subjecthood in relation to number. This singularizing effect is deeply embedded in the graphical writing systems on which software so heavily draws.
(159) Enumeration has specificities that relate to rituals of interpellation embedded in language, gesture, and writing. This point has deep implications for what software does, and how “others” are designated and predicated in software. . . . Although enumeration practices are usually “naturalized” (that is, taken for granted), making particular enumerations work is political: it concerns how people belong together.

Problems of Actual Internationalization

Simon Yuill

Interrupt: signals are software interrupts, breaking through default program flow established by the language.

(161) In polling, the computer periodically checks to see if any external signals have arrived but the processor retains control over when they are handled. In interrupts, the signals are handled whenever they arrive, “interrupting” the processor in whatever it is doing, and giving some control over its activities to an external agent. While polling continues to be used on some simple processor devices, the interrupt enabled more sophisticated forms of interaction between a computer and the external world. . . . The interrupt is the main mechanism through which an operating system seeks to maintain a coherent environment for programs to run within.

Interrupt: makes software social, social inscription of assemblages of social relations; Derrida gram; liminal and porous boundaries.

(162) It breaks the solipsism of the computer as a Turing Machine, enabling the outside world to “touch” and engage with an algorithm. The interrupt acknowledges that software is not sufficient unto itself, but must include actions outside of its coded instructions. In a very basic sense, it makes software “social,” making its performance dependent upon associations with “others”--processes and performances elsewhere.
(162-163) The interrupt vector, then, becomes a carrier through which different elements of a social assemblage are associated. . . . In this sense, we could say that software's “cognition” of the social is comparable to
Derrida's. Indeed, the action of interruption, of the break, is fundamental to the notion of the “gram.” . . . The interrupt, therefore, is the mechanism through which the social, as a process of making and breaking associations with others, is inscribed into a piece of running software.
(163) Whereas we might describe the operational space of software in the context of a user at a desktop system as having a liminal boundary, these other, far more distributed, forms of software operate in a much more porous situation. . . . The interrupt can therefore be thought of, on an extended level, as the vector that not only constructs associations between actors, but also traverses operational spaces.
(164) The operational space of software extends over large physical areas in which algorithms become the arbiters of normative behavior and of inclusion and exclusion.
(165) The interrupt increases the contingency of the environment in which a piece of software runs.

Interrupt: humans are interrupts, therefore software criticism must be social; tie in keyboard and bell of Burks, Goldstine, von Neumann.

(165) If the interrupt teaches us anything about software, it is that software is in many cases only as effective as the people who use it, those nondeterministic machines with their complex, non-reproducible behaviors, those “others” on whom it relies – can it really control such beasts? . . . Software engineering is simultaneously social engineering. Software criticism, therefore, must also be simultaneously social.

Florian Cramer
(168) There are at least two layers of formal language in software: programming language in which the software is written, and the language implemented within the software as its symbolic controls. . . . “Natural” language is what can be processed as data by software; since this processing is formal, however, it is restricted to syntactical operations.
(168) There is nothing “natural” about spoken language; it is a cultural construct and thus just as “artificial” as any formal machine control language. To call programming languages “machine languages” doesn't solve the problem either, as it obscures that “machine languages” are human creations.
(168) computer control languages are a formal (and as such rather primitive) subset of common human languages.
(169) Any computer control language is thus a cultural compromise between the constraints of machine design – which is far from objective, but based on human choices, culture, and thinking style itself – and the equally subjective user preferences, involving fuzzy factors like readability, elegance, and usage efficiency.

Language: codework defined; computer code distinguished from performative speech and reversible coding; need to study code to be critically informed about computers.

(170) As with magical and speculative concepts of language, the word automatically performs the operation. Yet this is not to be confused with what linguistics calls a “performative” or “illocutionary” speech act, for example, the words of a judge who pronounces a verdict, a leader giving a command, or a legislator passing a law. The execution of computer control languages is purely formal; it is the manipulation of a machine, not a social performance based on human conventions such as accepting a verdict. Computer languages become performative only through the social impact of the processes they trigger, especially when their outputs aren't critically checked.
(172) The converse to utopian language designs occurs when computer control languages get appropriated and used informally in everyday culture. . . . These “code poems” or “codeworks” often play with the interference between human agency and programmed processes in computer networks.
(172-173) In computer programming and computer science, “code” is often understood either as a synonym of computer programming language or as a text written in such a language. . . . The only programming language that is a code in the original sense is assembly language, the human-readable mnemonic one-to-one representation of processor instructions.
(173) But since writing in a computer control language is what materially makes up software, critical thinking about computers is not possible without an informed understanding of the structural formalism of its control languages.

Alison Adam

(175) List-making is often seen as a fundamental activity of modern society.
(176) The list provides an elegant data structure for the processing of symbols, rather than numbers, which is vital for the science of artificial intelligence.

Lists: illustrative LISP code presented as example of ancient programming language of modern world to compare to cunieform.

(177) Compare the cuneiform tablets of old, and an “ancient” programming language of the modern world. LISP offers a promise of the power of both the old lists, the nineteenth-century scientific lists, and something beyond.

Wilfried Hou Je Bek

(181) In the Coleridgian view iteration would be “the imprisonment of the thing” and recursion the “self-affected sphere of agency.”
(182) Recursion is surrounded in the programmer's mind with a nimbus of warm light captured in an oft-quoted bit of programmers' wisdom, variously attributed to L. Peter Deutsch and Robert Heller: “To iterate is human, to recurse, divine.”

Warren Sack

(184) Presupposed by this methodology of rhetorical analysis is the idea that the words employed in the design and evaluation of new technologies shape the form and function of those technologies.

Memory: Trope from Plato to Derrida (Kittler) good example of how theories from other disciplines frame understanding of computers; now computer is preferred model (Hayles), which itself is based on bureaucratic paper forms, Bartleby-the-Scrivener.

(184-185) Aristotle's trope does not begin or end with him. Plato wrote of the analogy before Aristotle; and, Cicero, Quintillian, Sigmund Freud, and Jacques Derrida explored the trope of memory-as-wax-tablet after him. Each new generation of memory theorists tends to incorporate the latest media technology to explore its similarities with human memory.
(185) This belief, that the computer is the best model of the object of study, is not unique to cognitive science. . . . The first set of models devised by cognitive psychologists to explain the structure and dynamics of human memory recapitulated many architectural aspects of then-contemporary computational hardware.
(186) When the machines we now call computers were first designed, they were designed to do the work of a human computer. . . . So from Aristotle's seals we have moved to a newer technology of bureaucracy, namely numbered paper forms.
(188-189) The human that serves as the model for these cyborg sciences is culturally coded in a very specific moment. . . . Here then is the true picture of the “human” that is the model for computer memory: he is a bureaucrat squirreling around in the back office, shuffling through stacks of gridded paper, reading, writing, and erasing numbers in little boxes. This Bartleby-the-Scrivener is the man so many cyborg scientists would like to portray or recreate as an assemblage of computational machinery.
(189) The technical literature is completely preoccupied with the management and allocation of memory.
(190) The metaphors of the desk, the trash can, and the mind-numbing operations of office work and bureaucracy are built right into the foundations of the computer and its user interface.
(190) When these genealogies of software are forgotten, one loses sight of the highly particular and ultimately idiosyncratic images of memory and reasoning that are reified in the design and design principles of software.
(190) But, computer science's working theories of memory are very specific and idiosyncratic to the concerns of bureaucracy, business, and the military. This is largely because funding for computer science has come from these sources.

Memory: acknowledge idiosyncratic assumptions about memory and reasoning reified in computer technologies likely due to their primary business and military motivators: can alternative designs be theorized and enacted, such as Proust example suggesting value of considering theories from other disciplines, leads to sound studies.

(190) Juxtaposition with very different images of memory help one to imagine alternatives to the “closed world” conditions that contemporary computational models circumscribe. For example, Marcel Proust's image of memory does not provide a better model of memory than the computer model, but it does provide a different model: contrasting image that can be seen to highlight issues, ideas, and materialities uncommon to the military-(post)industrial technologies of memory: [quoting from Remembrance of Things Past] And suddenly the memory revealed itself. The taste was that of the little piece of madeleine which on Sunday mornings at Combray . . . bear unflinchingly, in the tiny and almost impalpable drop of their essence, the vast structure of recollection.

Obfuscated Code
Nick Montfort

Obfuscated Code: this entry contains some code snippets including the Commodore BASIC that is topic of recent book of the same name in the Software Studies series.

(193) In the practice of obfuscated programming, the most pleasing programs are held to be those that are concise but which are also dense and indecipherable, programs that run in some sort of surprising way.
(193) The following program, for instance, when run on a Commodore 64, displays random mazes:
10 PRINT CHR$(109+RND(1)*2); : GOTO 10
(194) A program may be clear enough to a human reader but may have a bug in it that causes it not to run, or a program may work perfectly well but be hard to understand. Writers of obfuscated code strive to achieve the latter, crafting programs so that the gap between human meaning and program semantics gives aesthetic pleasure.

Confusion between data, code, comment also discussed by Tanak-Ishii.

(195) This data/code/comment confusion is invited by flaws or curiosities in a language's specification, but can be accomplished in several different languages, including C and Perl.
(195) By showing how much room there is to program in perplexing ways—and yet accomplishing astounding results at the same time—obfuscated C programs comment on particular aspects of that language, especially its flexible and dangerous facilities for pointer arithmetic.
(195) Perl's powerful pattern-matching abilities also allow for cryptic and deft string manipulations.
(197) A repository of JAPHs [programs producing the output “Just Another Perl Hacker”] is available online and explications of several have been provided. An explication of an introductory obfuscated C program is also available.

Obfuscated Code: most Perl poetry only needs to be valid, not interesting, representing asymmetrical form of multiple coding.

(197) This is a type of double coding; more generally, multiple coding can be seen in “bilingual” programs, which are valid computer programs in two different programming languages. . . . Perl poetry is a prominent modern-day form of double-coding, distinguished from obfuscated programming as a practice mainly because it is not as important in Perl poetry that the program function in an interesting way; the essential requirement is that the poem be valid Perl.

Obfuscated Code: primary insight is that source code are texts interpreted by human readers, with nod to Jarry Pataphysics; footnote to Maurice Black PhD Dissertation The Art of Code.

(198) Another heritage is the tradition of overcomplicated machinery that has manifested itself in art in several ways. Alfred Jarry's 'Pataphysics, “the science of imaginary solutions,” which involves the design of complicated physical machinery and also the obfuscation of information and standards, is one predecessor for obfuscated programming.
(198) While obfuscation shows that clarity in programming is not the only possible virtue, it also shows, quite strikingly, that programs both cause computers to function and are texts interpreted by human readers. In this way it throws light on the nature of all source code, which is human-read and machine-interpreted, and can remind critics to look for different dimensions of meaning and multiple codings in all sorts of programs.

Object Orientation
Cecile Crutzen and Erna Kotkamp
Data and Data Processing

Ready-Made Action
(200-201) However, this exclusion of the ambiguity of human acting did not prevent computer scientists from interfering in human activity. On the contrary, the modeling and construction of many complex interaction patterns between humans is still based on the same transmission model used for the representation of data-exchange between artificial senders and receivers.

Object Orientation: challenges deemphasis on individuality and specificity of human interaction to focus on generalized, mathematical data-exchange model characteristic of procedural programming, but nonetheless exhibits its own imperialism, biases, and limitations; compare to Hayles analysis of cybernetcs.

(201) This focus on generalizing information, communication, and interaction in computer science pushes the multiform character of individuality and the specificity of human interaction into the background. The exploration of the object-oriented approach is a significant example of this.

Object-Oriented Programming
(201) Within procedural programming software behavior is defined by procedures, functions, and subroutines. In OOP these behaviors are contained in the methods of the objects.

Object-Oriented Approach
(201) OO is used for the representation of the dynamics of interaction worlds, leading us beyond the data-oriented approach and making room for the opportunity to discuss the character of human behavior.
(202) Within the ontology of OO the behavior of humans can only be represented as frozen routine acting. With abstraction tools in OO such as classification, separation, and inheritance, the process of real world analysis is colonized.

Colonization of Analysis
(202) This use of OO as a methodology in informatics is exemplary for the ontological and epistemological assumptions in the discipline: Not only is it possible to “handle the facts” but also to handle and therefore control real behavior itself.

(203) In software and hardware products constructed through the OO approach, the fear of doubt is embedded and transferred into the interaction worlds of which they are part. . . . Abstractions are simplified descriptions with a limited number of accepted properties. They rely on the suppression of a lot of other aspects of the world.

(203-204) Differences, which are not easy to handle, or may not be relevant in the view of the observer and modeler, will be neglected and suppressed.

(204) The models produced by computer scientists using the OO approach as methodology for interpreting and analyzing human behavior leave no room for negotiation or doubt. Models translated into ready-made products, interaction, and communication are only defined on a technical and syntactical level. But the same models are also used on a semantic and pragmatic level to construct the planned and closed interactions of humans.


Object Orientation: difficult to enact Heideggerian change inspired by doubt while under sway of conditions built into OO products; orthogonal to typical concerns of free software advocates.

(205) Doubt leading to exploration and change is, according to Heidegger, the essence of technology; it is not simply a means to an end, it is a way of revealing the world we live in.
(205) However, is this change of meaning still possible? It requires the blowing up of the pre-established conditions for change embedded in OO-products.

Geoff Cox and Adrian Ward

Perl invented by Larry Wall as first postmodern programming language combining best of low and high level languages.

(207) Perl (an acronym for “Practical Extraction and Report Language”) is a programming language, first developed for Unix by Larry Wall in 1987 as an open source project to build on the capabilities of the “awk” utility. . . . Perl therefore lies somewhere between low-level programming languages and high-level programming languages. It combines the best of both worlds in being relatively fast and unconstrained.

Perl: plenty of code examples in this entry.

(207-208) Perl programs are generally stored as text source files, which are compiled into virtual machine code at run-time. . . . The politics of Blake's poem describing the social conditions of London are translated to a contemporary cultural and technical reality in which people are reduced to data.

Hayles uses many Perl examples in Writing Machines.

(209) Programming with Perl emphasizes material conditions, which evokes how N. Katherine Hayles, in Writing Machines, stresses materiality in relation to writing.
(209) [She]
adds the materiality of the text itself to the analysis in a similar way to those who consider code to be material. In this way, it is the materiality of writing itself that is expressed through the relationship between natural language and code—one, code, tended towards control and precision, the other, language, trending toward free form and expression.

Perl: intentional engagement with modernism and postmodernism by Wall in programming language design, intended to allow more degrees of freedom; relevant to critical code studies and critical programming.

(210) In the lecture, “Perl, the first postmodern computer language,” Larry Wall is keen to point out that modernist culture was based on “or” rather than “and,” something he says that postmodern culture reverses.
(210-211) In claiming “AND has higher precedence than OR does,” Wall is focusing on the eclecticism of Perl and how algorithms can be expressed in multiple ways that express the style of the programmer. . . . The suggestion is that Perl is not only useful on a practical level but that it also holds the potential to reveal some of the contradictions and antagonisms associated with the production of software.

Graham Harwood

(214) While there are many threads in the story of the quantification of vision resulting in the pixel, I have chosen to draw a line from perspective as a technical progenitor.
(214) Pixels first appeared at Princeton's Institute for Advanced Study in New Jersey in 1954. At that time the word “pixel” simply described the glowing filaments of the machine's vacuum memory registers.
(215) The human eye is sensitive to a very narrow band of frequencies, namely the frequencies between 429 terahertz (THz) and 750 THz. This is the same sensitivity range as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) chip found in our digital cameras.
(216) In this way, the pixel on the screen models the component light values held within the cells of the eye.
(216) As with their ancestor, perspective, today's binary seeing machines, have managed to convince us that
now we really can possess an infallible method of representation: a system for the automatic and mechanical production of truths about the material world. . . . Aided by the political and economic ascendance of Western systems of objectification and piggy-backing on photography's history, artificial seeing has conquered the world of representation.
(217) This is now the natural mode of representation in most rich countries and through it we enjoy our neutral appropriation of the pixel's reality.

Pixel: normalization of representation by pixel technologies enjoyed by rich countries hides environmental damage done by the industries that produce them, a critical hardware study moreso than software.

(217) Seen from some future water-table polluting slag-heap of heavy metals made from last year's cast-off monitors, printers, and scanners, the pixel will glint and wink at us, the guiding light in the reordering of our individual and collective sight, reduced to the soft/hardware systems that are used to record, judge, display, and manipulate the ambient variables of light.

Preferences / settings /options / control panels
Soren Pold

(219) In general, the preferences regulate three spheres around the software interface: functionality, power relations, and aesthetics.
(219) But there is more to aesthetics than surface. The preferences set up and negotiate an equivalent to the contract that a theater audience or a reader adhere to when entering a fictional representation: a mental, cultural contract negotiating one's expectations and how one is supposed to act and react in the representational space. The relations between the software's senders and receiver(s) or user(s) are defined, most often within very strict limits.

Preferences: Hayles too argues that the preferences and other user residue not merely mirrors but co-constitutes human brains; playing around with preferences palette is the way everyday users transcend unreflexive consumption by engaging with the representational machinery controlled outside their brains.

(220) Software increasingly constructs dynamic models of its user and customizes itself accordingly. . . . Still it highlights how “my” preferences on my personal computer become some sort of automated autobiography within the medium of software, on my personal computer becomes a cybernetic mirror of me.
(220 footnote 8) To sum up, a computer, with the passing of time, ends up looking like its owner's brain.
(221) One could argue that Word promotes an office perspective on writing, a typographical writing that has not taken the various digital developments of writing fully into account.
(222) Still, the trend toward letting the user control the superficial aesthetics can be seen as a symptom users wanting to become more than plain users.
(222) The preferences palette is where the common, everyday user—with no access to or knowledge of code—can make his mark and play around with the representational machinery of the software.

Wendy Hui Kyong Chun

Programmability: disciplining of hardware through discretization also disciplines thinker as programmer, who is rewarded with causal pleasure.

(225) The programmability and accuracy of digital computers stems from the discretization (or disciplining) of hardware.
(227) Just as Schrodinger links programmability to an all-penetrating mind, programmability is linked to the feelings of mastery attributed to programming, its causal pleasure.

Sonic Algorithm
Steve Goodman

(229) For this reason, an analysis of the abstract culture of music requires the contextualization of digital forms within the contagious sonic field of memetic algorithms as they animate musicians, dancers and listeners.
(230) If, as Gottfried Leibniz proposed, all music is “unconscious counting,” then clearly, despite its recent popularity, algorithmic music composition cannot be considered the exclusive domain of computing.
(232) The focus of such generative music revolves around the emergent behavior of sonic lifeforms from their local neighborhood interactions, where no global tendencies are preprogrammed into the system.

Sonic Algorithm: artificial life techniques used in music software aims at creative contingency in precoded work.

(233) In summary, then, the development of artificial life techniques within music software culture aims to open the precoded possibilities of most applications to creative contingency.

Source Code
Joasia Krysa and Grzesiek Sedek

Source Code: starts with an incomplete C program because that is all that will fit on one page, a thoughtful program to consider as a way of creating space for virtual ingredients in same proportion as cookbook recipe ingredients and measures given Knuth comparing programming to recipes and cookbooks. In the take function, for example, ingredient = malloc (quantity+1)*sizeof(ingredient pointer), assuming all pointers are the same address width. It matters whether five, eight or sixty four. Fitting to open the lexicon at random to this page.

It is also codework, pseudocode as not well formed to compile and execute effectively since its base operation returns a character and not pointer pointer character.

(236) / Barszcz C recipe

* string based cooking
* Copyleft (C) 2006 Dennis “Jaromil” Rojo

#include <stdio.h>

#define ingredient char

ingredient **take(int quantity, ingredient **ingr) {

int c;

int len = strlen(ingr) +10;

ingredient = malloc ( (quantity+1) * sizeof(*ingredient));

for(c = 0; c < quantity; c++)

ingredient[c] = malloc(len * sizeof(ingredient));

ingredient[c+1] = NULL;

return ingredient;

(236-237) In The Art of Computer Programming Donald Knuth equates programming and recipes in a cookbook as a set of instructions to follow. . . . The importance of source code for the description of software is that, alongside computer commands, it also usually provides programmers' comments—that is, a documentation of the program including a detailed description of its functionality and user instructions. Furthermore, the importance of source code is that any modifications (improvements, optimizations, customizing, or fixes) are not carried out on compiled binary code (object code or machine code) but on the source code itself. The significance of this is that the source code is where change and influence can be exerted by the programmer. . . . Although recipes are clearly not reducible to code—and vice versa—the analogy emphasizes that both programming and cooking can express intentionality and style.

Source Code: well stated definition of source code and von Neumann architecture machinery invoking Knuth forming a defining statement of post postmodern cybersage; Knuth should be on reading lists for philosophers of programming, Ceruzzi among historians and philosophers of computing.

Source Code: basic description of the earliest electronic computing machines perfectly instantiates the possibility of artificial intelligence in cybernetic self constituting operation; let this be the definition of artificial intelligence rather than human discursive texts.

(237-238) Source code (usually referred to as simply “source” or “code”) is the uncompiled, non-executable code of a computer program written in higher level programming languages. . . . In the history of computation, programs were first written and circulated on paper before being compiled in the same way as recipes were written and shared before being compiled in cookbooks. . . . The source code of a modern digital computer derives from the further adaptation (in the 1940s) of Babbage's ideas. What came to be known as the “von Neumann architecture” is important as it presented a single structure to hold both the set of instructions on how to perform the computation and the data required or generated by the computation; it demonstrated the stored-program principle that has led to development of programming as separate from hardware design. Remington Rand's UNIVAC (Universal Automatic Computer, 1951) was one of the first machines to combine electronic computation with a stored program and capable of operating on its own instructions as data. . . . In A History of Modern Computing, Paul E. Ceruzzi explains this development, from building up libraries of subroutines, then getting the computer to call them up and link them together to solve a specific problem, to a more general notion of a high-level computer language with the computer generating fresh machine code from the programmer's specifications.
(238) The principle of re-using or sharing code relies on storing collections of codelines, or functions, in “libraries.”

Source Code: gives examples of foss repositories and invokes Stallman.

(239) There are other examples that extend the online repository model to the cultural realm. . . . Another example is, which presents a themed and contextualized (reviewed) systematic selection of links to innovative free software.
(239) In
Free Software, Free Society, Richard Stallman suggests that the sharing of software is as old as computing, just as the sharing of recipes is as old as cooking. However, the reverse of this analogy holds too.

Source Code: aesthetic properties of source code, preference for brief examples yields attractor to code poetry, quines, minimal code, and of course obfuscated code, which perhaps redefines beauty as extreme style.

(239-240) The idea of source code, and indeed the open source model, extends beyond programming and software. For instance, Knuth points to creative aspects of programming alongside technical, scientific, or economic aspects, and says that writing a program “can be an aesthetic experience much like composing poetry or music.” Source code can be considered to have aesthetic properties; it can be displayed and viewed. It can be seen as not only as a recipe for an artwork that is on public display but as the artwork itself—as an expressive artistic form that can be curated and exhibited or otherwise circulated. . . . The software art repository lists obfuscated code under the category of “code art” alongside code poetry, programming languages, quines, and minimal code.

Source Code: ends with calls to cook function displayed at beginning; important that authors note example of using personal free, open source project as part of human oriented (philosophical) text, alluding to not so much critical code as critical programming study.

(240) An online repository and a platform for presenting and sharing barszcz soup recipes in the form of source code written in a number of programming languages, the [] project brings together cooking recipes and source code in a literal sense.

System Event Sounds
Morten Breinbjerg

(243) Every day we expose ourselves to these sounds; they form the soundscape of our computers.

The Semiotic Use of Sound
(245) The system event sounds of
Windows XP are mostly symbolic, although some can be characterized as iconic. The icon is a type of sign that resembles the object signified, while the symbol is a sign that represents its object purely by convention.
(245) Nevertheless, some of them are value-laden because they attribute an emotional state to the action being performed; this lies beyond simple feedback information and beyond the semantics of the action.

The Aesthetic Use of Sound

System Event Sounds: now part of broader culture beyond corporate control; no mention of cell phone system sounds, but obvious extension.

(246) The ascending logon melody is perceived as the positive energized action and the descending logoff as the negative one.
(246) In fact these media have stereotyped these ways of hearing and comprehending. As such, the immediate understanding of the “critical battery” sound as a warning and the experience of the logon sound as a positive action is due to both innate experiences of music and cultural ways of listening.
(248) In an interview Brian Eno explained that Microsoft presented him with a list of adjectives (inspiring, optimistic, futuristic, sentimental, emotional, etc.) that they wanted the sound to reflect. He composed eighty-four different pieces of music, from which they chose one [the Windows 3.1 “Microsoft Sound”].
(248) Furthermore, Brian Eno not only makes music but also publishes theoretical work and as such belongs to the intelligentsia of rock and electronic music. With his background Microsoft not only hired a competent musician, they hired a cultural icon.
(248) By means of the use of sound, the computer is given a voice and thereby the ability to contact and communicate with its user and the world around it.
(249) System event sounds as aesthetic objects have become a part of broader culture outside the control of Microsoft. . . . As aesthetic objects, system event sounds have themselves equally become part of a culture (and of a new billion dollar industry) of sharing, buying, managing, recording, and downloading.

Text Virus
Marco Deseriis

(251) In the impossibility of ascertaining their origins, such hoaxes appear as epiphenomena of a machinic system characterized by a high level of commixture of natural language and computer code.
(252) By archiving, labeling, and rating viruses and hoaxes, antivirus firms set a tradition and enact the same preservative function of the clergy. My argument here is that this categorization freezes the ever-sliding nature of (machinic) writing, and prevents use from discovering the power of this ambivalence.
(252) In order to articulate this thesis, I have to step back to the
Phaedrus. . . . What disturbs Socrates most (according to Plato and “retraced” by Derrida) is the fact that writing is a supplement that, circulating randomly without its father, cannot be interrogated, and thus diverts us from the search for truth.

Text Virus: writing as phamakon; machinic writing stifled by virus categorization.

(253) For this reason writing as a pharmakon—a Greek term that stands both for medicine and poison—an errant simulacrum of a living discourse that comes from afar and whose effects are unknown to those who take it.
(254) for the Greeks [quoting David Abram] “a direct association is established between the pictorial sign and the vocal gesture, for the first time completely bypassing the thing pictured.”
(254-255) Thus, in a machinic environment the hoax constantly redoubles the acts of magic through which programmers translated one language into another after they lost their respective parents (the external world for the alphabet, the machine for code). Both orphans, the two systems can now exchange their functions and look for a different destiny. But to express its virtuality, machinic writing constantly struggles with the gatekeepers that try to disambiguate it and reinscribe it in a proper and productive system of signification.

Timeline (sonic)
Steve Goodman

(257) Bergson criticized the cinematographic error of Western scientific thought, which he describes as cutting continuous time into a series of discreet frames, separated from the temporal elaboration of movement, which is added afterward (via the action, in film, of the projector) through the perceptual effect of the persistence of vision. Yet sonic time plays an understated role in Bergson's (imagistic) philosophy of time, being often taken as emblematic of his concept of duration as opposed to the cinematographic illusion of consciousness.
(258) Aside from its improvement of the practicalities of editing and the manipulation of possibility, the digital encoding of sonic time has opened an additional sonic potential in terms of textual invention, a surplus value over analog processing.

Timeline: textual effects of timestretching.

(258-259) The technique referred to as time-stretching cuts the continuity between the duration of a sonic event and its frequency. In granular synthesis, discreet digital particles of time are modulated and sonic matter is synthesized at the molecular level. In analog processing, to lower the pitch of a sound event adds to the length of the event. . . . Timestretching, however, facilitates the manipulation of the length of a sonic event while maintaining its pitch, and vice versa. Timestretching, a digital manipulation process common to electronic music production is used particularly in the transposing of project elements between one tempo (or timeline) and another, fine tuning instruments, but also as a textual effect producing temporal perturbations in anomalous durations and cerated consistencies.

Derek Robinson

(260) The root of the difference is that a programmer's variables are implemented on a computer, which means they must concretely exist in a computer's memory, in accordance with whose concreteness they must be named, ordered, addressed, listed, linked, counted, serialized, unserialized, encoded, decoded, reveled, and unraveled; how this happens bears little resemblance to algebraic symbols scratched on a chalkboard.
(261) At bottom this is what any variable is: a name standing for a number that is interpreted as an address that indexes a memory location where a program is directed to read or write a sequence of bits. Electronic sensors attached to a computer are de facto variables registering external events in a set-aside range of addresses that act as portholes to view sampled digital representations of the changing voltages provided by the sensor.
(262) The single most critical constraint on a variable's use is that it, and its every instance, must be uniquely determined in the context or “namespace” of its application, if it is to serve naming's ambition of unambiguous indication.

Variable: programmers do practical ontology, as in account of fetch and execute variable and requirement of unambiguous indication in namespaces; see Smith On the Origin of Objects.

(262 footnote 7) Brian Cantwell Smith's On the Origin of Objects plumbs software's ontology very deeply and very densely (however it's only recommended for people not put off by infinite towers of procedural self-reflection).
(262) Pronouncing upon the thingness of things has historically been considered the special preserve of philosophers, but programmers, being the practical engineering types that they are, simply had to get on with the job.

Weird Languages
Michael Mateas

(267) Weird programming languages are not designed for any real-world application or normal educational use; rather, they are intended to test the boundaries of programming language design itself. A quality they share with obfuscated code is that they often ironically comment on features of existing, traditional languages.
(267) INTERCAL is the canonical example of a language that parodies other programming languages.
(269) Minimalist languages strive to achieve universality while providing the smallest number of language constructs possible.
(270) Some weird languages encourage double-coding by structuring the play within the language such that valid programs can also be read as a literary artifact.
(272) Thus, in some sense, Chef structures play to establish a triple-coding: the executable machine meaning of the code, the human meaning of the code as a literary artifact, and the executable human meaning of the code as steps that can be carried out to produce food.
(273) By commenting on the nature of programming itself, weird languages point the way toward a refined understanding of the nature of everyday coding practice.

Weird Languages: all coding involves double-coding; study of weird languages seems necessary component of critical programming as well as link to traditional humanities.

(274) All coding inevitably involves double-coding. “Good” code simultaneously specifies a mechanical process and talks about this mechanical process to a human reader. Finally, the puzzle-like nature of coding manifests not only because of the problem solving necessary to specify processes, but because code must additionally, and simultaneously, make appropriate use of language styles and idioms, and structure the space of computation. Weird languages thus tease apart phenomena present in all coding activity, phenomena that must be accounted for by any theory of code.

Fuller, Matthew. Software Studies: A Lexicon. Cambridge, MA: MIT Press, 2008. Print.