Notes for Friedrich Kittler “There Is No Software”
Key concepts: .
Related theorists: .
Mistakenly originally saved as these is no software. The basic question is if there is no software why proceed working code?
The problem in era beyond literacy is that writing is hidden in computer memory cells that are able to read and write autonomously.
(147) The bulk of written texts—including the paper I am actually reading to you—no longer exist in perceivable time and space, but in a computer memory's transistor cells. . . . our writing scene may well be defined by a self-similarity of letters over some six orders of decimal magnitude. . . . It also seems to hide the very act of writing.
Evocative image of manually blueprinted microprocessor circuit as last historical act of writing, which now becomes real virtuality as geometrical or autorouting powers of actual generation.
(147-148) The crazy kind of software engineering that was writing suffered from an incurable confusion between use and mention. . . . manmade writing passes instead through microscopically written inscriptions, which, in contrast to all historical writing tools, are able to read and write by themselves. The last historical act of writing may well have been the moment when, in the early seventies, the Intel engineers laid out some dozen square meters of blueprint paper in order to design the hardware architecture of their first integrated microprocessor. The manual layout of two thousand transistors and their interconnections was then miniaturized to the size of an actual chip and, by electro-optical machines, written into silicon layers. . . . Actually, the hardware complexity of microprocessors simply discards such manual design techniques, instead of filling countless meters of blueprint paper, have recourse to Computer Aided Design, that is, to the geometrical or autorouting powers of actual generation.
Explosion of software, postmodern Tower of Babel of extensions of programming languages from machine to high-level.
We do not know what our writing does, especially now that it mixes into autonomous machine behavior: how can we know what our writing does to us if we cannot follow it, incredibly fast and small in circuits in place of paper.
(148) This claim in itself has had the effect of duplicating the implosion of hardware by an explosion of software. Programming languages have eroded the monopoly of ordinary languages and grown into a new hierarchy of their own. This postmodern Tower of Babel reaches from simple operation codes whose linguistic extension is still a hardware configuration, passing through an assembler whose extension is the very opcode, up to high-level programming languages whose extension is that very assembler. In consequence, far-reaching chains of self-similarities in the sense defined by fractal theory organize the software as well as the hardware of every writing. What remains a problem is only recognizing these layers which, like modern media technologies in general, have been explicitly contrived to evade perception. We simply do not know what our writing does.
Commercial files required for first look at programming, now FOSS.
Nonvocalized acroymns outside phonetic reading thought/subjectivity.
Machine cognitive-embodied processes self-constituted like subjectivity arising from habituation with phonetic reading.
(148-149) To wordprocess a text, that is, to become oneself a paper machine working on an IBM AT under Microsoft DOS, one must first of all buy some commercial files. . . . On the one had, they bear grandiloquent names like WordPerfect, on the other hand, more or less cryptic, because nonvocalized, acronyms like WP. . . . Executable computer files encompass, by contrast not only to WordPerfect but also to big but empty Old European words such as the Mind or the Word, all the routines and data necessary to their self-constitution.
Software manuals cross realm of literature; compare to Ryan lackluster narrative of the software agent.
(149) Written to bridge the gap between formal and everyday
languages, electronics and literature, the usual software manuals
introduce the program in question as a linguistic agent ruling with
near omnipotence over the computer system's resources, address
spaces, and other hardware parameters.
(150) Not only no program, but also no underlying microprocessor system could ever start without the rather incredible autobooting faculty of some elementary functions that, for safety's sake, are burnt into silicon and thus form part of the hardware. Any transformation of matter from entropy to information, from a million sleeping transistors into differences between electronic potentials, necessarily presupposes a material event called reset.
Everything is in hardware but skeumorphs of human everyday and early machine languages retain notion of software because of the human mental component, although Clark extended cognition seems to eliminate even that immaterialism. Strong criticism of which philosophers of computing? What about Floridi, Maner?
Gradualism ontology: GUI and protected mode obfuscating completely, as Turkle begins to theorize, although epistemological transparency of FOSS and open standards has reversed somewhat.
(151-152) When meanings come down to sentences, sentences to words, and words to letters, there is no software at all. Rather, there would be no software if computer systems were not surrounded by an environment of everyday languages. . . . On the contrary, the so-called philosophy of the so-called computer community tends systematically to obscure the hardware with software, electronic signifiers with interfaces between formal and everyday languages. . . . Consequently, in a perfect gradualism, DOS services would hide the BIOS, WordPerfect the operating system, and so on and so on until, very recently, two fundamental changes in computer design (or DoD politics) have brought this system of secrecy to closure.
Buried redundancy valuation of algorithms akin to Nietzsche criticism of American philosophy striving to get things done as quickly as possible.
(151-152) One-way functions, in other words, hide an algorithm from its result. . . . And, finally, IBM has done research on a mathematical formula for measuring the distance in complexity between an algorithm and its output . . . [quoting] “its buried redundancy . . . the value of a message is the amount of mathematical or other work plausibly done by its originator, which the receiver is saved from having to repeat.” . . . this algorithm intended to compute the cost of algorithms in general is Turing-uncomputable itself.
German law has defined software as material thing rather than mental property; good grounds to assume priority of hardware.
(152) Under these tragic conditions, criminal law, at least in Germany, has recently abandoned the very concept of software as mental property; instead, it defines software as necessarily a material thing. . . . On the contrary, there are good grounds to assume the indispensability and, consequently, the priority of hardware in general.
Working code limited by hardware building argument for inherent materiality.
(152) Only in Turing's paper On Computable Numbers with an Application to the Entscheidungsproblem does there exist a machine with unbounded resources in space and time, with an infinite supply of raw paper and no constraints on computation speed. All physically feasible machines, in contrast, are limited by these parameters in their very code.
Hasslacher discretization of continuous algorithmic descriptions as real programming versus Turing computational imagination: failure to appreciate materiality of computation is flaw in philosophies of programming Kittler criticizes.
(153) [quoting Brosl Hasslacher] We must reduce a continuous algorithmic description to one codable on a device whose fundamental operations are countable, and we do this by various forms of chopping up into pieces, usually called discretization. . . . The compiler then further reduces this model to a binary form determined largely by machine constraints.The outcome is a discrete and synthetic microworld image of the original problem, whose structure is arbitrarily fixed by a differencing scheme and computational architecture chosen at random. . . . This is what we actually do when we compute up a model of the physical world with physical devices. This is not the idealized and serene process that we imagine when usually arguing about the fundamental structures of computation, and very far from Turing machines.
From Shannon thesis to transistor to microprocessor, the technology narrative of the schematism of media recommended in GFT.
When Claude Shannon,
in 1937, proved in what is probably the most consequential M.A.
thesis every written that simple telegraph switching relays can
implement, by means of their different interconnections, the whole of
Boolean algebra, such a physical notation system was established. And
when the integrated circuit, developed in the 1970s out of Shockley's
transistor, combined on one and the same chip silicon as a
controllable resistor with its own oxide as an almost perfect
isolator, the programmability of matter could finally “take
control,” just as Turing had predicted. . . . That is to say,
millions of basic elements work under almost the same physical
conditions, especially as regards the most critical, namely,
temperature-dependent degradations, and yet electrically all of them
are highly isolated from each other.
(154) Thus, the very isolation between digital or discrete elements accounts for a drawback in connectivity that otherwise, “according to current force laws” as well as to the basics of combinatorial logics, would be bounded only by a maximum equalling the square number of all elements involved.
Maximal connectivity limitation of isolated switching components may be overcome with physical, nonprogrammable systems, changing nature of programming activity from stored program ideology of Turing machines: at this point also there is no software, but still forms of programming or engineering, compare to two directions for the future in conclusion of Code in Software Studies.
(154-155) Precisely this maximal connectivity, on the other, physical side, defines nonprogrammable systems, be they waves or beings. . . . Software in the usual sense of an ever-feasible abstraction would not exist any longer. . . . programming it will have little to do any longer with approximated Turing machines.
Biochauvanistic future computers based on neural networks complemented with very nearness of existing silicon systems to this truly solid state model.
Perhaps theoretical return to noise similar to second orality, as away from clarity of liberal humanist subject, and more like cognitive-embodied processes suggested by Hayles, Clark, Jenkins as replacement for subjectivity.
(155) In what I have tried to describe are badly needed machines that . . . certain Dubrovink observer's eyes might be tempted to recognize, under evolutionary disguises or not, the familiar face of man. . . . Maybe. At the same time, however, our equally familiar silicon hardware obeys many of the requisites for such highly connected, nonprogrammable systems. . . . To minimize all the noise that it would be impossible to eliminate is the price we pay for structurally programmable machines. The inverse strategy of maximizing noise would not only find the way back from IBM to Shannon, it may well be the only way to enter that body of real numbers originally known as chaos.
Kittler, Friedrich A. “There Is No Software.” Literature, Media, Information Systems: Friedrich A. Kittler Essays. Ed. John Johnston. Amsterdam: Overseas Publishers Assocation, 1997. 147-155. Print.