Notes for David M. Berry The Philosophy of Software: Code and Mediation in the Digital Age

Key concepts: parasitic subjectivity.

Ultimate ethic is being a good stream.

Related theorists: Kittler.

We can assume this philosophical approach focuses on meditation and code in the digital age, not living writing fantasies of past media-overdetermined epistemological ages orality, literacy, and PHI. Software studies, critical code studies, working code: this book leads to differences for my niche, towards end of literature review. Can the rest be plotted and derived working backwards from the survey Berry presents?

Acknowledgements
(x) I would like to take this opportunity to thank Nikki Cooper, the Callaghan Centre for the Study of Conflict, Power, and Empire, and the Research Institute for Arts and Humanities (RIAH) at Swansea University for funding the workshop, The Computational Turn, which explored key issues around software. Thanks also to N. Katherine Hayles and Lev Manovich and the other participants at this workshop who enthusiastically discussed many of the themes pertinent to this book in a rigorous and critical setting.


1
The Idea of Code

The built environment of late capitalist information society is supported by software and mostly happens by software for the sake of software even when serving human ends at the ends of its causal chains; materiality of code on account of its being in and affecting the physical world, spun like webs, largely database to database information.

(1-2) To do this requires millions, if not billions of lines of computer code, many thousands of man-hours of work, and constant maintenance and technical support to keep it all running. These technical systems control and organize networks that increasingly permeate our society, whether financial, telecommunications, roads, food, energy, logistics, defense, water, legal or government. . . . [quoting Roger Bohn] “This is primarily 'database to database' information—people are only tangentially involved in most of it” (The Economist 20120d).

Invokes Heidegger as first philosopher of computing. Make a list of them for literature review: Heidegger, Kittler, Tanaka-Ishii, Berry. Later add Turing, von Neumann, Licklider, Engelbart, and others revealed by computer and software industry histories.

(2) But the specific difference introduced by software/code is that it not only increases the speed and volume of these processes, it also introduces some novel dimensions: (1) . . . delegation of mental processes of high sophistication into computational systems. . . . (2) networked software, in particular, encourages a communicative environment of rapidly changing feedback mechanisms that tie humans and non-humans together into new aggregates. . . . (3) there is a greater use of embedded and quasi-visible technologies, leading to a rapid growth in the amount of quantification that is taking place in society. . . . This transforms our everyday lives into data, a resource to be used by others, usually for profit, which Heidegger terms standing-reserve (Heidegger 1993a: 322).

The power of software over other tools.

(3) More accurately, we might describe it as a society that is more dependent on the computation of information, a computational knowledge society. . . . And yet, this is not the whole story, for each of the computers and technologies is actually mediating its own relationship with the world through the panoply of software. These computers run software that is spun like webs, invisibly around us, organizing, controlling, monitoring and processing.
(3) Software can revolutionize the limitations of the physical world.

Software hidden by interfaces; second philosopher of computing introduced is Kittler.

(4) But this software is too often hidden behind a facade of flashing lights, deceptively simple graphic user interfaces (GUIs) and sleekly designed electronic gadgets that re-presents a world to the user. . . . It is time, therefore, to examine our virtual situation.

Stiegler, the newest significant philosopher of computing introduced following Heidegger and Kittler, offers materialist definition, dynamic of organized inorganic matter, inviting pass through Burks, Goldstine, von Neumann; mechanology is a new philosophical position introduced without fanfare here.

(4) The challenge is to bring software back into visibility so that we can pay attention to both what it is (ontology), where it has come from (through media archeology and genealogy) but also what it is doing (through a form of mechanology), so we can understand this 'dynamic of organized inorganic matter' (Stiegler 1998: 84).

Interesting new forms of scholarship around software studies, cultural analytics, and critical code studies, comparable to list Hayles makes in How We Think: platform studies, media archeology, software engines, soft authorship, genre analysis of software, graphical user interfaces, digital code literacy, emporality and code, sociology and political economy of the free software and open source movement. This enumeration could found the literature review for the prospectus.

(4-5) Thankfully, software is also starting to become a focus of scholarly research from a variety of approaches loosely grouped around the field of software studies/cultural analytics (Fuller 2003; Manovich 2001, 2008) and critical code studies (Marino 2006; Montfort 2009). Some of the most interesting developments in this area include: platform studies (Montfort and Bogost 2009), where there is a critical engagement with an 'infrastructure that supports the design and use of particular applications, be it computer hardware, operating systems, gaming devices, mobile devices, and digital disc formats' (Gillespie 2008); media archeology, which uncovers histories and genealogies of media, insisting on researching differences rather than continuity (Parikka 2007); research into software engines, which form an increasing part of the way in which software is packaged to perform a wide variety of functions, e.g. gaming engines, search engines, etc. (Helmond 2008); research into 'soft' authorship (Huber 2008) and genre analysis of software (Douglas 2008), which look at the way in which the notion of the author is problematized by the processural and bricolage nature of software development; graphical user interfaces, which focuses on the human-computer interface and the machine (Bratton 2008; Chun 2008; Dix et al 2003; Manovich 2001; Manovich and Douglas 2009); digital code literacy, which investigates how people read and write digital forms (Hayles 2004; Hayles 2005; Montfort 2008); research into temporality and code (Raley 2008); the sociology and political economy of the free software and open source movement, particularly with respect to the way in which the software has been rethought and subject to deliberation and contestation (Berry 2008; Chopra and Dexter 2008; Coleman 2009; Kelty 2008; Lessig 2002; May 2006; Weber 2005).

Pragmata of code entails situatedness if not materiality; phenomenological approach to problems that include ephemerality, both of source code revisions and entire operating environments, and high technical skill requirement. The latter I claim requires lots of practice, such as developed through employment developing software and extreme hobbyist activity such as in free, open source software projects. Good statement of significance of the undertaking, including examining influence on academics itself.

(5-6) What remains clear, however, is that looking at computer code is difficult due to its ephemeral nature, the high technical skills required of the researcher and the lack of analytical or methodological tools available. This book will attempt to address this lack in the field by pointing towards different ways of understanding code. It will do so through a phenomenological approach that tries to highlight the pragmata of code. . . . Indeed, I argue that to understand the contemporary world, and the everyday practices that populate it, we need a corresponding focus on the computer code that is entangled with all aspects of our lives, including reflexivity about how much code is infiltrating the academy itself.

Understanding software avidities affects human freedom, so therefore it is worthwhile to study, imbricating classic Socratic questioning for tracing agentic paths constituting human experience.

(6-9) The way in which these technologies are recording data about individuals and groups is remarkable, both in terms of the collection of: (1) formal technical data, such as dataflows, times and dates, IP addresses, geographical information, prices, purchases and preferences, etc.; (2) but also qualitative feelings and experiences. . . . This information is not collected passively, but processed and related back into the experience of other users either as a news feed or data stream, or else as an information support for further use of the software. . . . However, without an understanding of how computation is tying data, news, practices and search results together through computer code, the process of 'search' is difficult to explain, if not strangely magical. It also precludes us from concentrating on the political economic issues raised by the fact that an American corporation is capturing this data in the first place, and is able to feed it back through pre-populating the search box and hence steer people in particular directions. . . . Essentially, Google creates advertising markets by the real-time segmentation of search requiring computer power to understand who, in terms of a particular consumer group, is searching and what can be advertised to them. Google, therefore, harnesses the power of computation to drive an aggressive advertising strategy to find out who the user is, and what are their preferences, tastes, desires, and wants. . . . To understand these kinds of issues, which are essentially about the regulation of computer code itself, we need to be able to unpack the way in which these systems are built and run. This means a closer attention to the multiple ways in which code is deployed and used in society.

Code, better software, material because it transforms media. Who will do this? Turn those already familiar with working code into philosophers, and train young people as programmers so they might become technically skilled philosophers. As Hayles (or is it Berry) points out, the shortage or gap may not be with money, but time, or competence.

(9-10) Therefore, it seems to me that we need to become more adept at reading and writing computer code in order to fully understand the technical experiences of the everyday world today. . . . Without this expertise, when tracing the agentic path, whether from cause to effect, or through the narrative arcs that are used to explain our contemporary lives, we will miss a crucial translation involved in the technical mediation provided by software. . . . Code is not a medium that contains the other mediums, rather it is a medium that radically reshapes and transforms them into a new unitary form.

Paying attention to the computationality of code, tracing agentic path, is the crucial, effective position to join to other approaches: can only understand by reading and watching operation, the latter suggesting materiality or at least situatedness within instrumentalized world.

(10) To understand code and place the question of code firmly within the larger political and social issues that are raised in contemporary society, we need to pay more attention to the computationality of code. In other words, the way in which code is actually 'doing' is vitally important, and we can only understand this by actually reading the code itself and watching how it operates. As a contribution to developing our understanding of what is admittedly a complex subject, I take a synoptic look at the phenomena of code, and try to place it within phenomenological context to understand the profound ways in which computational devices are changing the way in which we run our politics, societies, economies, the media and even everyday life.

Understanding computation

Derivation of computation from Latin computare.

Understand how being-in-the-world is made possible through application of computational techniques manifested in processes touched by software and code.

Similar to Foucault, examine how computation actually occurs rather than theoretical underpinnings.

(10-11) The term computation itself comes from the Latin computare, com-'together' and putare 'to reckon, to think or to section to compare the pieces'. . . . My intention is not to evaluate or outline the theoretical underpinnings of computability as a field within the discipline of computer science, rather, I want to understand how our being-in-the-world, the way in which we act towards the world, is made possible through the application of these theoretical computational techniques, which are manifested in the processes, structures and ideas stabilized by software and code.

Distinguish the computational from computationalism, which seems orthogonal to the issues, providing useful metaphors (Hayles).

(11) It is also worth clarifying that I do not refer to computational in terms of computationalism, a relatively recent doctrine that emerged in analytic philosophy in the mid 1990s, and which argues that the human mind is ultimately 'characterizable as a kind of computer' (Golumbia 2009: 8), or that an increasing portion of the human and social world is explainable through computational processes.
(12-13) Although I will not be looking in detail at the questions raised by analog computation, nor the digital philosophy of
Fredkin and others, these examples demonstrate the increasing importance of the digital in how people are conceptualizing the world. . . . These metaphors help us understand the world, and with a shift to computational metaphors, certain aspects of reality come to the fore, such as the notion of orderliness, calculability, and predictability, whilst others, like chaos, desire and uncertainty, retreat into obscurity.

Important distinction between computationalist and instrumentalist notions of reason, locating the former in the materially entangled matrix operations of distributed cognition, and the latter in a more restricted sense of a type of agency.

(13) I argue for a distinction between computationalist and instrumentalist notions of reason. . . . In effect, instrumental rationality is a mode of reasoning employed by an agent. In contrast, computational rationality is a special sort of knowing, it is essentially vicarious, taking place within other actors or combinations and networks of actors (which may by human or non-human) and formally algorithmic. . . . This means that the location of reasoning is highly distributed across technical devices and agents. This strongly entangles the computational with the everyday world; after all, only a limited number of computational tasks are self-contained and have no user or world input. . . . In this sense then, computational rationality is a form of reasoning that takes place through other non-human objects, but these objects are themselves able to exert agential features, either by making calculations and decisions themselves, or by providing communicative support for the user.
(14) In a certain sense, this is an agonistic form of communicative action where devices are in a constant stream of data flow and decision-making which may only occasionally feedback to the human user.

Everyday computational comportment to be developed via digital Bildung.

(14) This 'everyday computational' is a comportment towards the world that takes as its subject-matter everyday objects which it can transform through calculation and processing interventions. . . . This reminds us that computation is limited to specific temporal durations and symbolic sets of discrete data to represent reality, but once encoded, it can be resampled, transformed, and filtered endlessly.

Kittler sense of media convergence is translation into digital forms, whereas Berry senses new knowledge and productive potential because discretization not equivalent to immaterialism.

(14-15) This demonstrates the plasticity of digital forms and points toward a new way of working with representation and mediation, facilitating the digital folding of reality. . . . In other words, a computer requires that everything is transformed from the continuous flow of our everyday reality into a grid of numbers that can be stored as a representation of reality which can then be manipulated using algorithms. The other side of the coin, of course, is that these subtractive methods of understanding reality (episteme) produce new knowledges and methods for the control of reality (techne).

Hidden versus visible affordances complicate computational objects but also leave saving power of epistemological transparency.

(15) In a similar way to physical objects, technical devices present the user a certain function . . . but this set of functions (affordances) in a computational device is always a partial offering that may be withheld or remain unperformed. This is because the device has an internal state which is generally withheld from view and is often referred to as a 'black box', indicating that it is opaque to the outside view.

Similar to robotic moment of Turkle, but more encompassing, double mediation requires entrusting agency to software design.

(16) In this sense then, the computational device is a mediator between entities and their phenomenal representation in the everyday world, and its affordances help inform us and guide us in using it. . . . We have to trust the machine has properly captured, transformed, and rendered the desired image.

Is double mediation a special case of Baudrillard simulacrum, or more, deserving fresh analysis?

(16) This demonstrates the double mediation which makes the user increasingly reliant on the screen image that the computer produces, but also renders them powerless to prevent the introduction of errors and mistakes (unless they have access to the computer code).

Most code experience is visual rather than haptic.

(17) That is not to say matter too is not also the subject of feverish research activity into making it 'computable'. . . . However, much of the code that we experience in our daily lives is presented through a visual interface that tends to be graphical and geometric, and where haptic, through touch, currently responds through rather static physical interfaces but dynamic visual ones, or example iPads, touch screen phones, etc.

Code as cultural technique affecting culture (Kittler, Kramer, Manovich).

(17) Computer code is not solely technical though, and must be understood with respect to both the 'cultures of software' that produce it, but also the cultures of consumption that surround it. . . . Therefore, following Kittler's (1997) definition of media, I also want to understand computational reasoning as a cultural technique, one that allows one to select, store, process, and produce data and signals for the purposes of various forms of action but with a concentration on its technical materiality (Kramer 2006: 93).
(18) This means that we can ask the question: what is culture after it has been 'softwarized'? (Manovich 2008:41). Understanding code can therefore be a resourceful way of understanding cultural production more generally, for example, digital typesetting transformed the print newspaper industry, eBook and eInk technologies are likely to do so again.

Towards digital humanities
(18) By problematizing computationality, we are able to think critically about how knowledge in the 21
st century is transformed into information through computational techniques, particularly within software.

Super-critical modes of thought yield not collective intelligence but collective intellect (Jenkins, Hayles); humanists should focus on determining appropriate practices, not specific ICT.

(20) However, I want to suggest that rather than learning a practice for the digital, which tends to be conceptualized in terms of ICT skills and competences (see for the example the European Computer Driving License), we should be thinking about what reading and writing actually should mean in a computational age. . . . The digital assemblages that are now being built, not only promise great change at the level of the individual human actor. They provide destabilizing amounts of knowledge and information that lack the regulating force of philosophy that, Kant argued, ensures that institutions remain rational. . . . This introduces not only a moment of societal disorientation with individuals and institutions flooded with information, but also offer a computational solution to them in the form of computational rationalities, what Turing (1950) described as super-critical modes of thought.
(21) This is not the collective intelligence discussed by Levy (1999), rather, it is the promise of a collective
intellect. This is reminiscent of the medieval notion of the universitatis, but recast in a digital form, as a society or association of actors who can think critically together mediated through technology.

Computational hard core in all disciplines may be new paradigm; if not practicing working code, super-critical modes of thought circulate exclusively within consumer subjectivity, missing potential of creative control.

(21) This would mean that the disciplines would, ontologically, have a very similar Lakatosian computational 'hard core' (Lakatos 1980). . . . Perhaps we are beginning to see reading and writing computer code as part of the pedagogy required to create a new subject produced by the university, a computational or data-centric subject.

New distributed, computational subjectivity, necessarily dehumanizing, but also only potentially democratizing.

(22) Instead, reasoning could change to more conceptual or communicational method of reasoning, for example, by bringing together comparative and communicative analysis from different disciplinary perspectives and knowing how to use technology to achieve a result that can be used—a rolling process of reflexive thinking and collaborative thinking. Relying on technology in a more radically decentered way, depending on technical devices to fill in the blanks in our minds and to connect knowledge in new ways, would change our understanding of knowledge, wisdom and intelligence itself. It would be a radical decentering in some ways, as the Humboldtian subject filled with culture and a certain notion of rationality, would no longer exist, rather, the computational subject would know where to recall culture as and when it was needed in conjunction with computationally available others, a just-in-time cultural subject, perhaps, to feed into a certain form of connected computationally supported thinking through and visualized presentation.
(22-23) This doesn't have to be dehumanizing. Latour and others have rightly identified the domestication of the human mind that took place with pen and paper (Latour 1986). . . . Computational techniques could give us greater powers of thinking, larger reach for our imaginations, and, possibly, allow us to reconnect to political notions of equality and redistribution based on the potential of computation to give to each according to their need and to each according to their ability.

Computational literacy that Manovich is milking also hinted as new academic goal (Whithaus).

(24-26) Further, it is not merely the quantification of research which was traditionally qualitative that is offered with these approaches, rather, as Unsworth argues, we should think of these computational 'tools as offering provocations, surfacing evidence, suggesting patterns and structures, or adumbrating trends' (Unsworth, quoted in Clement et al. 2008). . . . This is a distinction that Moretti (2007) referred to as distant versus close readings of texts. . . . Computational approaches facilitate disciplinary hybridity that leads to a post-disciplinary university that can be deeply unsettling to traditional academic knowledge. . . . Indeed, there is a cultural dimension to this process and as we become more used to computational visualizations, we will expect to see them and use them with confidence and fluency. . . . This is a subject that is highly computationally communicative and able to access, process and visualize information and results quickly and effectively. At all levels of society, people will increasingly have to turn data and information into usable computational forms in order to understand it at all.

Orality, literacy, computationality, ontotheology; a bold philosophical position hinted at by Hayles, Turkle, and others.

(27) Here, following Heidegger, I want to argue that there remains a location for the possibility of philosophy to explicitly question the ontological understanding of what the computational is in regard to the positive sciences. Computationality might then be understood as an ontotheology, creating a new ontological 'epoch' as a new historical constellation of intelligibility.
(27) Metaphysics, grasped ontotheologically, 'temporarily secures the intelligible order' by understanding it 'ontologically', from the inside out, and 'theologically' from the outside in, which allows for the formation of an epoch, a 'historical constellation of intelligibility which is unified around its ontotheological understanding of the being of entities' (
Thomson 2009: 150).

Why this is philosophy of computing: role is to grasp the ontic and ontological.

(28) If code and software is to become an object of research for the humanities and social sciences, including philosophy, we will need to grasp both the ontic and ontological dimensions of computer code. Broadly speaking, then, this book takes a philosophical approach to the subject of computer code, paying attention to the broader aspects of code and software, and connecting them to the materiality of this growing digital world.


2
What is Code?

Perl vs C/C++ interpreted vs compiled, pretty example but does not do much; code as textual artifact and software the running process: rather than awkwardly quoting code in academic writing, do theory in situ in code repositories.

Code as textual artifact and running program the most basic distinction.

(29) The perl poem, Listen, shown below, demonstrates some of the immediate problems posed by an object that is at once both literary and machinic (Hopkins n.d.). Source code is the textual form of programming code that is edited by computer programmers. The first difficulty of understanding code, then, is in the interpretation of code as a textual artifact. . . . The second difficulty is studying something in process, as it executes or 'runs' on a computer, and so the poem Listen has a second articulation as a running program distinct from the textual form.

Clever example of Alliance for Code Excellence version indulgences for bad code, like carbon offsets.

(30-31) Knuth is also pointing towards the aesthetic dimension of coding that strives for elegance and readability on the final code – 'good' code. . . . Code offsets allow you to program badly, but through the funding of open-source programming set aside these 'bad' practices. . . . Rather fittingly, the money raised is used to pay out 'Good Code Grants' to the open source movement to encourage more open source software.

Code understood as textual and social practices of static source code writing, testing, and distribution, implying close reading; software processual operating form, implying distant reading.

(31-32) Throughout this book I shall be using code to refer to the textual and social practices of source code writing, testing and distribution. . . . In distinction, I would like to use 'software' to include commercial products and proprietary applications, such as operating systems, applications or fixed products of code such as Photoshop, Word and Excel, which I also call 'prescriptive code'. . . . Or to put it slightly different, code implies a close reading of technical systems and software implies a form of distant reading. . . code is the static textual form of software, and software is the processual operating form.
(33) We must also not be afraid of using other technical devices to mediate our access to the code, indeed, even as Fuller problematizes the reading of 'subscopic' code (Fuller 2003:43), we can use devices to open any level of code to view (e.g. debuggers, IDEs, oscilloscopes). . . . Naturally, the use of software to view software is an interesting methodological recursion, but with reflexivity it seems to me a perfectly reasonable way of developing tools for us to understand software and code within the humanities and social sciences.

Code

Hermenutic circle using Clausewitzian sense of absolute rather than real code: is this an oppositional favoring phenomenological processes as tiresome to contemplate and complex as any other highly complex thing like a roaring flowing river or household air conditioning; Berry follows Latour for wide range multiple conceptions of software.

Conceive absolute code like Marx absolute labor and problematic like Marxian analysis of industrial production; technical, social, material, and symbolic aspects.

Absolute versus real code to capture how programmers think computationally in hermeneutic circle.

Robin Miller comment that languages influence how programmers think about tasks evident in OOO.

(33-36) Code is striking in its ability to act as both an actor performing actions upon data, and as a vessel, holding data within its boundaries. . . . In a Clausewitzian sense, I want to think through the notion of 'absolute' code, rather than 'real' code, where real code would reflect the contingency and uncertainty of programming in particular locations. Absolute code would then be thinking in terms of both the grammar of code itself, but also trying to capture how programmers think algorithmically, or better, computationally. . . . Programmers have tended to need to continually use this hermeneutic circle of understanding 'the parts', 'the whole' and the 'parts in terms of the whole' in order to understand and write code (and communicate with each other), and this is striking in its similarity to literary creativity, to which it is sometimes compared. . . . Robin Miller (creator of ML, a programming language) comments, Faced with a particular task, I think a programmer often picks the language that makes explicit the aspects of the task that he considers most important. However, some languages achieve more: they actually influence the way that the programmer thinks about the task. . . . Object oriented techniques, such as object oriented design (OOD), have certainly contributed to changing the way people think about software, but when people undertake OOD it is in the sense of absolute code. . . . So, although an expressive medium, computer languages remain constrained in what may be 'said' due to the requirements that the computer in the final instance understands it. . . . Also computer programming can be an intensely social activity in addition to the perceived loneliness of the computer programmer. . . . Code is therefore technical and social, and material and symbolic simultaneously. This is not a new problem but it does make code difficult to investigate and study, and similar to understanding industrial production as Marx explained.

Code is more than textual files that reduce to mathematical representation like lambda calculus and UTMs due to double mediation; approach similar to Sterne, Latour, others, extending from discrete object analysis to cultural context, invoking Wardrip-Fruin as another philosopher of computing. Further complicated by double mediation (Kittler on media).

Approach code in its multiplicity as literature, mechanism, spatial form, repository of social norms.

(36-37) Rather, code needs to be approached in its multiplicity, that is, as a literature, a mechanism, a spatial form (organization), and as a repository of social norms, values, patterns and processes. . . . Due to improvements over the last forty years or so, programmers can now take advantage of tools and modular systems that have been introduced into programming through the mass engineering techniques of Fordism. . . . through the study of code we can learn a lot about the structure and processes of our post-Fordist societies through the way in which certain social formations are actualized through crystallization in computer code.

Code work occurs in the mediated environment of software engineered for developing software means the appearance of code snippets on a printed page is actually a skeumorph of earlier times, and we only really experience working code through double mediation.

(37) This means that software is mediating the relationship with code and the writing and running of it. When we study code we need to be attentive to this double mediation and be prepared to include a wider variety of materials in our analysis of code than just the textual files that have traditionally been understood as code.

Definition based on sequential concept fetch and execute cycle; Manovich media performances processual context of code.

(38) Media performances, refers to the fact that increasingly our media is constructed on the fly from a number of modular components derived from a wide variety of networked sources. Code is therefore connective, mediating and constructing our media experiences in real-time as software. Code must then be understood in context, as something that is in someway potentially running for it to be code. Code is processual, and keeping in mind its execution and agentic form is crucial to understanding the way in which it is able to both structure the world and continue to act upon it.

Think about code by sequential ticks through each statement, with attention to illusion of concurrency.
(38) This, perhaps, gives us our first entry point into an understanding of code; it is a declarative and comparative mechanism that 'ticks' through each statement one at-a-time (multiple threaded code is an illusion except where several processors are running in a machine simultaneously, but even here the code is running sequentially in a mechanical fashion on each processor). This means that code runs sequentially (even in parallel systems the separate threads run in an extremely pedestrian fashion), and therefore its repetitions and ordering processes are uniquely laid out as stepping stones for us to follow through the code, but in action it can run millions of times faster than we can think – and in doing so introduce qualitative changes that we may miss if we focus only on the level of the textual.

Phenomenology of computation performed by reverse engineering discretizations, political economy, property relations, breakdowns, and moral depreciation concretized in code and software systems, although slowing down analysis not possible in systems with real-time requirements like high speed process control systems, even the Atari VCS; yet this sort of tracing analysis founds phenomenology of computation that is after the fact, reverse engineered.

Slowing down to step by step execution entry point to phenomenology of computation.

Enter phenomenology of computation by considering step by step execution, but this is only the beginning.

(38) By slowing down or even forcing the program to execute step-by-step under the control of the programmer the branches, loops and statements can be followed each in turn in order to ensure the code functions as desired.
(38-39) The external 'real' world must be standardized and unified in a formal manner which the code is able to process and generate new data and information – and this we can trace. This is where a phenomenology of code, or more accurately a phenomenology of computation, allows us to understand and explore the ways in which code is able to structure experience in concrete ways.

Code is manufactured, always unfinished projects being continually updated; thus political economy of software important; see Mackenzie.

Second phenomenological perspective foregrounds political economy of software as something manufactured.

(39) The second entry point into understanding code is that computer code is manufactured and this points us towards the importance of a political economy of software. . . . However, code needs to be thought of as an essentially unfinished project, a continually updated, edited and reconstructed piece of machinery.

Interesting argument where factory practices most resisted and master craftsmanship most preserved, in the opposite of Manovich cultural software.

(39-40) Code is also treated as a form of property in as much as it is subject to intellectual property rights (IPRs), like copyright. . . . However, for the creation of specialist software, particularly for time-critical or safety-critical industries, the literacy craftsmanship of programming remains a specialized hand-coded enclave.

Software continually breaks down; much never reaches working state, much is never used, much remains hidden in large code repositories; Chun also distinguishes these forms of code from that which executes.

Third phenomenological perspective involves irrationality and break downs.

(40-41) Thirdly, it is important to note that software breaks down, continually. . . . The implications are interesting; much software written today never reaches a working state, indeed a great quantity of it remains hidden unused or little understood within the code repositories of large corporate organizations.
(42) Software, therefore, always has the possibility of instability being generated from within its different logics. Even when it is functioning, code is not always 'rational'.

Software lifecycle includes moral depreciation of code borrowed from Marx; complexity through distributed authorship over many revisions means it is likely that nobody comprehends all of any given application.

Fourth phenomenological perspective envelopes software within its lifecycle, including moral depreciation and death, as well as technical debt.

(42) Therefore, and lastly, software, contrary to common misconceptions, follows a cycle of life. It is not eternal, nor could it be. It is created as code, it is developed whilst it is still productive, and slowly it grows old and decays, what we might call the moral depreciation of code (Marx 2004: 528).
(43) So software too can suffer a kind of death, its traces only found in discarded diskette, the memories of the retired programmers, their notebooks, and personal collections of code printouts and manuals.

The ontology of Code

Phenomenologically derived ontological characteristics from experience of programmers: habituation, structural constraints, shared knowledge.

Understanding of code through programming practices via habituation, structural constraints, shared knowledge.

(43) Nonetheless we should be clear that the ontology of code is specifiable, indeed, programmers already know what code is, qua code. For example, each of the following highlights the way in which an ontology of code is inculcated in the programmer and serves to reinforce how code is both understood and materialized as part of programming practices.

Through habituation/training/education
(44) These methods of habit become deeply ingrained in the way in which a programmer will tackle a computing problem and are demonstrated by the way in which programmers often become attached to particular programming languages, shells, editing environments or computer platforms (e.g. Unix). . . . This has been reinforced by marketing efforts which use 'fear, uncertainty, and doubt' (FUD) to encourage customer loyalty.
Through structural constraints (e.g. IDE, compiler)
(44-45) In addition to the habituation and education of programmers are the constraints offered by the programming environments themselves which can be very unforgiving. . . . Code therefore requires a very high degree of proof-reading ability in order to ensure the correctness of the program under development. . . . We can think of this as a form of prescribed literate programming.
Through a constellation of shared knowledge and practices

Likely big difference in practices between collaborative and personal projects.

(45) The source code will likely at some stage be maintained by others, consequently, programming can be an extremely social activity with shared norms as to how the code should be laid out, what are the acceptable and unacceptable ways of writing a program. . . . Techniques such as agile programming, which encourages programming with a partner, code reviews, where a committee double-checks the logic of the code, and free/libre and open source software (FLOSS) which allows anyone to download and read the code, all act to promote readability and excellence in the programming.

Microprocessors have a limited vocabulary defined by their instruction set (as microcode and/or as assembly language)
(45) Usually the processor will have an instruction set specific to it which means that a binary files (usually the form the application, such as Word, is distributed) runs only on specific processors.
Compilable and executable code is structured and formatted by precise rules

Code may compile but not work, or contain syntax errors, deprecated functions, or other flaws so it will not compile or run despite being programmatically correct; compare to Derrida refuting arbitrariness of language.

Metaphorical code

Grand narratives and cultural tropes related to metaphorical code: engine, image, communication medium, container.

(46) There is also a metaphorical cultural relation to these ontologies which have become cultural tropes that structure the way in which people understand code.

Code as an engine

Progress timeline: basic mechanical process, to which stored program computer is next stage, then multiprocessing and internetworking, then Web 2-0.

(47) The designs generated by Babbage were inspired by the use of cards to 'program' mechanical looms such as the Jacquard loom developed in 1801 to program woven patterns in power looms. This notion of computation through mechanical processes was further embedded in cultural representations by the use of a variety of mechanical devices.

Shifting notion of calculation and computation from engine to symbolic processing: what is its current trajectory?

(47-48) Whilst there were a number of key innovations in this field, finally with theoretical contributions from Alan Turing (1936), Emil Post (1936), and Claude Shannon (1938) the notion of calculation moved from a problem of arithmetic to that of logic, and with it the notion the 'information can be treated like any other quantity and be subjected to the manipulations of a machine' (Beniger 1986: 406). . . . It was also the move away from the metaphor of the engine and towards a notion of computation and symbolic processing.

Code as an image or picture

Symbols of stored program computer inspire picture meditations; finally a meaty example of quicksort algorithm as beautiful code, although no explanation of its operation so must be derivable from its own presentation.

Quicksort example of code as image or picture begs for critical study like 10 PRINT.

(48) An example of which given by Jon Bently in (Oram and Wilson 2007) in which he describes beautiful code by saying that the rule that “vigorous writing is concise” applies to code as well as to English, so [following this] admonition to “omit needless words” this algorithm to sort numbers is the result:

void quicksort(int l, int u)
{ int i, m;
if(l >= u) return;
swap(l, randint(l,u));
m=l;
for(i=l+1; i<=u; i++)
if(x[i] < x[l])
swap(++m, I);
swap(l,m);
quicksort(l, m-1);
quicksort(m+1, u);
}.

Code as a medium of communication

Docile mediation enabling. What is American version of BBC? Government access websites?

(49-50) Code here is understood as a form to facilitate communication, transformation, transfer of data and information, voice calls and all other sorts of media. . . . This communicational trope tends to connect to questions raised in media policy, such as the regulatory system in the nation state and arguments for a tacit particular broadcasting model.

Code as container

Domain of double mediation, calling for oscilloscopes and tcpdump (Kirschenbaum); also narratives of cyberspace constitution.

(50) Kirschenbaum (2004) offers an exemplary example of researching code as a container, where he undertakes a 'grammatology of the hard drive' through looking at mechanisms of extreme inscription of magnetic polarity on the hard disk platters to understand this form of 'electric writing'.

Towards a grammar of code

Weberian ideal-types of analytical categories to build grammar: data, code, delegated (source), prescriptive (software), critical, commentary.

Is this grammar of code method phenomenological, what kind of ontological division is this, any concern that not focusing on concrete historical examples defers serious engagement by philosophers of computing in working code?

(51) tenatative Weberian 'ideal-types' to help us think about the different forms or modalities of code, namely: (I) digital data structure, (ii) digital stream, (iii) delegated code, (iv) prescriptive code, (v) commentary code, (vi) code object and (vii) critical code. Ideal-types are an analytical construct that are abstracted from concrete examples. They also provide a means whereby concrete historical examples of code may be compared and allows us to consider the ways in which code might deviate from this form. . . . By creating these ideal-types I aim to unpack the different modalities of code (as a digital form) and allow us to develop our understanding of the way in which it is used and performed in computer technology.

Data
(52) Data is therefore a key element of understanding code and as an analytic category it allows us to understand the way in which code stores values and information in a form that is stable and recallable.

Code
(52) Code can be understood as the mechanism that operates upon and transforms symbolic data, whether by recombining it, performing arithmetic or binary calculation or moving data between different storage locations. As such, code is operative and produces a result (sometimes at the end of a number of sub-goals or tasks), often in an iterative process of loops and conditionals.

Delegated code (or source code)
(52-53) Code has a dual existence, as delegated code residing in a human-readable frozen state that computer programmers refer to as 'source code', and as 'autonomous' prescriptive code that performs operations and processes. . . . Nonetheless, at some point the abstractions manipulated by the programmer within delegated code will have to be translated into the necessary binary operations for the 'correct' functioning of the prescriptive code.

Prescriptive code (or software)
(53) The production of computer code at this low level would be prohibitively complex, time-consuming and slow for all but the smallest programs.

Critical code

Critical code is democratizing, liberating, and affords epistemological transparency.

(53-54) This is code that is written to open up existing closed forms of proprietary computer code, providing users the ability to read/hack existing digital streams and hence to unpack the digital data structure (see below). . . . Therefore, a requirement of critical code would be that the source/executable would be available for inspection to check the delegated processing that the code undertakes. If the source was unavailable then it would be impossible to check the prescriptive code to ensure it was not bogus or malicious software and it could not then be critical code.

Commentary Code

Hermeneutic and historical record most obvious in commentary ideal-type.

(54) These comments assist both the programmer and others wishing to understand the programming code and I introduce the ideal-type commentary code to describe these areas. These textual areas are used to demonstrate authorship, list collaborators and document changes – thus source code offers a hermeneutic and historical record in commentary code in addition to the processing capabilities of the explicitly delegated code within the file.

Digital data structure

Regarding digital data structure, embodiment of transducer and encoder matter in digitalization even if code is putatively immaterial; types include digital stream, code objects, functions and methods, network code.

(54) Digitalization is therefore the simplification and standardization of the external world so that it can be stored and manipulated within code. . . . This highlights the importance of a focus on the materiality as different embodiments fix data in different ways.

Digital stream

Digital stream as one-dimensional flow of 0s and 1s becomes core of new computational subjectivity.

(55) When computers store media content to a hard disk or other medium, the media is encoded into binary form and it is written to a binary file as a digital stream, as a one-dimensional flows of 0s and 1s. . . . The flexibility of being able to render information, whether audio-visual or textual, into this standardized digital stream form allows the incredible manipulation and transformation of information that computers facilitate. . . . This transitional quality of digital representation and storage (albeit at an often degraded resolution within digital data structures) is something that highlights the strong remedial qualities of digital representation.

Code objects

Object oriented thinking transcends need to conceive machine embodiment; immateriality in conceptual abstraction.

(56) The further the programmer is positioned from the atomic level of 0s and 1s by the programming language, the further from the 'metal' – the electrical operation of the silicon. Therefore the programmer is not required to think in a purely mechanical/electrical fashion and is able to be more conceptual.

Functions/methods
(56) These are discrete parts of code that do things, usually the processing or iterative actions on data, particularly the digital data structure. . . . Essentially, these areas of the code can be used and reused and are usually written in a general fashion to be applicable to standard data types.

Web 2.0 and network code

Technical imaginary in Lacanian/Zizekian sense?

(57) Indeed, Web 2.0 is not a technology, as such, rather it is an ideal for the way in which certain social technologies might be imagined as working together to create useful applications. It is a technical imaginary intended to create the possibility for rethinking a particular technical problem – particularly the Internet as it existed in 2004.
(59) They had designed based on existing notions of what could be done, rather than with a notion of more powerful technologies and communications being available in the future. This produced a version of the Web that had underlying protocols and plumbing of the Web which had remained within the Client-Server paradigm of exchanging information – Hypertext Transfer Protocol (HTTP) used for web pages is a good example of this.
(59-60) This discourse of Web 2.0 has spread from the technical sphere into many related areas and in this translation Web 2.0 has become firmly associated with notions of participation, social networking, sharing open source and free software, open access to knowledge and democratic politics.

Web 2.0 is no threat to Berry's methodology.

(60-61) Web 2.0, then, has to be understood as a particular constellation of code-based technologies. Although interesting in terms of its focus on increasing interactivity and real-time delivery of data, Web 2.0 does not represent something outside or beyond existing ways of understanding code, indeed, it highlights the importance of critical approaches to new movements and fashions within programming and technology.

Understanding code
(61) Any study of computer code should acknowledge that the performativity of software is in some way linked to its location in a (mainly) capitalist economy. Code costs money and labor to produce, and once it is written requires continual inputs of energy, maintenance and labor to keep functioning. This has important implications when it is understood that much of the code that supports the Internet, even though it is free software or open source, actually runs on private computer systems and networks (see Berry 2008). . . . This is currently part of the ongoing debates over the importance of Network Neutrality – whereby private networks agree to carry public and private data in reciprocal arrangements which are implemented in code.

Link to Jameson and cultural theory. Good characterization of network neutrality.

(61-62) Code constructs the relationship we have with technology, and it is here where questions of ownership, through patents and copyrights, for example, and technologically mediated control, through digital rights management, become key issues. . . . in this book I will largely bracket out the question of political economy of software.

Code lies on plane of immanent connections performing the network form (Bogost, Galloway, Callon).

(62) We sometimes find it easier to understand code through a hierarchical relationship but strictly speaking code lies on a plane of immanent connections and consequently, no code is 'bigger' or 'more important' than another, except to the extent that it has a larger number of connections. In a political economy of the information society, a more nuanced understanding of the way in which power is located in the network, for example through connections, or through protocol (Galloway 2006), demonstrates that we need to take account of the way in which software as dispositifs socio-technique (socio-technical devices) acts to perform the network form (Callon 2007).

The control operation, logotropos, dis-embeds materialities it controls.

(62-63) We must remember that we should keep in mind that code-based devices are a 'tangle, a multi-linear ensemble' (Deleuze 1992, 159). . . . Code follows directions; it 'traces' processes interacting with its own internal state . . . But this code 'abstracts' or dis-embeds a logic, whether as here analytically understood as engine, container, image or communications channel.

Code located within material devices form technical devices; Hayles MSA-compatible definition of code as computational logic located within material devices.

(63) we need to bring to the fore how code is computational logic located within material devices which I will call technical devices. Now we turn to look at some concrete examples of the materiality of code and how we can understand it as both written, read and executed.


3
Reading and Writing Code

Realization of importance of finding good examples, not necessarily instructional because they may be tedious; similar difficulty to presenting examples in scholarly and scientific writing.

(64) One of the biggest problems with trying to understand code is finding the right kinds of examples to illustrate this discussion so here I will present some examples of code that will make the code more visible and show why reading code is useful. . . . I also want to avoid a tedious programming lesson in what can soon become a rather dry subject of discussion. . . . Secondly, I have tried to be clear that when one is discussing code one should be aware that snippets of code can be difficult to understand when taken out of context and often require the surrounding documentation to make sense of it.
(65) The way in which the code is created, and tested, maintained and run, forms part of our discussion to the materiality and obduracy of code. Thus . . . we will remain attentive to code in its multiplicity, that is as a literature, a mechanism, a spatial form (organization), and as a repository of social norms, values, patterns and processes.

Tests of strength

Latour trial of strength related to software engineering test case; locate materiality in trial of strength legitimation practices as opposite of atemporal perfect state of code that appears in a textbook to demonstrate an algorithm or to be reduced to mathematical forms such as lambda calculus, logical notation, or UTM.

(65-66) To locate the materiality of code, I develop Latour's (1988) notion of 'trial of strength' introduced in Irreductions. . . . An overriding requirement is the obligation to specify the type of strength that is involved in a specific test and to arrange a testing device. . . . The notion of a test of strength is also similar to the idea of a 'test case' in software engineering, which is a single problematic that can be proved to be successful, and therefore designates the code free from that error or problem. . . . To be included in a particular 'society of code' then, the code must be legitimated (realized) through a series of tests.

Uses computer programming contests to situate his discussion of tests of strength; could also apply tests of strength to facticity of FOSS development communities for this validation.

(66) It is only after this point that the prototyping and testing phases really begins and code is written, but it remains an iterative process to construct the detailed structure and content of the required software systems.

Software development life cycle from requirements and design to alpha, beta, release candidate and gold master; note emphasizing concrete design work in life cycle reflects the hard mastery programming style at the opposite pole of which Turkle presents the bricoleur style.

(67) Each step creates physical entities (e.g. documentation) and tests that further reinforce the materiality of code.

Link symbolic level of human programmer with machinic requirements to compile and execute.

(68) Throughout these case studies, the intention is to link the symbolic level of the literate programmer with the machinic requirement of compilation and execution of the software.

Reading code
The leaked microsoft source code

Reading code example of leaked Microsoft source reveals corporate build process, hacks, role of APIs, which is developed by other authors as well.

(68) In terms of the analysis we are undertaking here, we might think of the build as an important 'test of strength' for the materiality of Microsoft Windows development software.
(69) By looking at the code in these files, an insight into Microsoft's daily build process is given.
(70) The documents also showed where Microsoft employees were required to break programming conventions and perform 'hacks', or inelegant software fixes to get around stupid, restrictive or problematic bottlenecks in the existing codebase.
(70) By reading the Microsoft source code one also begins to get connections to the political economy of software development more generally. For example, Microsoft uses certain specialized function calls called Application Programming Interfaces (APIs) which are kept private and internal to the company. These are then used by its own software which, it is alleged, give it a performance boost over its rivals third-party software.
(71) Many of the commentators remarked on the unlikely situation of anyone finding much of interest to take from Microsoft code, pointing to the difference between having the source code as a textual repository and actually getting it to compile.

Climate research code
(74) Whilst not a comment on the accuracy or validity of [Eric] Raymond's claims, this example shows the importance of seeing the code and being able to follow the logic through code fragments that can be shared with other readers.

Through example of climate resource code, question raised: does democratization of programming requires competent citizens? What does this suggest about late capitalist, surface-oriented, consumer technology comportment? Rather than programming from childhood, such a society must rely on double mediation of specialists and machine expertise to do the reading and critiquing.

(74) Not only is this a clear example of the changing nature of science as a public activity, but also demonstrates how the democratization of programming means that a large number of people are able to read and critique the code.

Writing code

Examples of writing code are based on contests.

(75) These cases also allow us to see how the materiality of code is demonstrated by abiding closely to the prescribed legitimate tests for the code being developed.
(75) The first case study is the Underhanded C Contest, an online contest that asks the contestants to submit code that disguises within fairly mundane source code another hidden purpose. The second case study is The International Obfuscated C Code Contest (IOCCC), a contest to write the most Obscure/Obfuscated C program possible that is as difficult to understand and follow (through the source code) as possible.

The Underhanded C Contest

His three examples are for a redactor program that does not really obliterate the original text. Is there any MSA equivalent, a devious text that unexpectedly evokes a particular feeling in the reader?

(76) The goal is to engineer malicious behavior that is not noticed as part of a test or code review.

Materiality of source code as text itself foregrounded in underhanded, obscure and obfuscated code.

(81) This is a technique called obfuscation and demonstrates both the materiality of source code itself, and the fact that unreadable source code can still be executable.

The International Obfuscated C Code Contest

We may joke that the obvious MSA equivalent of obfuscated code is early Heidegger and other philosophical writings.

(83) Code obfuscation means applying a set of textual and formatting changes to a program, preserving its functionality but making it more difficult to reverse-engineer.
(84) It requires an ability to not only craft a suitable program to perform a function, but to think about presentation and visual impact.
Obfuscated code examples
(93) For all these examples, however, the actual execution of the code is secondary to the textual source and therefore only demonstrates only a single side of the code/software distinction.


4
Running Code

(94) Clearly, the first step is to look at how the code runs, through a method of slowing down the code to a human time frame, secondly, using a device to examine the running code from a distance.

Analogical music by Miwa a curious example of exemplary code ethnography, later setting up reverse remediation theme; could learn more from mundane models used in Computer Organization course.

(94) In essence, this [work of Masahiro Miwa] is an attempt to follow the logic of code through a form of code ethnography, observing and watching how code functions in the activities of musicians that attempt to model their approach to music through computer code.
(96) Each of these instructions tells the computer to undertake a simple task, whether to move a certain piece of data from A to B in the memory, or to add one number to another.

Compare analysis of running code to network layer model.

(97) When we analyze running code, we clearly have to face the different levels at which code is running, which we can imagine as a number of different planes or levels for analysis. We might consider that they are made up of: (i) hardware; (ii) software; (iii) network; (iv) everyday.

The temporality of code

Clock-based computers are the norm; this introduction could be broadened to define the stored program, fetch and execute sequential binary computer, that is, von Neumann architecture to make better sense of temporality and spatiality.

(97) For machine code to execute requires that a single actor conducts the entire process, this is the 'clock' that provides the synchronicity which is key to the functioning of computer systems. . . . However, all parts of the system need to be operating according to the master clock speed if things are to be delivered to the right place at the right time.

The spatiality of code

Global address space implies mediation by networks and other processes to yield the illusion of linear spatial memory.

(98) Another curious feature of code is that it relies on a notion of spatiality that is formed by the peculiar linear form of computer memory and the idea of address space. As far as the computer is concerned memory is a storage device which could be located anywhere in the world.
(99) Understanding computer code and software is difficult enough, but when they are built into complicated assemblages that can be geographically dispersed and operating in highly complex inter-dependent ways, it is no surprise that we are still struggling to comprehend these systems and technologies as running code.

Reverse remediation
(99) Miwa is a Japanese composer who has been experimenting with a form of music that can be composed through the use of programming metaphors and frameworks.
(100) Rather than modelling within a computer space the various phenomena of the world based on the laws of physics, phenomena that have been verified within a computer space are modelled in the real world, hence the name, reverse-simulation. (Miwa 2003b)
(102) In
Matari-sama, each player acts as an individual XOR gate using their left hand with a castanet to signify a binary '1' output and a bell in the right hand to signify a '0'.

Compare Miwa reverse-simulation examples of logic gates to symposia simulating virtual reality.

(103) Here, the musicians are acting as if they were running 'autonomous' prescriptive code performing as individual logic gates performing logic operations based on the internal logic operations defined by XOR. . . . But crucially, it should be possible, at least theoretically, to follow through each step of the process unfolding, rather as one would if debugging computer software.
(104) It is interesting to note that the music generated sounds like an idealized version of what one would assume the internals of computer circuitry might be.

Running code and the political
(107) After this close reading of Miwa's work, I now undertake a brief distant reading of running code, through the example of an e-voting system.
(107-108) This analysis has two aspects, recognizing both the need for engagement with the problematic of computer code itself as an object which materializes political action and policy, but also the relation between the technical objects themselves and an increasing visibility in political discourse.
(108) Voting is therefore a particular political activity that due to its explicitly quantitative form is particularly thought to be suited to reform through the application of technical methods.
(109) In some sense it might be argued that this is a form of quasi-citizenship that rests on a procedural notion of politics whereby certain technically mediated processes legitimate 'being political' is increasingly realized through certain technical forms, whether voting through e-voting systems, or deliberation and debate through real-time streams such as Twitter and Facebook.
(111) I want to look at the way in which voting is translated into technical representations of the right to vote, and how these are instantiated in computer code by a reading of code that is available as FLOSS software in the Google Code Repository.

Is this the only reference to Campbell-Kelly?

(111) (note 9) Essentially leaving the nation state open to the desire of the manufacture to encourage software updates in a similar manner to the rest of the software industry (see Campbell-Kelly 2004).
(112) The systems I have chosen to focus on in this chapter are free/libre and open source software (FLOSS) systems (Berry 2008). FLOSS are surrounded by an important form of software practice that is committed to openness and public processes of software development. This means that the groups involved in FLOSS projects typically place all the source-code for the project and documentation online in an easily viewable and accessible form.
(111) (note 10) The kinds of documentation that are useful for programming include: requirements specification, flowcharts and diagrams, formal language specifications (e.g. UML, Z) and test/use cases. There is also documentation within the source code called 'comments' written by the programmers to help others understand the code.

Include issue tracking, news feeds and forums among FLOSS cultural objects besides voting machinery.

(113) We might think here of the relation between the ability to read the structures and processes of the voting system as presented in the FLOSS source code as transparent e-voting as opposed to the dark e-voting which is given in proprietary systems.
(113) VoteBox is described as a tamper-evident, verifiable electronic voting system created by researchers in the Computer Security Lab at Rice University (VoteBox 2009d).
(113) The way in which a voter acts is visualized by means of a process or flow-chart.

Subject position of user in voting machinery; contrast system-centric, idealized voter to user-centric design (Norman, Johnson, Barker).

(114) As we discuss below, this idealized voter constantly seeps into the source code in a number of interesting ways, captured in the shorthand used in the commentary code and documentation. Although here we do not have the space to go into the interesting gender assumptions that are being made, they demonstrate how programming commentary is important to analyze and take into account as part of the assemblage that makes up running software systems.
(115) in the particularly discourse of computer programming one notes the key dichotomy creates between the programmer and the user, with the user being by definition the less privileged subject position. The term user also carries a certain notion of action, most notably the idea of interactivity, that is that the user 'interacts' with the running software interface in particular circumscribed ways.
(116) A revealing moment by the programmer in this example, demonstrates that a particular gender bias is clearly shown when the programmer refers to the request of the 'voter' to challenge 'his' vote.
(116) Perhaps even more interesting, is the inability of the user-voter to cast a spoilt ballet, whether as an empty ballot, or a ballot that has more than one candidate selected.
(117) All of these small technical decisions act together to format the voting practice and provide a given set of processes and digital objects that are associated with them. They thus act to stabilize a particular instantiation of the voting process, rendering it more legitimate and material than other forms, additionally through the use of prescription the software, in effect disciplines the voter to act in particular ways (e.g. vote only once and select just one candidate) and circumscribes other forms of voting (e.g. spoiling the ballot paper, leaving it blank, throwing it away, taking it home, etc.).

Software makes docile voters (users); e-voting and so many other social transactions and networks depend Latour immovable mobiles that are doubly mediated by computer technologies where data seems immaterial but is always instantiated in something.

(117) Secondly, the voter must now rely on the correct inscription of their vote within the material substrates of the computer software and hardware and these represent what Latour (2007) called 'immovable mobiles', that is that the vote remains stabilized throughout its passage from the booth to the data collection systems (the supervisor in this case) and then on to when it is expressly counted. In the case of paper, there is always a paper trail, that is the vote can always be followed through the process by the human eye. In the case of software, the vote is encrypted and signed, such that this digital signature can indicate whether the vote has been changed or tampered with, however, once cast into the digital the only way to follow the vote is through its mediation through other software tools.

Would a debugger example, which merges the close and distant readings, have been too tedious for this chapter on running code; is the book form itself holding back much richer approaches?

(117-118) In this chapter, I have looked at some examples of how to analyze running code, namely through either a form of close (code in action) or distant reading (software in action).
(118) To look into this further and to broaden and deepen the question of code, I now turn explicitly to the questions raised through a phenomenological understanding of the computational, through a discussion of the work of Martin Heidegger, through a phenomenology of computation.


5
Towards a Phenomenology of Computation

Bernard Stiegler interpretation of Heidegger, Simondon platform, and Wilfred Sellars phenomenology: materiality of code as it is tied to phenomena, whether prescriptively creating it or being part of it, must be understood in terms of not only its potentialities as a force, but also as a platform, only ever partially withdrawn (unreadiness-to-hand).

(119) I want to explore the idea that technology is actually only ever partially forgotten or 'withdrawn', forcing us into a rather strange experience of reliance, but never complete finesse or virtuosity with the technology.
(119-120) I want to develop the argument that we should not underestimate the ability of technology to act not only as a force, but also as a '
platform'. This is the way in which the loose couplings of technologies can be combined, or made concrete (Simondon 1980), such that the technologies, or constellation of technologies act as an environment that we hardly pause to think about. . . . But should we lose the phone then we have lost not just the list of numbers, but also the practiced habits of how we used to find information about our friends. . . . Further, when leaving Facebook due to the closed nature of the technology it is very difficult to extract your contacts, in effect meaning that Facebook attempts to hold onto your friends in order to hold onto you. Code is therefore used as a prescriptive technology.

Phenomenological exploration of experience of digital technology.

(121) To look at the specific instance of computation devices, namely software-enabled technologies, I want to make a particular philosophical exploration of the way in which we experience digital technology. This is a method called phenomenology, and as such is an approach that keeps in mind both the whole and the parts, and that is continually reminding us of the importance of social contexts and references (i.e. the referential totality or the combined meaning of things). . . . So here we need to explore the way in which we can both know our way around technologies, but also the way in which technologies can shape what it is possible for us to know in the first place.

Key part of the book on thinking computationally delegating classes of cognitive duties to technical devices: knowing-how versus knowing-that; connect Zuboff to Applen and McDaniel rhetorical XML; study materiality concretized in instrumentation, too; it is not completely hidden, and in fact must reveal its interface to be usable. Computational comportment different than pancomputationalism; key concern is substitution of knowing-that for Dasein rather than potential for immortality or finding the algorithm behind everything (like freakonomics?).

(121-123) In this chapter, then, I want to understand in the broadest possible sense how to know one's way around computationally with respect to things in the world. . . . In this case, one must have an embodied set of practices that frame and make available necessary knowing-that, towards which one is able to computationally know one's way around. . . . Technical devices are delegated performative and normative capabilities which they prescribe back onto humans and non-humans. That is, a person lives in the midst of technical beings that have specific forms of agency, or as Zuboff (1988) states, 'technology . . . is not mute. . . . Information technology not only produces action, but also produces a voice that symbolically renders events, objects, and processes so that they become visible, knowable, and shareable in a new way (Zuboff 1988: 9). For example, knowing-how to browse the world wide web, or knowing-how to use a satellite navigation system in a car calls for the user to think computationally in order to transform the inner state of the device such that it performs the function that is required of it. . . . But to fully interact with technical devices running code one is further encouraged to have some technical knowledge and an understanding of the collection of electronic resources that stand behind them. . . . So, when one views the world computationally, one is already comported towards the world in a way that assumes it has already been mapped, classified, digitzed.
(123) The exemplar is perhaps the augmentation technologies that attempt to re-present reality back to the user via a picture of the world which is automatically overlaid with the result of computational geodata, tags and other content.

Heidegger circumspection mixed with symbolically sophisticated non-human actors yields unreadiness-to-hand phenomena, making evident how Berry casts danger inherent in technology.

(124-126) In the case we are discussing here, the contemporary milieu is suffused with technical devices with which we have to develop a familiarity if we are to be at home in the world. . . . Computation has moved from a small range of activities to a qualitative shift in the way in which we engage with knowledge and the world which highlights how important an understanding of the computational is today. . . .Of course, we have always used devices, mechanical or otherwise, to manage our existence, however, within the realm of digital computational devices we increasingly find symbolically sophisticated actors that are non-human. . . . This [example of Google instant search] demonstrates the very lack of withdrawal or semi-withdrawal of computational devices . . . this is the phenomena of 'unreadiness-to-hand' which forces us to re-focus on the equipment, because it frustrates any activity temporarily (Blattner 2006:58), that is that the situation requires deliberate attention.
(127) The critical question throughout is whether 'computation' is a concept seemingly proper to knowing-that has been projected onto knowing-how/Dasein and therein collapses the distinction between knowing-how and knowing-that hence inducing the substitution of knowing-that for Dasein.

Phenomenology and computation

Hegemony of computational understanding, decentered fragmentary subjectivity unified by devices; suggests Clark, Hayles, especially Turkle Alone Together.

(128-129) I want to suggest that what is happening in the 'digital age' is that we increasingly find a computational dimension inserted into the 'given'. Or better, that the ontology of the computational is increasingly hegemonic in forming the background presupposition for our understanding the world. . . . Life experiences, then, become processual chains that are recorded and logged through streams of information stored in databanks. Experience is further linked to this through a minimal, decentered and fragmentary subjectivity which is unified through the cognitive support provided by computational devices which reconcile a 'complete' human being. . . . We might say that these devices call to us to have a particular computationally structured relationship with them. . . . In sum, computer scientists attempt to transform the present-at-hand into the ready-to-hand through the application of computation.

Time-sharing operating system example of transformation of present-at-hand into ready-to-hand; television and Atari VCS do similar work.

(129-130) What Sellars is trying to draw our attention towards is the contradiction within the two images, whereby the manifest image presents a world of flow, continuous and entangled experiences, and the scientific image postulates a world of discrete elements, particles and objects. . . . In effect, computation aims to perform this task by fooling our senses, assembling the present-at-hand objects together at a speed that exceeds our ability to perceive the disjunctures.

The computational image

Add computational to Sellars scientific and manifest image; per Harman, is materiality implied if do not fully withdraw?

(131-132) Where Heidegger contrasts universe and world, and for Sellars this indicates the scientific and the manifest image, here I want to think through the possibility of a third image, that of the closed 'world' of the computer, the computational image. . . . Here, I want to connect the computational image to Heidegger's notion of equipment, but crucially, I want to argue that what is exceptional about the computational device is that unlike other equipment which is experienced as ready-to-hand, computational devices do not withdraw, rather they are experienced as radically unready-to-hand.

Proposal to develop Lyotard distracted consciousness stream, Deleuze and Guatarri schizophrenic to computational way-of-being, perhaps as Turing super-cognition and Clark extended cognition, enabling exteriorization of cognition and reflexivity, while at the same time being careful to avoid screen essentialism.

(133-135) This unreadiness-to-hand, Heidegger argues, is a kind of partial present-at-hand (i.e. scientific image) which forces dasein to stop coping and instead sense a contextual slowing-down which Heidegger calls conspicuousness. . . . This places dasein in a relationship of towards-which that maximizes the experience of conspicuousness perceived as a constant series of pauses, breaks, and interruption. . . . Both conspicuousness and obtrusiveness, I want to argue, create a fragmentary and distracted flow of consciousness which, following Lyotard (1999: 5), I want to call a 'stream' and Deleuze and Guatarri (2003) call the schizophrenic. . . . Of course this is the other side of the coin in that in the historical specificity of the computational way-of-being is offered the revolutionary potential of this recurring experience of infrastructural emancipation in a distributed notion of cognitive support through social-technical devices.
(135-136) For a computational device any withdrawal is partial, as it requires constant attention to keep it functioning and 'right' for the task it is to assist with, that is, a computational device remains in a state of conspicuousness. . . . For Stiegler, it is the ability to place memory outside the body in some material form that gave rise to the possibility of reflexive thought and is a key aspect of how we came to be human in the first place. Here, I want to make the connection with the way in which modern computational technology is enabling the exteriorization of cognition and reflexivity itself.
(136-137) Within the domain of the computational processes, in the interstices between the manifest image and the digital representation is the possibility for the monitoring of and if necessary the realignment of the commands of the user. . . . With software, however, the incorporeal transformation requested may not have been carried out, but the user is convinced by the software display that is has done so. This is a vicarious relationship, that is a relationship whereby following the command (order-words), the user transacts with the code to execute the action. . . . This again highlights the importance of avoiding a screen essentialism if we are to open the black box of computational devices.

Vicarious transformations

Heim on vicarious causation relates to Turkle on surface enjoyment, what Berry calls screen essentialism, due to double articulation, and Hayles investigates through Oreo models of PET scan: these phenomena harbor Bogost units containing universes, and double articulation entails materiality of interface and substrate as well as the properly immaterial virtual reality of representational space within the device.

(138-139) In other words, there is no direct contact with our phenomenal reality and that represented within the computational device except through the interfaces, computer code, and input devices that mediate it, such as a mouse and a windowing system. As Heim (1987) explains: The writer has no choice but to remain on the surface of the system underpinning the symbols. . . . Digital entities can then be said to have a double articulation in that they are represented both spatially within our material universe, but also with the representational space created within the computational device – a digital universe. The computational device is, in some senses, a container of a universe (as a digital space) which is itself a container for the basic primordial structures which allow further complexification and abstraction towards a notion of world presented to the user.

Loose coupling network layers another good example of loosely independent connections thought in terms of Heidegger Gelassenheit, with abstraction taking place all the way down.

(140) The computational device is an unstable form of equipment that must continually gather and reinforce its equipmental qualities against a hostile world of breakdown. This is then repeated through numerous layers of software that serve to create inner unstable universes within which further abstraction takes place, all the way down. Crucially though, the agency of each universe is loosely independent and defined at its creation in computer code by a series of constraints which serve as a framework within which the new abstract layer must function. Each layer promises uncertain affordances to the latter, eventually culminating in the partial affordance offered to the user through a risky encounter with a vicarious transformation which here I argue is radically unreadiness-to-hand.
(140) This loose coupling of the user and the computational technical device offers possibilities that may be thought of in terms of Heidegger's notion of
Gelassenheit. For Heidegger, Gelassenheit is a particular type of relationship with technical devices that is a letting go, 'serenity, composure, release, a state of relaxation, in brief, a disposition that “lets be” (Arendt 1971). . . . This points towards the possibility of a relationship with technology that is not built of the will to power by virtual of the impossibility of control in a system that exceeds the comprehension of a human subject, this will be explored in the next chapter.
(141) In some senses then we might start to speculate on the nature of the computational image as a form of cultural analog-digital/digital-analog converter that translates entities between the manifest and scientific images but does so in an uneven and fragmentary way.

Switching costs of unready-to-hand technological comportment of running software affects contemporary subjectivity.

(141) By way of conclusion, I suggest that by thinking about computationality, in particular code and software, as unready-to-hand, helps us to understand the specific experience of our increasingly code saturated environment. Linked to this is the notion of a distributed form of cognition (we might think of this as a database of code enabled cognitive support), which we can draw on, like Google Instant, but which remains unready-to-hand. That is, that it causes us to suffer switching costs, which, even if imperceptively, change our state of being in the world.


6
Real-Time Streams

Riparian habitus of real time streams for new notion of subject, watching at multiple levels one component of digital literacy (ever just watch tcpdump), perhaps constituting narratives; makes sense that next type of philosophical production informed by technological imaginary (Zizek).

(142-145) This has traditionally been a rather static affair, however, there is evidence that we are beginning to see a change in the way in which we use the web, and also how the web uses us. This is known as the growth of the so-called 'real-time web' and represents the introduction of a technical system that operates in real-time in terms of multiple sources of data red through millions of data streams into computers, mobiles, and technical devices more generally. . . . In essence, the change represents a move from a notion of information retrieval, where a user would attend to a particular machine to extract data as and when it was required, to an ecology of data streams that forms an intensive information-rich computational environment. . . . The real-time stream is not just an empirical object; it also serves as a technological imaginary, and as such points the direction of travel for new computational devices and experiences. . . . The new streams constitute a new kind of public, one that is ephemeral and constantly changing, but which modules and represents a kind of reflexive aggregate of what we might think of as a stream-based publicness – which we might cal riparian-publicity. Here, I use riparian to refer to the act of watching the flow of the stream go by. . . . To be a member of the riparian public one must develop the ability to recognize patterns, to discern narratives, and to aggregate the data flows. Or to use cognitive support technologies and software to do so. . . . In a sense, one could think of the real-time streams as distributed narratives which, although fragmentary, are running across and through multiple media, in a similar way to that Salman Rushdie evocatively described in Haroun and the sea of stories. . . . These technologies may provide a riparian habitus for the kinds of subjectivity that thrives within a fast moving data-centric environment, and through a process of concretization shape the possibility of thought and action available.

Materiality of code inscribed in programmers through long habituation, internalized to point of dreaming (Rosenberg), whereas materiality of software inscribed in users, for example multitasking synaptogenesis.

Building computational subject as stream from Lyotard fables, Massumi affective fact, software avidities, Husserlian comet, processing multiple streams at once (Aquinas).

Restructuring post-human subjectivity riding on top of network of computationally-based technical devices the key point of the book: is it a phenomenological result?

(145-149) The question now arises as to the form of subjectivity that is both postulated and in a sense required for the computational subject. . . . To do this, I want to look at the work of Jean-Francois Lyotard, a French philosopher and literary theorist, especially his ideas expressed in Postmodern Fables. . . . using the fable as an exploratory approach. . . . This concept of the stream as a new form of computational subjectivity also represents a radical departure from the individualized calculative rationality of homo economicus and tends rather toward the manipulation of what Brian Massumi calls 'affective fact', that is through an attempt to mobilize and distribute the body's capacity to think, feel and understand (either through a self-disciplinary or institutional form). . . . A link is formed between affective and empirical facts that facilitates and mobilizes the body as part of the processes of a datascape or mechanism directed towards computational processes as software avidities, for example, complex risk computation for financial trading, or ebay auctions that structure desire. . . . This notion of computationally supported subject was developed in the notion of the 'life-stream' [by Freeman and Gerlernter]. . . . This is a life reminiscent of the Husserlian 'comet', that is strongly coupled to technology which facilitates the possibility of stream-like subjectivity in the first place. . . . This is the restructuring of a post-human subjectivity that rides on top of a network of computationally-based technical devices. This notion of a restructured subjectivity is nicely captured by Lucas (2010) when he describes the experience of dreaming about programming. . . . This is the logic of computer code, where thinking in terms of computational processes, as processual streams, is the everyday experience of the programmer, and concordantly, is inscribed on the programmer's mind and body. The particular logic of multiple media interfaces can also produce a highly stimulated experience for the user, requiring constant interaction and multi-tasking. . . . [quoting Richtel 2010] Going back a half-century, tests had shown that the brain could barely process two streams, and could not simultaneously make decisions about them. . . . This can change the very act of writing itself, as Heim writes.

Being a good stream

Lyotard focus on speed up and technical time of the computer; examples of Nietzsche and Kittler.

(150-151) In Postmodern Fables he is expressly interested in technology's ability to speed up the exchange of information to such an extent that critical thought itself might become suppressed under the quantity of information. . . . This improvement in the 'efficiency' of the individual recalls Marx's distinction between absolute and relative surplus value and the importance to capitalism of improvements in both organizational structure and technological improvements to maximizing profit (Marx 2004: 429-38). . . . It is the reduction in total time between the inputs and outputs of a process that Lyotard is drawing attention to as, following Marx, 'moments are the elements of profit' (Marx 2004: 352). In the computational, the moments are not measured in working days or hours, but rather in the 'technical time' of the computer, in milliseconds or microseconds.
(150) (endnote 1) Nietzsche was the first German professor of philology to use a typewriter; Kittler is the first German professor of literature to teach computer programming (Kittler 1999: XXXI).

Materializing the stream

No computation without inscription, and some material apparatus, even if ultimately part of universal computer; compare Kittler Aufschreibesystem to Sterne transducer.

(151-152) To be computable, the stream must be inscribed, written down, or recorded, and then it can be endlessly recombined, disseminated, processed and computed. . . . The consistencies of the computational stream are supported by aggregating systems for storing data and modes of knowledge, including material apparatuses of a technical, scientific and aesthetic nature (Guattari 1996: 116).

Role of technical objects in preference formation goes beyond mediating influence to structural foundation and efficient cause: Doel excess, Latour plasma, Kittler time axis manipulation..

(152) Here, I am not thinking of the way in which material infrastructures directly condition or direct collective subjectivity, rather, the components essential 'for a given set-up to take consistency in space and time' (Guattari 1996: 117). We might think about how the notion of self-interest is materialized through technical devices that construct this 'self-interest', for example, through the inscription of accounting notions of profit and loss, assets and liabilities, which of course increasingly take place either through computer code which is prescribed back upon us.
(152) In other words, computational data is artifactualized and stored within a material symbolization. . . . We have the assemblage of a network which builds the material components into an alliance of actors and which is a referential totality for the meaning that is carried over it, and past its borders, policed by human and non-human actors, we have what
Doel (2009) calls excess and Latour (2005) calls plasma.
(152-153) This [Kittler time axis manipulation] is the storing of time as space, and allows the linear flow to be recorded and then reordered.

Challenges to liberal humanist individual, noting Heidegger authentic time versus time of the computational stream; bounded rationality replaced by extended cognition, thus appeal to visual rhetoric to comprehend big data (Manovich).

(153) This notion of the computational dataspace is explicitly linked to the construction of the stream-like subject and raises many important questions and challenges to the liberal humanist model of the individual. Most notably in their bounded rationality – here the information and processing to understanding is off-loaded to the machine – but also in the very idea of a central core of human individuality. It also returns us to the question of digital bildung and how we structure the kind of education necessary in a computationally real-time world.
(154) Heidegger considered 'authentic' time to be time in relation to death, as finitude and mortality. In the time of the computational stream, however, time is found in the inauthentic time of measurement, the attempt to determine the 'undetermined' through technical devices.
(154-155) These streams are undoubtedly creating huge storage issues for the companies that will later seek to mine this collection of streamed data. . . . clearly with the amount of data available the skills of a visual rhetoric will become increasingly important to render the patterns in the data meaningful.
(155) Users treat their lives as one would a market portfolio, constantly editing the contents through buying and selling, creating new narratives through the inclusion or exclusion of certain types of product or data stream.

Financial streams
(156-157) Financialization has implicit within it, certain ways of acting, certain ways of being and certain ways of seeing that are connected to a particular comportment of the world, one that is highly attenuated to notions of leverage, profit and loss and so forth. . . . Here, I follow the work in the sociology of markets to understand financialization as the uneven process of formation of a socio-technical network that is used to stabilize a certain kind of calculative cognitive-support, that mediates the self and the world through financial practices, categories, standards and tests (Callon 1998). More importantly, I want to link the processes of financialization to the creation of rapidly changing data streams of financial information.

Callon socio-technical network stabilizing financial subjectivity using Deleuze agencements (see Hayles on high frequency trading), focusing attention with an extended mind, cyborg subjectivity composite Latour plug-ins.

(157) A financialized assemblage is connected together through the use of equipment or financial computational devices (what Deleuze would call agencements) whose aim is to maintain an anticipatory readiness about the world and an attenuated perception towards risk and reward which is mediated through technical affective interfaces (i.e. the computer user interface). . . . But none of these practices of intensification could have been possible without information technology, which acts as a means of propagation but also a means of structuring perception – or better, of 'focusing' attention in the sense of an extended mind.
(158) Rather, you can obtain a complete human being by composing it out of composite assemblages which is a provisional achievement, through the use of computer cognitive support (what Latour (2005) neatly calls 'plug-ins') and we might think of as software interfaces or technical devices.
(158) Wherever the investor is, they are able to call up the portfolio and judge their asset worth as defined by the external forces of the financial markets but crucially simplified and visualized through the graphical capabilities of the mobile device.
(159) In the case of financial markets, software has completely changed the nature of stock and commodity markets creating 24 hour market trading and enabling the creation of complex derivative products and services, often beyond the understanding of the traders themselves.

Guattari processual, device-dependent subjectivity; possible alternative computational theory of mind inclusive of Derridian archive checking against key arguments about delegating classes of cognitive duties to technical devices; involvement of code situated materially significant even if its goal is to strive to erase spatiality and temporality as in financial systems.

(160) Software that acts in this cognitive support capacity can therefore be said to become a condition of possibility for a device-dependent, co-constructed subjectivity. . . . Following Lyotard, we might declare that the subject becomes a computational 'stream', in this case a stream attenuated to the risk associated with finance capital mediated through financial software.

Compare Gaussian risk to Zizek chocolate laxative; also latent risk in software bugs, sloppy integration, poor object modeling.

(161) Risk itself becomes mediated through software and becomes a processual output of normative values which are themselves the result of computational processes usually hidden within the lines of computer code. . . . Indeed, it is this misunderstanding of risk that Taleb (2007) blames for the huge leveraged asset bubble in 2007-2009 and the resultant financial crisis.
(161-162) Here, we can think of the external management of the internal perception of time that is linked to a form of Heideggerian angst towards a future event – sickness, old age, and so forth – which provides a new affective fuel source for capitalism.
(162) This life stream is therefore a performative subjectivity highly attenuated to interactivity and affective response to an environment that is highly mediatized and deeply inscribed by computational datascapes.

Consider other manifestations of computationalism such as through deliberate exercises for intuiting machine embodiment.

(162) From the material experience of the financialized user of code, both trader and consumer, to the reading and writing of code, and then finally to the execution and experience of code as it runs on financial trading systems, we need to bring to the fore how code is a condition of possibility for a computational stream whether of financial news and data, or of a datastream cognitive support for everyday life.

Lifestreams

Lifestreams are Kitchin and Dodge capta trails, and can be studied phenomenologically for their impact on everyday life.

(162-164) I now want to look at the practice of creating lifestreams, particularly through the example of Twitter. . . . Particularly as politicians and the media have caught on to the unique possibilities generated by this rapid communicational medium. Designated as solipsistic and dismissed at first by the pundits, the growth in Twitter's use has meant that it can no longer be ignored and indeed it has become a key part of any communication strategy for politics, corporations and the media more generally. . . . There is an increasing need for a cartography of both the production and empirical content of a number of these collaborative, streamed institutions and their recording of political events, power and interests.
(164) They also have the capacity to create a form of social contagion effect whereby ideas, media and concepts can move across these networks extremely quickly. . . . As such, real-time streams presents an excellent opportunity for tracing the impact of computational real-time devices in everyday life and the way in which they capture the informal representations of issues with which contemporary communities are becoming increasingly concerned.
(164) The network effects combined with the vast amount of information flowing through the network are difficult for the user to understand.

Huge distributed machine cognized memory of lifestreams; think of how machine subjectivity arose in science fiction series Caprica from avatars.

(165) But it is the technology that makes up Twitter that is a surprising: a simple light-weight protocol that enables the fast flow of short messages. . . . Increasingly, we are also seeing the emergence of new types of 'geo' stream, such as location, which give information about where the user is in terms of GPS coordinates, together with mixed media streams that include a variety of media forms such as photos, videos and music. . . . But this is not just a communications channel, it is also a distributed memory system, storing huge quantities of information on individuals, organizations and objects more generally.
(166) These streams are fascinating on a number of different levels, for example questions remain over the way in which national identity might be mediated through these computational forms in terms of an imagined community composed of twitter streams that aggregates institutions, people and even places.
(167) In the final section, I want to shift focus and consider the wider implications of thinking-streams, computer code and software.

Subterranean streams

Default ontological insecurity, inability to distinguish knowing-how and knowing-that; relate to Turkle robotic moment of being alone together.

(167) This could lead to a situation in which the user is unable perceive the distinction between 'knowing-how' and 'knowing-that' relying on the mediation of complexity and rapidity of real-time streams through technology. This Heidegger would presumably describe this as dasein no longer being able to make its own being an issue for itself. . . . Indeed, if these computational devices are the adhesives which fix the postmodern self into a patterned flow of consciousness (or even merely visualized data), an ontological insecurity might be the default state of the subject when confronted with a society (or association) in which unreadiness-to-hand is the norm for our being-in-the-world.

Heideggerian danger includes reclassifying entities from persons to objects, so seeking to promote gathering; thus our mission as philosophers of computing is digital Bildung (self-cultivation), fostering super-critical, versus sub-critical and acritical subject positions.

(167-169) To return to the question from Sellars and reframe it: it still remains difficult to reconcile the homogeneity of the manifest image with the non-homogeneity of the scientific one, but we have to additionally address the unreadiness-to-hand of the computational image which offers the possibility of partial reconciliation through uncertain affordances. . . . This would represent the final act in a historical process of reclassification of entities from persons to objectspotentially, dasein becoming an entity amongst entities, a stream amongst streams – with challenging political and cultural implications for our ability to trace the boundary between the human and non-human. . . . This is where the importance of digital Bildung becomes crucial, as a means of ensuring the continued capability of dasein to use intellect to examine, theorize, criticize and imagine. . . . Instead, we should be paying attention to how computation can act as a gathering to promote generative modes of thinking, both individually and collectively, through super-critical modes of thinking created through practices taught and developed through this notion of digital Bildung.
(169) Understanding software is a key cultural requirement in a world that is pervaded by technology, and as Vico argued, as something made by humans, software is something that can and should be understood by humans.

Final thought is Serres-inspired parasite subjectivity in symbiotic relationship to enormous machinery generating digital standing reserve, although I see flaw in this image because passing through underground cavity to surface waters involves reduction passing through porous solid material like sand, losing coherence of human navigating cyberspace as on a surfboard or automobile. Uncanny coincidence that the book ends with Lyotard's reference to subterranean streams when I am contemplating life along the Santa Fe river to build on Ulmer's Florida School precedent. However, this choral reading elides other significant parts of this scholarship, such as super critical cognition involving Clark stretching to theorists outside Berry galaxy of meaning.

Promotes an ethic of being a good stream, whose Serres parasitic subjectivity, although requiring thoughtful comportment to computer technologies, does not endorse outright learning and practicing programming as itself critically important to being a good stream the way literacy did for prior generations, or as a substantial component of humanities scholarship intellectual labor.

(170-171) In the spirit of Lyotard's expression of an aesthetics of disruption, however, I want to end the book with an elusive ought. This is an ought that is informed by a reading of Aesop's Tales through Michel Serres and his notion of the parasite (Serres 2007). The parasite is used not as a moral category, but in connection with an actor's strategic activities to understand and manipulate the properties of a network. . . . The question of who this subject 'eats next to', is perhaps reflected in the way in which streams pass through other streams, consumed and consuming, but also in the recorded moments and experiences of subjects who remediate their everyday lives. This computational circulation, mediated through real-time streams, offers speculative possibilities for exploring what we might call parasitic subjectivity. Within corporations, huge memory banks are now stockpiling these lives in digital bits, and computationally aggregating, transforming and circulating streams of data – literally generating the standing reserve of the digital age. Lyotard's (1999: 5) comment to the streams that flow through our postmodern cultural economies seems as untimely as ever: 'true streams are subterranean, they stream slowly beneath the ground, they make headwaters and springs. You can't know where they'll surface. And their speed is unknown. I would like to be an underground cavity full of black, cold, and still water'.


Berry, David M. The Philosophy of Software: Code and Mediation in the Digital Age. Basingstoke, Hampshire: Palgrave Macmillan, 2011. Print.