Notes for Alexander Galloway Protocol: How Control Exists after Decentralization

Key concepts: abstract, anti-federalism through universalism, auctionism, biopower, diagram, distributed network, dividuated, hacker, hyperlinguistic, institutional ecologies, layering, moment of disconnectivity, information organisms, net.arg, network, ontology standards, open source, OSI Reference Model, packet-switching, periodization, political technologies, protocol, Request For Comments, tactical media, tiger team, virtual.


Related theorists: Baran, Barthes, Bazin, Bergson, Ted Byfield, Frederick Cohen, Manuel DeLanda, Deleuze, Derrida, Enzensberger, Foucault, Marina Grzinic, Hardt, Hayles, Jameson, Kittler, Lessig, Levy, Lovink, Marx, Negri, Sadie Plant, John Postel, Stallman, Bruce Sterling, Sandy Stone, Turkle, Wiener.

Foreword: Protocol Is as Protocol Does
Eugene Thacker

(xii) Recent discussions of the post-industrial society, the information society, the network society, disciplinary society, control society, informationization, scale-free networks, small worlds, and smart mobs are all ways of attempting to understand how social change is indissociable from technological development (research, design, use, distribution, marketing, naturalization, consumption)--though not determined by it. This last point is crucial. If one is to foster an understanding and awareness of how the social and the political are not external to technology, then it is important to understand how the technological is in some sense isomorphic to the social and the political.

Isomorphism of technological and social/political; therefore material understanding of technology.

(xii) Throughout the discussions on power, control, and decentralization, Protocol consistently makes a case for a material understanding of technology. . . . In short, the technical specs matter, ontologically and politically. . . . Code is a set of procedures, actions, and practices, designed in particular ways to achieve particular ends in particular contexts. Code = praxis.

Practical technical understanding for experimentation rather than explanation helps ground critical programming studies for Applen and McDaniel theorist-practitioner approach.

(xiii) Protocol puts forth an invitation, a challenge to us: You have not sufficiently understood power relationships in the control society unless you have understood “how it works” and “who it works for.” Protocol suggests that it is not only worthwhile, but also necessary to have a technical as well as theoretical understanding of any given technology. “Reading” code is thus more programming or development or debugging than explanation. In this sense, Protocol aims less to explain the society of control than to experiment with it; in fact, it might just as well be subtitled “experiments with code.”

Networks Are Real but Abstract

Code always process based.

(xiii) The first point is that networks are not metaphors. . . . Networks are not tropes for notions of “interconnection.” They are material technologies, sites of variable practices, actions, and movements. . . . A code, in the sense that Protocol defines it, is process-based: It is parsed, compiled, procedural or object-oriented, and defined by ontology standards.

Abstract but material in sense of Bergson virtual; networks are not metaphors but material materializing media.

(xiv) When the book suggests that networks are not metaphors (or not merely metaphors), the dichotomy is not one between material and immaterial, but rather between two types of “abstract.” . . . An abstract that is real is a potential. (Henri Bergson uses the term “virtual” for the immanent unfolding of duration as potentiality.) . . . Rather, this abstract-but-real is the network that is always enacted and always about to enact itself. . . . The network as real-but-abstract may involve “information” as an immaterial entity, but that information always works toward real effects and transformations, no matter how localized.
(xiv) Thus, in an important way, networks are not metaphors.

Answers question how does the Internet work with analysis of TCP/IP and DNS.

(xv) Understanding networks not as metaphors, but as materialized and materializing media, is an important step toward diversifying and complexifying our understanding of power relationships in control societies. . . . This is what Protocol does. It asks how a particular type of network functions—the information networks that undergird the Internet. It shows how a network is not simply a free-for-all of information “out there,” nor is it a dystopia of databanks owned by corporations. It is a set of technical procedures for defining, managing, modulating, and distributing information throughout a flexible yet robust delivery infrastructure. . . . It is constituted by a bi-level logic that Protocol explains. . . . Understanding these two dynamics in the Internet means understanding the essential ambivalence in the way that power functions in control societies. . . . To grasp “protocol” is to grasp the technical and the political dynamics of TCP/IP and DNS at the same time.

Political character of protocol displayed in moment of disconnectivity as others emphasize glitches, blips and breakdowns.

(xvi) The moment of disconnectivity is the moment when protocol most forcefully displays its political character.
(xvi) Again, the mere technical details, such as RFCs, suddenly become the grounds for contesting the way in which control takes shape in the materiality of networks.

Protocol, or Political Economy
(xviii) It argues for a methodological shift from a generalized understanding of networks to a specified one, in which the protocological systems of TCP/IP and DNS operate as what
Foucault termed “political technologies.”

Foucault political technologies and Deleuze diagram.

(xviii) Protocol considers networks through a “diagram,” a term borrowed from Gilles Deleuze. Protocol considers first a network as a set of nodes and edges, dots and lines.
(xix) If we are indeed living in a post-industrial, postmodern, postdemocratic society, how does one account for political agency in situations in which agency appears to be either caught in networks of power or distributed across multiple agencies?
(xix) By looking closely and carefully at the technical specifications of TCP/IP and DNS,
Protocol suggests that power relations are in the process of being transformed in a way that is resonant with the flexibility and constraints of information technology.

Isomorphic Biopolitics

Doubly materialist as networked bodies and conditions of experience.

(xix-xx) It is in this sense that Protocol is doubly materialist—in the sense of networked bodies inscribed by informatics, and in the sense of this bio-informatic network producing the conditions of experience.
(xx) From the perspective of protocol, there are no biologies, no technologies, only the possible interactions between “vital forms” which often take on a regulatory, managerial, and normative shape.
(xxi)
Layering is a central concept of the regulation of information transfer in the Internet protocols.

Use of ontology standards outside philosophy as foundation of portability and layering in networks materialized in protocols.

(xxi-xxii) Ontology standards is a strange name for agreed-upon code conventions, but in some circles it is regularly used to signify just that. Newer, more flexible markup languages such as XML have made it possible for researchers (be they biologists or engineers) to come up with a coding schema tailored to their discipline. . . . If layering is dependent upon portability, then portability is in turn enabled by the existence of ontology standards.


Preface
(xxiii-xxiv) Like film was to Andre Bazin or fashion was to Roland Barthes, I consider computers to be fundamentally a textual medium. The reason is obvious: computers are based on a technological language called
code. This underlying code provides the textual link between computers and critical theory.

Code is textual link between computers and critical theory; computer languages need to be considered more like natural languages.

(xxiv) It is my position that the largest oversight in contemporary literary studies is the inability to place computer languages on par with natural languages—something I address in chapter 5 below on hacking.


Acknowledgments
(xxv) This work follows directly from my doctoral thesis. I would like to thank Fredric Jameson, Michael Hardt, Jane Gaines, and Lawrence Grossberg who sat on my dissertation committee and provided invaluable support along the way.


I
How Control Exists After Decentralization
Introduction

Introduction epigraph from Foucault by Deleuze: Every society has its diagram(s).

(3) I am particularly inspired by five pages from Gilles Deleuze, “Postscript on Control Societies,” which begin to define a chronological period after the modern age that is founded neither on the central control of the sovereign nor on the decentralized control of the prison or the factory. My book aims to flesh out the specificity of this third historical wave by focusing on the controlling computer technologies native to it.
(4) Just as Marx rooted his economic theory in a strict analysis of the factory's productive machinery, Deleuze heralds the coming productive power of computers to explain the sociopolitical logics of our own age.

Distributed agency of Baran packet-switching network in surrounding equipment; the packets themselves do not find their own ways.

(5) [Paul] Baran's network was based on on technology called packet-switching that allows messages to break themselves apart into small fragments. Each fragment, or packet, is able to find its own way to its destination. Once there, the packets reassemble to create the original message.
(6) In the early 1980s, the suite of protocols known as TCP/IP (Transmission Control Protocol/Internet Protocol) was also developed and included with most UNIX servers.
(6) At the core of networked computing is the concept of protocol. A computer protocol is a set of recommendations and rules that outline specific technical standards. The protocols that govern much of the Internet are contained in what are called RFC (Request For Comments) documents.

Protocols are core of networked computing governed by organizations like IETF and W3C who freely publish them as RFCs and other document types.; compare to self-defined XSD versus DTD, noting reterritorialization by logic and physics of consideration and sense of original diplomatic usage of fly leaf glued to beginning of a document.

(6) The RFCs are published by the Internet Engineering Task Force (IETF). They are freely available and used predominantly by engineers who wish to build hardware or software that meets common specifications. . . . Other protocols are developed and maintained by other organizations. For example, many of the protocols used on the World Wide Web (a network within the Internet) are governed by the Word Wide Web Consortium (W3C).
(7) Etymologically it refers to a fly-leaf glued to the beginning of a document, but in familiar usage the word came to mean any introductory paper summarizing the key points of a diplomatic agreement or treaty.
(7) What was once a question of consideration and sense is now a question of logic and physics.
(7-8) These regulations always operate at the level of coding—they encode packets of information so they may be transported; they code documents so they may be effectively parsed; they code communication so local devices may effectively communicate with foreign devices. . . . Viewed as a whole, protocol is a distributed management system that allows control to exist within a heterogeneous material milieu.
(8) I argue in this book that protocol is how technological control exists after decentralization. The “after” in my title refers to both the historical moment after decentralization has come into existence, but also—and more important—the historical phase after decentralization, that is, after it is dead and gone, replaced as the supreme social management style by the diagram of distribution.
(8) Emblematic of the first machinic technology, the one that gives the Internet is common image as an uncontrollable network, is the family of protocols known as TCP/IP.
(8) One machine radically distributes control into autonomous locales, the other machine focuses control into rigidly defined hierarchies.

Key theme of book is contrast between distributing TCP/IP and hierarchizing DNS.

(8) Emblematic of the second machinic technology, the one that focuses control into rigidly defined hierarchies, is the DNS.
(9) All DNS information is controlled in a hierarchical, inverted-tree structure. Ironically, then, nearly all Web traffic must submit to a hierarchical structure (DNS) to gain access to the anarchic and radically horizontal structure of the Internet.
(9) Because the DNS system is structured like an inverted tree, each branch of the tree holds absolute control over everything else.
(10) Without the foundational support of the root servers, all lesser branches of the DNS network become unstable. Such a reality should shatter our image of the Internet as a vast, uncontrollable meshwork.

Distributed network is native landscape of protocol, whose content is another protocol, thus important Deleuzean diagram.

(10) To steal an insight from Marshall McLuhan, the content of every new protocol is always another protocol.
(11) Protocol's native landscape is the distributed network. Following Deleuze, I consider the distributed network to be an important diagram for our current social formation. Deleuze defines the diagram as “a map, a cartography that is coextensive with the whole social field.”

Distributed network the Deleuze diagram of current social formation.

(11) A distributed network differs from other networks such as centralized and decentralized networks in the arrangement of its internal structure.
(11-12) The network contains nothing but [quoting Hall Internet Core Protocols] “intelligent end-point systems that are self-deterministic, allowing each end-point system to communicate with any host it chooses.” Like the rhizome, each node in a distributed network may establish direct communication with another node, without having to appeal to a hierarchical intermediary. Yet in order to initiate communication, the two nodes must speak the same language.

Foucault biopower and Deleuze dividual express embodied protocols.

(12) I turn now to Michel Foucault to derive one final quality of protocol, the special existence of protocol in the “privileged” physical media of bodies. Protocol is not merely confined to the digital world. As Deleuze shows in the “Postscript on Control Societies,” protocological control also affects the functioning of bodies within social space and the creation of these bodies into forms of “artificial life” that are dividuated.
(12) I later suggest the Foucault's relationship to life forms is a protocological one. This is expressed most clearly in his later work, particularly in the twin concepts of biopolitics and biopower.
(13) But which technologies in particular would correspond to Foucault's biopolitical scenario? I argue here that they are the distributed forms of management that characterize the contemporary computer network and within which protocological control exists.
(13) Foucault's treatment of biopower is entirely protocological. Protocol is to control societies as the panopticon is to disciplinary societies.
(16) Deleuze recognized this, that the very site of Foucault's biopower was also a site of resistance.
(16) While the new networked technologies have forced an ever more reticent public to adapt to the control structures of global capital, there has emerged a new set of social practices that inflects or otherwise diverts these protocological flows toward the goal of a utopian form of unalienated social life.

Danger of protocol like danger of technology for Heidegger, pharmakon for Plato/Derrida; efforts must be guided through protocol, not against it (Licklider symbiosis, Heim component, Hayles coevolution).

(16) What is wrong with protocol? To steal a line from Foucault, it's not that protocol is bad but that protocol is dangerous.
(17) I hope to show in this book that it is
through protocol that one must guide one's efforts, not against it.

Focus on bodies and material stratum of computer technology rather than minds and epistemology; like Bazin, Barthes and Hayles analyzing material specific formal functions and dysfunctions.

(17-18) I draw a critical distinction between this body of work [Minsky, Dennett, Searle, Dreyfus], which is concerned largely with epistemological and cognitive science, and the critical media theory that inspires this book. Where they are concerned with minds and questions epistemological, I am largely concerned with bodies and the material stratum of computer technology.
(18) While my ultimate indebtedness to many of these authors will be obvious, it is not my goal to examine the social or culturo-historical characteristics of informatization, artificial intelligence, or virtual anything, but rather to study computers as Andre Bazin studied film or Roland Barthes studied the striptease: to look at a material technology and analyze its specific formal functions and dysfunctions.

Other inpsirations include Kittler discourse networks, Wiener cybernetic control, Lovink Net criticism, DeLanda institutional ecologies.

(18) I hope to build on texts such as Friedrich Kittler's groundbreaking Discourse Networks, 1800:1900, which describes the paradigm shift from a discourse driven by meaning and sense, to our present milieu of pattern and code.
(18) Norber
Wiener is also an important character. His books laid important groundwork for how control works within physical bodies. . . . Other important theorists from the field of computer and media studies who have influenced me include Vannevar Bush, Hans Magnus Enzensberger, Marshall McLuhan, Lewis Mumford, and Alan Turing.
(18-19) I am also inspired by
Lovink's new school of media theory known as Net criticism. . . . Much of this intellectual work has taken place in online venues such as CTHEORY, Nettime, and Rhizome, plus conferences such as the annual Ars Electronica festival and the Next 5 Minutes series on tactical media.
(19) This alternate path recognizes the material substrate of media, and this historical processes that alter and create it. It attempts to chart what Manuel
DeLanda calls “institutional ecologies.”

Reading code as a natural language, but likely employing close, hyper, and machine techniques.

(20) Indeed, I attempt to read the never-ending stream of computer code as one reads any text (the former having yet to achieve recognition as a natural language), decoding its structure of control as one would a film or novel.

Periodization

Broad periods of sovereign, disciplinary, and control societies with characteristic political and technological forms.

(20) I refer to the axiom, taken from periodization theory, that history may be divided into certain broad phases, and that the late twentieth century is part of a certain phase that (although it goes by several different names) I refer to alternatively as the postmodern or digital age.
(21) Deleuze reinforces the historical arguments, first presented by Foucault, in his book
Foucault, as well as in several interviews and incidental texts in the collection Negotiations.
(22) Kittler agrees roughly with this periodization in his book
Discourse Networks, 1800:1900.
(23) In the sociopolitical realm many thinkers have also charted this same periodization.

Protocol the technical theory of Hardt and Negri Empire as social theory.

(26) The computer protocol is thus in lockstep with Hardt and Negri's analysis of Empire's logics, particularly the third mode of imperial command, the managerial economy of command. . . . In fact, one might go so far so to say that Empire is the social theory and protocol the technical.
(26) Further to these many theoretical interventions—Foucault, Deleuze, Kittler, Mandel, Castells, Jameson, Hardt and Negri—are many dates that roughly confirm my periodization.
(27) At best, periodization theory is an analytical mindgame, yet one that breathes life into the structural analyses offered to explain certain tectonic shifts in the foundations of social and political life. My book implicitly participates in this game, mapping out certain details of the third, “control society” phase, specifically the diagram of the distributed network, the technology of the computer, and the management style of protocol.


1
Physical Media

Chapter 1 epigraph from Where Wizards Stay Up Late: the language of the RFC was warm and welcoming.

(29) While many have debated the origins of the Internet, it's clear that in many ways it was built to withstand nuclear attack. The Net was designed as a solution to the vulnerability of the military's centralized system of command and control during the late 1950s and beyond.
(29) As I show in this chapter, the material substrate of network protocols is highly flexible, distributed, and resistive to hierarchy.
(30) Normal military protocol serves to hierarchize, to prioritize, while the newer network protocols of the Internet serve to
distribute.

Protocol is algorithmic, and may be centralized, distributed or decentralized.

(30) I attempt to show that protocol is not by nature horizontal or vertical, but that protocol is an algorithm, a proscription for structure whose form of appearance may be any number of different diagrams or shapes.

Centralized networks hierarchical, operating from central hub; examples of American military and judicial systems, Foucault panopticon.

(30) Centralized networks are hierarchical. They operate with a single authoritative hub. Each radial node, or branch of the hierarchy, is subordinate to the central hub.
(31) The American judicial system, for example, is a centralized network.
(31) The panopticon, described in Foucault's
Discipline and Punish, is also a centralized network.

Decentralized networks of distributed autonomous agents following system rules, exemplified by Deleuze and Guattari rhizome; most common diagram of modern era, examples of airline system, US interstate highway, Internet.

(31) There are many decentralized networks in the world today—in fact, decentralized networks are the most common diagram of the modern era.
(31) One example is the airline system.
(33) First, distributed networks have no central hubs and no radial nodes. Instead each entity in the distributed network is an autonomous agent.
(33) A perfect example of a distributed network is the
rhizome described in Deleuze and Guattari's A Thousand Plateaus. . . . The rhizome links many autonomous nodes together in a manner that is neither linear nor hierarchical. Rhizomes are heterogeneous and connective, that is to say, “any point of a rhizome can be connected to anything other. They are also multiple and asymmetrical.
(35) One actually existing distributed network is the Dwight D. Eisenhower System of Interstate and Defense Highways, better known as the interstate highway system.
(38) Of course the Internet is another popular and actually existing distributed network.
(38) Distributed networks have no chain of command, only autonomous agents who operated according to certain pre-agreed “scientific” rules of the system.
(38) For the Internet, these scientific rules are written down. Called protocols, they are available in documents known as RFCs, or “Requests for Comments.” Each RFC acts as a blueprint for a specific protocol. It instructs potential software designers and other computer scientists how to correctly implement each protocol in the real work. Far more than mere technical documentation, however, the RFCs are a discursive treasure trove for the critical theorists.

RFCs as discursive treasure trove for critical theorists; defines Internet as series of interconnected networks.

(38) The RFC on “Requirements for Internet Hosts,” an introductory document, defines the Internet as a series of interconnected networks, that is, a network of networks, that are interconnected via numerous interfacing computers called gateways.
(39) The RFC on “Requirements for Internet Hosts” defines four basic layers for the Internet suite of protocols: (1) the application layer (e.g., telnet, the Web), (2) the transport layer (e.g., TCP), (3) the Internet layer (e.g., IP), and (4) the link (or media-access) layer (e.g., Ethernet).
(39) This diagram, minus its “layer” captions, appears in RFC 791. The four layers are part of a larger, seven-layer model called the OSI (Open Systems Interconnection) Reference Model developed by the International Organization for Standardization (ISO).

OSI preferred model for considering everything as code; no special anthropomorphic uses of data, and affords ontology of amalgamation of multiple processes occurring in multiple temporal orders of magnitude and systems exhibiting distributed control, fitting models described by Deleuze and Guattari (assembly, abstract machine, body without organs, lines of flight, strata).

(40) (footnote 15) The critical distinction is that the OSI model, my preferred heuristic, considers everything to be code and makes no allowances for special anthropomorphic uses of data. This makes it much easier to think about protocol. The other models privilege human-legible forms, whose reducibility to protocol is flimsy at best.
(41) The different responsibilities of the different protocol layers allow the Internet to work effectively. For example, the division of labor between the transport layer and the Internet layer, whereby error correction is the sole responsibility of the transport layer and routing (the process by which data is “routed,” or sent toward its final destination) is the sole responsibility of the Internet layer, creates the conditions of existence for the distributed network.
(42) As long as the hosts on the network conform to the general suite of Internet protocols—like a lingua franca for computers—then the transport and Internet layers, working in concert, will take care of everything.
(43) TCP creates an imaginary circuit between sender and receiver.
(43) The primary value of TCP is its robust quality. TCP allows communication on the Web to be very reliable: Information is monitored during transport and is re-sent if lost or corrupted.

Datagram as linguistic unit is a true container rather than any kind of symbol.

(44) IP is responsible for one thing: moving small packets of data called “datagrams” from one place to another.
(44) Technically, then, IP is responsible for two things: routing and fragmentation. Routing is the process by which paths are selected for moving data across a network.
(45) This flexible routing system is achieved through a “hopping” process whereby data is passed from computer to computer in sequence. . . . Each node in the network knows not where the final destination is, but simply which direction, or “next-hop,” will get it closer to its destination.
(45) In this way the message hops around until it arrives in the immediate vicinity of its destination, whereby the exact location of the destination is in fact known and final delivery is possible.
(45) The second responsibility of the Internet Protocol is fragmentation. When messages are sent across the network, they are inevitably too large to be sent in one piece. Hence, each message is fragmented, or disintegrated into several small packets, before it is sent.

Distinct protocological characteristics: peer to peer, distributed, universal language, robust and flexible, open to unlimited variety of computers and locations, result of action of autonomous agents; protocol layers likely inconceivable to early computing theorists and practitioners, which is a good reason to consider periodization theory applies to different trajectories for machine intelligence, possibilities of machine operations, and human computer symbiosis.

(46-47) At this point, let me pause to summarize the distinct protocological characteristics of the TCP/IP suite. . . . Each of these characteristics alone is enough to distinguish protocol from many previous modes of social and technical organization. Together they compose a new, sophisticated system of distributed control.

DNS heroic project of mapping humanized names to machinic numbers; it is a language.

(47) “The basic problem at hand,” writes DNS critic Ted Byfield, is “how we map the 'humanized' names of DNS to 'machinic' numbers of the underlying IP address system.”
(48-49) The tree structure allows [Paul]
Mockapetris to divide the total name space database into more manageable and decentralized zones through a process of hierarchization. . . . each portion of the database is delegated outward on the branches of the tree, into each leaf.
(49) Like this, the process starts at the most general point, then follows the chain of delegated authority until the end of the line is reached and the numerical address may be obtained. This is the protocol of a decentralized network.
(50) DNS is the most heroic of human projects; it is the actual construction of a single, exhaustive index for all things. It is the encyclopedia of mankind, a map that has a one-to-one relationship with its territory. . . . DNS is not simply a translation language,
it is language. It governs meaning by mandating that anything meaningful must register and appear somewhere in its system. This is the nature of protocol.

Protocol is materially immanent, endogenous language that is indifferent to content (against interpretation).

(51) Second, as the discussion of TCP/IP shows, protocol is materially immanent. That is, protocol does not follow a model of command and control that places the commanding agent outside of that which is being commanded. It is endogenous.
(52) At each phase shift (i.e., the shift from HTML to HTTP, or from HTTP to TCP), one is able to identify a data object from the intersection of two articulated protocols. In fact, since digital information is nothing but an undifferentiated soup of ones and zeros, data objects
are nothing but the arbitrary drawing of boundaries that appear at the threshold of two articulated protocols.
(52) Protocols do not perform any interpretation themselves; that is, they encapsulate information inside various wrappers, while remaining relatively indifferent to the content of information contained within.
(52) The consequences of this are legion. It means that protocological analysis must focus not on the sciences of meaning (representation/interpretation/reading), but rather on the sciences of possibility (physics or logic), which I address in more detail in chapter 5 on hacking.

Protocological analysis eschews meaning and focuses on envelope of possibility; compare to Applen and McDaniel critical reverse engineering and Bogost unit operations, and do not be afraid to leave interpretative realm of critical theory: protocol is a circuit, not a sentence.

(53) To follow a protocol means that everything possible within that protocol is already at one's fingertips. Not to follow means no possibility. Thus, protocological analysis must focus on the possible and the impossible (the envelope of possibility), not a demystification of some inner meaning of “rational kernel” within technology. Protocol is a circuit, not a sentence.


2
Form

(55) The physical realm refers to not only the whole mass of circuits, wires, terminals, routers, and so on that constitute protocol's material layer, but also the technical software that keeps it running.
(55) By formal apparatus I mean the totality of techniques and conventions that affect protocol at a social level, not simply a technical one. If the previous chapter was about protocol from the point of view of the systems administrator, then the current chapter is about protocol from the point of view of the webmaster. Thus, just as film theorists have analyzed the apparatus of film in terms of film form, and ideology theorists have analyzed the apparatus of ideology in terms of its formal structure, I discuss in this chapter the formal qualities of the apparatus of computer protocols.
(56) But what would a Marxist theory of the media actually look like? This is the problem faced by Hans Magnus
Enzensberger in his essay “Constituents of a Theory of the Media.”

Formal apparatus involves social level of protocol along with technical specifications; media are dirty because they require involvement to critique them (Enzensberger).

(57) And to the extent that transmission itself means being able to manipulate ([quoting Enzensberger] “every use of the media presupposes manipulation”), then everyone interested in an emancipated media should be a manipulator. In this sense, media are by their very nature “dirty” for they require, in the very act of critique, to engage with the dominant technologies of manipulation.
(58) The discovery of processes where once there were objects—this is perhaps the most fundamental moment in a Marxist method.
(59) Wiener's theory of dynamic systems, known as cybernetics, acts as an alternative or even a precursor to network theory.
(59) While Wiener's focus on systemic dynamism was certainly emulated by later network theorists, his focus on small, closed systems was not.
(59-60) The innovation of the Memex, however, is its architecture. It was to constitute a type of meshwork, a relational database of records operating on the principle of associative, rather than hierarchical, indexing.
(60) Both Wiener and Bush have therefore unwittingly contributed greatly to the tradition of Marxist media theory inaugurated by Brecht.
(64) The project of this book is to show that protocol is in fact both poles of this machinic movement, territorializing structure and anarchical distribution.
(64) If the Internet were truly rhizomatic, it would resist identification. It would resist the deep, meaningful uses that people make of it everyday.

Use of continuity concept from film theory for networks.

(64) One concept that I will borrow from film theory is continuity. Despite being a decentralized network composed of many different data fragments, the Internet is able to use the application layer to create a compelling, intuitive experience for the user.
(64) Legions of computer uses live and play online with no sense of radical dislocation.
(64) What follows are some of the most important techniques of continuity.
(64)
Conceal the source. Many media formats have a tendency to conceal their own making. This is one reason why Marx's formal critique of the commodity form has been so useful for film theorists, because the commodity itself has a tendency to conceal its own making.
(65) Two common examples are HTML and IP addresses.
(65) Programming languages also follow the rules of continuity: The visible code written by the programmer is made invisible at the moment the code is compiled.
(65)
Eliminate dead links. On the Internet, dead links are called “404 errors.” For successful continuity, 404 errors are to be avoided at all costs. . . . If something is pointed to, it must exist.
(66)
Eliminate no links. . . . Each page must go somewhere else, even if that somewhere else if “back.”
(66)
Green means go. . . . Capitalize on the user's preexisting knowledge of iconography to designate a link.
(66)
True identity. Worse than a dead link is a deceptive link.
(66)
Remove barriers. Each click that a user is forced to make on the Web is an unnecessary barrier to that user and will hinder his or her movement.
(67)
Continuity between media types. . . . In essence, all differentiation between different types of media—text, images, animations—must be eliminated.
(67)
Prohibition against low resolution. . . . Low resolution shatters the illusion of continuity because it means that the source, the code, is not being properly concealed.
(67)
Highest speed possible. . . . Speed helps perpetuate the illusion that personal movement on the Net is unmediated, that the computer is a natural extension of the user's own body.
(67)
Prohibition on crashes. . . . Not only does the crash disrupt the movement of the user, it is offensive, attacking the user itself with threats of data loss and software corruption.
(68)
Prohibition on dead media. . . . Dead media are those media that have fallen out of use.
(68)
Eliminate mediation
(68) All traces of the medium should be hidden, hence the evolution from the less intuitive “QWERTY” keyboard to technologies such as the touch screen (e.g., Palm and other PDAs) and voice recognition software.

Hidden feedback loops of technological nonconscious helps produce subjectivity.

(68) Feedback loops. . . . Feedback loops are necessary to help produce the active subjectivity of the user.
(69)
Anonymous but descriptive. . . . Demographics and user statistics are more important than real names and real identities. . . . The clustering of descriptive information around a specific user becomes sufficient to explain the identity of that user.

Foucault biopower interprets material objects as information at statistical rather than individual level.

(69) Foucault introduced the concept of “biopower” to help explain this phenomenon. His formulation was consistent with the functioning of protocol, for biopower is the power to interpret material objects as information, to affect objects at the statistical or informational level, not at the level of individual content.
(69-72) The Net does not rely on the text as its primary metaphor; it is not based on value exchange; it is not time-based like film or video; it is not narrative in a conventional sense; its terms are not produced in a differential relationship to some sort of universal equivalent. Digital technology necessitates a different set of object relations.

Casts software as immaterial despite stressing materiality of networks.

(72) However, the niceties of hardware design are less important than the immaterial software existing within it. . . . Thus, the key to protocol's formal relations is in the realm of the immaterial software.

Record
(72) The first term in Net form is the record. The record has its roots in the ability of physical objects to store information. A record is any type of nonrandom information, not simply something that records language or data.
(73) The technological recording of the real entered into competition with the symbolic registration of the Symbolic.
(73) The record is, in the most abstract sense, any nonchaotic
something.

Object
(73-74) A record is one particular form-of-appearance of an
object. The object is the digital economy's basic unit. . . . Digital objects are pure positivities. They are the heterogeneous elements that exist in what Deleuze and Guattari have called “machinic” processes.
(74) Objects exist only upon use.
(74) Different objects are understood as such due to their irreconcilability, their separation within a machinic process.

Protocol

Protocol as universal description language of objects, chivalry of objects.

(74) Protocol is a universal description language for objects.
(74-75)
Protocol is a language that regulates flow, directs netspace, codes relationships, and connects life-forms. . . . Protocol is always a second-order process; it governs the architecture of the architecture of objects. Protocol is how control exists after distribution achieves hegemony as a formal diagram. It is etiquette for autonomous agents. It is the chivalry of the object.

Browser
(75) In digital space this “hiding machine,” this making-no-difference machine, is epitomized in the Internet browser.
(76) Its goal is to display all media formats.

HTML
(76) As the Net's universal graphic design protocol since its introduction in 1990, HTML designates the arrangement of objects in a browser.
(77) The final design layout is never actually included in the HTML file; it is merely
described through a series of tags.
(77) HTML is therefore nothing more than a protocol for graphic design. As a protocol, it facilitates similar interfacing of dissimilar objects.

Fonts
(77) A font is not analogous to a signifier. Rather it renders the signifier itself internally complex.
(77-78) Computer fonts do the same work in the digito-semiotic world that HTML does in the virtual world. Both are a set of instructions for the compilation of contents. . . . They are at once totally crucial to the transfer of textual information and yet they are completely disposable, contingent, and atemporal. They are a readable example of protocol.
(78) Like Marx's analysis of the commodity, or Bazin's analysis of film form, Net form must be decoded to reveal its inner complexities.


3
Power

Chapter 3 epigraph from Deleuze Foucault: Technology is social before it is technical.

(81) I argue in this chapter that protocol has a close connection to both Deleuze's concept of “control” and Foucault's concept of biopolitics. I show here that protocol is an affective, aesthetic force that has control over “life itself.” This is the key to thinking of protocol as power.

Materiality of life due to imbrication with protocols supports argument that protocol is an affective, aesthetic force as well.

(82) life, hitherto considered an effuse, immaterial essence, has become matter, due to its increased imbrication with protocol forces (via DNA, biopower, and so on discussed later).

Foucault search for authochthonic transforamtion in realm of words and things.

(83) He claims that he wants to uncover the principles of an “autochthonic transformation”--that is, a transformation in the realm of words and things that is immanent, particular, spontaneous, and anonymous.
(84) Indeed Foucault defines “life” in a fashion very similar to power itself. So similar, in fact, that in the late Foucault, the two terms merge into one: biopower.
(85) Biopolitics, then, connects to a certain statistical knowledge about populations.
(85) Biopolitics is a species-level knowledge.
(87) For it is not simply Foucault's histories, but Foucault himself that is left behind by the societies of control. Foucault is the rhetorical stand-in for the modern disciplinary societies, while Deleuze claims to speak about the future.

Second Nature
(88) For my purposes, “second nature” refers to the way in which material objects in the modern era have a tendency to become aesthetic objects. Through being aesthetized, they also tend to become autonomous, living entities in some basic sense. This tendency is a necessary precondition for protocol, and I would likke to look closely at Marx's
Capital to illustrate how this happens.

Analysis of vitalism in Marx Capital to illustrate second nature as how material objects become aesthetic objects.

(90) This vitalism in Marx heralds the dawning age of protocol, I argue, by transforming life itself into an aesthetic object. . . . The moments in Marx when he lapses into metaphor and imagery appear to be his own attempt at cinematography—that is, his attempt to aestheticize the vital forms contained in the body.
(92) Capitalism, for Marx, is second nature. It is at once intuitive and naturalized—what Barthes would call a second-order system of signification. It is a “layer” that has been folded back on itself such that it is simultaneously its core self and its own patina. It is both raw and coded.
(96) The type of vitalistic discourse seen most clearly in
Capital is that of vital objects. Although his “rational kernel” and “mystical shell” may be the most well known, Marx is obsessed with animals, plants and minerals of all kinds.
(97) Often Marx's vital objects take on more sinister, supernatural personalities. Specters, monsters, and vampires riddle his text. As Derrida has shown in
Speters of Marx, the concept of haunting appears several times in Capital (although perhaps not as often as Derrida would lead us to believe).
(98) A third category (after the world of objects and the world of supernatural objects) within Marx's vitalist discourse is that of natural processes. . . . Congealing is an incredibly powerful process in Marx since it involves both a change in category and a change in form.
(99) The famous Hegel headstand that Marx posits in the “Postface” is based on the idea that inversion, or upside-downness, is linked directly to illusion, mystification, and misrecognition.
(99) Hiddenness is as powerful a force of mystification (i.e.,
naturalization) as inversion is. . . . The social hieroglyphic refers to something that does not announce on its surface what it is on the inside.
(100-102) These concepts of mystification and naturalization may be defined further, both more generally in the concept of “form of appearance” and more specifically in the fetish form. . . . Finally, form of appearance in its most advanced usage gestures toward what one would today call a theory of ideology. . . . To simplify the formula: natural misrecognition = ideology. For this reason I argue that form of appearance is an incredibly powerful moment in Marx's theorization of vital forms.

Intuitive capitalistic apparatus alluded to by vitalistic imagery foreshadows protocol.

(102) The use of vitalistic imagery, no matter how marginalized within the text, quite literally aestheticizes capitalism. It turns capitalism into media. Perhaps then the conventional wisdom on Capital, that Marx's goal was to denaturalize the apparatus of capitalism, can be rethought. The existence in the text of vital forms allows for both an intuitive and estranged capitalistic apparatus.

Emergence of Artificial Life Forms (Matter Becoming Life)
(103) Indeed, protocol is a theory of the confluence of life and matter (and ultimately we will see that protocol shows how life
is matter).
(105) Wiener's position is, thus, what one might today call Deleuzian. Wiener sees entropy as a gradual procession toward the Plane of Immanence, Deleuze's term for the undifferentiated, contingent state matter finds itself in when it has yet to organize itself in any identifiable way. Life, the, is a type of stratification within that Plane. It is, quite literally, an
organization, a set of “patterns that perpetuate themselves.”

Wiener argues that both people and machines are communicative organisms, which today live inside protocol; essence of cybernetics is self-determinism of material systems, like Foucault biopower.

(105-106) What makes Wiener's theory so radical, however, is that he recognized that machines also resist entropy. . . . It's not simply that machines are like people, or that people are like machines, but that both entities are like something else, what Wiener calls “communicative organisms,” or what today might be called “information organisms.” These are the same organisms that live inside protocol.
(107) The self-determinism of material systems is therefore the essence of cybernetics, and it is a positive essence, one that also reflects the positive potential of protocological organization.

Artificial Life
(107-108) Pseudo-artificial entities such as robots have been in existence for many years already. The emergence of “artificial life” proper happens as computers shift from being primarily linear calculation machines to being clusters of parallel, distributed submachines.
(108) In computer science, this shift is characterized by the change from “procedural” (or linear) programming to so-called object-oriented programming.

Appeal to shift from procedural to object-oriented programming as indicative of potential for true emergence of artificial life from parallel, distributed networks of submachines; Turkle ties to shift from modern to postmodern eras.

(108-109) Sherry Turkle writes that this shift—from procedural to object-oriented—follows the shift from the modern to the postmodern eras. . . . This shift, from centralized procedural code to distributed object-oriented code, is the most important shift historically for the emergence of artificial life.

Life as Medium (Life Becoming Matter)
(110) I assert that, further to the anti-entropic theory of life (which by itself has little to say about protocol), life forms, both artificial and organic, exist in any space where material forces are
actively aestheticized, resulting in a type of sculpted materiality, a materiality in which vital agents are managed, organized, affected, and otherwise made aesthetically active.
(111) The “information age”--a term irreverently tossed to and fro by many critics of contemporary life—is not simply that moment when computers comes to dominate, but is instead that moment in history when matter itself is understood in terms of information or code. At this historical moment, protocol becomes a controlling force in social life.

Transformation of matter to media, life as code, immaterial soul replaced with aesthetized biometrics, key to understanding rise of protocological control.

(111) But what has been overlooked is that the transformation of matter into code is not only a passage from the qualitative to the quantitative but also a passage from the non-aesthetic to the aesthetic—the passage from non-media to media.
(111) This historical moment—when life is defined no longer as essence, but as code—is the moment when life
becomes a medium.
(113) One's lived experience was no longer tied to material realities, but instead was understood in terms of numbers—a telephone number, a zip code, a social security number, an IP address, and so on.
(113) It considers living human bodies not in their immaterial essences, or souls, or what have you, but in terms of quantifiable, recordable, enumerable, and encodable characteristics. It considers life as an aesthetic object.
(113) Authenticity (identity) is once again inside the body-object, yet it appears now in sequences, samples, and scans.
(114) Collaborative filtering is therefore an extreme example of the protocological organization of real human people. Personal identity is formed only on certain hegemonic patterns.

Periodization table for control matrix from feudal, modern, postmodern to future considering machine, energy mode, disciplinary mode, control diagram, virtue, active threat (resistance), passive threat (delinquency), political mode, stratagem, personal crisis.

(115) The matrix describes protocol's successes, its failures, and its future forms.


II
Failures of Protocol
4
Institutionalization

Part II epigraphs from Paul Baran and Tim Berners-Lee; chapter 4 begins recounting birth of spam on 4/12/1994.

Compare this violence against humans by machines from spam to humans by humans of ancient times when also making an analysis of behavior that seems to judge indiscriminately, as pebbles used in calculation.

(119-120) This protocological covenant outlining open channels for Usenet's growth and governance, hitherto cultivated and observed by its large, diverse community of scientists and hobbyists, was sullied in the spam incident by the infraction of a few. The diversity of the many groups on Usenet was erased and covered by a direct-mail blanket with a thoroughness only computers can accomplish.
(120) This chapter, then, covers how protocol has emerged historically within a context of bureaucratic and institutional interests, a reality that would seem to contradict protocol.

Most literature on protocol relation to bureaucracy and law, such as Lessig; Galloway emphasis on technology and use.

(120-121) To date, most of the literature relating to my topic has covered protocol through these issues of law, governance, corporate control, and so on. Lawrence Lessig is an important thinker in this capacity. . . . Bureaucracy is protocol atrophied, while propriety is protocol reified.
(121) The argument in this book is that bureaucratic and institutional forces (as well as proprietary interests) are together the inverse of protocol's control logic. . . . Protocol gains its authority from another place, from technology itself and how people program it.

Postel credits Internet success to public documentation, free and cheap software, vendor independence.

(121-122) As long-time RFC editor Jon Postel put it, “I think three factors contribute to the success of the Internet: (1) public documentation of the protocols, (2) free (or cheap) software for the popular machines, and (3) vendor independence.” Commercial or regulatory interests have historically tended to impinge upon Postel's three factors.

Protocol controlling logic transcends institutions, governments, and corporations while tied to them.

(122) In short, protocol is a type of controlling logic that operates outside institutional, governmental, and corporate power, although it has important ties to all three.
(122) Like the philosophy of protocol itself, membership in this technocratic ruling class is open. . . . But, to be sure, because of the technical sophistication needed to participate, this loose consortium of decision makers tends to fall into a relatively homogeneous social class: highly educated, altruistic, liberal-minded science professionals from modernized societies around the globe.

Loose affiliations of technocratic ruling class yet localized (Castells); importance of Unix and C/C++ as protocological technologies.

(122) Of the twenty-five or so original protocol pioneers, three of them—Vint Cerf, Jon Postel, and Steve Crocker—all came from a single high school in Los Angeles's San Fernando Valley. Furthermore, during his long tenure as RFC editor, Postel was the single gatekeeper through whom all protocol RFCs passed before they could be published.
(123) A significant portion of these computers were, and still are, Unix-based systems. A significant portion of the software was, and still is, largely written in the C or C++ languages. All of these elements have enjoyed unique histories as protocological technologies.
(123-124) It is not only computers that experience standardization and mass adoption. Over the years many technologies have followed this same trajectory. The process of standards creation is, in many ways, simply the recognition of technologies that have experienced success in the marketplace.

IEEE as worlds largest protocological society.

(126-127) The IEEE works in conjunction with industry to circulate knowledge of technical advances, to recognize individual merit through the awarding of prizes, and to set technical standards for new technologies. In this sense the IEEE is the world's largest and most important protocological society.
(127-128) ANSI, formerly called the American Standards Association, is responsible for aggregating and coordinating the standards creation process in the United States. It is the private-sector counterpart to NIST.
(128) Besides being consensus-driven, open, transparent, and flexible, ANSI standards are also voluntary, which means that, like NIST, no one is bound by law to adopt them. . . . And in fact, proven success in the marketplace generally predates the creation of a standard. The behavior is emergent, not imposed.
(129) On the international stage several other standards bodies become important.
(129) Another ISO standard of far-reaching importance is the
Open Systems Interconnection (OSI) Reference Model.
(130) ISO, ANSI, IEEE, and all the other standards bodies are well-established organizations with long histories and formidable bureaucracies. The Internet, on the other hand, has long been skeptical of such formalities and spawned a more ragtag, shoot-from-the-hip attitude about standard creation.
(131) Four groups make up the organizational hierarchy in charge of Internet standardization. They are the Internet Society, the Internet Architecture Board, the Internet Engineering Steering Group, and the Internet Engineering Task Force.
(131) (footnote 24) Another important organization to mention is the Internet Corporation for Assigned Names and Numbers (ICANN).

IETF defined by various RFCs, some of which feature social relations and cultural biases.

(132) The bedrock of this entire community is the IETF. The IETF is the core area where most protocol initiatives begin.
(133) (footnote 29) This RFC [“IETF Guidelines for Conduct” (RFC 3184, BCP 54)] is an interesting one because of the social relations it endorses within the IETF. Liberal, democratic values are the norm. . . . Somewhat ironically, this document also specifies that “English is the de facto language of the IETF.”
(133) The IETF is the least bureaucratic of all the organizations mentioned in this chapter. In fact it is not an organization at all, but rather an informal community. . . . “Membership” in the IETF is simply evaluated through an individual's participation. If you participate via email, or attend meetings, you are a member of the IETF. All participants operate as unaffiliated individuals, not as representatives of other organizations or vendors.
(133-134) The IETF is divided by topic into various Working Groups. Each Working Group focuses on a particular issue or issues and drafts documents that are meant to capture the consensus of the group.
(134) The process of establishing an Internet Standard is gradual, deliberate, and negotiated. Any protocol produced by the IETF goes through a series of stages, called the “standards track.” The standards track exposes the document to extensive peer review, allowing it to mature into an RFC memo and eventually an Internet Standard.
(134) Cronyism is sometimes a danger at this point, as the old-boys network—the RFC editor, the IESG, and the IAB—have complete control over which Internet-Drafts are escalated and which aren't.
(135) Not all RFCs are standards. Many RFCs are informational, experimental, historic, or even humorous in nature.
(136) In addition to the STD subseries for Internet Standards, there are two other RFC subseries that warrant special attention: the Best Current Practice (BCP) documents and informational documents known as FYI.
(136) Some of the RFCs are extremely important. RFCs 1122 and 1123 outline all the standards that must be followed by any computer that wishes to be connected to the Internet.
(136) Other RFCs go into greater technical detail on a single technology. Released in September 1981, RFC 791 and RFC 793 are the two crucial documents in the creation of the Internet protocol suite TCP/IP as it exists today.
(137) The Web emerged largely from the efforts of one man, the British computer scientist Tim Berners-Lee.
(138) [quoting Berners-Lee] The Web was not a physical “thing” that existed in a certain “place.” It was a “space” in which information could exist.
(138) But, Berners-Lee admitted, “the IETF route didn't seem to be working.”
(138) Instead he established a separate standards group in October 1994 called the World Wide Web Consortium (W3C).
(138-139) In many ways the core protocols of the Internet had their development heyday in the 1980s. But Web protocols are experiencing explosive growth today. Current growth is due to an evolution of the concept of the Web into what Berners-Lee calls the Semantic Web. . . . it is enriched using descriptive protocols that say what the information actually is.

Semantic web is machine-understandable information, protocol that cares about meaning that could lead to emergent forms of machine intelligence; consider recent proliferation of proprietary protocols prevalent in mobile computing as changing evolutionary trend away from open, democratic foundation.

(139) The Semantic Web is simply the process of adding extra metalayers on top of information so that it can be parsed according to its semantic value.
(139) Before this, protocol had very little to do with meaningful information. Protocol does not interface with content, with semantic value. It is, as I have said, against interpretation. But with Berners-Lee comes a new strain of protocol: protocol that cares about meaning.
(139) So it is a matter of debate as to whether descriptive protocols actually add intelligence to information, or whether they are simply subjective descriptions (originally written by a human) that computers mimic but understand little about.

Antifederalism through universalism reverts decision making to local level.

(140-141) It is a peculiar type of anti-federalism through universalismstrange as it sounds—whereby universal techniques are levied in such a way to revert much decision making back to the local level.
(141) Ironically, then, the Internet protocols that help engender a distributed system of organization are themselves underpinned by adistributed, bureaucratic institutions—be they entities like ICANN or technologies like DNS.
(141-142) Thus it is an oversight for theorists like Lawrence Lessig (despite his strengths) to suggest that the origin of Internet communication was one of total freedom and lack of control. . . . The founding principle of the Net is control, not freedom.
Control has existed from the beginning.

Faults Lessig for not seeing control is endemic to all distributed networks governed by protocol, which must be partially reactionary to be politically progressive.

(141) (footnote 46) It is certainly correct from him [Lessig] to note that new capitalistic and juridical mandates are sculpting network communications in ugly new ways. But what is lacking in Lessig's work, then, is the recognition that control is endemic to all distributed networks that are governed by protocol.
(142) The generative contradiction that lies at the very heart of protocol is that
in order to be politically progressive, protocol must be partially reactionary.
(142) To put it another way, in order for protocol to enable radically distributed communications between autonomous entities, it must employ a strategy of universalization, and of homogeneity.

Tactical standardization is our Barthes Operation Margarine.

(143) Perhaps I can term the institutional frameworks mentioned in this chapter a type of tactical standardization, in which certain short-term goals are necessary in order to realize one's longer-term goals. Standardization is the politically reactionary tactic that enables radical openness. . . . It is, as Barthes put it, our “Operation Margarine.”


III
Protocol Futures
5
Hacking

Chapter 5 epigraph from Hardt and Negri on hacking.

(147) One reason for its success is the high cost of aberrance levied against those who ignore the global usage of specific technologies. . . . Protocol is fundamentally a technology of inclusion, and openness is the key to inclusion.

Control is now like a law of nature it is so imbricated in biopower and technological systems; we must be thorough in our scientific approach to self reflection.

(147) While control used to be a law of society, now it is more like a law of nature.
(150) In this section I address a few of the so-called resistive strains within computer culture and how they promise to move protocol into an exciting new space.
(150) What happens when the enemies of networks are also networks?
(150) Political tactics drawn from a bygone age will undoubtedly fail. This is the essence of CAE's argument.
(150-151) (footnote 5) While it is important for me to recognize the prescience of CAE [Critical Art Ensemble], because few others recognized so early and so incisively the nature of politics within the new techno and biopolitical age, several important differences exist between their position and my own. . . . A protocological analysis shows that control is almost never in abstract form. Rather, protocol ensures that control is literally inscribed onto the very cells and motherboards of bioinformatic networks.
(151) Today there are two things generally said about hackers. They are either terrorists or libertarians.
(152) Levy distilled this so called hacker ethic into sever key points. . . . Several of Levy's points dovetail with my earlier conclusions about protocol.
(153) Yet after a combination of public technophobia and aggressive government legislation, the identity of the hacker changed in the mid to late eighties from do-it-yourself hobbyist to digital outlaw.

Hackers are symptomatic of assumption of protocol and changing resistance, as references to Levy then Sterling tiger teams illustrate.

(157) When viewed allegorically, hacking is an index of protocological transformations taking place in the broader world of techno-culture. Hackers do not forecast the death (or avoidance or ignorance) of protocol, but are instead they very harbingers of its assumption.
(158) By knowing protocol better than anyone else, hackers push protocol into a state of hypertrophy, hoping to come out the other side. So in a sense, hackers
are created by protocol, but in another, hackers are protocological actors par excellence.

Tiger Teams
(159) By “tiger teams”
Sterling refers to the employee groups assembled by computer companies trying to test the security of their computer systems. Tiger teams, in essence, simulate potential hacker attacks, hoping to find and repair security holes.
(159) The term also evokes the management style known as Toyotism originating in Japanese automotive production facilities. Within Toyotism, small pods of workers mass together to solve a specific problem. The pods are not linear and fixed like the more traditional assembly line, but rather they are flexible and reconfigurable depending on whatever problem might be posed to them.
(160) In this sense, while resistance during the modern age forms around rigid hierarchies and bureaucratic power structures, resistance during the postmodern age forms around the protocological control forces existent in networks.
Hacking means that resistance has changed.
(160) Yet this is a new type of individual. This is not the same individual who is the subject of enlightenment liberalism.
(161) CAE proposes a “nomadic” (rather than sedentary) model for resistance. . . . Different nomadic cells, or tiger teams, would coalesce around a specific problem, allowing resistance “to originate from any different points.”
(161) The structural form is similar to what Bey refers to in the “temporary autonomous zone” (TAZ).

Code

Hackers know code like a mother tongue, bolstering similarity of computer and natural languages, transformation of subjectivity; code is hyperlinguistic, the only language that is executed.

Code as hyperlinguistic is the only language that is executed, that actually does what it says: link to historical interest in power of language as well as topics in analytical and continental philosophy.

(164) Hackers know code better than anyone. They speak the language of computers as one does a mother tongue. As I argue in the preface, computer languages and natural languages are very similar.
(165) It lies not in the fact that code is sublinguistic, but rather in the fact that it is
hyperlinguistic. Code is a language, but a very special kind of language. Code is the only language that is executed.
(165-166) So code is the first language that actually does what it says—it is a machine for converting meaning into action.

Compare analysis of executable metalayer encapsulating code, making it hyperlinguistic rather than sublinguistic, to its materiality in Berry.

(166) In this way, code is the summation of language plus an executable metalayer that encapsulates that language.
(167) The hacker's close relationship to code displays the power of protocol, particularly its ability to compel autonomous actors toward a more vital or affective state within their particular distributed milieu.

Possibility
(168) In fact, possibility often erases the unethical in the mind of the hacker.
(168-169) Deciding (and often struggling for) what is possible is the first step in a utopian vision based in desire, based in what one
wants. Hackers are machines for the identification of this possibility.

Levy collective intelligence and possibility of utopia in cyberspace.

Hacker insight into nature of utopia because realizable in cyberspace by working code.

(169) Pierre Levy is one writer who has been able to articulate eloquently the possibility of utopia in the cyberspace of digital computers.
(169) Thus, I suggest that the hacker's unique connection to the realm of the possible, via protocol that structures itself on precisely that threshold of possibility, gives the hacker special insight into the nature of utopia—what he or she
wants out of computers.
(170) “Sharing of software . . . is as old as computers,” writes free software guru Richard
Stallman, “just as sharing of recipes is as old as cooking.”
(170) Code does not reach its apotheosis
for people, but exists within its own dimension of perfection. . . . Commercial ownership of software is the primary impediment hated by all hackers because it means that code is limited—limited by intellectual property laws, limited by the profit motive, limited by corporate “lamers.”

Curiously ill informed mischaracterization of FLOSS as freeware.

(171 footnoe 60) The primary example of this trend is the free, open source operating system Linux. Virtually every software product has a freeware analog that often performs better than its commercial counterpart. Examples include Apache, a free Web server, MySQL, a free relational database that competes with high-end commercial databases such as Oracle; and Perl, a free open source scripting language.

Protocol is open source by definition.

(171) However, greater than this anti-commercialism is a pro-protocolism. Protocol, by definition, is open source, the term given to a technology that makes public the source code used in its creation. That is to say, protocol is nothing but an elaborate instruction list of how a given technology should work, from the inside out, from the top to the bottom, as exemplified in the RFCs described in chapter 4.
(171-172) As concerned protocological actors, hackers have often called attention to commercial or governmental actions that impede protocol through making certain technologies proprietary or opaque. One such impediment is the Digital Millennium Copyright Act (DMCA) of 1998.
(172) On the Net, something is possible only if it is accessible.
(172)
2600 writes, correctly, that the real issue here is one of control over a specific technical knowledge, not potential piracy of DVD media.

Hacking reveals ways of using code creatively.

(172) What hacking reveals, then, is not that systems are secure or insecure, or that data wants to be free or proprietary, but that with protocol comes the exciting new ability to leverage possibility and action through code.


6
Tactical Media

Chapter 6 epigraph from Virilio Infowar.

(175) Tactical media is the term given to political uses of both new and old technologies, such as the organization of virtual sit-ins, campaigns for more democratic access to the Internet, or even the creation of new software products not aimed at the commercial market.
(175) That is to say, there are certain tactical effects that often leave only traces of their successes to be discovered later by the ecologists of the media.

Attitude toward viruses and proprietary software needs adjusted along with Microsoft monopoly predictions, hearkening to a prior struggle, although focus on tactical media focus avoids this topic.

(175-176) For example computer viruses are incredibly effective at identifying anti-protocological technologies. They infect proprietary systems and propagate through the homogeneity contained within them. Show me a computer virus and I'll show you proprietary software with a market monopoly.
(176) Instead in this chapter I would like to examine tactical media as those phenomena that are able to exploit flaws in protocological and proprietary command and control, not to destroy technology, but to sculpt protocol and make it better suited to people's real desires. . . . Tactical media propel protocol into a state of hypertrophy, pushing it further, in better and more interesting ways.

Computer Viruses

Concept of computer virus presented in 1983 seminar paper by Cohen which became dissertation.

(177) [Frederick] Cohen first presented his ideas on computer viruses to a seminar in 1983. His paper “Computer Viruses—Theory and Experiments” was published in 1984, and his Ph.D. dissertation titled “Computer Viruses” (University of Southern California) in 1986.
(178) Computer viruses acquired their current discursive position because of a unique transformation that transpired in the mid-1980s around the perception of technology. In fact several phenomena, including computer hacking, acquired a distinctly negative characterization during this period of history because of the intense struggle waging behind the scenes between proprietary and protocological camps.

Interesting comparison between early reaction to computer viruses and AIDS, and shift from technological identity to actions of human perpetrators, and weaponization of terrorism paradigm.

(179) Thus, by the late 1990s viruses are the visible indices of a search for evildoers within technology, not the immaterial, anxious fear they evoked a decade earlier with the AIDS crisis.
(181) While the AIDS paradigm dominated in the late 1980s, by the late 1990s computer viruses would become weaponized and more closely resemble the terrorism paradigm.
(181) The state attacks terror with all available manpower, while it systematically ignores AIDS. Each shows a different exploitable flaw in protocological management and control.
(184) A self-replicating program is no longer the hallmark of technical exploration, as it was in the early days, nor is it (nor was it ever) a canary in the coal mine warning of technical flaws in proprietary software, nor is it even
viral; it is a weapon of mass destruction. From curious geek to cyberterrorist.

Cyberfeminism

Focus on bugs a common cyberfeminist theme (Hayles).

(185-186) The computer bug, far from being an unwanted footnote in the history of computing, is in fact a space where some of the most interesting protocological phenomena occur.
(187-188) Cyberfeminism in its very nature necessitates a participatory practice in which many lines of flight coexist. Yet several recurrent themes emerge, among them the questions of
body and identity. Like a computer virus, cyberfeminisim exists to mutate and transform these questions, guiding them in new directions within the protocological sphere.
(188) Like French feminist Luce Irigaray before her, [Sadie]
Plant argues that patriarchal power structures, which have unequally favored men and male forms in society, should be made more equal through a process of revealing and valorizing overlooked female elements.
(189)
Zeros and Ones persuasively shows how women have always been inextricably involved with protocological technology.
(189) The digital provides a space of valences that exists outside of and potentially preempts patriarchal structures.
(190) Held aloft, yet notably aloof from the cyberfeminist movement, is Sandy
Stone, theorist of the history of cyberspace, desire, and the virtual body.
(191) Through the introduction of tactical protocols, which are always negotiated and agreed to in advance by all participants, digital networks
become Cartesian, body-based, desiring, and so on. Cyberfeminism is the tactical process by which this reification will be refashioned.

Stone and Plant argue digital space conceptualized via protocol based participatory social practices.

(191) As Stone and others show, a participatory social practice (i.e., community) based on an imagined ether-scape of desiring and interacting bodies (i.e., protocol) is basic to how one conceptualizes digital space.
(191) Cyberfeminist pioneers VNS Matrix provide the frontline guerrilla tactics for Stone and Plant's theoretical efforts.
(194) Cyberfeminism aims to exorcise the essentialized, uninterrogated female body (brought into existence as a by-product of the protocological revolution) through a complex process of revalorization and rebuilding.
(196) It matters little if gender disappears completely, or if it reemerges as a moniker of militancy. The political question is simply choosing how and when to inject change into protocol so that it aligns more closely with one's real desires about social life and how it ought better to be lived. This is the essence of tactical media.

Conflicting Diagrams
(196) Throughout the years new diagrams (also called graphs or organizational designs) have appeared as solutions or threats to existing ones.
(201) Thus the Internet can survive attacks not because it is stronger than the opposition, but precisely because it is weaker. The Internet has a different diagram than a nuclear attack does; it is
in a different shape. And that new shape happens to be immune to the older.
(204) In short,
the current global crisis is one between centralized, hierarchical powers and distributed, horizontal networks.
(205) In recent decades the primary conflict between organizational designs has been between hierarchies and networks, an asymmetrical war. However, in the future the world is likely to experience a general shift downward into a new bilateral organizational conflict—networks fighting networks.
(206) In a sense, networks have been vilified simply because the terrorists, pirates, and anarchists made them notorious, not because of any negative quality of the organizational diagram itself. In fact, positive libratory movements have been capitalizing on network design protocols for decades if not centuries. The section on the rhizome in
A Thousand Plateaus is one of literature's most poignant adorations of the network diagram.

Tactical media weakens technologies in order to sculpt new forms from the degrees of freedom arising in its hypertrophic condition.

(206) These tactical effects are allegorical indices that point out the flaws in protocological and proprietary command and control. The goal is not to destroy technology in some new Luddite delusion, but to push it into a state of hypertrophy, further than it is meant to go. Then, in its injured, sore, and unguarded condition, technology may be sculpted anew into something better, something in closer agreement with the real wants and desires of its users. This is the goal of tactical media.


7
Internet Art

Chapter 7 epigraph from Alexei Shulgin Nettime from which net-dot-art originated.

(209) Much of my analysis in preceding chapters focused on form, with the assumption that a revolutionary critique of the protocological media is simply a critique of their formal qualities: Determine a nonoppressive form and an emancipated media will follow. And indeed this is the main goal of media liberation theorists like Enzensberger.

Derrida new art is not video but digital computer art.
(210) Derrida offers an intriguing commentary on the question of video and its specificity as a medium.
(210) It is “vigilant” and “unpredictable” and it brings with it “other social spaces, other modes of production, of 'representation', archiving, reproducibility . . . [and] the chance of
a new aura.” [“Videor” 77]
(210-211) Let me suggest that the “new art” the Derrida calls for is not in fact video, but the new media art that has appeared over the last few decades with the arrival of digital computers. . . . Further, as I argue in this chapter, a subgenre of Internet art has emerged since 1995 called “net.art.” This subgenre refers to the low-tech aesthetic popularized by the
7-11 email list and artists like Jodi.
(212) I argue in this chapter that the definition of Internet art has always been a tactical one, that Internet art doesn't simply mean using browsers and HTML, but instead is an aesthetic defined by its oppositional position vis-a-vis previous, often inadequate, forms of cultural production.

Manifestation of Internet media in glitches, bugs, and errors provides specificity in lieu of creator experience that could arise through critical programming; the alternative is engaging politics or feigning ignorance (Grzinic).

(213) Following [Marina] Grzinic, I suggest here that computer crashes, technical glitches, corrupted code, and otherwise degraded aesthetics are the key to this disengagement. They are the “tactical” qualities of Internet art's deep-seated desire to become specific to its own medium, for they are the moments when the medium itself shines through and becomes important.
(214) As I suggest in part I, the protocols that underlie the Internet are not politically neutral. They regulate physical media, sculpt cultural formations, and exercise political control. This fact helps one understand the difference of opinion between the hackers and the artists/activists. If the network itself is political from the start, then any artistic practice within that network must engage politics or feign ignorance.
(215) The Internet's early autonomous communities were the first space where pure networks aesthetics (Web site specificity) emerged—emails lists like
7-11, Nettime, recode, Rhizome, and Syndicate.
(215) Primitive signs were seen in early net.art projects, such as Alexi Shulgin's
Refresh, an art project consisting of nothing but links between Web pages.
(216) Focusing specifically on those places where computers break down, Jodi derives a positive computer aesthetic by examining its negative, its point of collapse.

Art making moved outside aesthetic realm to often invisible working code.

(217) While the artwork may offer little aesthetic gratification, it has importance as a conceptual artwork. It moves the moment of art making outside the aesthetic realm and into the invisible space of protocols: Web addresses and server error messages.
(218) The cluster of servers that make up the
Name.Space alternative network—a web within the Web that uses a different, more flexible (not to mention cheaper and nonmonopolistic) addressing scheme—are a perfect example of this type of Internet conceptualism.
(218) The
Web Stalker is also a good example of the conceptual nature of Internet art. It is an alternate browser that offers a completely different interface for moving through pages on the Web.

Internet art periods highlighting network and software concerns.

(218-219) Let me now propose a simple periodization that will help readers understand Internet art practice from 1995 to the present. Early Internet art—the highly conceptual phase known as “net.art”--is concerned primarily with the network, while later Internet art—what can be called the corporate or commercial phase—has been concerned primarily with software. This is the consequence of a rather dramatic change in the nature of art making concurrent with the control societies and protocological media discussed throughout this book.
(219) But this primary limitation has now begun to disappear. Today Internet art is much more influenced by the limitations of certain commercial contexts.

Internet Art as Art of the Network
(224)
OSS [by Jodi] is abstract art for computers. In it, context itself has been completely subordinated to the sometimes jarring and pixelized topography of the computer operating system.
(225) By making the download time part of the viewing experience, Lialina brings protocol itself directly into the art object.

Internet Art as Art of Software
(227)
Toywar was an online gaming platform playable simultaneously by multiple users around the world. The goal of the game was to negatively affect specific capital valuations on the NASDAQ stock market.
(228) RTMark is a corporation dedicated to anticorporate sabotage activities. It was instrumental in several now famous actions such as the Barbie Liberation Organization in the 1980s, the
Deconstructing Beck CD from the late 1990s, and also the Toywar activities of December 1999.
(229) By far the most successful corporate artists are the Swiss art group Etoy.
(232) The goal of
Toywar was to wage “art war” on eToys Inc., trying to drive its stock price to as low a value as possible—and in the first two weeks of Toywar, eToy's stock price on the NASDAQ plummeted by over 50 percent and continued to nosedive. . . . The strategy worked. eToys Inc. dropped its lawsuit against the artists and declared bankruptcy in 2001.
(232-233) Like the struggle in the software industry between proprietary technologies and open, protocological ones, Internet art has struggled between an aesthetic focused on network protocols, seen in the earlier work, and an aesthetic focused on more commercial software, seen in the later work.

Auctionism
(233) One particular subgenre of Internet art that mixes both sides of the aesthetic divide (art as network and art as software) in interesting ways is auction art. . . . The communal network or social space created by the auction art piece supplements the artist's eBay Web page.
(234) These are all examples of the nonartistic uses of eBay by members of the art community. But the auction Web site has also been used as an actual medium for art making or otherwise artistic interventions.

Compare role of Kickstarter today to Toywar and auctionism.

(238) Auctionism unravels the limitations of the network by moving the location of art object off the Web site and into the social space of the Net, particularly email lists like Rhizome, Nettime, and others.


Conclusion
(241) Protocol is not a superego (like the police); instead it always operates at the level of desire, at the level of “what we want.”
(242) This constant risk of collapse or disaster is what makes the subcultures discussed here—hacking, cyberfeminism, Net art—so necessary for the assumption and continuing maturation of protocol.

Protocol also management style of ruling elite, and where enemies of power operate as well, as Foucault notes of biopower generating new forms of control and delinquency.

(242) Yet the success of protocol today as a management style proves that the ruling elite is tired of trees too.
(243) But power's enemies are swimming in that same flow.
(243) Let me end by restating a few summarizing moments selected from previous chapters.

No longer threat of Microsoft monopoly but multiplicities of proprietary protocols riding atop IP replacing core TCP RFCs; precisely this exists today with mobile device applications that utilize proprietary, likely encrypted protocols atop IP, which can be tested by watching wireless network traffic via tcpdump and etherape in the vicinity of mobile devices (telephones, tablets, and so on).

(244) It is very likely if not inevitable that the core Internet protocols, today largely safe from commercial and state power, will be replaced by some type of proprietary system. (The fact that Microsoft has not yet replaced TCP/IP with a commercial product of its own is one of the miracles of computer history. Chances are this will happen very soon.)
(245) It is a physical logic that delivers two things in parallel: the solution to a problem, plus the background rationale for why that solution has been selected as the best. Like liberalism, or democracy, or capitalism, protocol is a successful technology precisely because its participants are evangelists, not servants. Like liberalism, democracy, or capitalism, protocol creates a community of actors who perpetuate the system of organization.
(245) Protocol then becomes more and more coextensive with humanity's productive forces, and ultimately becomes the blueprint for humanity's innermost desires about the world and how it ought to be lived.

Protocol dangerous in double Foucauldian sense of reification and weaponry, including tactical media.

(245) This makes protocol dangerous—but in the Foucauldian sense of danger that is twofold. First it is dangerous because it acts to make concrete our fundamentally contingent and immaterial desires (a process called reification), and in this sense protocol takes on authoritarian undertones.
(245) But protocol is also dangerous in the way that a weapon is dangerous. It is potentially an effective tool that can be used to roll over one's political opponents. And protocol has already proven this in the sphere of technology. What poses a real threat to Microsoft's monopoly? Not Macintosh (the market). Not the Justice Department (the state). Instead it is the widespread use of protocols that struggle against Redmond's proprietary standards with varying degrees of success.
(246) With this analogy in place, then, a critique of protocol becomes clearer. In many ways market economies represent a dramatic leap forward in the history of mankind, for they represent a higher degree of individual freedom over previous social forms (e.g., feudalism). But at the same time market economies bring into existence high levels of social inequality.

Comparing protocol to market economy permits analogous critical questioning; however, the special feature of code being executable and autonomous requires further analysis and differentiation.

(246) Thus the same types of critiques that can be levied against so-called successful social realities such as market economies (or even liberalism, or civil society, or the bourgeois class itself) can be levied against protocol. As critics we must first ask ourselves: Do we want the Web to function like a market economy? Can we imagine future technological solutions that fulfill our social desires more fully than protocol can?



Galloway, Alexander R. Protocol: How Control Exists After Decentralization. Cambridge: MIT Press, 2004. Print.