Notes for Brian Cantwell Smith On the Origin of Objects

Key concepts: computation in the wild, Heideggerian breakdown, inscription errors, object, middle distance, philosophy of presence, registration, type-coercive style.


Related theorists: Wendy Chun, Scott Rosenberg.

Preface

Claims to be a computer scientist turned philosopher, working in language research.

(vii-viii) Having spent more than twenty-five years working in the trenches of practicing computer science, in a long-term effort to develop an empirically responsible theory of computation, I had never met such a logically pure entity, never met such a lapidary individual thing. . . . By and large, or so at least my experience suggests, the world is an unruly place—much messier than reigning ontological and scientific myths would lead one to suspect.

Conception of object in science and analytic philosophy resembles manicured garden more so than grimy ice flow taught from decades programming.

(viii) And for better or worse—but mostly, I believe, for worse—the conception of “object' that has been enshrined in present-day science and analytic philosophy, with its presumptive underlying precision and clarity, is more reminiscent of fastidiously cropped hedge rows, carefully weeded rose gardens, and individually labeled decorative trees, then it is of the endless and rough arctic plain, or of a million-ton iceberg midwifed with a deafening crack and splintering spray from a grimy 10,000-year-old ice flow.
(ix) Neither discovered, nor in any simple sense merely constructed, gardens, in order to to be gardens, must be cared for, tended—even loved. What more could one ask for, by way of ontological moral?
(ix) This is a book about metaphysics—one that attempts to do justice to the tundra, to gardening, to politics, to rock. As indicated, my path into these subjects has come through computer science, but that is mostly by the way.
(x) I hope to say two things. First,
yes, it is possible to base uncompromising theoretical inquiry on alternative foundations: messier foundations, contested foundations, foundations that run closer to the wry and weathered texture of ordinary life. No one, least of all God, has decreed that intellectual rigor most (or even can) be founded on a pristine foundational atomism. Second, though, I also want to make evident just how much such a transformation costs.


Introduction

Introduces philosophy of presence operating in middle distance between naive realism and pure constructivism.

(3) This book introduces a new metaphysics—a philosophy of presencethat aims to steer a path between the Scylla of naïve realism and the Charybdis of pure constructivism.
(3-4) Fundamental to the view is a claim that objects, properties, practice, and politics—indeed everything ontological—live in what is called the “
middle distance: an intermediate realm between a proximal though ultimately ineffable connection, reminiscent of the familiar physical bumping and shoving of the world, and a more remote disconnection, a form of unbridgeable separation that lies at the root of abstraction and of the partial (and painful) subject-object divide. No sense attends to the idea of complete connection or complete disconnection; limit idealizations are outmoded. Yet an essential interplay of patterns of partial connection and partial disconnection—restless figures of separation and engagement—is shown to under lie a single notion taken to unify representation and ontology: that of a subject's registration of the world.
(4) Thus the proposal shows what any successful metaphysics must show: how an irrevocable commitment to pluralism is compatible with the recognition that not all stories are equally good?
(4) it is not a picture of a simple world, but a simple picture of a world of surpassing richness.

1 The foundations of computation

Criteria for theory of computation are empirical and conceptual, doing justice to contemporary computational practice and providing foundation to cognitivism, computation in the wild.

(5) For more than twenty-five years I have been striving to develop an adequate and comprehensive theory of computation, one able to meet two essential criteria:
1.
Empirical: It must do justice to computational practice (e.g., be capable of explaining Microsoft Word—including for reasons that will emerge, the program itself, its construction, maintenance, and use); and
2.
Conceptual: It must provide a tenable foundation for the computational theory of mind—the thesis, sometimes known as “cognitivism,” that underlies artificial intelligence and cognitive science.
(6) By the same token, I reject all proposals that assume that computation can be defined. By my lights, an adequate theory must make a substantive empirical claim about what I call
computation in the wild: that eruptive body of practices, techniques, networks, machines, and behavior that has so palpably revolutionized late-twentieth-century life.
(6-7) In my view, that is, cognitivism holds that people manifest, or exemplify, or are, or can be explained by, or can be illuminatingly understood in terms of,
whatever properties it is that characterize some identifiable species of the genus exemplified by computation-in-the-wild. . . . The cognitive revolution is fueled, both directly and indirectly, by an embodied and enthusiastically endorsed, but as-yet largely tacit, intuition based on many years of practical computational experience.
(8) Not only do these writers make a hypothetical statement about
people, that they are physical, formal, or explicit symbol manipulators, respectively; they do so by making a hypothetical statement about computers, that they are in some essential or illuminating way characterizable in the same way.

No construal of computation meets either the empirical or conceptual criterion.

(8) That, then, constitutes what I will call the computational project: to formulate a true and satisfying theory of computation that honors these two criteria. Needless to say, neither criterion is easy to meet. Elsewhere, I report on a study of half a dozen reigning construals of computation, with reference to both criteria---formal symbol manipulation, automata theory, information processing, digital state machines, recursion theory, Turing machines, the theory of effective computability, complexity theory, the assumptions underlying programming language semantics, and the like—and argue, in brief, that each fails on both counts.
(9-11) The most celebrated difficulties have to do with semantics. It is widely recognized that computation is in one way or another a symbolic or representational or information-based or semantical—i.e., as philosophers would say, an
intentional—phenomenon. . . . The only compelling reason to suppose that we (or minds or intelligence) might be computers stems from the fact that we, too, deal with representations, symbols, meaning, information, and the like.
(11) For someone with cognitivist leanings, therefore—as opposed, say, to an eliminative materialist, or to some types of connectionist—it is natural to expect that a comprehensive theory of computation will have to focus on its semantical aspects. This raises problems enough. Consider just the issue of representation. In order to meet the first criterion, of empirical adequacy, a successful candidate ill have to make sense of the myriad kinds of representation that saturate practical systems.
(12) In order to meet the second, conceptual, criterion, moreover, any account of this profusion of representational practice must be grounded on, or at least defined in terms of, a theory of semantics or content.
(12-13) Genuine theories of content, moreover—of what it is that makes a given symbol or structure or patch of the world be
about or oriented towards some other entity or structure or patch—are notoriously hard to come by.

2 The ontological wall
(14) the most serious problems standing in the way of developing an adequate theory of computation are as much
ontological as they are semantical.
(16) This book can be viewed as an attempt to follow out, as simply but as rigorously as possible, the consequences of this entanglement.

3 A convergence of fields
(18) The concerns are perhaps most pressing for literary critics, anthropologists, and other social theorists, vexed by what analytic categories to use in understanding people or cultures that, by the theorists' own admission, comprehend and constitute the world using concepts alien to the theorists' own. What makes the problem particularly obvious, in these cases, is that potential for
conceptual clash between theorist's and subject's worldview—a clash that can easily seem paralyzing.
(19) Consider the burgeoning interest in “complex systems” coalescing in a somewhat renegade subdiscipline at the intersection of dynamics, theoretical biology, and artificial life. . . . In spite of being virtually constitutive of the discipline, these discussions are conducted in the absence of adequate theories of what organization comes to, of what a “unit” consists in, of how “entities” arise (as opposed to how they survive), of how it is determined what predicates should figure in characterizing a fitness landscape as rough or smooth, etc.
(20) Midway between matter and mind, computation stands in excellent stead as a supply of concrete cases of middling complexity—what in computer science is called a “validation suite”--against which to test specific metaphysical hypotheses.
(21) Computation is
an example, throughout; it is never the example.

4 Rhetoric
(21-22) It turns out that the normative demands on writing in computer science and philosophy are rather different. In computer science, a text is treated as something like a map or travel guide: it gives you a route to the author's real contribution, but once that contribution is reached (understood, implemented), the text has served its purpose, and might as well be thrown away.
(22) The situation in philosophy is a little different. Here the text, or perhaps the argument that the text expresses, is subject to more serious normative evaluation.

Difference between computer science and philosophy texts; this is the former.

(22-23) This book may look like philosophy, but do not be fooled. . . . I was less interested, this time around, in developing watertight arguments than in introducing a new territory—a territory that I believe is worth exploring on its own merit.


Part I Analysis
1 Computation
1 The ontology of computation

Questions for philosophy of computing, if the overall term survives.

(27-28) The first set of ontological problems that a theorist of computation encounters has to do with the nature of computation itself—with the kind and character of the workaday entities of professional computational practice. What are programs, for example, really: and how do they different from data structures? What is an implementation level? What is an abstraction boundary? What is the relation between hardware and software (the mind/body problem for machines)? In what ways are interpreters, compilers, and emulators alike, and in what ways different? Are virtual machines physical or abstract? What exactly is state? What are the identity conditions on functions, algorithms, programs, and implementations? What is the difference between an effect and a side effect? How do computer, computation, and computability relate?
(28) the field is characterized by a remarkably robust, though rough and tacit, working consensus. . . . It consists less of a prior agreement on everything, in principle, so much as an ability, in specific cases, to negotiate or work through to consensus.
(28) To say this is not intended to be critical or discouraging; it is simply a recognition that computer science is still in its infancy. Nevertheless, to see how inchoate our present understanding is, one only needs to see what happens when one tries to “lift” this tacit understanding outside its natural context—and redeploy it somewhere as nearby as in the debate about the computational theory of mind.
(29 footnote 5) Throughout, I will use 'they' and 'them' as syntactically plural but semantically singular third-person personal pronouns of unmarked sex.

2 The use of metaphorical terms
(33) Given the intellectual origins of computer science, it is no surprise that much of our present-day computational vocabulary was lifted from the study of logic, mathematics, and formal languages. And the objects that occupied the foreground in those traditions were, among other things, by and large either (i)
linguistic or grammatical—written formulae or sentences, with a grammar and constituent structure, such as are familiar from the ordinary quantificational calculi—or (ii) abstract, such as the numbers and sets and functions that they have usually been taken to denote. Moreover, the virtually universal and by-now mythologized semantical model, which I will call the “binary model of semantics,” involved mapping elements of the first onto elements of the second.

Binary models of semantics misses tripartite program, process, subject matter domains such that emphasis on one pair or the other generates different sets of philosophical problems.

(33-34) Unfortunately, in my opinion, the uncritical attempt to fit computation into this [linguistic/abstract] typology has obscured, rather than illuminated, the true nature of the computational situation. The fundamental problem stems from the fact that the paradigmatic computational situation involves at least three types of entity, not just two. The situation is caricatured in figure 1-1, which discriminates among: (i) a program, of the sort that might be edited with a text editor; (ii) the process or computation to which that program gives rise, upon being executed; and (iii) some (often external) domain or subject matter that the computation is about. Three objects naturally give rise to three binary relations, of which I will take two to be of primary importance: the program-process relation, labeled 'α' in the diagram; and the process-subject matter relation, labeled 'β'.
(34) If you adopt the simple binary model, you are forced either to ignore or to elide one of these distinctions, and (usually) thereby to conflate two of the three fundamental types of entity. In cognitive science and the philosophy of mind—and more generally, I think, in disciplines surrounding computer science—it is the distinction between program and process that is elided. This leads people to adopt two very familiar views: (i) that computation is fundamentally syntactic (like manipulation of structures that are in some essential sense like written tokens); and that it can therefore be adequately characterized using concepts that were developed for written languages, such as a simple type/token distinction, a notion of (lexical) constituent, etc.; and (ii) that 'semantics' refers to the relation β between “computation” (the conflation of program and process) and the world in which that computation is embedded. Theoretical computer science, however, takes the opposite tack: it focuses on the program-process relation α, not so much eliding as setting aside the process-subject matter relation. As a result, computer scientists view programs, not processes, as syntactic, but treat computation itself abstractly; and, more seriously, take the word 'semantics' to refer to the program-process relation (α), not to that between process and subject matter (β).
(35) The fact that cognitive science treats computations not just as concrete but as syntactic has misled a generation of philosophers into thinking that all standard architectures—von Neumann machines, Lisp, just about everything except connectionist networks—involve the explicit manipulation of formal symbols.

Eliding programs and process prevents noticing ontological shift towards more intrinsically dynamic ontologies, in addition to Chun sourcery.

(35-36) Perhaps the most unfortunate consequence of the adoption of the traditional binary semantic model, however, has been in outsiders' tendency to elide program and process, and thereby to miss an extraordinarily important ontological shift in focus at the heart of computer science. This is a very deeply entrenched change away from treating the world in terms of static entities instantiating properties and standing in relation, and towards a view that is much more intrinsically dynamic and active.

Example of critical programming studies done by Smith on 2-Lisp.

(37) In 1981, as something of a design exercise, I developed a programming language called 2-Lisp, with the explicit aim of exhibiting within the context of a programming language a degree of semantical clarity about these very semantical issues. More particularly, I identified two different semantical relationships: one, approximately α in the diagram, between external expressions and internal computational structures that I called impressions (i.e., using the word 'impression' to designate process ingredients); and another, approximately β, between those impressions and such external, Platonic entities as sets, numbers, and functions.
(38-40) First [moral], all three domains relevant to a computation—program, process, and semantic domain (task domain, domain of interpretation)--must be recognized by an adequate theory of computation as first class realms in their own right. Moreover, they should also be classified with properties they actually exhibit, rather than classified metaphorically, with properties lifted from a merely analogous domain.

Type-coercive style like Heideggerian breakdow views representational objects only becoming visible contextually in contestation: relate to early versus late binding (Rosenberg)?

(40-41) It was soon clear that what was wanted, even if I did not at the time know how to provide it, was a way of allowing distinctions to be made on the fly, as appropriate to the circumstances, in something of a type-coercive style—and also, tellingly, in a manner reminiscent of Heideggerian breakdown. Representational objects needed to become visible only when the use of them ceased to be transparent. Reason, moreover, argued against the conceit of ever being able to make all necessary distinctions in advance—i.e., against the presumption that the original designer could foresee the finest-grain distinction anyone would ever need, and thus supply the rest through a series of partitions or equivalence classes. Rather, what was required was a sense of identity that would support dynamic, on-the-fly problem-specifc or task-specific differentiation—including differentiation according to distinctions that had not even been imagined at a prior, safe, detached, “design” time.
(41) It was sobering, moreover, to encounter this moral (which many social theorists would argue for in much more complex settings) even in such simple arithmetic cases as essential arithmetic calculation—
allegedly the paradigmatic case of formal symbol manipulation construal of computation.

3 Computational ontology

Nature of ontology itself at stake in study of representational nature of computation.

(42) The representational nature of computation implies something very strong: that it is not just the ontology of computation that is at stake; it is the nature of ontology itself.
(44) Rather, I am concerned with the more general ontological assumptions that control the categories in terms of which these details are formulated (categories like
object, property, relation, and wave); and higher-order properties of those properties, having for example to do with issues of negation, parameterization, instantiation, etc.

Designers of object-oriented languages, knowledge representation, database design, network designers involved in ontological research.

(44) To make this concrete, consider the exploding interest, both theoretical and practical, in the development of object-oriented languages. Even if not advertised as such, this turn of events has led computer science squarely into the business of doing research in ontology.
(44) coding up the details of task-specific domains is the job of
users of object-oriented languages, not their designers.

Computer scientists wrapped up in metaphysical questions about mereology, object identity, type/token distinctions, identity criteria, and so on because it is really the task of users to explore details of task-specific domains.

(44-45) As a result, computer scientists have ended up having to face all sorts of unabashedly metaphysical questions: about the nature of mereology (part/whole relations); about whether or not object identity within a system crosses different levels of abstraction or implementation, relevant to questions of theoretic reduction; about the nature of type/token distinctions; about individuation criteria, including the establishing of identity, for example in self-organizing systems; about the nature of parameterization; about the similarities and differences among sets, classes, and types; and so on and so forth. Nor are object-oriented system designers the only people involved in these issues; currently they are just the most visible. The same questions have been under investigation for decades by developers of knowledge representation schemes, data base designers, people worrying about strongly typed languages, and the rest. More recently they have been taken up anew by network designers wrestling with the relations among identifiers, names, references, locations, handles, etc., on the World Wide Web.

Failure of traditional ontological categories.

(45) Perhaps the most interesting thing about this ontological effort, moreover, has been the ways in which it has failed. The problem is that, upon encounter with real-world problems, it is hard for practitioners to avoid realizing that such traditional ontological categories as discrete countable objects, clear and precise categories, and other products of received ontological myth, are both too brittle and too restrictive.
(47) In part, it is increasingly recognized not only that the represented categories have context-dependent meanings, but that the question of what the categories are can only be answered dynamically, within the settings in which the computational systems are deployed. This presses for a kind of representational flexibility that current object-oriented systems lack.

Example of EMACS as supporting multiple simultaneous takes on character buffer.

(48 footnote 24) Note that EMACS, a popular text and programming editor, derives much of its power from supporting multiple simultaneous “takes” on the string of characters in its buffer, in just the way suggested in the text. One command can view the buffer as a Lisp program definition; another, as a linear sequence of characters; another, as bracketed or parenthesized region. In order to support these multiple simultaneous views, EMACS in effect “lets go” of its parse of the buffer after every single keystroke, and re-parses all over again the next time a key is struck—possibly with respect to a wholly different grammar.
(49) Computer science does what it always does in the face of such difficulties: it makes up the answers as it goes along—inventive, specific, and pragmatic, even if not necessarily well explicated. But that leads in turn to the third broad class of ontological problem—a problem with a methodological as well as a substantive dimension.

4 Inscription errors

Inscription error of ontological assumptions onto computational systems, then reading back as if empirical discoveries.

(50) It is a phenomenon that I will in general call an inscription error: a tendency for a theorist or observer, first, to write or project or impose or inscribe a set of ontological assumptions onto a computational system (onto the system itself, onto the task domain, onto the relation between the two, and so forth), and then, second, to read those assumptions or their consequences back off the system, as if that constituted an independent empirical discovery or theoretical result.
(53) The justification for assigning different kinds of content to a system, that is, is vulnerable to the ways in which we (perhaps unwittingly) individuate the system itself.

Inscription error example of Coke can collecting robot.

(53) A similar example is provided by analyses of conditions under which a system is able to reidentify a given object as being the same as one it saw before, rather than being a new one of the same type—e.g., the sort of argument that would be used to support the conclusion that the system is capable of particular, not just generic, reference. Again, this is worth going through slowly, because the moral only emerges from the details.

5 Theorist's vs. subject's perspective
(63) The issue is not how the world seems to us, but
how it is for the constructed agent. And there is no a priori reason to suppose that any of the choices that the theorist is likely to entertain, or is even capable of entertaining, will be the right one.
(64) From the fact that we can build something that is φ, that is, it does not follow that we understand what it is to be φ.

Argument for critical programming studies: actually build and modify, not just understand how to build..

(66) But it want to assert something stronger: that it is intellectually essential not just that we understand how to build them, but that we actually build and modify and use them—because of the fact that, in so building and modifying and using, we become enmeshed with them in a participatory fashion, in a way that both transcends and also grounds the representational attitudes we bear towards them.
(67) The point is easier to see in our case. How
we take the world to be—to consist of objects, properties, and relations, or of other things, or whatever—cannot depend on how we take our minds or brains to be, since most of us do not take our minds or brains to be any way at all.
(68) Somehow or other—and this I take to be the most important and difficult task facing the cognitive sciences—
it must be possible to have determinate representational content, i.e., for there to be a fact of the matter as to how the world is represented, without that analysis depending on any way of taking the internal structures in the mind that does the analysis.

6 Participation
(69) It leads to a single inescapable conclusion:
There is no way to proceed on the overall project, of developing a comprehensive theory of computation that meets both the empirical and conceptual criteria, except by taking on metaphysics and ontology directly. . . . If there is ever going to be a satisfying theory of computation, it will have to rest on theories of both intentionality and ontology—or else (I will be recommending this alternative) a single integrated theory that covers both subject matters. Either way, there is no way to sidestep the metaphysical challenge.
(69 footnote 44) Ever since the fall of 1967, when I first took up this project, and learned how to program, my primary interest in computation has been in the potential window it might (may) give us on a new way of understanding—one with all the rigor and power and penetration of the natural sciences, but one that at the same time could do justice to the richness and complexity and sheer vivacity of human life.

Cannot avoid materiality and locatedness of code, nor importance of participatory engagement, physical embodiment, after investigating computation in the wild.

(72) First, it turns out that issues of physical embodiment are essential. . . . It is a theory of the flow of effect, in other words—and as such, even though it is not so advertised, is probably the best candidate yet for a scientific theory of causality.
(72-73) Second, fitting in which this essential materiality and locatedness is perhaps the most ramifying consequence of investigating computation in the wild: the recognition that computers are
inextricably involved in their subject matters. . . . Experience, in any intuitively recognizable form, is too passive or receptive a category to do justice to the sorts of activity that computers engender. . . . In the end one can only conclude that any semantical theory adequate to practice will have to be a full-blooded theory of participatory engagement, not just of reasoning or representation, or even of perception, action, and experience.

Computation is not a subject matter, so no philosophies of computing: replace with social construction of intentional artifacts.

(73-74) For present purposes, however, both these results pale in importance compared with a third and final lesson: Computation is not a subject matter. . . . Computers turn out in the end to be rather like cars: objects of inestimable social and political and economic and personal importance, but not the focus of enduring scientific or intellectual inquiry.
(75) Rather, what computers are, I now believe, and what the considerable and impressive body of practice associated with them amounts to, is neither more nor less than the
full-fledged social construction and development of intentional artifacts. That means that the range of experience and skills that have been developed within computer science—remarkably complex and far-reaching, if still inadequately articulated—is best understood as practical, synthetic, raw material for no less than full theories of semantics and ontology.

Experience with constructing computational systems gives chance to witness how intentional capacities arise from mere physical mechanism, leading to better thoughts along Socratic lines of how a structured lump of clay can sit up and think: strong linkage between philosophy of computing and the humanities.

By acculturating ourselves with working code we may prepare ourselves to witness emergence of intelligence from merely physical mechanism, begging the question how that intelligence is validated, by what criteria it is judged to sit up and think short of engaging in dialog with us as portrayed in science fiction and even television shows, for example KITT from Knight Rider and Commander Data from STNG.

(75-76) Methodologically, it means that our experience with constructing computational (i.e., intentional) systems may open a window onto something to which we would not otherwise have any access: the chance to witness, with our own eyes, how intentional capacities can arise in a “merely” physical mechanism. . . . But only when we let go of the conceit that that fact is theoretically important will we finally be able to see, without distraction—and thereby, perhaps, at least partially to understand—how a structured lump of clay can sit up and think.


2 Irreduction
()



Smith, Brian Cantwell. On the Origin of Objects. Cambridge, MA: MIT Press, 1996. Print.