Notes for Luciano Floridi Philosophy and Computing: An Introduction
Key concepts: associative indexing, Balkanisation of knowledge, critical constructionism, cyberspace, fifth element, humanities computing, Humean theory of personality, hypertext, ideometry, infosphere, lexia, linearity, managerial function of computers, memex, multi-level linearity, ontological interpretation of databases, Super Turing machine, Turing machine.
Related theorists: Barthes, Bolter, Boole, Bush, Descartes, Kittler, Licklider, Lyotard, Manovich, Negroponte, Schumpeter, Von Neumann.
Frontispiece from Iliad ancient reference to automata [xviii, 417-421].
(ix) I have written this introduction to information and communication technology (ICT) with two kinds of philosophy students in mind: those who need to acquire some ICT literacy in order to use computers efficiently, and those who may be interested in acquiring the background knowledge indispensable for developing a critical understanding of our digital age and hence beginning to work on that would-be branch of philosophy I define the “philosophy of information,” which I hope my one day become part of our Philosophia Prima.
Reaching for philosophy of information/ICT rather than computing or programming, despite control system frontispiece; admits to logocentric bias and to impose as discussion topic later in the book.
(ix-x) It is easier to be intellectually neutral when talking about different types of data storage than when dealing with the concept of text or the possibility of artificial intelligence. By the time the reader has reached Chapter 5, I can only hope that my “logocentric” or “neo-Cartesian” views (both labels with negative connotations for many philosophers who write about the topics covered in this book) will be sufficiently clear to allow an open discussion of their value.
Pragmatic approach to AI, for example value of speech recognition, recalls Licklider.
(x-xi) My aim here
is not to debunk them but rather to redirect our philosophical
interest in them. . . . We have stopped interacting audibly with our
information systems sometime after Augustine. . . . A
speech-recognition, hands-free application is not necessarily useful
especially if it is not mobile as well.
(xi) I argue that a LAI [light AI] approach is the most promising strategy in many areas of AI application. The second half of the chapter is dedicated to the introductory analysis of these areas: fuzzy logic systems, artificial neural networks, parallel computing, quantum computing, expert systems, knowledge engineering, formal ontologies, robotics, cybernetics and artificial agents.
Appealing to collectively sensed philosophical problems arising in information culture, promoting critical constructionism as new perspective to investigate.
There are philosophers' problems, which people often find
uninteresting and are glad to leave to the specialists, and there are
philosophical problems, problems about which any educated person will
usually feel the urge to know and understand more. The development of
an information culture, a digital society and a knowledge economy
poses problems of the second kind. They may not yet have gained their
special place in the philosophers' syllabus, but this does not make
them any less important. I suggest that they open up a whole new area
of philosophical investigation, which I have labeled the “philosophy
of information,” and promote the development of a general new
philosophical perspective, which I define as critical
(xi) One of the remarkable things about computer science and ICT is that they have provided and continue to contribute to a sort of metadisciplinary, unified language. Information-related concepts and powerful metaphors acting as “hermeneutic devices” through which we may more or less acritically interpret the world are now common currency in all academic subjects.
Claims to this being a Webook versus traditional book, originally an update of a previous work; ironically, as of 2013 the link forwards to a now defunct destination.
(xii) To my knowledge, there is no other general introduction to ICT for philosophers. . . . all printed and electronic sources used or mentioned in the book, and all suggestions for further reading have been placed in a webliography, which is available free and online at http://www.wolfson.ox.ac.uk/~floridi/webliography.htm. . . . This text may be seen as a printout of a more complex “webook.” . . . And finally, because a webliography can be corrected and updated easily and on a regular basis, thanks to interaction with the readers.
Forced to rewrite Italian original in part due to virus damaging his information: play against Derrida archiving on his little Macintosh.
(xiii) I embarked upon this ambitious project only because I thought I could simply build on a guide to ICT for philosophers I had written in Italian (Floridi 1996). . . . The referees, however, suggested further expansions, and a virus destroyed part of the initial files. By rewriting, updating, reshaping, adapting and expanding the original text, I ended up writing what is in fact a new book.
Divide et computa: philosophy and the digital environment
The digital revolution
Conflates Moore law with similarly improving usability; surprisingly, no references to Castells, who differentiates informational and information societies.
(2) If Moore's
famous law is still true fifty years after he proposed it – and
there is currently no reason to think that it will not be – then at
the beginning of the 2020s microprocessors may well be as much as
1000 times computationally more powerful than the Pentium III chip we
were so proud of only yesterday, yet a child will be able to use them
(2) What we call “the information society” has been brought about by the fastest growing technology in history. . . . Total pervasiveness and high power have raised ICT to the status of the characteristic technology of our time, but rhetorically and iconographically.
(3) The real point is that technological determinism is unjustified and that work organization and the quality of working life depend much more on overall managerial philosophies and strategic decisions than on the simple introduction of a particular type of technology.
The four areas of the digital revolution
ICT changes in social standards; compare to Manovich.
(4) In the information society, changes in social standards are ever more deeply and extensively ICT-driven or induced. Such modifications in the growth, the fruition and the management of information resources and services concern four main sectors: computation, automatic control, modeling, and information management.
(4-5) According to different perspectives, computation may be described as a logical or physical process of generation of final states (output) from initial states (inputs), based on:
1. rule-goverend state-transitions, or
2. discrete or digital rule-governed state-transitions, or
3. a series of rule-governed state-transitions for which the rule can be altered, or
4. rule-governed state-transitions between interpretable states.
There are some difficulties with these definitions. . . . (4) seems to be the most satisfactory. It makes the interpretable representation of a state a necessary condition for computation. . . . We shall see in the next chapter that it may be easier to explain the concept of “computability” by referring precisely to a Turing machine and its properties. For the time being, let us rely on our intuitive understanding of the concept.
Modeling and virtual reality
Instrumentation and technological nonconscious; embodied epistemology.
(7) The ultimate goal is authenticity, understood as a virtual
hyper-realism. . . . Indeed, every are of human knowledge whose
models and entities – whether real or theoretical no longer matters
– can be translated into the digital language of bits is, and will
inevitably be, more and more dependent upon ICT capacity to let us
perceive and handle the objects under investigation, as if they were
everyday things, pieces on a chess board that can be automatically
moved, rotated, mirrored, modified, combined and subjected to the
most diverse transformations and tests.
(8) After a long period in which there has been a prevailingly Cartesian view of knowledge and rational thinking as a series of intuitions enjoyed by a disembodied mind and chiefly brought about by a flow of algebraic equations and analytic steps, we are witnessing the revaluation of the sensorially-enriched (visual and tactile and synthetic) epistemology so dear to Greek and Renaissance philosophers.
(8) As regards the [global] infosphere, the symbolic-computational power of ICT tools is employed for the ends that go beyond the solution of complex numerical problems, the control of a mechanical world or the creation of virtual models.
the analogue to the digital: the new physics of knowledge
(10) Thus one could outline the comparatively brief history of western culture in terms of the evolution of the methods whereby organized knowledge has been made potentially available to the individual mind.
Obligatory Swift reference for automation of reproduction of knowledge, looking forward to Bush but awaiting electronics.
In the end, the printed medium showed itself capable of coping with
the extension of knowledge only vertically, by producing increasingly
specialized guides and then further guides to guides, in an endless
hierarchy. So much so that, soon after Gutenberg, the development of
new, automatic methods to manipulate, access and control the
encyclopedic domain became first critical, then indispensable. As a
consequence, the history of modern technology is full of attempts to
reproduce, at the level of the automation of the information process,
the same mechanical improvements already achieved by the press and
movable type at the level of the reproduction of knowledge. Johathan
Swift's description of the work done in the grand Academy of Lagado,
is a classic parody of such efforts.
(13) Only a new physics of knowledge, the passage from printed paper to digital-electronic data, finally made possible a thoroughly satisfactory way of managing information, and hence much more efficient control over the system of knowledge.
The digitization of the infosphere:
(14) We have seen that the appearance of ICT in the second half of the twentieth century in an attempt to solve our problems of infosphere management can be interpreted, conceptually, as a perfectly coherent outcome of the self-regulating process governing the growth of the infosphere itself [extension, visualization and manipulation, convergence and integration].
The relations between philosophy and
Sociology of the information society
The philosophy of
(17) All “philosophies of . . .” show a tendency to converge towards two poles, one phenomenological and the other metatheoretical.
(17) Some specific “philosophies of . . .”, however, show only a tension between the two poles, often combining both phenomenological and metatheoretical interests.
(18) The philosophy of information is therefore phenomenologically biased, and is more a philosophy of the infosphere than a philosophy of computer science or ICT tout court.
The philosophy of AI (artificial intelligence)
Philosophy and aided
(18-19) This last field, which is something described, with a wordplay, as that of aided or augmented intelligence (“Augmented Intelligence” was coined in the 1960s by Douglas Engelbart, one of the fathers of hypermedia), is the one to which I have tried to call the reader's attention in the foregoing pages, and it belongs to the wider subject known as humanities computing. In such a context, wondering what is implicit in the relation between computing, ICT and philosophy means to ask not only whether philosophers may be able to exploit recent technological innovations for their work, and if so, to what extent, but also what it means for a philosopher to be abreast of the technology of our time. . . . We shall begin by looking at the computational nature of the instruments in question, the tools of our digital workshop.
The digital workshop
From the laboratory to the house
is a computer?
Before the computer: two semantic distinctions
Convergence based on construction of modern computer.
(24) To begin with, bits can equally well be represented logically (true/false), mathematically (1/0) and physically (transistor = on/off, switch = open/closed, electric circuit = high/low voltage, disc or tape = magnetized/unmagnetized, CD = presence/absence of pits, etc.), and hence provide the common ground where mathematical logic, the logic of circuits and the physics of information can converge. This means that it is possible to construct machines that are able to recognize bits physically and behave logically on the basis of such recognition.
Babbage's analytical engine
(26) The road leading to semantic atomism was blocked and the analytical engine was probably as close as one could get to constructing a computer without modifying the very physics and logic implemented by the machine.
(26) Alan Turing's contributions to computer science are so outstanding that two of his seminal papers, “On Computable Numbers, with an Application to the Entscheidungsproblem” (Turing 1936) and “Computing Machinery and Intelligence” (Turing 1950), have provided the foundations for the development of the theory of computability, recursion functions and artificial intelligence.
(28) The logical sequence of TM operations is fully determined at all times by TM's internal state (the first kind of input), the symbol on the active square (the second kind of input) and the elementary instructions provided by the quintuples. The machine can be only in a finite number of states (“functional states”), each of which is defined by the quintuples.
(28-29) How extended is this class of functions? To answer this question we need to distinguish between two fundamental results achieved by Turing, which are usually known as Turing's theorem (TT) and the Church-Turing thesis (CTT), and a number of other corollaries and hypotheses, including Church's thesis.
(29) TT is a crucial result in computation theory: to say that a UTM is a TM that can encompass any other TM is like saying that, given m specific flow charts, drawn in a standard and regimented symbolism, which describe the execution of as many specific tasks, there is a universal flow chart n, written with the same symbols, that can reproduce any of them and thus perform the same tasks.
(30) Broadly speaking, CTT suggests that the intuitive but informal notion of “effectively computable function” can be replaced by the more precise notion of “TM-computable function.”
(35) In STMs [Super Turing Machines] we gain more computational power at the expense of a complete decoupling between programming and computation (we have calculation as an input-output transformation phenomenon determined by the structure of the hardware, without having computational programmability as a rule-based procedure), while in UTM-compatible systems we gain complete coupling between the programmable algorithmic procedure and the computational process of which it is a specification (in terms of computer programs, the process takes place when the algorithm begins its fetch-execute cycle) at the expense of computational power.
(35-36) Given the differences between TMs and STMs, the class of calculable functions is therefore a superclass of the class of effectively computable functions. This is the strictly set-theoretic sense in which Super Turing machines are super: whatever can be computed by a TM can be calculated by a STM but not vice versa.
General-purpose, rule-based transformation is the key concept of computability.
Since there are discrete dynamical systems (including parallel
systems, see below) that can have super-Turing capacities, the
distinction between effective
be reduced to the analogue/digital or continuous/discrete systems
distinction; the key concept is rather that of general-purpose,
(37) The “richness” of PPCs [parallel processing computers] and QCs [quantum computers] lets us improve our conception of the tractability of algorithms in the theory of complexity, but does not influence our understanding of the decidability and computability of problems in the theory of computation. Since they are not in principle more powerful than the classical model, richer computers do not pose any challenge to CTT.
(38) We can now state the last general result of this section. According to Church's thesis, every function that is effectively computable is also recursive (R) and vice versa. . . . Like CTT then, CT is not a theorem but a reasonable conjecture that is supported by a number of facts and nowadays results widely accepted as correct.
(39-40) After Turing, we have a more precise idea of the concepts of “mechanical procedure,” “effective computation” and “algorithm.” This was a major step, soon followed by a wealth of mathematical and computational results. Nevertheless, a UTM leaves unsolved a number of practical problems, above all the unlimited resources (space and time) it may require to complete even very simple computations. . . . In summary, one may say the Boole and Shannon provided the former [more efficient logic], and Von Neumann, with many others including Turing himself, the latter [more efficient architecture].
(41-42) Once classic (i.e. two-value) propositional logic, Boolean algebra, the algebra of set theory and switching algebra become interchangeable, electronic technology can be used to build high-speed logical switching circuits such as adders, multipliers, etc. from simple logical units, and a real UTM can be physically implemented by assembling gates together.
(42) As a result, the history of computing has moved so fast through so many technological stages that in fifty years it has accumulated a full archaeological past and four time scales, based on the three macro revolutions, represented by the appearance of the first computers, PCs, and the Internet – the third age of ICT; the evolution of programming languages, the evolution of interfaces (the latter time scale was suggested by John Walker, founder of Autodesk); and finally the improvements in the structure of hardware.
Von Neumann machine
Examples how Von Neumann machine satisfies UTM criteria: stored-program, random-access, sequential, single path.
(44) In a famous paper, entitled “Preliminary Discussion of the Logical Design of an Electronic Computing Instrument”, John von Neumann, also relying on a wealth of work done in the area, suggested the essential architecture of a UTM that became universally accepted as standard in the following decades. A Von Neumann machine (VNM) is any UTM that satisfies the following criteria:
It is a stored-program computer.
It is a random-access machine.
It is a sequential machine.
It is a single path machine.
languages and software
(47-48) An algorithm must be
1. explicit and non-ambiguous;
2. faultless and infallible;
4. deterministic and sequential. . . .
To write a good algorithm for a computing system we need an appropriate language, but when we analyze the “language” of a computer, it is important to distinguish between several levels of representation [physical, logical, abstract, conceptual systems].
(49) In high-level languages, English words, such as OPEN or PRINT, are used as commands standing for whole sequences of hundreds of machine language instructions. They are translated into machine code by the computer itself thanks to compilers or interpreters.
(49) The development of programming language design has essentially been determined by four factors [naturalization, problem-orientation, portability, maintainability].
(49) If we look now at the resulting software, it is common to distinguish three main categories: operating systems, utilities and applications.
of commercial computers
(50) Computers available on the market are often classified into five groups according to their size, power and intended use.
(54) The computer revolution began precisely when it became possible, in the 1970s, to interact with a computer in a user-friendly environment constituted by visual aids, keyboard and mouse.
A particular compilation error manifests a problem of the type associated with machine architectures incompatibly interfaced at the level of information structure architecture despite sharing the same CPU hardware architecture.
(55) The world of personal computers is unfortunately divided into two: on the one hand there is a niche of Apple Macintosh (Mac) users (between 1994 and 1998 Apple's share of the global market decreased from 9.4 percent to 2.6 percent) and, on the other hand, the vast majority of PC users. . . . what my be a little more interesting is to understand why, given the fact that both a Mac and a PC are VNMs, they are not compatible.
Incompatibilities go beyond information structure architectures, having social and legal components that may be concretized in design; surprising Floridi does not make this point as a self-proclaimed critical constructionist, erhaps due to his predilection for going into details of CCT, VNM, and not C++, Perl, XML, is symptomatic of how he casts foundational computation affects and reflects how he thinks.
(55) The same architecture can have different hardware implementations. . . . On the other hand, since each type of CPU recognizes a certain set of instructions, if two microprocessors implement different ISA [information structure architecture] then their software will not be compatible. This is why we cannot use software written for a Macintosh with an IBM-compatible and vice versa, although both are VNMs. It is like saying that a whale and a dog are both mammals but cannot eat the same food.
A revolution called Internet
The Internet as a basic technological change
Schumpeter model of technological develeopment mentioned by other theorists.
(56) we also make it possible to apply to its evolution Schumpeter's classic model of the three stages in the development of a technology: invention, innovation and diffusion.
The invention stage (1968-84): the Internet as a Cartesian network
Convincing recapitulation of necessary conditions of Internet; should a philosopher of computing know these protocols to the point of technical mastery, and in what sense can we articulate the need for and boundaries of an adequate use knowledge of deprecated TCP/IP?
(58-59) Three technological factors were critical in determining the genesis of the Internet: (1) The packet switching technology, first implemented in 1968. . . . (2) The adoption, in 1982, of the Transmission Control Protocol (TCP) and Internet Protocol (IP), as the standard protocols for communication over the network. . . . (3) The implementation, in 1984, of a strictly hierarchical Internet naming system, known as DNS (Domain Name-server System).
The innovation stage (1984-95)
The diffusion stage (1995-)
(60) But the year which really signaled the end of the innovation stage was 1995, when the NSFNet (the network run by the National Science Foundation in the USA) reverted to being a research network, thus allowing the main USA backbone traffic to be routed through commercial interconnected network providers.
Compare diffusion stage of Internet to Phaedrus as if philosophers being caught unaware of the emergence of writing.
(61) Despite Clarke, McLuhan, Orwell or films like War Games, it seems that its appearance found most of us, and especially the intellectual community, thoroughly unprepared. In 1994, the first eight months of the New York Times On-Disc contained 240 occurrences of the word “Internet,” while there was none in the Philosopher's Index on CD-ROM. We do not know what to think of this new phenomenon, or how to deal with it. What is the Internet exactly? What can it be used for? And what will be the effects of such a radical revolution in the way we handle the world of information?
What is the Internet?
(61) In this chapter, we shall analyze the Internet more technically as the totality of three different spaces:
the infrastructure (the physical dimension)
the memory platform (the digital dimension)
the semantic space (the cyberspace dimension).
Internet as a physical
(61) The Internet is an information carriage, consisting of a global mesh of computer networks.
Internet as a memory platform
Hayles argues that access, rather than location, is what matters now, although location with respect to the physical infrastructure affects access and therefore memory.
(62-63) The physical infrastructure implementing the common protocols makes possible a global memory platform, which results from the cohesion of all the memories of all the computers in the network. . . . That is to say that, at any given time, the memory platform has a particular extension, to be calculated in terabytes, which makes up an anisotropic space, where the amount of memory fully available to a user depends on his or her location.
Internet as cyberspace
A famous equation in my history (Ulmer chora) which makes cyberspace different from other media forms; interactive dissemination rather than broadcasting, or point to point telephonic communication, and so on; fitting that AI thrives in Internet cyberspace, relate to Derrida archive as incomplete model of embodied cognition.
(63) The totality of all documents, services and resources constitutes a semantic or conceptual space commonly called cyberspace. Cyberspace inherits from the memory platform a discrete, anisotropic, seamless nature, to which it add two further features: semi-ubiquity . . . Cartesian saturation. Formally speaking, cyberspace can then be defined thus: PHI, where PHI' is maximally close to (immediately reachable from) y and E(r)=R is devoid of information. The two features of semi-ubiquity and saturation are what makes possible the interactive dissemination of information as opposed to broadcasting.
Argues that Internet has grown beyond human control at individual, corporate, or national level, perhaps like the climate.
(65) We shall see in Chapter 5 that, once the fundamental process of structuring and de-materializing the world has taken place, it is reasonable to expect that forms of artificial intelligence will find cyberspace a suitable place in which to operate, and that it is not by chance that Webbots and digital agents are among the most successful applications of AI. . . . There are many commercial services on the Internet, like shops in a city but, again, the streets are free and one can walk wherever one wishes. . . . Nobody is running the system or will ever be able to control it in the future. When Bill Gates speaks of his plans to dominate the Internet, what he really means is that he wants to make sure that the new version of Microsoft Windows and its network will become the most popular interface and route for single users who wish to manage their own access to cyberspace through their personal computer.
How to characterize the growth of the Internet
Yahoo another interesting link to literature, to which I want to add electronic devices, and no mention of Google in list of search engines shows age of the text.
(65-66) (1) The number of documents: Consider query-engines such as AltaVista, Lycos, WebCrawler or Yahoo (the last name represents an entertaining coincidence; in Gulliver's Travels Swift describes not only the mechanical “computer” but also a land in which a race of sensible and noble horses, the Houyhnhnms, embody reason and virtue, and their savages, the human Yahoos, have only the worst human qualities; the Yahoos have become digital agents).
What can the Internet be used for?
Enumerates remote control via telnet, file transfers via FTP, running applications like Java, email, and last, web pages (but does not name HTTP).
(67) Let us therefore concentrate our attention on five typologies of communication that can be implemented through the network.
Email, the silent dialogue
(70) One exists in cyberspace only as a particular EA or, to paraphrase Quine, in cyberspace to be is to be the reference of a URL (uniform resource locator), and in this case one's URL is one's EA.
(71-72) The new koine of the electronic agora is a kind of American English, often variously adapted depending on the nationality and education of the users. . . . This introduces us to the analysis of some dialogical features that often occur in a email message: emoticons, indexicals and citations.
Declares electronic communication a secondary orality, and discusses the virtual subject.
Emoticons can be of many different types and they are interesting in
that they show that an email is clearly perceived as something half
way between a written and an oral communication.
(73) Electronic communication clearly represents a secondary orality, one step beyond literacy but dependent upon it. So far, we have seen several features that assimilate email communication to an oral conversation. There are others that show the originality of the electronic “silent dialogue” in full.
(75) Ironically, we are really “persons” only in a virtual context (the Latin word persona originally meant “mask” and was used to refer to a specific character in a play, the dramatis persona), and our identity is all in that string of symbols that we call an email address. The physical individual, like a homunculus inside the virtual subject, can then take liberties he or she would never dream of IRL (in real life).
BBS and mailing lists: the electronic learned societies
(77) In other words, the gopher represented the Internet for an MS-DOS-based culture. Once the extraordinary potential of a hypermedia (hypertextual logic + multimedia contents) map of cyberspace had been grasped, the next step was the World Wide Web.
(77) The World Wide Web originated from CERN, the High-Energy Physics Laboratory in Geneva, where it was designed as a very powerful tool with which to disseminate and share constantly updatable, hypermedia information within internationally dispersed teams and support groups. For once, it was the indirect result of scientific needs, not of a military research project.
(78) On the Web, every document can be linked to any other local or remote document as long as the latter's URL is known.
(79) The success of the World Wide Web has been so dramatic that it has deeply affected our conception of the Internet. If the latter can be thought of as a new mass medium and a commercial tool this is only because of the Web's possibilities.
The future of the human encyclopedia in the third age of IT: Frankenstein or Pygmalion?
How we think may be affected, as the launch of comparative media, texts and technology, and other new disciplines attests; Floridi determines there are at least eleven, and attempts to rank their importance, with digital discrimination oddly the least because it is not focused on social life of information.
(80) Even the we we think may be affected in the long run, for
relational and associative reasoning is becoming as important as
linear and inferential analysis, while visual thinking is once again
considered to be at least as indispensable as symbolic
(81) My impression is that there are at least eleven principal issues, concerning the growth of an information and communication network, worthy of attention. Let us have a look at them briefly, in tentative order of increasing importance.
Disappearance of the great compilers
Emergence of the computerized scholar
Stored knowledge > knowledge accessible
Knowledge accessible >
(83) A critical form of censorship is therefore a precondition for the individual mind to be able to survive in an intellectual environment in which exposure to the infosphere is far greater than ever before.
“404 document not found” and the forgetful memory
Paradox of digital prehistory; see others on Internet Wayback Machine.
(83) Our digital memory seems as volatile as our oral culture but perhaps even more unstable, as it gives us the opposite impression. This paradox of a digital “prehistory” will become increasingly pressing in the near future.
No epiphany on paper
The new language of the
(84) Elements of ICT will have to become part of the minimal education of any human being, if the freedom of information is to remain a universal right.
Intellectual space or polluted environment?
Produmers his take on produsers, human and machine, need to practice critical censorship.
(85) On the Internet, the relation between the producer and the consumer of information (when there is any distinction at all; today we should rather speak of the new figure of “produmer”) tends to be direct, so nothing protects the latter from corrupt information. . . . Unless responsible individuals, as well as academic, cultural institutions or recognized organizations, provide some forms of guides, filters, selective agents and authoritative services (e.g. rating services, quality-control services), we may no longer be able to distinguish between the intellectual space of information and knowledge and a highly polluted environment of junk mail and meaningless data.
(85) While we entrust ever vaster regions of the human inheritance to the global network, we are leaving the Internet itself in a thoroughly anarchic state.
(86) The retrieval systems in existence are insufficient to guarantee that, in a few decades, organized knowledge will not be lost in a labyrinth of millions of virtual repositories, while efforts and funds are wasted in overlapping projects. . . . If global plans are disregarded or postponed, and financial commitments delayed, the risk is that information may well become as easy to find on the network as a needle in a haystack.
Chapter ends with all for virtual nation library system like Berners-Lee semantic web; did not really develop Pygmalion or Frankenstein rhetorically.
(86-87) in the same way that the invention of printing led to the constitution of national copyright libraries that would coordinate and organize the production of knowledge in each country, so the Internet is in need of an info-structure of centers which, through their coordinated efforts, can fulfill the following five tasks. . . . What I am suggesting is that the Internet is like a new country, with a growing population of millions of well-educated citizens, and that as such it does not need a highway patrol, but will have to provide itself with info-structures like a Virtual National Library system, which could be as dynamic as the world of information, if it wants to keep track of its own cultural achievements in real time, and hence be able to advance into the third millennium in full control of its own potential. . . . Depending on how we meet the challenge, future generations will consider us as new Pygmalions or as old Frankensteins.
The digital domain
Infosphere, databases and hypertexts
The paradox of the grown of knowledge: from the chicken and the egg to the needle in a haystack
Impressive series of quotations from Augustine to Nietzsche, crystallizing on Faust as does Kittler.
(91-92) We live in a time when the loss of information is not an option, but occurs, one hopes, only as a rare and unfortunate disaster. During the Middle Ages it was simply a natural occurrence that people had to put up with. . . . Printed books soon started to produce a boundless sea of knowledge and transformed information into a territory that required precise maps. It was going to be a region in which only expert people would be able to find their way.
Biological perspective of fright reaction.
(95) The flowering of a world of learning through the long period from Francis Bacon's motto plus ultra to Kant's illuministic replay sapere aude, had brought about Plato's revenge. New knowledge could obviously be found; centuries of successful accumulation prove it unequivocally. Yet the new world represented by the human encyclopedia had become as uncontrollable and impenetrable as the natural one, and a more sophisticated version of Meno's paradox could now be formulated. . . . Knowledge, as history and culture, is a strictly human business. It was left to the modern mind to discover that it can get lost, frightened and overwhelmed even in its own intellectual environment.
Insert cathedral versus bazaar response to this Balkanizing explosion of nonhuman intelligence emerging from operation infospheres.
Admit that I have been converting spellings by Floridi like specialized from specialised, yet here it feels strange correcting an experience original to not even to Floridi, who is Italian, but that unexistent, virtual subject perspective shared by all secondary readers including us.
(96) The pyramid became a cathedral, with its secret rooms, its high peaks, nave and aisles, proving that the printed medium could cope with the extension of knowledge only vertically, by producing more and more specialized guides and then further guides of guides, in an endless hierarchy. . . . The empire of knowledge eventually found that one way to survive its own growth was by breaking down into smaller kingdoms of disciplines. The Balkanization of knowledge had begun.
“Everything must be transformed into an Encyclopaedia” (Novalis)
Managerial function in place of designed function the crucial impact of ICT on human being.
(97) The managerial function of computers, rather than their immediate mathematical applications, has represented the real revolution in our culture, and it is only by relating it to the history of the infosphere that the impact of ICT on our way of living and thinking can begin to be fully appreciated.
Lyotard predicts nature becomes database, but for their managerial abilities rather than model of thinking implied in their operation; connect to Derrida archive.
(97) ICT is, then, the cultural answer to the terrific growth
undergone by our system of knowledge since the Renaissance. . . .
Jean-Francois Lyotard ventured the following forecast in 1979 (see
Lyotard 1984): [quoting] Databases are tomorrow's encyclopedia. They
exceed the capacities of any user. They represent “nature” for
the postmodern humanity.
(98) In search of its own individuality, autonomy and stability, the single mind can firmly establish itself at the crossroads of Being and Culture only by epistemically emancipating itself both from the world of things surrounding it and from the infosphere it inherits from the past, including the world of previously codified knowledge. However, in thus establishing its own individual autonomy, each single mind cannot help contributing to the growth of the infosphere, thus also generating new knowledge to which future generations will need to react.
(99) ICT is a technology that can make “history”, the time of the written word, so open and light as to enable us to interact with other people and the infosphere both “anachronistically”, as if we were in a prehistorical time (Plato), and historically, i.e. in Aristotle's sense of cumulative investigation.
is a database system?
(100) The whole database itself is then expected to be integrated . . . well-ordered . . . sharable . . . variably accessible . . . software independent . . . upgradable.
of database systems
(104) The essential elements of the relational data model were devised in 1970 by Ted Codd, and since the 1980s relational DBMS have become the dominant standard in corporate computing. . . . The common fields, used to link the data of a table T1 to the data of a table T2, are known as primary keys in T1 and foreign keys in T2, and at least in theory, a well-designed and fully normalized relational database should store only such keys more than once, while all the other data should be entered only once. In our simple example, primary keys could be the titles of Sextus' works.
(105) Here, it may be useful to distinguish between two kinds of software:
Retrieval, concordance, parsing are machine operations engaging human texts.
. . . Thus flexible software allows the user to transform her hard
disk into a flexible and useful database and this, by the way,
represents one of the main advantages of word-processing one's own
(105) (2) Concordance software and parsers: Concordancers are applications that allow the user to study a text by reorganizing it independently of its sequential order, according to analytic criteria such as combinations of keywords, word-forms or concepts.
Data, information and knowledge: an erotetic approach
Erotetic approach, a logic of questions and answers as an ontological, epistemological type, structures reality in terms of levels or layers trending from common to exception: datum, information, knowledge.
Attaching machine semiosis with queries, likely for Floridi database queries, prepares for remediation of Platonic critique of writing by acknowledging shared basis of all text then suggesting unthought digital affordances.
From the point of view of our erotetic model, a datum can then be
defined as an answer without a question: 12 is a sign that makes a
difference, but it is not yet informative, for it could be the number
of the astrological signs, the size of a pair of shoes or the name of
a bus route in London, we do not know which. Computers certainly
treat and “understand” data; it is controversial whether there is
any reasonable sense in which they can be said to understand
information. . . . To become informative for an intelligent being, a
datum must be functionally associated with a relevant query. . . . In
our erotetic model, information becomes knowledge only if it is
associated with the relevant explanation of the reasons why things
are the way they are said to be. . . . We need to remember that
information is a common phenomenon, knowledge a rare exception.
(107) The analogical document without the digital is inflexible and risks becoming a monotonous parrot, Plato docet. But the digital, freed from the iron cage of the analogical, is spineless and can often deafen the understanding with its overwhelming noise, as shown by the Internet. . . . However, training ourselves in the proper use of digital resources is only half the solution. There is also an increasing need to produce conceptual tools that combine the solitidty of the analogical with the openness of the digital, mental-energy saving resources which may turn out friendly towards our intellectual habitat, green intellectual tools, as it were.
The hyperbolic space of the infosphere and the fifth element
Interesting argument reaching infosphere model similar to mine with electronic devices; absence of statistical applications in typical textbases symptomatic of conceptual vacuum.
(108-109) A database may contain four types of data: . . . primary data . . . metadata . . . operational data . . . derivative data. . . . In all these examples, we have made the infosphere speak about itself through a quantitative and comparative analysis. Unfortunately, textbases are still published as reference tools only, and only occasionally do they make possible some elementary ideometric analyses (for example, they never include statistical applications).
Cyberspace as ether due to isomophism to combinations of data types percolating within databases, also a source, a place or space, location, support, from which it emerges as epiphenomenal field effects, of machine cognition; does comprehension really depend on our ability to picture the twelve sided shape derived from his formula?
(109-110) Since to each type of datum corresponds a type of information and hence a type of knowledge, we now have twelve forms of contents. The infosphere consists of a countless number of DIKs. Is there any geometrical conceptualization of this model? To begin with, we may compare each DIK to a dodecahedron (a polyhedron made of twelve regular pentagons). . . . We have seen that the infosphere is a conceptual environment in which there can be no empty space, either within or without (a dataless infospace is a contradiction, as the concept of unextended empty space was for Descartes). This means that the DIKs fully tessellate the infosphere (fill the n-dimensional space in the same way as stones may cover a pavement). . . . There are five regular polyhedrons, and in Platonist literature four of them were associated with the structure of the four natural elements (earth = cube, fire = tetrahedron, air = octahedron, water = icosahedron) while the fifth, the dodecahedron, was sometimes interpreted as the geometrical structure of the fifth element. What some Platonists taught was ether (the source is Epinomis 981b-e, but see also Timaeus 55c and 58d), we now have defined as DIK contents. Metaphorically, we can define cyberspace as the ethereal zone.
aesthetic and the ontological interpretation of databases
(110) The aesthetic approach can be limited by its naïve realism (in which only physical entities and their perceptible properties are genuinely and authentically real) and undervalues the crucial importance of procedural and derivative DIKs. It mistakes the infosphere for a copy of a copy, instead of considering it, more correctly, the constitutive blueprint of any physical implementation, and hence what truly determines the shaping of any future reality.
(110) This is the view shared by a second approach to databases which, still in Platonic terms, we may connote as ontological or hyperrealist. . . . From this perspective, the infosphere is the authentic reality that underlies the physical world, and as such it has a normative of a constructionist function. . . . Thus, the infosphere comes first in the logical order of realities, and it is the necessary conceptual environment that provides the foundation for any meaningful understanding of the surrounding world, one of its many possible implementations.
(111) There was a direct correspondence between ordering principles and possible questions. Computerized databases have changed all this, for it is now possible to query the digital domain and shape it according to principles that are completely different from those whereby the primary data were initially collected and organized.
Derivative data from ideometrical analysis versus fortuitous deformations as contribution of computing to human philosophy, positioning Floridi at McGann level theorists, seeking to stake out the territory of the philosophy of information versus that of the latter; academic philosophy crossing texts and technology territory, enumerating scientometric histriography, lexicography, stylometry, linguistic statistics, and so on, for revealing information about the infosphere that has become also answering humanities questions. Infosphere as difficult to examine media ecology of Kittler.
What we may call ideometry
the critical study of such significant patterns resulting from a
comparative and quantitative analysis of the extensional field of
codified information, that is, clusters of primary data, metadata and
procedural data from data-banks, textual corpora or multimedia
archives used as extensional sources.
(112) Ideometry is perfectly in line with the development of the history of thought, which through time becomes progressively more and more self-reflective and metatheoretical. As a cluster of multidisciplinary methods, it has become popular in many humanistic, linguistic and social disciplines since the 1960s.
Information about the infosphere research goal of ideometric analysis.
(113) We do not convert printed texts into electronic databases in order to read them better or more comfortably. For this task the book is and will remain unsurpassed. We collect and digitize large corpora of texts in order to be able to subject them to ideometric analysis and extract data they contain only on a macroscopic level. In this way, we can reach further information about the infosphere itself. . . . Copora of electronic texts and multimedia sources are the laboratory for ideometrical analysis.
commodification of information and the growth of the infosphere
(113) Information, as well as the processes by which it is manufactured, can easily be commodified. Why this is so is often a matter of disagreement, owing to widespread confusion between five different phenomena: digitalization, cross-medialization, disembodiment, reification and hence commodification of information.
(114) These five stages of development explain why it is a mistake to interpret the information society as a communication society, as Negroponte seems to be suggesting from a massmediological perspective (Negroponte 1995): the transformation is much more radical, because it represents an ontological evolution of the post-industrial society into a manufacturing society in which what is manufactured is no longer material res extensa but consists of immaterial information.
Rich and poor in the information economy
Constructive semiosis symptomatic of ontological rather than aesthetic interpretation of databases.
(115) What really matters, in the information society, is education and the full mastering of the several languages that ontically constitute the fifth element, from one's own natural idiom to mathematics, from English to ICT, since languages are the very fabric of the infosphere. . . . Yet meaningful signs (including those belonging to natural languages) do not clothe a pre-existing infosphere, they constitute it, and a long intellectual tradition, going from von Humboldt to Chomsky, has called attention to this fundamentally constructionist nature of semiotic codes, whereby languages ontologically incorporate reality, rather than just describing it.
ICT practical problems and computer ethics
Textual analysis: a constructionist approach
Floridi offers literalism to computing on a platter of philosophies of information rather than computing or programming for constructionist approach to textual analysis; bring computing into humanities teaching soldering circuits programs. For example, titerary texts as conceptual mechanisms read visually public radio singing Jingle Bells in Spanish boundary of human perception (reading and hearing concurrently). My function is to make [clearly] daily instantiation good another 10,000 years of 100/1000 is the Mayan calendar. I am not sure what he means “with Lego.” Is that the plastic toys or some programming language. Easy to completely miss the point by inferring wrong background information. It is through this deep analysis that electronics and humanities interoperate.
(117) Interpreting a literary text from a constructionist perspective means, therefore, to be able to normalize and categorize its DIK-elements (what they are and how they are to be re-modelled or re-conceptualized as related sets of properties), to discover their mutual relations, their internal processes, their overall structure, and to provide a textual model in which it becomes evident how far the purposes of the text-tool are fulfilled. If programs can be copyrighted (see for example the USA 1976 Copyright Act, amended by the Congress in 1980), then conversely literary texts can also be seen as conceptual mechanisms, complex systems that may exhibit holistic and emergent properties of have a high degree of fault tolerance (do the Meditations collapse because of the Cartesian circle?) and redundancy, and that may be broken after all, or in need of repair. For a constructionist, rational criticism is a way of showing how far a text can support external and internal solicitations, and reading is like working with Lego.
Hypertext as information retrieval system
Long Bush quote compared to Nelson summary because he defines many key terms and concepts, including a direct link to cyborg in memory supplement and associative indexing, which links to Barthes, literally instantiating the arbitrariness of signification; makes important point that information moves toward user instead of forcing user to embrace machine, which solves retrieval problem of enormous masses of information like books in libraries, making all informaiton ready-at-hand, easily recalled from long term memory via external machinery, not just static signs.
(118) [quoting Bush] A memex
is a device in which an individual stores all his books, records, and
communications, and which is mechanized so that it may be consulted
with exceeding speed and flexibility. It is an enlarged intimate
supplement to his memory. . . . It affords an immediate step,
however, to associative indexing, the basic idea of which is a
provision whereby any item may be caused at will to select
immediately and automatically another.
(119) The retrieval problem is a communication problem, which can be solved by devising an information system that no longer requires the user to move towards information (towards the notes at the bottom of a page or at the back of a book, towards the document referred, towards the library and so forth) but, rather, organizes information so that whatever is needed can be made immediately and constantly available, coming towards the reader, as it were.
A standard definition of hypertext
Barthes lexia plus hyperlinks and anchors plus interactive dynamic interface machinery.
A text is a hypertext if and only if it is constituted by 1. a
discrete set of semantic units which, in the base cases offer a low
cognitive load. . . . 2. a set of associations. . . . 3. an
interactive and dynamic interface.
(120) (1) The electronic fallacy: hypertext is a uniquely computer-based concept.
Argues digital electronic conceptually irrelevant for their understanding, contrary to platform studies and theorists of machine embodiment such as Bogost, Kirschenbaum, Chun.
Digital electronics, although practically vital for their
implementation, is by and large conceptually irrelevant for their
(121) (2) The literary fallacy: hypertext began primarily as a narrative technique and hence it is essentially a new form of literary style.
(121) The expressionist fallacy: hypertext has arisen as and should be considered primarily a writing-pushed phenomenon.
Readerly traversal of infosphere biases hypertext as consumption rather than designer phenomenon: does this support claim that most philosophy of computing arises from phenomenology, like Clark extended mind?
Danger of mental laziness ignoring style and excellence of print era literary composition: good example is textbook with many breakout sections.
Overall, hypertext is therefore more correctly seen as primarily a
reading-pulled phenomenon, rather than a writing-pushed one. This is
what makes it a truly postmodern “technique”: the empowered
reader, rather than the writer, determines the format of the text,
and in this case the stress is not only on the Cartesian-Kantian
problem of how we should construct the future space of information
(textuality as the result of a recording/writing problem), but also
on how we should be moving within the now-constructed infosphere
(hypertextuality as the result of a retrieving/reading
(123) In its normative aspect, a writing-centered conception of hypertext also runs the risk of enhancing a certain mental laziness. . . . The great contribution of hypertext to philosophical writing seems to be that of having made clear that now authors can choose to be linear. On the one hand, paraphrasing Beckett, I write linearly “pour m'auppauvrir”, “to impoverish myself”: it is the minimalist aesthetics of a Bauhaus type of geometrical linearity. On the other hand, the avilability of hypertextual writing is no justification for baroque, disconnected, loose, overparatactic or disordered contents.
(123) (4) The “politically correct” fallacy: with hypertext, the reader is in complete control of whatever contents or functions are available and hence is no longer subject to the writers' authority.
(124-125) In practice, the hypertext author can make more than one semantic universe available for the reader, but certainly not every universe. The reader's navigation is willfully constrained by the reader as well as by the writer around an axial narrative, and it is usually very clear what is text and what is paratext.
Practical limitations of degrees of freedom and possibilities of creative interaction; Aarseth and Ryan make similar points with myth of Aleph.
The degree of creative
that hypertexts offer to the reader remains practically
(125-126) (5) The obsession with the rhetoric of syntax: hypertext is non-linear writing and challenges the bookish assumption that contents have to be presented in a linear fashion.
Linearity defined as either syntactically sequential, semantically sequential, transmitted/communicated serially, or accessed/retrieved serially.
(126) An information system can already be connoted as linear if it satisfies at least one of the following conditions.
Simultaneous token transmission of multiple signifier types preferred form of multi-level linearity rather than disruption one-at-a-time unit processing characteristic of traditional human intelligence, of which my favorite example is Aquinas.
The reader still perceives (or tries to perceive) items serially,
with a beginning, and end, a linear development and a sense of unity.
Strictly speaking, then, no medium can truly transcend all forms of
linearity without seriously affecting its intelligibility. . . . Not
even hypermedia can transmit more than one token of a specific type
of signifier per time without creating confusion in the receiver, but
they can transmit tokens of more than one type of signifier
simultaneously, a text with images and music, for example, and this
kind of multi-level
well be taken to be a way of transcending linearity also in the sense
of (c) [t/c serial].
(127) (6) The mimetic fallacy: hypertext mimics the associative nature of the human mind and therefore is better suited to its activities.
(127) (7) The methodological fallacy: hypertexts will replace printed books.
Hypertext: the structure of the infosphere
Ontic meaning: computer technology is all about ontics, instantiating poststructuralist and postmodern theoretical ideas.
Four good points about externalization of memory, touching on science of timelessness (Castells).
First, we can now abandon the common view that hypertext (the
conceptual structure, not the actual products) is simply an
epistemological concept. . . . As the system of relations connecting
the DIKs, hypertext is the very backbone of the infosphere and
significantly contributes to its meaningfulness in an ontical
i.e. by helping to constitute it.
(129) Second, the space of reason and meaning – including the narrative and symbolic space of human memory – is now externalized in the hypertextual infosphere, and this brings about four more consequences concerning the rhetoric of spatiality.
Individual person in cyberspace as evolving hypertext with privileged access, related to Humean theory of personality, supports involvement of technological nonconsicous to form Clark extended cognition, Hayles cognitive-embodied processes: Floridi suggests new subjectivity reverses specialist, compartmentalization trend of modernity for managerial model similar to Jenkins and Hayles by mitigation through automation, the memex memory augmentation, the original promise of living writing from Phaedurs; diatropic, horizontal interdisciplinarity compares well to Hayles and Jenkins conclusions.
(130) Finally, it is important to realize that a philosophy of mind may investigate the nature of the self, and the consequent problems of identity and dualism, from the information perspective just introduced, by developing a Humean theory of personality in terms of narrative development of a region of the infosphere. . . . The individual person . . . becomes a unique element in the infosphere, a steadily evolving hypertext, which is kept together and remains identifiable through its evolution thanks to the special access and awareness it enjoys with respect to its own states, the uniformity of its own reflective narrative.
Conclusion: a Renaissance mind?
Can digital natives stressing epistemic managerial functions implemented by ICT develop Renaissance minds (like Floridi himself) if an essential component is also close reading, explored in Hayles How We Think?
(130-131) It is to be hoped that one of the direct efforts of the transformation both in the ontology and in the organizing logic of the infosphere may be an inversion of a secular tendency towards the diffusion and reinforcement of specialization. . . . Thus, the necessity of stressing the importance of the epistemic managerial functions implemented by ICT rests also on the crucial role that the latter may have in weakening the concept of compartmentalization. We have seen that the age of the book, providing a rigidly structured context, invited vertical specialization, not only in science but also in the humanities. On the other hand, the computer as ordinateur – a word that in the eighteenth century signified the Supreme Being, recalling the Dieu horloger of Voltaire – may now promote forms of horizontal interdisciplinarity and multi-disciplinarity which could soon start bearing fruit. The more open and flexible the whole space of knowledge becomes to the individual mind, the more “diatropic” its approach can be. . . . The is one of the reasons why the computer age has also been described as a return of the Renaissance mind.
Artificial intelligence: a light approach
Floridi, Luciano. Philosophy and Computing: An Introduction. London: Routledge, 1999. Print.