Notes for Neal Stephenson In the Beginning . . . was the Command Line

Key concepts: blinking twelve problem, cruft, graphical user interface, interface culture, mediated experience, mindshare dominance, operating system, source code, technosphere, temporal arbitrage.

Related theorists: Mark Bauerlein, Bill Gates, Steve Jobs, Steven Johnson, David Rushkoff, Richard Stallman, Linus Torvalds, Yuri Takhteyev, Sherry Turkle.

Stephenson provides a subjective essay based on time using and programming many personal computer operating systems, and as a writer of science fiction, sensitive to metaphors, staking a claim as a philosopher of computing.

(3) What the hell is going on here? And does the operating system business have a future, or only a past? Here is my view, which is entirely subjective; but since I have spent a fair amount of time not only using, but programming, Macintoshes, Windows machines, Linux boxes, and the BeOS, perhaps it is not so ill-informed as to be completely worthless. This is a subjective essay, more review than research paper, and so it might seem unfair or biased compared to the technical reviews you can find in PC magazines. But ever since the Mac came out, our operating systems have been based on metaphors, and anything with metaphors in it is fair game as far as I'm concerned.


Analogy between cars and operating systems, with major vendors as dealerships.

(5) The analog between cars and operating systems is not half bad, and so let me run with it for a moment, as a way of giving an executive summary of our situation today.
(5) Imagine a crossroads where four competing auto dealerships are situated.


Written language as strings of phonetic symbols the easiest way to convert to bits and first communicate with a computer, having been developed with Morse code telegraphy, though people raised on GUIs may be surprised.

(12) People who have only interacted with computers through graphical user interfaces such as the MacOS or Windows—which is to say, almost everyone nowadays who has ever used a computer—may have been startled, or at least bemused, to hear about the telegraph machine that I used to communicate with a computer in 1973. . . . Written language is the easiest of all because, or course, it consists of strings of symbols to begin with. If the symbols happen to belong to a phonetic alphabet (as opposed to, say, ideograms), converting them into bits is a trivial procedure, and one that was nailed, technologically, in the early nineteenth century, with the introduction of Morse code and other forms of telegraphy.
(12) We possessed a human/computer interface a hundred years before we had computers.

Before Macintosh introduced GUI, Victorian technologies used to communicate with computers.

A book with ellipsis built into its title that provides connecting arguments for chapter one.

(14) In effect, we still used Victorian technologies to communicate with computers until about 1984, when the Macintosh was introduced with its Graphical User Interface.


Getting a sense of the computer point of view through examining the HTML source code of browser content or tcpdump.

(15) The quickest way to get a taste of this is to fire up your web browser, visit a site on the Net, and then select the View/Document Source menu item.


Mac and Windows users appear to reflect distinct artistic and capitalist ideologies, but all become accustomed to using GUIs.

(32) Mac partisans want to believe in the image of Apple purveyed in those ads, and in the notion that Macs are somehow fundamentally different from other computers, while Windows people want to believe that they are getting something for their money, engaging in a respectable business transaction.
(32) The upshot is that millions of people got accustomed to using GUIs in one form or another.


After distinguishing closed from free software, conclusion that operating system market seen as death trap dependent on codependent relationship between customers who want to believe and vendors adding features to retain loyalty.

(40) The operating system market is a death trap, a tar pit, a slough of despond. There are only two reasons to invest in Apple and Microsoft. (1) Each of these companies is in what we would call a codependency relationship with their customers. The customers Want To Believe, and Apple and Macintosh know how to give them what they want. (2) Each company works very hard to add new features to their OSes, which works to secure customer loyalty, at least for a little while.


Command line offered by UNIX, now GNU/Linux remembering this book was published before Linus wrote the first version although Stallman had been long at GNU, separate OS from GUI; text only layer of technosphere privileged human machine interface that can be widely enjoyed through floss.

GUI understood as vast suite of code in addition to rest of old-fashioned operating system functions.

(41) Unix is the only OS remaining whose GUI (a vast suite of code called the Xwindow System) is separate from the OS in the old sense of the phrase. This is to say that you can run Unix in pure command line mode if you want to, with no windows, icons, mouses, etc. whatsoever, and it will still be Unix and capable of doing everything Unix is supposed to do. But the other OSes—MacOS, the Windows family, and BeOS—have their GUIs tangled up with the old-fashioned OS functions to the extent that they have to run in GUI mode, or else they are not really running.

Reliance on GUI complicates operating system; implies additional contrast that effective minimal command line keyboard text only interface has benefit for humans.

(41) When OSes are free, OS companies cannot compete on price, and so they compete on features. This means that they are always trying to outdo each other writing code that, until recently, was not considered to be part of an OS at all: stuff like GUIs.

Interesting argument that integrating browser into OS adds value to salable product.

(42) If browsers are free, and OSes are free, it would seem that there is no way to make money from browsers or OSes. But if you can integrate a browser into the OS and thereby imbue both of them with new features, you have a salable product.

Ethical consideration to limit territorialization of operating system; apply to smartphone evolution.

(42) The real question is whether every new technological trend that comes down the pike ought to be used as a crutch to maintain the OS's dominant position.

Technosphere like biosphere between inhospitable extremes of concretization and impractical possibility as description of existential milieu of operating systems; compare also to theorization by Thrift, Kitch and Dodge.

(43) Companies that sell OSes exist in a sort of technosphere. Underneath is technology that has already become free. Above is technology that has yet to be developed, or that is too crazy and speculative to be productized just yet.

Internet provides fossil record of prior versions of operating systems and applications, the concretized lower bounds of technosphere.

(43) The fossil record—the La Brea Tar Pit—of software technology is the Internet. Anything that shows up there is free for the taking (possibly illegal, but free).

Companies avoid tar pit by using research efforts to probe upper bounds of technosphere; everyday users can learn about the milieu by experimenting with alternatives to the defaults like Linux and BeOS.

(44) The danger is that in their obsession with staying out of the fossil beds, these companies will forget about what lies above: the realm of new technology. In other words, they must hang on to their primitive weapons and crude competitive instincts, but also evolve powerful brains. . . . I have learned much more about Microsoft by using the Linux operating system than I ever would have done by using Windows.

Microsoft makes money by temporal arbitrage, betting on time of new technologies gaining market share then becoming free.

(45) Microsoft is making money by taking advantage of differences in the price of technology in different times. Temporal arbitrage, if I may coin a phrase, hinges on the arbitrageur knowing what technologies people will pay money for next year, and how soon afterwards those same technologies will become free.

Philosophy addresses long runs like when all copyrights expire, noting major corporate players hire philosophers and creative engineers; jumping around temporary corporate seminars seems like a place for philosophers of computing to operate.

Philosophical argument that seriously considering obligation to develop GUI with traditional functions of OS addressed to Microsoft and Apple.

(45) The question is whether this makes sense in the long run. If Microsoft is addicted to OSes and Apple is to hardware, then they will bet the whole farm on their OSes so that customers will not switch to the cheaper alternatives, and maintaining the image that, in some mysterious way, gives those customers the feeling that they are getting something for their money.


Stephenson claims inspiration observing shopping habits and Disney World, though credits Steven Johnson, who already wrote a book called Interface Culture.

(47) Directly in front of me [at Main Street USA] was a man with a camcorder. . . . Rather than go see a real small town for free, he had paid money to see a pretend one, and rather than see it with the naked eye, he was watching it on television.

Preference for mediated experiences fits postmodern perspective; Stephenson ties to success of GUIs as does Turkle, suggesting that Disney should create an operating system.

(47) Americans' preference for mediated experiences is obvious enough, and I'm not going to keep pounding it into the ground. . . . But it clearly relates to the colossal success of GUIs, and so I have to talk about it some. Disney does mediated experiences better than anyone. If they understood what OSes are, and why people use them, they could crush Microsoft in a year or two.

Argues the word is only nonfungible method of encoding thoughts because it is a conversation, in contrast to visual representations, as command line brings you closer to the machine; contrast to Hayles and others presenting examples of literature that involves typography itself as visual representation.

(50) Disney is in the business of putting out a product of seamless illusion—a magic mirror that reflects the world back better than it really is. But a writer is literally talking to his or her readers, not just creating an ambience or presenting them with something to look at. Just as the command line interface opens a much more direct and explicit channel from user to machine than the GUI, so it is with words, writer, and reader.
(50) The word, in the end, is the only system of encoding thoughts—the only medium—that is not fungible, that refuses to dissolve in the devouring torrent of electronic media.

Disney visitors not interested in absorbing ideas from books through written media; compare to Bauerlein.

(51) If I can risk a broad generalization, most of the people who go to Disney World have zero interest in absorbing new ideas from books. This sounds snide, but listen: they have no qualms about being presented with ideas in other forms. Disney World is stuffed with environmental messages now, and the guides at Animal Kingdom can talk your ear off about biology.

Disney presents an entire culture, like the medieval cathedral, rather than dialog with individual artists, thus seems creepy for lacking translation of content to explicit words; likewise laborious quasi oral tradition of command line interface man pages lost in expensively designed GUI.

(52) The cathedral as a whole is awesome and stirring in spite, and possibly because, of the fact that we have no idea who built it. When we walk through it, we are communing not with individual stone carvers but with an entire culture.
(52) Disney World works the same way. . . . But it's easy to find the whole environment a little creepy, because something is missing: the translation of all its content into clear explicit written words, the attribution of the ideas to specific people. You can't argue with it.
(52) And this is precisely the same as what is lost in the transition from the command line interface to the GUI.
(52) Disney and Apple/Microsoft are in the same business: short-circuiting laborious, explicit verbal communication with expensively designed interfaces.

Universal acknowledgment of failed intellectualism and overly complicated world rationalizes use of nonverbal media; compare failure of intellectualism to Bauerlein contention that adults are too indulgent.

(53) Part of it is simply that the world is very complicated now—much more complicated than the hunter-gatherer world that our brains evolved to cope with—and we simply can't handle all the details.
(53) But more importantly, it comes out of the fact that during this century, intellectualism failed, and everyone knows it.
(53) We seem much more comfortable with propagating those values to future generations nonverbally, through a process of being steeped in media.

Written word is unique as digital medium easily manipulated by humans.

(54) The written word is unique among media in that it is a digital medium that humans can, nonetheless, easily read and write.

Global trend to eradicate cultural differences along with need to suspend judgment, like indulgence of low performing youth, leads to suspicion of and hostility toward authority.

(55) It is obvious, to everyone outside of the United States, that our arch-buzzwords—multiculturalism and diversity—are false fronts that are being used (in many cases unwittingly) to conceal a global trend to eradicate cultural differences.
(55) The lesson most people are taking home from the twentieth century is that, in order for a large number of different cultures to coexist peacefully on the globe (or even in a neighborhood) it is necessary for people to suspend judgment in this way. Hence (I would argue) our suspicion of, and hostility toward, all authority figures in modern culture.

Stephenson concludes, as does Bauerlein, that there is no real culture left; we are the feckless, dumbest generation.

(56) The problem is that once you have done away with the ability to make judgments as to right and wrong, true and false, etc., there's no real culture left.
(56-57) Anyone who grows up watching TV, never sees any religion or philosophy, is raised in an atmosphere of moral relativism, learns about civics from watching bimbo eruptions on network TV news, and attends a university where postmodernists vie to outdo each other in demolishing traditional notions of truth and quality, is going to come out into the world as one pretty feckless human being.

Two tiered cultural system like Wells Morlocks and Eloi inverted, the latter majority steeped in electronic media maintained by the book reading former minority; compare to Rushkoff and Lanier.

(58) Contemporary culture is a two-tiered system, like the Morlocks and the Eloi in H. G. Wells's The Time Machine, except that it's been turned upside down. . . . The Morlocks are in the minority, and they are running the show, because they understand how everything works. The much more numerous Eloi learn everything they know from being steeped from birth in electronic media directed and controlled by book-reading Morlocks.

Dim comprehension through an interface is better than none.

(59-60) It simply is the case that we are way too busy, nowadays, to comprehend everything in detail. And it's better to comprehend dimly, through an interface, than not at all. Better for ten million Eloi to go on the Kilimanjaro Safari at Disney World than for a thousand cardiovascular surgeons and mutual fund managers to go on “real” ones in Kenya. . . . The specter of a polity controlled by the fads and whims of voters who actually believe that there are significant differences between Bud Lite and Miller Lite, and who think that professional wrestling is for real, is naturally alarming to people who don't. But then countries controlled via the command line interface, as it were, by double-domed intellectuals, be they religious or secular, are generally miserable places to live.


GUI as new semiotic layer between people and machines fits diachrony in synchrony model.

Abdication of responsibility and surrender of power to the operating system because people want things to be easier.

(61) Back in the days of the command line interface users were all Morlocks who had to convert their thoughts into alphanumeric symbols and type them in, a grindingly tedious process that stripped away all ambiguity, laid bare all hidden assumptions, and cruelly punished laziness and imprecision. Then the interface makers went to work on their GUIs and introduced a new semiotic layer between people and machines. People who use such systems have abdicated the responsibility, and surrendered the power, of sending bits directly to the chip that's doing the arithmetic, and handed that responsibility and power over to the OS. . . . We want things to be easier. How badly we want it can be measured by the size of Bill Gates's fortune.

Operating system as intellectual labor saving device translating vague intentions into bits, taking over functions formerly considered the province of humans.

(62) The OS has (therefore) become a sort of intellectual labor-saving device that tries to translate humans' vaguely expressed intentions into bits. In effect we are asking our computers to shoulder responsibilities that have always been considered the province of human beings—we want them to understand our desires, to anticipate our needs, to foresee consequences, to make connections, to handle routine chores without being asked, to remind us of what we ought to be reminded of while filtering out noise.

Analogies developed to describe upper level functions through conventions like menus, buttons, windows; promiscuous metaphor mixing exemplified by the electronic document.

(62) At the upper (which is to say, closer to the user) levels, this is done through a set of conventions—menus, buttons, and so on. These work in the sense that analogies work: they help Eloi understand abstract or unfamiliar concepts by likening them to something known. But the loftier word “metaphor” is used.
(63) There is massively promiscuous metaphor-mixing going on here, and I could deconstruct it till the cows come home, but I won't. Consider only one word: “document.”

GUIs use bad metaphors to make computing easier; we are buying into the assumption that metaphors are a good way to deal with the world rather than precise description exemplified in the command line.

(64) So GUIs use metaphors to make computing easier, but they are bad metaphors. Learning to use them is essentially a word game, a process of learning new definitions of words such as “window” and “document” and “save” that are different from, and in many cases almost diametrically opposed to, the old.
(64) What we're really buying is a system of metaphors. And—much more important—what we're buying into is the underlying assumption that metaphors are a good way to deal with the world.

GUIs are now general tools encountered in many devices, promoting metaphors to method of world interpretation.

(65) So we are now asking the GUI to do a lot more than serve as a glorified typewriter. Now we want it to become a generalized tool for dealing with reality.

Interfaces must be consistent or the blinking twelve problem arises as it did with early VCRs.

(67-68) It's no longer acceptable for engineers to invent a wholly novel user interface for every new product, as they did in the case of the automobile, partly because it's too expensive and partly because ordinary people can only learn so much. . . . But because the VCR was invented when it was—during a sort of awkward transitional period between the era of mechanical interfaces and GUIs—it just had a bunch of pushbuttons on the front, and in order to set the time you had to push the buttons in just the right way. . . . Computer people call this “the blinking twelve problem.”
(68) The blinking twelve problem has moved on to plague other technologies.

GUI use promotes belief that hard things can be made easy; combine with Turkle robotic moment, Bauerlein media cocoon, and inspirations for Rushkoff ten commands to arrive at conception of average postpostmodern network dividual cyborg.

(69) By using GUIs all the time we have insensibly bought into a premise that few people would have accepted if it were presented to them bluntly: namely, that hard things can be made easy, and complicated things simple, by putting the right interface on them.



Investigating Apple Macintosh Programmers Workshop revealed recreation of Unix interface at center of GUI.

(74) The first thing that Apple's hackers had done when they'd gotten the MacOS up and running—probably even before they'd gotten it up and running—was to recreate the Unix interface [in MPW], so that they would be able to get some useful work done. At the time, I simply couldn't get my mind around this, but, apparently as far as Apple's hackers were concerned, the Mac's vaunted graphical user interface was an impediment, something to be circumvented before the little toaster even came out onto the market.

Stephenson jumped to Unix after being disappointed by Apple and Microsoft failures.

(76) When my PowerBook broke my heart, and when Word stopped recognizing my old files, I jumped to Unix.

Decisions made by IBM and Microsoft at dawn of PC era resulted in abundance of cheap hardware from which Linux powered GNU arose.

(79-80) The availability of all this cheap but effective hardware was an unintended consequence of decisions that had been made more than a decade earlier by IBM and Microsoft. . . . This free-for-all approach to hardware meant that Windows was unavoidably clunky compared to MacOS. But the GUI brought computing to such a vast audience that volume went way up and prices collapsed.

Cultural price of Mac was that its closed design discouraged hacking, whereas Microsoft inspired parts bazaar became primordial soup for Linux based operating systems to self assemble.

(80) But the price that we Mac owners had to pay for superior aesthetics and engineering was not merely a financial one. There was a cultural price too, stemming from the fact that we couldn't open up the hood and mess around with it. Doug Barnes was right. Apple, in spite of its reputation as the machine of choice of scruffy, creative hacker types, had actually crated a machine that discouraged hacking, while Microsoft, viewed as a technological laggard and copycat, had created a vast, disorderly parts bazaar—a primordial soup that eventually self-assembled into Linux.


Unix exemplifies mega tool used for every operation, and scorn for lesser operating systems by children of hackers.

(85) Unix the the Hole Hawg of operating systems, and Unix hackers—like Doug Barnes and the guy in the Dilbert cartoon and many of the other people who populare Silicon Valley—are like contractors' sons who grew up using only Hole Hawgs. They might use Apple/Microsoft OSes to write letters, play video games, or balance their checkbooks, but they connot really bring themselves to take those operating systems seriously.


Learning Unix hard and comprehension comes through many small epiphanies coming to understand processes to which you have been subject all along.

(86) Unix is hard to learn. The process of learning it is one of multiple small epiphanes. Typically you are just on the verge of inventing some necessary tool or utility when you realize that someone else has already invented it, and built it in, and this explains some odd file or directory or command that you have noticed but never really understood before.

Unix embodies epic oral history of hacker subculture, a profound philosophical insight in a book about computer operating systems.

(88) Windows 95 and MacOS are products, contrived by engineers in the service of specific companies. Unix, by contrast, is not so much a product as it is a painstakingly compiled oral history of the hacker subculture. It is our Gilgamesh epic.

Unix in its very structure resembles oral narrative, and is well known such that it can autochthonously in sense of separate intelligence beyond our own playing consciousness with us, recreate itself.

(88) What made old epics like Gilgamesh so powerful and so long-lived was that they were living bodies of narrative that many people knew by heart, and told over and over again—making their own personal embellishments whenever it struck their fancy. The bad embellishments were shouted down, the good ones pick up by others, polished, improved and, over time, incorporated into the story. Likewise, Unix is known, loved, and understood by so many hackers that it can be re-created from scratch whenever someone needs it.

Credit to both Stallman and Torvalds yet reiterate unexpected, unplanned growth of floss attributed to supply of cheap substrates.

(90) To write code at all, Torvalds had to have cheap but powerful development tools, and these he got from Stallman's GNU project.
(90) And he had to have cheap hardware on which to write that code. . . . The only reason Torvalds had cheap hardware was Microsoft.

Bizarre trinity of Torvalds, Stallman, Gates as premier philosopher kings of computing, computer science, software, computer programming.

(90) In trying to understand the Linux phenomenon, then, we have to look not to a single innovator but to a sort of bizarre Trinity: Linus Torvalds, Richard Stallman, and Bill Gates.


Spending time using Linux to learn about native OS like visiting foreign countries to learn more about America.

(91) When the traveler returns home and takes stock of the experience, he or she may have learned a good deal more about America than about the country they went to visit.
(91-92) For the same reasons, Linux is worth trying. It is a strange country indeed, but you don't have to live there; a brief sojourn suffices to give some flavor of the place and—more importantly—to lay bare everything that is taken for granted, and all that could have been done differently, under Windows or MacOs.

Linux a self organizing net subculture based on evolving body of widely shared source code.

(93) Linux per se is not a specific set of ones and zeros, but a self-organizing Net subculture. The end result of the collective lucubrations is a vast body of source code, almost all written in C (the dominant computer programming language). “Source code” just means a computer program as typed in and edited by some hacker.

Durability of ASCII texts files for having no typographical frills, transitory formats or markups one lesson from using Linux.

(94) ASCII text files in other words, are telegrams, and as such they have no typographical frills. But for the same reason they are eternal, because the code never changes, and universal, because every text-editing and word-processing software ever written knows about this code.

Hackers live in the saddle like Mongols, using and adjusting their own tools.

(95) Editor, compiler, and linker are to hackers what ponies, stirrups, and archery sets were to the Mongols. Hackers live in the saddle and hack on their own tools, even while they are using them, to create new applications.


Having been written by many, Linux lacks central policies for program messages, which are honestly exposed by command line where they are elsewhere hidden by polished GUI; predominance of English despite global contributor base relates to Takhteyev.

(104) Linux is not capable of having any centrally organized policies dictating how to write error messages and documentation, and so each programmer writes his own. Usually they are in English, even though tons of Linux programmers are Europeans. . . . If something bad has happened because the software simply isn't finished yet, or because the user screwed something up, this will be stated forthrightly. The command line interface makes it easy for programs to dribble out little comments, warnings, and messages here and there.

Commercial OS stance toward admitting errors and sharing honest user feedback like Communist stance on poverty.

(106) Commercial OSes have to adopt the same official stance toward errors as Communist countries had toward poverty.
(106) This posture, which everyone knows to be absurd, is not limited to press releases and ad campaigns. It informs the whole way these companies do business and relate to their customers.
(107) The business is expanding fast enough that it's still much better to have billions of chronically annoyed customers than millions of happy ones.

Linux culture accepts feedback and encourages rapid resolution by maintainers.

(108) When something goes wrong with Linux, the error is noticed and loudly discussed right away. Anyone with the requisite technical knowledge can go straight to the source code and point out the source of the error, which is then rapidly fixed by whichever hacker has carved out responsibility for that particular program.

Pay Per Incident support model of Microsoft sustains illusion of rational business transaction.

(111) In the world of open source software, bug reports are useful information. Making them public is a service to other users, and improves the OS. . . . In the commercial OS world, however, reporting a bug is a privilege that you have to pay lots of money for.
(112) What Microsoft is selling through Pay Per Incident isn't technical support so much as the continued illusion that its customers are engaging in some kind of rational business transaction.

Kaftaesque relationship between commercial OS vendors and customers seeking help enforces asymmetric division between suppliers and users.

Stephenson argues that people will not pay for per incident support, and by extension for the whole operating system itself, so the market is not sustainable the more commercial OSes adopt community practices common in open source.

(116) This approach to dealing with one's customers was straight out of the Central European totalitarianism of the mid-twentieth century. The adjectives “Kafkaesque” and “Orwellian” come to mind. It couldn't last, any more than the Berlin Wall could, and so now Microsoft has a publicly available bug database. It's called something else, and it takes a while ot find it, but it's there.
(116) They have, in other words, adapted to the two-tiered Eloi/Morlock structure of technological society.
(116-117) No one is going to cough up $95 to report a problem when chances are good that some other sucker will do it first, and that instructions on how to fix the bug will then show up, for free, on a public site. And as the size of the bug database grows, it eventually becomes an open admission, on Microsoft's part, that their OS has just as many bugs as their competitors'. There is no shame in that; but it puts Microsoft on an equal footing with the others and makes it a lot harder for their customers—who Want to Believe—to believe.


GNU tty screen reminds computer user of complexity beneath GUI, like the skulls writers kept on their desks did their mortality.

(119) It used to be fashionable for a writer to keep a human skull on his desk as a reminder that he was mortal, that all about him was vanity. The tty screen reminds me that the same thing is true of slick user interfaces.

Virtue of small utility programs run on the command line; high overhead of pure GUI changes programming environment such that small utility programs get swallowed up in omnibus packages like Microsoft Office.

(123) The ability to run these little utility programs on the command line is a great virtue of Unix, and one that is unlikely to be duplicated by pure GUI operating systems.
(123-124) GUIs tend to impose a large overhead on every single piece of software, even the smallest, and this overhead completely changes the programming environment. Small utility programs are no longer worth writing. Their functions, instead, tend to get swallowed up into omnibus software packages.

Drawback of omnibus Walmart approach is feature clutter.

(124) The most serious drawback to the Wal-Mart approach is that most users only want or need a tiny fraction of what is contained in these giant software packages. The remainder is clutter, dead weight.

Praise for Visual Basic built into Microsoft Office as way to spawn more hacking by offering a simple, accessible programming interface reminiscent of early programming experience and Unix command line, albeit at the application level.

(124-125) The other important thing to mention, here, is that Microsoft has included a genuinely cool feature in the Office package: a Visual Basic programming package. Basic is the first computer language that I learned, back when I was using the paper tape and the teletype. By using Visual Basic . . . you can write your own little utility programs. . . . And so it is quite possible that this feature of Office will, in the end, spawn more hacking than GNU.


BeOS closest to ideal of having well designed GUI with command line alternative in 1999, well met today with Ubuntu and other Linux distributions.

Combination of usable GUI and command line option of modern GNU/Linux operating systems seems to have afforded my experiements with critical programming toward a philosophy of computing.

(129) The ideal OS for me would be one that had a well-designed GUI that was easy to set up and use, but that included terminal windows where I could revert to the command line interface, and run GNU software, when it made sense. A few years ago, Be Inc. invented exactly that OS. It is called the BeOS.


Be is a story of failure of great idea.

(130) Many people in the computer business have had a difficult time grappling with Be, Incorporated, for the simple reason that nothing about it seems to make any sense whatsoever. It was launched in late 1990, which makes it roughly contemporary with Linux. From the beginning it has been devoted to creating a new operating system that is, by design, incompatible with all the others (though, as shall see, it is compatible with Unix in some very important ways). . . . It is famous for not being famous; it is famous for being doomed.

No virtue in crappy old operating systems so lesser value to study later in context of learning history of computing.

(133) Crappy old OSes have value in the basically negative sense that changing to new ones makes us wish we'd never been born.

Crappy OSes accumulations of crufty designs; does not discuss but consider implications for the bad narrative they produce.

(133-134) Like an upgrade to an old building, cruft always seems like a good idea when the first layers of it go on—just routine maintenance, sound prudent management.

Intentional ground up, object oriented design for BeOS, an appropriate story for the 1990s accompanying 1980 personal computer stories.

BeOS object oriented messaging software entities.

(135) The great idea behind BeOS was to start from a clean sheet of paper and design an OS the right way.
(138) At any rate, BeOS has an extremely well-thought-out GUI built on a technological framework that is solid. It is based from the ground up on modern object-oriented software principles. BeOS software consists of quasi-independent software entities called objects, which communicate by sending messages to each other.

Stephenson sold on terminal interface and POSIX compatibility amid BeOS GUI.

(138-139) For this user, a big selling point of BeOS is the builtin Terminal application, which enables you to open up windows that are equivalent to the xterm windows in Linux. In other words, the command line interface is available, if you want it. And because BeOS hews to a certain standard called POSIX, it is capable of running most of the GNU software.

BeOS could attract artists and creative hackers who gravitated to Macs in late eighties.

(140) During the late eighties, the MacOS ws, for a time, the OS of cool people—artists and creative-minded hackers—and BeOS seems to have the potential to attract the same crowd now.

Modern OSes depend on availability of hardware specific code; no mention of concerns dear to Kittler like protected mode and trusted computing.

(141) Because the hardware market has become so vast and complicated, what really determines an OS's fate is not how good the OS is technically, or how much it costs, but rather the availability of hardware-specific code.


Microsoft currently dominates mindshare competition such that software makers and hardware makers write applications and drivers for them; the logic of being the market driven de facto standard Gates praises only countered by emergent, distributed sharing communities like GNU/Linux, for no single company can compete and government based efforts have been shunned.

(143-144) What is really going on is that Microsoft has seized, for the time being, a certain type of high ground: they dominate in the competition for mindshare, and so any hardware or software maker who wants to be taken seriously feels compelled to make a product that is compatible with their operating systems. Since Windows-compatible drivers get written by the hardware makers, Microsoft doesn't have to write them; in effect, the hardware makers are adding new components to Windows, making it a more capable OS, without charging Microsoft for the service. It is a very good position to be in. The only way to fight such an opponent is to have an army of highly competent coders who write and distribute equivalent driver, which Linux does.

Antitrust framers did not consider mindshare dominance.

(144) Here, instead, the dominance is inside the minds of people who buy software. Microsoft has power because people believes it does.
(145) Mindshare dominance is, in other words, a really odd sort of beast, something that the framers of our antitrust laws couldn't possibly have imagined. It looks like one of these modern, wacky chaos-theory phenomena, a complexity thing, in which a whole lot of independent but connected entities (the world's computer users), making decisions on their own, according to a few simple rules of thumb, generate a large phenomena (total domination of the market by one company) that cannot be made sense of through any kind of rational analysis.

Mindshare dominance can also be toppled under same chaotic conditions under which it arises, making Microsoft nervous.

(146) But there's no way to predict when people will decide, en masse, to reprogram their own brains. This might explain some of Microsoft's behavior, such as their policy of keeping eerily large reserves of cash sitting around, and the extreme anxiety that they display whenever something like Java comes along.


If God were an engineer answering the celestial help line, most would be told that life is tough; life cannot be reduced to a mediated user interface.

(151) What would the engineer say, after you had explained your problem and enumerated all of the dissatisfactions in your life? He would probably tell you that life is a very hard and complicated thing; that no interface can change that; that anyone who believes otherwise is a sucker; and that if you don't like having choices made for you, you should start making your own.

Stephenson, Neal. In the Beginning . . . was the Command Line. New York: Harper Collins, 1999. Print.