Notes for Scott Rosenberg Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software
Key concepts: agile software development, attribute object model, automatic software, bazaar-style, bootstrapping, cathedral mode, code review, continuous integration, cowboy coder, dogfooding, edge cases, extreme programming, forcing function, geek, Hungarian notation, intentional programming, item object model, late binding, Lego hypothesis, Lisp, object-oriented programming, open source software development methodology, pattern language, pervasive heterogeneity, phenotropic software, post-symbolic communication, pragmatic minimalism, Python, Smalltalk, software disaster genre, software patterns movement, software time, source code, spiral model, Squeak, stamping, structured programming, superdistribution, version tracking systems, waterfall model, wiki.
Related theorists: Christopher Alexander, David Allen, John Backus, Kent Beck, Tim Berners-Lee, Robert Biddle, Barry Boehm, Robert Britcher, Frederick Brooks, Nicholas Carr, Larry Constantine, Alan Cooper, Brad Cox, Ward Cunningham, W. Edwards Deming, Phillip J. Eby, Douglas Engelbart, Francis Fukuyama, Richard Gabriel, Antonio Gramsci, Andy Herzfeld, Douglas Hofstadter, Watts Humphrey, Bill Joy, Mitch Kapor, Alan Kay, Donald Knuth, Ray Kurzweil, Jaron Lanier, John McCarthy, Noble, David Lorge Parnas, Eric Raymond, Joel Spolsky, Richard Stallman, Gerald Jay Sussman, Michael Toy, Linux Torvalds, Guido Van Rossum, Gerald Weinberg.
Space between the way machines and humans count and think, leading to yearnings for replacing the entire software edifice: any point in trundling out Heidegger What is Called Thinking?
(7) In the binary digital world of computers, all information is
reduced to sequences of zeros and ones. But there's a space between
zero and one, between the way the machine counts and thinks and the
way we count and think. When you search for explanations for
software's bugs and delays and stubborn resistance to human desires,
that space is where you'll find them.
(10) Some dream of ripping down the entire edifice of today's software and replacing it with something new and entirely different. Others simply yearn for programs that will respond less rigidly to the flow of human wishes and actions, for software that does what we want and then gets out of our way, for code that we can count on.
(15) Take one such unknown, place it next to all the other similar unknowns in Chandler, multiply them by one another, and you have the development manager's nightmare: a “black hole” in the schedule, a time chasm of indeterminate and perhaps unknowable dimensions.
(15) Black holes, snakes. dragons—the metaphors all daubed a layer of mythopoetic heroism over the most mundane of issues: how to schedule multiple programmers so that work actually got done.
(16) The earliest and best diagnosis of the problem of software time can be found in a 1975 book by Frederick Brooks, The Mythical Man-Month.
(17) “Adding manpower to a late software project makes it later.”
(17) In practice, Brooks found, nearly all software projects require only one-sixth of their time for the writing of code and fully half their schedule for testing and fixing bugs. But it was a rare project manager who actually planned to allocate developers' time according to such a breakdown.
Challenge of even single developer to communicate with future selves following Brooks Law.
(18) Brooks's Law implies that the ideal size for a programming team is one—a single developer who never has to stop to communicate with a colleague.
Tantalizing prospects of open source development methodology to repeal Brooks Law.
source software development methodology
which [Mitch] Kapor's foundation took its name simply did not exist
in the days of The
And more than any other development since then, open source has
tantalized the programming world with the prospect of repealing
(21) The software concepts that united computing's isolated archipelagoes into one global network arose not in the offices of profit-seeking entrepreneurs but from the publish-and-share mind-set of idealistic researchers working in universities and publicly sponsored research centers.
(21) That world—an environment of geekish enthusiasms and cooperative ideals—experienced a sort of waking to self-consciousness in the 1990s. . . . The took their inspiration and tools from two central figures: Richard Stallman and Linus Torvalds.
Kernel as focal core of digital brain; GNU brain stem?
his cohorts labored for years in relative obscurity trying to build
the pieces of GNU—a project to create a free version of the Unix
operating system that predominated in university computing centers. .
. . [Torvalds']
Linux provided GNU with the central component it was missing, the
operating system's “kernel,” the focal core of the digital
(22) Proponents of open source like to draw the distinction between “free as in free beer” and “free as in free speech”: Not all open source software products cost nothing, but all open source software is free to be examined, adapted, and reused.
(24) Programmers are motivated and led toward their best work by a desire to accomplish something that pleases them or fulfills a personal need.
Cathedral versus bazaar modes.
(24) In the Unix world, programmers had long been accustomed to freely sharing their source code. But for important work, cathedral mode had long remained the default, for both free and commercial software. . . . Torvalds and Linux changed that, demonstrating that a “promiscuous,” bazaar-style approach could produce big, useful software that kept getting better.
Raymond suggests network-powered open peer review breaks Brooks paradox, but not notable for bringing new products to users faster.
that this new style of network-powered open peer review had broken
the back of Frederick Brooks's cruel paradox.
(27) The new bazaars of the open source movement have changed computing in many ways, but they are not notable for bringing new products to users any faster than the old cathedral builders did.
(30) Part of the difference, [Michael] Toy felt, lay in Chandler's novelty. It was not a rewrite like so many open source projects; it had grand ambitions.
Toy and Herzfeld grand ambitions to execute Chandler project using best open source practices.
Compare this characterization of Herzfeld style with depiction in Lammers.
(30) At OSAF meetings [Andy] Herzfeld's voice was the most consistent in pushing the developers to stop designing and start coding—or at least to start coding without waiting for the ground to cool.
THE SOUL OF AGENDA
(36-37) The papers in Kapor's pocket—reminders, things to do, ideas, recommendations, and so on—weren't so easy to categorize. He wanted to be able to dump them into the computer on the fly and worry about organizing them later. And he needed to count on being able to find them easily and quickly.
Goals of Kapor Agenda very broad, descendant of Engelbart NLS and hyperscope, inspiring projects like Chandler but still unmet.
Agenda broke new ground in the no-man's'-land that separated the
strict realm of computer logic and fuzzy ambiguities of human
reality. The principles that drove its creators (Kapor worked with
Jerry Kaplan, who went on to found the pen-computing start-up GO, and
Edward J. Belove) were ambitious: Users should be able to input their
data without worrying about the structure of the software that will
store it; users should be able to extend and modify the structure of
the data easily, adding new categories without losing any
information; users should be able to build new ways of viewing their
data and to manipulate and change the data through those views that
they created themselves.
(37) You can gauge how bold these goals were by seeing how little of the software that we now employ, two decades later, meets them.
(39) Whatever shape his new software would take, Kapor decided, as he pondered the inadequacies of Microsoft Exchange and began to dream of inventing something to put in its place, it would have to conjure the soul of Agenda.
(43) If you watch those videos [of the 1968 demo], you'll learn that Engelbart's “oNLine System,” or NLS, was, among other things, a PIM.
Computer programmers ideal target group for Engelbart bootstrapping improving improvement has relevance to critical programming.
But the real purpose of NLS was to help Engelbart's programmers
program better. In the 1962 essay that laid out his plan of research
into the augmentation of human intelligence, Engelbart explained why
computer programmers were the most promising initial target
(44) To Engelbart, bootstrapping meant “an improving of the improvement process.”
Engelbart bootstrapping coevolution of human and machine (Kemeny; Hayles).
(45) Unlike later computer innovators who elevated the term
“usability” to a mantra, Engelbart didn't place a lot of faith in
making tools simple to learn. . . . His vision was of “coevolution”
between man and machine: The machine would change its human user,
improving his ability to work, even as the human user was constantly
improving the machine.
(46) This tension between ease and power, convenience and subtlety, marks every stage of the subsequent history of software.
(46) Mitch Kapor always cited Engelbart as one of his inspirations, and Agenda was in a sense a descendant of NLS.
Epic struggles of actual programming work ignored by proponents; see page 58 for statement about materiality of code.
(47) The picture of digital progress that so many ardent boosters paint ignores the painful record of actual programmers' epic struggles to bend brittle code into functional shape.
Disaster stories in both government and private industry, documented in 1995 CHAOS Report.
But don't jump to the conclusion that government is the problem here;
the record in private industry offers little solace. The corporate
landscape is littered with disaster stories, too. . . . Through
details differ, the pattern is depressingly repetitive: Moving
targets. Fluctuating goals. Unrealistic schedules. Missed deadlines.
Ballooning costs. Despair. Chaos.
(50) In 1995, a Massachusetts-based consulting firm called the Standish Group released a study of software project failures that it called the CHAOS Report.
(50) Whatever progress the industry has made, more than two-thirds of the time it is still failing to deliver.
(51) In fact, though, there is an entire software disaster genre; there are shelves of books with titles like Software Runaways and Death March that chronicle the failures of one star-crossed project after another.
Software disaster genre such as Britcher Limits of Software.
(51) The genre's definitive work to date is The Limits of Software, a disjointed but impassioned book by an engineer named Robert Britcher.
Invokes Gramsci calling for pessimism of intellect, optimism of will as epitomized in software creators.
(54) If you want to change the world, the Italian radical Antonio Gramsci famously declared, you need “pessimism of the intellect, optimism of the will.” Today's software creators are improbable heirs to that binary mind-set.
PROTOTYPES AND PYTHON
(57) When we move some aspect of our lives into software code, it's easy to be seduced by novel possibilities while we overlook how much we may be giving up.
(58) Software is abstract and therefore seems as if it should be infinitely malleable. And yet, for all its ethereal flexibility, it can be stubbornly, maddeningly intractable, and it is constantly surprising us with its rigidity.
Materiality of code as situated constraints manifest in implications of early design choices of languages and technologies (Ramsay), evidenced by long history of struggles in software development.
(58) That paradox kicks in at the earliest stages of a programming
project when a team is picking the angle of attack and choosing what
languages and technologies to use. These decisions about the
foundations of a piece of software, which might appear at first to be
lightweight and reversible, turn out to have all the gravity and
consequence of poured concrete.
(59) One use case for the OSAF project that emerged early on, and would keep reappearing, was organizing a big personal music collection.
Hitching ride on existing code is a style, as attempted for OSAF with RDF.
You could model just about anything in a simple three-part format
that looked something like the subject-verb-object arrangement of a
simple English sentence:
<this> <has-relationship-with> <that>
Then they discovered that the answer they'd come up with had already been outlined and at least partially implemented by researchers led by Tim Berners-Lee, the scientist who had invented the World Wide Web a dozen years before. Berners-Lee had a dream he called the Semantic Web, an upgraded version of the existing Web that relied on smarter and more complex representations of data. The Semantic Web would be built on a technical foundation called RDF, for Resource Description Framework. RDF stores all information in “triples”--statements in three parts that declare relationships between things.
(60) There was no need to reinvent the RDF wheel; maybe OSAF could just hitch a ride on it.
Source code as a distinct type of text born with Fortran.
(67) In Fortran, laborious sequences of assembly language procedures were summarized in brief commands. The human programmer would write a sequence of these commands—source code; then a kind of uberprogram running on the computer called a compiler would translate those commands into object code in the machine's own language.
Van Rossum claims Python much more efficient at accomplishing tasks with less code than C or C++.
(71) [Guido] Van Rossum says that a Python program can usually accomplish the same task as a C or C++ program using three, five, or even ten times less code.
Resemblances between programming languages and their creators; include Van Rossum in survey of language creators.
(72) Just as dogs often come to resemble their owners, it seems that
programming languages end up reflecting the temperaments and
personalities of their creators in some subtle ways.
(75) Object-oriented techniques organize programs not around sequential lines of commands but instead around chunks of code called objects that spring to life and action when other objects call on them. . . . (This terminology sometimes gives discussions of object-oriented programming the strange ring of Marxist theory cut with genetic science.)
(76) But in practice, though the object-oriented approach gave programmers a leg up as they constructed ever more complex edifices of code, it did not open a road to the marketers' utopia where programmers could write components once and reuse them anywhere.
Version tracking systems necessary condition for open source development.
(80) Version tracking systems like CVS had made it possible for far-flung groups of programmers to work simultaneously on the same code base without stepping on one another's toes; they had made open source development possible.
[NOVEMBER 2002-AUGUST 2003]
Applications based on individual computers, old applications guys, versus Internet based as programming styles; direct comparison to collaborative work in composition and digital media.
(85-86) Kapor knew that, like Hertzfeld and John Anderson, he was an
“old applications guy,” as he put it. He had always worked with
programs that computer users ran individually on their own computers.
But Chandler was going to be different: Sharing data over a network
was central to its promise. Montulli and Totic offered the
perspective of programmers who had spent nearly a decade creating
software that depended on the Internet for its critical
(88) And he knew just what they needed: a system for “object persistence.” Rather than store Chandler's data in the RDF subject-verb-object “triple” format, Anderson wanted the program's data and Python code stored together as objects (in the “object-oriented programming” sense of the term) that Chandler's coders could easily grab, manipulate, and save again.
(93) In the time it would take OSAF's programmers to study ZODB, figure out whether it suited their purposes, and tinker with it to get it to fit, Totic figures, they could build their own object persistence code directly on top of Berkeley DB.
Build, buy, or borrow archetypal trilemma of software reuse: Postmodern Programmers Noble and Biddle Lego Hypothesis.
Anderson's “I've found the perfect solution” versus Totic's “We
could do it ourselves just as easily”: Here, once more, was the
archetypal dilemma of software reuse. Build or borrow?
(93) James Noble and Robert Biddle, two scholars in New Zealand who sometimes write together under the sobriquet The Postmodern Programmers, dub this version the Lego Hypothesis.
Pervasive heterogeneity foils realization of Lego hypothesis.
(94) When they peered under the hood of real programs, Noble and Biddle observed what they called “pervasive heterogeneity”: Everywhere you looked, the only constant was that nothing was constant.
Cox superdistribution failed hope for automated market of reliable software components discussed by Larry Constantine in Peopleware Papers.
create incentives for the evolution of a bustling automated market
for reliable software components, but once more, his ideas did not
(97) Larry Constantine, author of a popular column in the 1990s called “The Peopleware Papers,” offered one pragmatic explanation for why programmers did not flock to Cox's ideas.
(98) The twin revolutions of open source development and the Internet have certainly begun to change that habit [of wanting to program everything]. Google has shortened the process of finding things to a duration that even the programmer's two-minute-and-twenty-seven-second attention span can handle.
Information challenge of keeping up with software libraries according to Ward Cunningham.
(99) “Keeping up with what's available in the libraries,” says programming expert Ward Cunningham, “is the number one information overload challenge.”
Acronym yahoo as well as reference to Swift, who is often invoked by digital humanities theorists.
When two Stanford grad students started up Yahoo! In 1994, the name
was a smart-alecky acronym for Yet Another Hierarchical Officious
(102) Here is one of the paradoxes of the reusable software dream that programmers keep rediscovering: There is almost always something you can pull off the shelf that will satisfy many of your needs. But usually the parts of what you need done that your off-the-shelf code won't handle are the very parts that make your new project different, unique, innovative—and they're why you're building it in the first place.
Lore of cowboy coders who are heroes to programmers, nightmares to managers.
(111) There are decades of lore in the software industry about the “cowboy coder,” the programmer who resists rules, prefers solitude, and likes to work on the edge. To a lot of managers, cowboy coders are a nightmare; to a lot of programmers, they are heroes.
Items and attributes as basic object models; item at heart of Chandler data model.
(112) The heart of Chandler's data model would, they reaffirmed, be
(113) The difference between a more object-oriented item approach and RDF's attribute-based technique, the developers sometimes said, was like the difference between realms of physics: Items were like atoms; attributes were like subatomic particles.
MANAGING DOES AND GEEKS
Software management always dealing with slider-like adjusts to cost, schedule, features, quality.
(120) But a manager gets to site down at the console and move those
sliders around only if a project is organized enough to respond
predictably to decisions about cost and schedule and features or
(125) On his personal weblog he [Toy] interrogated himself about his liberal Christian faith, and on the new blog he started at OSAF [Blogotomy], he posted thoughts on the process of managing programmers.
Irony that writing software resistant to measurement, leading to management techniques like SWAG and MBWA (Humphrey, Brooks).
One great irony in the management of software projects is that
despite the digital precision of the materials programmers work with,
the enterprise of writing software is uniquely resistant to
measurement. . . . There is no reliable relationship between the
volume of code produced and the state of completion of a program, its
quality, or its ultimate value to a user.
(128) Most software managers, well aware of these difficulties, end up improvising. There is a list of what needs to be done, subdivided into a series of tasks, and there is some method of keeping track of which of those tasks is (more or less) completed. Fully aware of the perils and paradoxes of software time, the manager will still expect individual programmers to try to estimate—or at least SWAG (take a Silly, Wild-Assed Guess)--how long each remaining task will take.
(129) But MBWA, as the tech industry's acronym-mongers soon dubbed the idea, doesn't translate well to the software realm: The work is simply not visible to the wandering managerial eye. No one has expressed this difficulty with more matter-of-fact precision than Watts Humphrey—a high priest of software management who led the IBM software team in the 1960s after Frederick Brooks's departure, and then went on to found the Software Engineering Institute at Cargnegie-Mellon and to father a whole alphabet soup's worth of software development methodologies.
Evolution of word geek for person finding relationship with computers easier than other humans.
If, as Kemeny feared, a combination of deficiencies in technical training and good intentions characterized the past few decades, then the arrival of geeks on the cultural scene should raise less alarm than the hordes of mostly inept users that fills out with them the middle and lower classes; such is the opinion of Langdon Winner, for example, who seeks to dispel the rhetoric he calls mythinformation.
Roughly a decade ago the word geek
into common usage to describe the kind of person who finds it easier
to have relationships with computers than with other human
(130) You can trace the evolution of the word geek through successive editions of the Jargon File, also known as the New Hacker's Dictionary, a compendium of programmer language and lore curated by Eric Raymond of “The Cathedral and the Bazaar.”
(131) What they all share is a passion for specialized knowledge and a troubled relationship—at best clumsy, at worst hostile—with everyone who lacks that passion.
(133) At the extreme end of the spectrum, the behavioral profile of the programmer—avoiding eye contact, difficulty reading body language, obsession with technical arcana—blurs into the symptom list of a malady known as Asperger's syndrome.
Suggestion that programmers hear machine frequencies.
(134) Some observers of the programming tribe have suggested that in order to commune more closely with the machines they must instruct, many programmers have cut themselves off from aspects of their humanity. But the Asperger's/autism parallel suggests that, more likely, those programmers were themselves already programmed to hear machine frequencies as well as or better than human wavelengths. That can help them write effective code and design efficient algorithms. But it puts them at some disadvantage in understanding how to shape a program so it accomplishes its human user's goals. It also sets them up for trouble when it comes to the simple need to communicate with anyone else who isn't also a geek.
Weinberg Psychology of Computer Programming; see Hayles and Turkle for positive and pessimistic conceptions of synaptogenesis arising from human computer symbiosis.
It [Gerald Weinberg
The Psychology of
was the first attempt to study the burgeoning new discipline with the
tools of the anthropologist and psychologist. One of the things
Weinberg found was that despite their reputation as loners,
needed the change to talk to one another—the
more informal the setting, the better.
(136) In the decades since Weinberg's writing, managers have gone from clumsily wrecking such informal mechanisms [like the water cooler] to clumsily trying to encourage them. . . . But in the same time span, programmers have gone far beyond architects and managers: They have invented a profusion of technologies for staying in touch with one another, extending the software cosmos with multiple new genres of tools for coordinating a team's work.
(136) The original digital groupware was Doug Engelbart's NLS, the 1960s-era template for so much of what would unfold in the personal computing revolution to follow.
Alexander Pattern Language basis for attempts to apply approach to programming like Portland Pattern Repository wiki started by Cunningham; promise of wiki for web-based collaboration as substitute for official project management tool.
a sort of grammar of construction by observing common elements or
patterns in successful buildings. The software pattern-language
people aimed to apply the same approach to programming.
(139) Cunningham's pioneering wiki, the Portland Pattern Repository, grew over a decade to about thirty thousand pages. It inspired a whole wiki movement. . . . Wikis seemed to offer a quick-and-dirty shortcut to the promised land of Web-based collaboration.
Early use of Bugzilla for coordination shifted to OSAF developing their own tracking tool, a common stage in growth of many projects and organizations.
(142) For six months Bugzilla remained OSAF's official project
management tool, but as willingness to use it grew increasingly
sporadic, Morgan Sagen began working on a homegrown Status Manager
for the team—a Web-based tool that would streamline the process for
entering tasks and viewing them sorted by person, by project, by
time, and by status.
(142) In software management, coordination is not an afterthought or an ancillary matter; it is the heart of the work, and deciding what tools and methods to use can make or break a project.
GETTING DESIGN DONE
Edge cases involve concepts alien to nonprogrammers that constitute much of the digital minutiae concealed by end-user application interface.
Creating end-user application software—software intended for use by
mere mortals—means anticipating myriad combinations of human
actions and machine reactions.
(148) They spend the bulk of their working hours wrestling with digital minutiae, and their reflexes have already been trained in the customs of the systems they build. Concepts they take for granted are often entirely alien to nonprogrammers; users' assumptions may well be foreign to them.
(148) Programmers call these edge cases, and they are often where bugs hide.
Kapor software design manifesto invokes ancient Roman Vitruvius design principles of firmness, commodity, delight.
In 1990, at the PC Forum gathering of computer industry luminaries,
Kapor first delivered the text of his “Software Design
(149) Reaching back to ancient Rome, Kapor proposed applying to software the architecture theorist Vitruvius's principles of good design: firmness—sound structure, no bugs; commodity--”A program should be suitable for the purposes for which it was intended”; delight--”The experience of using the program should be a pleasurable one.”
Data-driven CPIA a second order stored program concept, encapsulating program blocks in the same data repository as the user data; see discussion of late binding.
The program would store its blocks as data in the Chandler repository
itself. This “data-driven” design would theoretically make it
easier to change the behavior of a block; instead of writing new
program code, you could just make and store a change in the
(158) CPIA [Chandler Presentation and Interaction Architecture] was a specific instance of the Lego Land dream of reusable software parts.
Big-bang integration versus continuous integration for distributed changes to shared source code.
(161) Most projects today embrace the idea of continuous integration: The programmers always keep their latest code checked in to the main trunk of the code tree, and everyone is responsible for making sure that their new additions haven't thrown a spanner into the works. Later on, OSAF would end up achieving a higher level of continuous integration, but for 0.2 the process was more like what software-development analysts call “big-bang integration”: all the programmers try to integrate their code at the end, and everything breaks.
David Allen GTD philosophy guiding design of Chandler as trusted system.
Soon after starting work as OSAF, [Mimi] Yin attended a daylong
seminar given by David Allen,
a productivity coach whose book Getting
Things Done was
establishing near-cult status among programmers. . . . But Yin was
the person at OSAF who would take a systematic look at how the ideas
of GTD might help shape Chandler.
(165) GTD proposes that we can stop feeling overwhelmed by our stuff and take charge of it by creating a “trusted system”--on paper or digitally, it doesn't matter.
(165-166) If you can do what needs to be done in two minutes or less, Allen advises, just do it. Otherwise, decide if it's something to file, discard, defer, or classify as part of a particular project with a next action.
(166) Older stuff would recede off the top of the screen into a storage area; deferred and future-scheduled items would get moved out of view at the bottom of the screen; and everything that needed to be processed would await the user's attention in the centre.
Hertzfeld withdrew energy from OSAF to start folklore-dot-org, a tool combining blog and wiki enabling groups to share stories; any relation to folkvine?
(168) As Hertzfeld withdrew some energy from OSAF in the latter part of 2003, he began a new project of his own: At a Web site called Folklore.org, he built a little software tool that borrowed aspects of both blogs and wikis to enable groups to contribute and share stories.
Early history of Chandler revealed disappointing pace fitting norm of other software projects.
In the annals of software history, Chandler's disappointing pace is
not the exception but the norm.
(174) Yet even if you took Torvald's advice—even if you started small, kept your ambitions in check, thought about details, and never, ever dreamed of the big picture—even then, Torvalds said, you shouldn't plan on making fast progress.
(176) Three computers now sit under his desk, one each for the Windows, Mac, and Linux versions of the program. Scripts running on these boxes constantly look for new “check-ins” of Chandler code; each time a developer checks in a change or addition to the source code repository, these computers test it to make sure that it hasn't “broken the build”--that the new changes, combined with all the existing Chandler code, haven't caused the resulting programming assemblage to fail one or more of Sagen's tests.
Multiple platform automated test system linked to Tinderbox status indicator.
The three sentinel computers send the output of their tests to a
program called Tinderbox, which publishes the results to a Web page
with three graphs that constantly display the current state of the
(180) Writing the spec, a document that lays out copiously detailed instructions for the programmer, is a necessary step in any software building enterprise where the ultimate user of the product is not the same person as the programmer.
(185) A city may not be a tree, as Alexander said, but nearly every computer program today really is a tree—a hierarchical structure of lines of code.
(185) “The tree” is the informal name for the directory of source code where developers check in their work.
Kapor forcing function revealed by sketching overall design, from which focal points for decisions emerge; most challenging demand of software development is communicating abstractions unambiguously.
(186) The simple act of sketching what this view would contain and show, and what it wouldn't, became a focal point for decisions—what Kapor liked to call a forcing function.
Story of Denman MacBasic failure that led to AppleScript.
(188) [Donn] Denman kept working on MacBasic, but before it was ready, Microsoft had released its own Macintosh-based Basic—one that the Apple programmers felt was inferior and poorly integrated with the Mac's new design. Meanwhile, Apple's deal with Microsoft to license the Basic that ran on the Apple II was up for renewal. In return for a new license, which Bill Gates knew Apple badly wanted, Microsoft demanded that Apple shut down Denman's MacBasic. Apple sold the code to Microsoft for a dollar.
Lure of Basic and scripting systems for the programming challenged.
(188) Ultimately, he stayed at Apple and worked on AppleScript, another system for nonprogrammers. . . . He had been itching to return to his passion: scripting systems for the programming challenged.
Ontological problems in computational worlds of kind-ness of ambiguous item, which challenges assumptions of fixed variable typing and early binding, objects addressed by stamping.
(189) Like the human body's undifferentiated stem cells, notes would begin life with the potential to grow in different directions. This design aimed to liberate the basic act of entering information into the program from the imprisoning silos. It also made room for Yin's proposed solution to the item mutability problem: The mechanism users would employ to specify the “kind-ness” of an item would be called stamping.
Contrast between forgiving flexibility of human languages and hazards of descriptive ambiguity in software development such as namespace clashes.
(192) Human language is more forgiving: One word can mean more than one thing. This flexibility provides a deep well of nuance and beauty; it is a foundation of poetry. But it leads only to trouble when you are trying to build software. As OSAF's developers struggled to transform the innovations in Chandler, such as stamping, from sketch to functioning code, they repeatedly found themselves tripped up by ambiguity. Over and over they would end up using the same words to describe different things.
Iconic presence of whiteboards for temporary visualization of incorporeal, invisible elements beyond windows and text of UI.
(195) Beyond the windows and text of a user interface, most elements
of software are incorporeal and invisible. There is nothing to point
to. So talking about them is unexpectedly difficult. This is one
reason the whiteboard is such an iconic presence in any space where
software is labored over; it provides a canvas for laying out the
abstract processes of a complex program in ways that allow people to
point to what they're talking about.
(196) Carefully chosen names avoid the confusion of “namespace clashes” or “collisions”--the use of a term means one thing in one context but something else in another. . . . Names that meant one thing when the programmer's work began end up meaning something different once a thousand bugs have been fixed.
Hungarian notation a naming technique to reduce ambiguity; see discussion in Lammers.
(197) In Hungarian notation, the programmer appends a prefix to every variable name that gives anyone reading the code important clues about what sort of variable it is.
Communicating abstractions unambiguously most challenging software development demand.
(198) Communicating abstractions unambiguously—from programmer to
machine, for programmer to programmer, and from program to user—is
the single most challenging demand of software development.
(200) They devised a new concept called the “mix-in kind.” A mix-in kind defined a set of attributes associated with a kind; for a task they might include “Priority” and “Done Status.”
Code review also an ambiguous term.
(200) The term code review can mean anything from an informal monitor-side chat to a weeks-long bureaucratic gantlet involving multiple layers of code inspection.
STICKIES ON A WHITEBOARD
Dogfooding on less extreme pole of improvement continuum than bootstrapping.
(209) The goal of bootstrapping was to rev up a feedback loop so that today you would use the tool you invented yesterday to build a better one for tomorrow; dogfooding, by contrast, had the more modest and pragmatic aim of speeding up bug-finding and bug-fixing by shoving developers' noses into their products' flaws.
Programmers bring prior enthusiasms and expertise to new problems, which can lead to mismatches as well as free ride on hobbyhorses: example of Dusseault work on WebDAV at IETF.
(211) At the IETF, Dusseault involved in work on a new standard
called WebDAV (for Web-based Distributed Authoring and Versioning),
an extension to the basic protocol of the Web that, as its official
Web site explains, “allows users to collaboratively edit and manage
files on remote Web servers.”
(213) WebDAV works by extending HTTP—the protocol that Web servers and browsers use to talk to each other—adding new commands that allow users to edit files on a remote server.
(213-214) Programmers always bring their preexisting enthusiasms and expertise to a new problem. At worst this can lead to mismatches of the “when you have a hammer, everything looks like a nail” variety; at best it means that when you bring new people into a project, you get a free ride on their hobbyhorses.
Kapor imagined user empowerment means server-free environment.
(214) [quoting Kapor] “My and OSAF's original position was,
electricity is good, therefore everyone should have their own power
plant. Unconsciously, I always imagined that user empowerment somehow
meant a server-free or server-light environment.”
(214-215) [quoting Kapor] “So many of the people who are though leaders in open source value freedom and initiative, and those values have been very tied up with this American frontier myth of self-sufficiency. . . . it turns out that the reality of open source and the Internet is much more collaborative than the narrow libertarian P2P ethic.”
Stickies on whiteboard planning tool, compare to Heim clustering, often reenating archetypical struggle between product and development managers.
(226) She  clears the left third of the long whiteboard at the
front of the room and draws four vertical columns on it, labeled 0.4,
0.5, 0.6, 0.7. Then she draws two more horizontal lines, dividing the
columns into top, middle, and bottom, corresponding to each of OSAF's
three development groups—apps, services, and repository.
(226) Dusseault, Parlante, and Mooney start marking up stickies and affixing them to the whiteboard. They confer about names, placement, and whether to divide some extra-difficult tasks into two stickies.
(228) Lam and Dusseault were reenacting an archetypal dialogue: the tug-of-war between product manager and development manager.
(239) Somebody must have figured this stuff out. Somewhere there is a map with a path out of the software-time tar pit.
Structured programming methods revealed code organization easier than organizing people and their work.
its recommendations from a defensive crouch, trying to protect
fallible programmers from their own flaws.
(241) It turned out that improving how you organize code was a cakewalk compared with improving how you organize people and their work.
Humphrey Capability Maturity Model measurement of quality of software development organizations.
Humphrey's success at enforcing schedule discipline at IBM stood on
two principles: Plans were mandatory. And plans had to be
(244) At SEI, Humphrey and his colleagues created the Capability Maturity Model (CMM) as a kind of yardstick for judging the quality of software development organizations.
Team Software Processes and Personal Software Processes criticize autocratic management styles by encouraging organization and self-management by individuals and small teams, inspired by Deming.
TSP [Team Software Processes] and PSP [Personal Software Processes]
criticize “autocratic management styles” and encourage individual
developers and small teams to seize control of their own destiny by
taking responsibility for planning and quality control, sharing
information, and “dynamically rebalancing” their workloads as
(246) The CMM, TSP, and PSP all drew inspiration from the ideas of manufacturing quality expert W. Edwards Deming, who argued that quality should not be an afterthought but ought to be built into every stage of a production process. In software this means the CMM is hostile to the “code and fix” tradition, where programmers produce bug-filled products, testers find bugs, and then programmers go back and fix them.
(246) But the CMM and its related methodologies have yet to make a major dent in the world of business software or desktop computing in the United States.
(247) In practice, “fix bugs first” works fine until the people who are waiting for the finished product grow impatient. Then the principle falls by they wayside in a scramble to deliver working, if imperfect, code.
(248) While counting bugs is plainly a valuable exercise—it's better than not counting them—it tends to throw critical system failures and picayune flaws into the same bucket. It also ends up encouraging programmers to perfect existing products rather than build new things.
Cunningham and Beck software patterns movement recorded experiences as narratives for solving particular problems rather than coming up with best practices; compare to heuristic modeling in AI.
(250) The software patterns movement, whose leaders included wiki inventor Ward Cunningham and a programmer named Kent Beck, imagined a new kind of lifeline for them, a less prescriptive approach to software methodology. Instead of laying down fixed principles of best practices, they recorded their experiences in brief narratives. “Faced with this kind of problem,” they would say, “we've found this pattern of programming to be useful.” The patterns movement approached software development as a craft.
Claim of patterns movement that physical act of moving index cards helped with software design.
(250) For instance, to tame the complexity of object-oriented coding, Cunningham and Beck proposed that programmers design a new program by laying out index cards—one per software object—on a table. . . . “We were surprised at the value of physically moving the cards around.”
Waterfall approach the typical project model of 1970s.
(251) For decades the organization of the typical project followed the “waterfall model.” The waterfall approach—the label first surfaced in 1970—divided a project into an orderly sequence of discrete phases, like requirements definition, design, implementation, integration, testing, and deployment.
Boehm mid-1980s spiral model of iterations of mini-waterfalls allows for more feedback.
(251) The waterfall model gradually acquired the bad reputation it deserved. In the mid-eighties, [Barry] Boehm defined an alternative known as the “spiral model,” which broke development down to “iterations” of six months to two years—mini-waterfalls dedicated to producing working code faster and allowing feedback from use of the resulting partially completed product to guide the next iteration.
Rapid Application Development methodology popular in 1990s emphasizes quick prototyping, aggressive cycles, reliance on computerized tools to handle mundane tasks.
(251) In the nineties, the software industry's methodology devotees adopted the banner of Rapid Application Development (RAD), which promised to speed up the delivery of finished software through quick prototyping, more aggressive iteration cycles, and reliance on new tools that let the computer itself handle some of programming's more mundane tasks.
Agile Software Development values individuals and interactions over processes, working software over documentation, customer collaboration over contracts, responding over following plans.
The meeting found a more virile name for the movement—Agile
produced a manifesto that reads in its entirety:
We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:
Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan
That is, while there is value in the items on the right, we value the items on the left more.
Extreme Programming pushes accepted methods to their limits, breaking projects down into narratives developers explain solution to customer feature requests.
label mostly refers to the way it adopts a set of widely accepted
methods and then pushes them to their limits.
(253) It mandated breaking projects down into “stories.” Each story represents a feature request that the customer lays out for the developers in a narrative that explains what the program should do.
Spolsky skeptical of Big-M methodologies, comparing that kind of software development to fast food production; twelve question Joel Test for rating organizations.
highly skeptical of what he calls “Big-M methodologies.” . . .
“It's pretty obvious to me that a talented chef is not going to be
happy making burgers at McDonald's, precisely because of McDonald's
(257-258) The Joel Test asks the following dozen questions:
Do you use source control?
Can you make build in one step?
Do you make daily builds?
Do you have a bug database?
Do fix bugs before writing new code?
Do you have an up-to-date schedule?
Do have a spec?
Do programmers have quiet working conditions?
Do you use the best tools money can buy?
Do you have testers?
Do new candidates write code during their interview?
Do you do hallway usability testing?
“A score of 12 is perfect,” Spolsky wrote, “11 is tolerable, but 10 or lower and you've got serious problems. The truth is that most software organizations are running with a score of 2 or 3, and they need serious help, because companies like Microsoft run at 12 full-time.”
Rare success, when it does occur, often by-product of restraint; high praise for 37 Signals development pragmatic minimalism methods.
Very often, in those rare cases, success is a by-product of
iron-willed restraint—a choice firmly made and vociferously
reasserted at every challenge to limit a project's scope. Where you
find software success stories, you invariably find people who are
good at saying no. . . . Either way, the perspective is less “small
is beautiful” than “big is dangerous.”
(262) Just as 37 Signals had extracted the Rails framework from the Basecamp code, it extracted a design philosophy from the Basecamp experience, encoded in a handy series of aphorisms: “Less software.” “Say no by default.” “Find the right people.” “Don't build a half-assed product, build half a product.” These are buzz phrases designed for quick consumption via presentation slides, but together they constitute a coherent approach to software development—call it pragmatic minimalism.
Google method of small, ad hoc project teams working tight deadlines producing narrowly focused Web-based products, incrementally improved based on feedback and field experience, coupled with decree to spend 20 percent of time on personal projects.
(263) Small teams assembled ad hoc for each new project, worked on tight deadlines, and rolled out narrowly focused Web-based products, which they then improved incrementally based on user feedback and field experience. Google also told its programmers to devote one-fifth of their work time to personal projects.
Carr argues, following Fukuyama end of history, that software history is over and just a matter of perfecting heavyweight methodologies; compare to early dreams of automatic programming.
Like Francis Fukuyama,
the Hegelian philosopher who famously declared “the end of history”
when the Berlin wall fell and the Soviet Union imploded, [Nicholas]
essentially, that software history is over,
done. We know what software is, what it does, and how to deploy it in
the business world, so there is nothing left but to dot the i's and
bring on the heavyweight methodologies to perfect it.
(265) But of all the capital goods in which businesses invest large sums, software is uniquely mutable. . . . And so every piece of software that gets used changed as people decide they want to adapt it for some new purpose.
(266) If you believe that we already know everything we want from software, then it's natural to believe that with enough hard work and planning, we can perfect it—and that's where we should place our energies. Don't even think about new features and novel ideas; focus everyone's energies on whittling down every product's bug list until we can say, for the first time in history, that most software is in great shape.
(267-268) Software development is often compared to the construction industry, but the analogy breaks down in one respect. . . . Therefore, once somebody has written a program that does what you need it to do, it's always cheaper to buy that software than build something from scratch.
Rosenberg contributes a law confounding Carr: software is easy to make, except when you want it to do something new, with corollary that the only software worth making does something new.
(268) Since every write about software sooner or later ends up offering a law under his own name, the time has come for me, with all due humility, to present Rosenberg's Law: Software is easy to make, except when you want it to do something new. And then, of course, there is a corollary: The only software that's worth making is software that does something new.
ENGINEERS AND ARTISTS
(271) In the decades since that conference, these two words, software and engineering, have become fused at the hip, like government and bureaucracy or fast and food. . . . But despite such bureaucratic rearguard actions, the term has become universal, only raising questions about whether the parallel between building structures and building code makes sense.
(273-274) “Masterpiece Engineering” could not have been more prescient in laying out the central fault line that has marked all subsequent discussions of the Software Crisis and its solutions—between those who see making software as a scientific process, susceptible to constant improvement, perhaps even perfectible, and those who see it as a primarily creative endeavor, one that might be tweaked toward efficiency but never made to run like clockwork.
1968 NATO software engineering conference prescient of next four decades of software development subjects and controversies, deserving study like the Macy Conferences.
(274) The two 130-page reports of the NATO software engineering conferences foreshadow virtually all the subjects, ideas, and controversies that have occupied the software field through four subsequent decades.
Etymology of engineer invokes ingeniousness and making things skillfully along with modern sense of applying scientific principles.
Engineering is often defined as the application of scientific
principles to serve human needs. But it also brings creativity to
bear on those scientific principles, dragging them out of pristine
abstraction into the compromised universe of our frustrations and
wants. The word derives (via a detour through medieval French) from
the same Latin root that gave us ingenious
refers to the ability to make things skillfully.
(277) If we could only discover dependable principles by which software operates, we could transcend the overpowering complexity of today's Rube Goldberg-style programs and engineer our way out of the mire of software time.
Common grail of automatic software since invention of compiler by Hopper.
(277) For many on this quest, the common grail has been the idea of “automatic software”--software that nonprogrammers can create, commands in simple English that the computer can be made to understand. This dream was born at the dawn of the computer era when Rear Admiral Grace Hopper and her colleagues invented the complier.
Simonyi Intentional Software applying WYSIWYG to act of programming itself; compare to Lammers.
For years [Charles] Simonyi
research at Microsoft into a field called “intentional
In 2002, he left the company that had made him a billionaire and
funded a new venture, Intentional Software, with the goal of
transforming that research into a real-world product.
(279) Simonyi wants to give these subject matter experts a set of tools they can use to explain their intentions and needs in a structured way that the computer can understand. . . . That set of definitions, that model, is then fed into a generator program that spits out the end-product software.
(280) As a young man, Simonyi led the development of Bravo, the first word processing program that functioned in what programmers now call WYSIWYG fashion (for “what you see is what you get,” pronounced “wizzy wig”).
Concern that Intentional Software demand nonprogrammer experts will have to create machine-readable models in absence of natural, flexible communication, raising old problem of natural language processing.
(280) Simonyi's Intentional Software is, in a way, an attempt to apply the WYSIWYG principle to the act of programming itself. But Simonyi's enthusiastic descriptions of the brave new software world his invention will shape leave a central question unanswered: Will Intentional Software give the subject matter experts a flexible way to express their needs directly to the machine—or will it demand that nonprogrammer experts submit themselves to the yoke of creating an ultra-detailed, machine-readable model?
Compare leaky abstractions to problems with text encoding (McGann on OHCO hypothesis).
In an essay titled “The Law of Leaky Abstractions,” Joel Spolsky
wrote, “All non-trivial abstractions, to some degree are leaky.”
. . . For programmers it means that new tools and ideas that bundle
up some bit of low-level computing complexity and package it in a
new, easier-to-manipulate abstraction are great, but only until they
break. Then all that hidden complexity leaks back into their
(284) For a programmer the lesson might be that stacks of turtles, or layers of abstractions, don't respond well to the failure of even one small part. They are, to use a word that is very popular among the software world's malcontents, brittle. When stressed, they don't bend, they break.
Late binding bridges gulf between compiled, interpreted, and even more dynamic programming methods like the CPIA; have to go back to Lisp for a good example.
Does lack of late binding weaken claim that C++ is the most philosophical programming language?
(285) Late binding is a term in computer science that refers to programming languages' capacity to provide programmers with more flexibility. Late-bound programs can be changed on the fly, while they're running; you can even build them so that they can change themselves while they're running. While some of the popular programming languages of today—Java to some degree, Python (Chandler's language) to a greater degree—are considered to have a late-binding nature, the core languages underlying much of the edifice of modern software, C and C++, do not. . . . To find nimbler languages that embody the essence of late-binding dynamic power, [Alan] Kay says, you have to go back decades—to Lisp, the “list processor” language invented by artificial intelligence researched John McCarthy in the late 1950s, and to Smalltalk, the language Kay himself conceived at Xerox's Palo Alto Research Center in the seventies.
In place of late binding or other language improvements, most programmers remain in thrall of compile cycle.
(285-286) Instead, today's programmers remain in the thrall of the “compile cycle.” . . . You had to wait thirty or even sixty seconds for the program to launch before you could see the results of a change in the code. Those seconds added up to a lot of waiting time.
Kay historical analogy to building pyramids brick by brick: does not scale.
(286-287) Kay loves to use historical analogies when he talks about
software. . . . “Most software today is very much like an Egyptian
pyramid with millions of bricks piled on top of each other, with no
structural integrity, but just done by brute force and thousands of
(287) You can build big things this way, Kay says, but it “doesn't scale.”
(288) Kay maintains that the software discipline today is actually somewhere in its Middle Ages--”We don't have to build pyramids, we can build Gothic cathedrals, bigger structures with less material”--but that the commercial software world remains stuck in pyramid mode.
(288) Ultimately, he says, we need to stop writing software and learn how to grow it instead.
Growing software instead of writing it; Kay version of OOP as bundling code and data together.
Kay's original vision for object-oriented
grander than just the idea of organizing code into reusable routines,
which the software industry ultimately embraced. Kay-style OOP aimed
for a total rethinking of one foundation of our software universe:
today's near-universal separation of program from data, procedural
code from stored information. Instead, imagine a system in which the
information and the code needed to interpret or manipulate it travel
together in one bundle, like a cell traveling with its packet of
(289) Bundling procedures and data in cell-like portable objects isn't on most programmers' agendas.
Squeak is open source incarnation of Smalltalk targeted for children to discover new development methods.
(290) Among other things, for the last decade he has labored on Squeak, a latter-day, open source incarnation of Smalltak for children. Since we are still trying to discover the basic of software engineering, Kay's logic goes, let's give the next generation a taste of alternatives.
For Backus and Lanier the von Neumann stored program architecture has become concretized as if an act of God.
Programming, [John] Backus
had grown out of the ideas of John von Neumann, the mathematician
who, at the dawn of computing in the 1940s, devised the basic
structure of the “stored program” of sequentially executed
instructions. But those ideas had become a straightjacket.
(291) [Jaron] Lanier says that we have fallen into the trap of thinking of arbitrary inventions in computing as “acts of God.”
Lanier argues computing models of protocol period derived from problems of communications, and need to be supplanted by phenotropic interaction of surfaces, which sounds more like Deleuze and Guatarri body without organs.
[quoting Lanier 2003 essay on problem of “Gordian software”]
“Some things in the foundations of computer science are
fundamentally askew,” Lanier concluded, and proceeded to trace
those problems back to “the metaphor of the electrical
communications devices that were in use” at the dawn of computing.
Those devices all “centered on the sending of signals down wires”
or, later, thorugh the ether: telegraph, telephone, radio, and TV.
All software systems since, from the first modest machine-language
routines to the teeming vastness of today's Internet, have been
“simulations of vast tangles of telegraph wires.” Signals travel
down these wires according to protocols that sender and receiver have
agreed upon in advance.
(293) Why not build software around the same principle of pattern recognition that human beings use to interface with reality?
(293) “When you de-emphasize protocols and pay attention to patterns on surfaces, you enter into a world of approximation rather than perfection,” Lanier wrote.
(293) Lanier calls this idea “phenotropic software” (defining it as “the interaction of surfaces”).
Collective fall from innocence of initial thrill of programming according to Lanier.
(294) In Lanier's view, the programming profession is afflicted by a sort of psychological trauma, a collective fall from innocence and grace that each software developer recapitulates as he or she learns the ropes.
Lanier dreaming in code as post-symbol communication; compare to Hayles technological nonconsicous and Berry interpretation of Serres parasite.
He says his critique of software is part of of a lifelong research
project aimed at harnessing computers to enable people to shape
visions for one another, a kind of exchange he calls “post-symbolic
likens to dreaming—a “conscious, waking-state, intentional,
(296) Intelligent voices can be found on both sides of this fence. Robert Britcher, in his The Limits of Software, takes the pessimistic view to this its tragic limit.
Brooks says give up looking for silver bullet; Sussman claims new engineering principles are needed.
Frederick Brooks tells us to give up hunting for a silver bullet:
Software's complexity is not a removable quality but an “essential
property.” Still, he leaves room for incremental progress mad
“stepwise, at great effort.”
(297) Others hold on to the dream of fundamental reform or revolution. MIT compute scientist Gerald Jay Sussman wrote: “Computer science is need deep trouble. . . . We need a new set of engineering principles that can be applied to effectively build flexible, robust, evolvable, and efficient systems.”
Parnas 1985 essays among great documents of software history lacks attention to these controversies.
One of the great documents of software history is a brief set of
essays by David Lorge Parnas
in 1985. The drab title, “Software Aspects of Strategic Defensive
Systems,” offers no hint of the controversy that birthed it.
(298) If “thinking things out in the order that the computer will execute them” is how programmers work, yet doing so is ultimately beyond their capability, how is it that we end up with any working software at all?
Programming is writing, symbolic cognition.
(298) People write programs. . . . Despite the field's infatuation with metaphors like architecture and bridge-building and its dabbling in alternative models from biology or physics, the act of programming today remains an act of writing—of typing character after character, word after word, line after line.
Why not study great works and the artists who made them, following Gabriel; especially given difficulty leading programmers like Joy have at writing books, clear invitation for texts and technology methodologies.
one of the leading programmers of his generation; he wrote much of
the free Berkely version of Unix, devised some of the critical
underpinnings of the early Internet, and helped create Java. After
leaving Sun Microsystems, which he cofounded, he attempted to write a
book. But in 2003 he told the New
York Times that
he was putting the project aside. It was one thing to write a
complier to interpret or the computer to execute; writing for other
people was simply too hard.
(299) Yet the programming field could learn much from the writing world, argues Richard Gabriel, a veteran of Lisp and object-oriented programming worlds who is now a Distinguished Engineer at Sun. . . . “They study great works of poetry. Do we do that in our software engineering disciplines? No. You don't look at the source code for great pieces of software.”
(300) But a bigger reason, Gabriel argues, is that much of the software in use today can't be studied; its code is locked away for commercial reasons. (Unsurprisingly, Gabriel is a believer in the open source movement.)
Gabriel feels software developers not challenged to present their work for peer criticism as much as literary writers and poets.
(300) He discovered that we ask more work of students who want to become writers and poets than of those who aim to become software developers: They must study with mentors, they must present their work for regular criticism by peers in workshops, and they're expected to labor over multiple revisions of the same work.
Perhaps mere executability and requirements criteria of object code overshadows interest in reading source code and insisting on quality of revisions; only now are programmers writing about their work in web sites and blogs, which has become the distributed informal site for communication like the vending machines venerated by Weinberg.
Here is an entry for critical programming studies beyond examining source code comments and programs themselves that may resemble writing about writing popular in compositions studies.
And yet something extraordinary happened to the software profession
over the last decade. Programmers started writing personally,
intently, voluminously, pouring out their inspirations and
frustrations, their insights and tips and fears and reams, on Web
sites and in blogs. . . . Yet it is changing the field—creating, if
not a canon of great works of software, at least an informal
literature around the day-to-day practice of programming. The Web
itself has become a distributed version of the vending-machine-lined
common room that Gerald Weinberg wrote about in The
Psychology of Computer Programming:
an informal yet essential place for coders to share their knowledge
(301) Maybe the problem is insoluble. Or maybe it isn't a problem at all but, rather, simply a manifestation of the uniqueness of programming as a human activity.
Knuth emphasizing art of programming, readability for others over science clear implication for humanities study.
You can find significant evidence supporting such a conclusion in the
life and work of Donald Knuth—programmer,
teacher, and author of a series of books that are widely viewed as
the bibles of his profession.
(302) Knuth chose to name his books The Art of Computer Programming—not The Science of Computer Programming.
(304) His work on TeX and Metafont led Knuth to draw precisely the opposite conclusion from Bill Joy: Writing software is “much more difficult” than writing books, he declared.
(305) As a landmark author who also devoted a decade to tackling—and solving—a fiendish practical problem in software, Knuth is probably better qualified than any other living human being to compare the relative difficulties of writing books and writing code.
(306) Knuth's proposal emphasizes writing code that is comprehensible to human beings, under the thinking that sooner or later programmers other than the author will need to understand it, and that such code will end up being better structured, too.
(307) Well-commented code is one hallmark of good programming practice; it shows that you care what you're doing, and it is considerate to those who will come after you to fix your bugs. But comments also serve as a kind of back channel for programmer-to-programmer communication and even occasionally as a competitive arena or an outlet for silliness.
Critical code studies connection to analyzing profanity in comments in Linux source code and leaked Windows 2000 source code, yet a weak focus when the putative goal is to understand code.
(307) A Norwegian programmer named Vidar Holen lovingly maintains a
Web page labeled “Linux kernel swear words.” On it he charts over
time the levels of profanity in comments within the Linux source
(308) Comments sometimes serve not just as explanatory notes but as emotional release valves. Despite decades of dabbling with notions of automatic programming and software engineering, making software is still painful. Anguished programmers sometimes just need to say “fuck.”
(309) Among the surprises they found were many comments in which the Microsoft programmers berated themselves, their tools, their colleagues, and their products.
THE ROAD TO DOGFOOD
[NOVEMMBER 2004-NOVEMBER 2005]
(310) Once more they set themselves a goal for the cycle: Chandler 0.5 would deliver a “dogfoodable calendar,” a shareable calendar that they, at least, could begin to use.
(311) Eventually, she [Yin] believed, Chandler users would rely on her proposed dashboard view—a single screen of items flowing from past to present to future—as the nerve center of their daily information management. But the dashboard had been put on hold; the programmers needed to concentrate on getting the calendar to work.
(311) On first glance, Chandler's basic three-pane structure resembled that of countless other programs, including Microsoft Outlook.
Design based on imagining mental model users would develop, in context of Yin evaluating Chandler dashboard views, anticipating synaptogenesis.
(312) Yin started from ideas about how users would want to organize their workflows; the developers began by imagining the “mental model” the user would develop about the program's functions.
Programming style influenced by deep experience with Java appeared deficient to a Python expert, result of a fellow expert doing code study.
After spending some time studying Chandler's code base, [Phillip J.]
to his blog a lengthy entry titled “Python Is Not Java.”
(314) There followed a long list of technical recommendations for how to use Python like a true Pythonista” rather than a newbie.
(316) His point was undeniable—that key Chandler developers, no matter how much coding they had under their belts, were Python newbies who simply weren't taking full advantage of the language's features and were sometimes tripping over them instead.
(317) Once more Eby turned to his blog to report on his work. He described the process he adopted to create Spike: He had borrowed a method from Extreme Programming called “test-driven development,” in which programmers write the test that evaluates the success of a program function before they write the function itself. Then, at every stage of work, they can be sure that their code still works and does what was originally intended.
Relative simplicity of server side versus client facing development because of diminished concern for user needs and edge cases arising from unpredictable actions by users.
(318) It helped that Cosmo employed only one developer; that meant it didn't have to pay a lot of coordination costs. But it was also true that writing server software has always had certain advantages compared with writing software for users. A server is a program that deals almost entirely with other programs and machines; it rarely needs to communicate directly with human beings. And when it does, the human being it needs to talk to—when it is being initially configured, for instance, or when it hits a snag—is usually a pro, a system administrator or programmer who is already fluent in the server's own dialect.
Compare getting started with a large codebase to entering the work of a prolific theorists like Derrida.
(320) Chandler was now 1.5 million
lines of code, most of which had been incorporated from other
projects like wxWidgets and Twisted. There were about 130,000
Chandler-specific lines of Python code that OSAF developers had
written. Getting started at the project was, as an OSAF summer intern
put it, like moving to a new city and trying to find your
(320) Among other geekish pursuits, PyCon holds “sprints”--quick immersion workshops in which programmers take on narrowly defined projects and see how far they can get in two or three days.
Open source projects largely managed by women despite stereotypes and reality of commercial software development.
(322) Not by design but maybe not
entirely coincidentally, it had become an open source project largely
managed by women.
(322) Efforts to explain the disparity risk both invoking stereotypes and profaning sacred cows. But the most thoughtful students of the matter point out the social bias against women tends to trump all other factors.
(322) Today, to walk into the management meeting of a software project and encounter a group of female faces is still an exotic experience.
(323) The first thing [Philippe] Bossut did on taking over the apps team reins was to make a spreadsheet listing every single task facing every single developer on 0.6 along with a SWAG (Silly, Wild-Assed Guess) estimating the task's length. Then he would add up all the tasks and see how much time 0.6 would really take. This was different in form but similar in essence to what Dusseault had done with stickies and what Michael Toy had tried to do years before when he asked his developers to estimate their task completion times in Bugzilla.
(324) This was the kind of problem that open source made easier to solve: IBM offered a free library that supplied all the specifics you needed about languages and localities, and whether Uruguayans prefer to start their week on Sunday or Monday.
(325) If Chandler 0.6 was going to provide first adopters with a calendar they could actually use, another basic feature had to be added: recurring events. . . . Paper calendars had never provided such a service, but it was exactly what people expected software to do effortlessly.
(327) A clue to why the deceptively simple-looking matter of event recurrence should prove so difficult can be found in the word recur itself.
(327) Recurring things repeat, but recursive things repeat in a special way: They loop back into themselves. They are tail-eating snakes.
Does recognition of design problems for dealing with recurring events reveal something about philosophical assumptions behind the Chandler project: likewise, Kay felt McCarthys elegant definitions of eval and apply functions for Lisp reverberate with the essence of the language or programming itself.
Alan Kay likes to point to McCarthy's “half-age of code at the
bottom of page 13 of the Lisp 1.5 manual” and praise it as the
“Maxwell's equations of computing”--concentrated, elegant
statements that distilled the field's fundamental principles just as
James Clerk Maxwell's four equations had laid out the essential
workings of electricity and magnetism at the dawn of the machine age.
One the page that Kay cited, which provides definitions of two
functions named “eval” and “apply,” McCarthy essentially
described Lisp in itself.
“This,” Kay says, “is the whole world of programming in a few
lines that I can put my hand over.”
(330) It turns out the the dream of an automatic program “verifier”--a program that could examine any other program and determine whether that program provides the correct output for any given input—is doomed to remain a dream. The question is undecidable.
Douglas Hofstadters Law: it always takes longer than you expect, even when you take into account Hofstadters Law.
In planning my project I had failed to take into account Hofstadter's
Law, the recursive principle to which Douglas Hofstadter
his name: It always takes longer than you expect,
even when you take into account Hofstadter's Law.
(332) But writing open source code for three years was not the same thing as building an open source community. As Chandler 0.6 neared completion, Ted Leung sent Mitch Kapor a report assessing OSAF's successes and failures in its open source efforts. He found that of approximately 4,400 total bugs logged in Bugzilla to date, around 100 had been filed by people outside OSAF. And there had been only a handful of actual code contributions from outsiders.
Failure of Chandler to avoid software development tar pit forces judgment of open source ideals.
(333) And being open source hadn't prevented Chandler from ending up
in the same agonizing time warp as countless other ambitious software
projects. Maybe Eric Raymond's “The Cathedral and the Bazaar” had
been wrong, and Linus's Law (“Given enough eyeballs, all bugs are
shallow”) didn't transcend Brooks's Law after all. Or perhaps OSAF,
for all its transparency, had so far failed to meet Raymond's
requirement for success with a bazaar-style open source project—that
it must recognize, embrace, and reward good ideas from
(334) Even this self-described “old applications guy” was now on the verge of accepting that Web-delivered, browser-based software had begun to overtake the desktop-based programs of personal computing's golden age. From now on, he [Kapor] declared, he would most likely adopt a Web interface for any new project from the start.
Alan Cooper, creator of Visual Basic, details software industry sins in The Inmates are Running the Asylum, the primary one being not understanding what it means to be done, hence anxiety of open-ended tasks noted by David Allen.
In 1999, Alan Cooper—a
software developer who created much of the Visual Basic programming
language in the early nineties and is now a prominent software design
advocate—published a fierce book titled The
Inmates are Running the Asylum
provides a rap sheet of the software industry's sins. In it Cooper
wrote, “Software development lacks one key element—an
understanding of what it means to be 'Done'.”
(337) But there is another consequence of software development's halting problem, one that is less pragmatic than existential. David Allen, the Getting Things Done guru, talked about the “gnawing sense of anxiety” suffered by knowledge workers who face mountains of open-ended tasks.
Kapor admits web interface the likely starting point for any new project, although old software tends to work.
(339) Old code rarely offers trendy graphics or flavor-of-the-month features, but it has one considerable advantage: It tends to work.
Art of making software like sending vision through atomizer, reassembling packets of data.
(344) But software development takes simple elegant visions and
atomizes them, separating them into millions of implementation
details and interface choices and compromises. The art of making
software well is, in a sense, the ability to send a vision through
that atomizer in such a way that it can eventually be put back
together—like the packets of data that separate to cross the
Internet and get reassembled into coherent messages for your
(345) By the time this book is published, OSAF will have released perhaps two more versions of the program.
(345) Cubranic's email didn't just propose the idea; it contained about 180 lines of Python code that implemented it [features of Ecco PIM/outliner preferred by the author].
A LONG BET
[2005-2029 AND BEYOND]
(348) Why can't we build software the way we build bridges? Not, of course, the way California is building bridges; people who ask this question are dreaming of a rigorous discipline founded on reliable formulas. But until and unless we can devise a “physics of software” to match the calculations of mass and motion and energy that govern engineering in the physical world, we probably can't do that. Nonetheless, the technical difficulties of software development have grudgingly yielded to incremental improvement. The same thing happened in the world of bridge building. Today, our competence in this realm is something we take for granted; we have forgotten (or repressed) the long history of bridge failures and collapses through the nineteenth century and into the early twentieth, when new technologies and immature practices created a record of disaster and loss that in many ways dwarfs our era's software disaster woes.
Prospect of perfection dissolves when information systems touch human beings free will and unpredictability (Lanier), making software engineering different from bridge building.
(349) “Ultimately, information systems only give value when they
touch human beings,” Jaron Lanier says. And when they do touch
human beings, the prospect of perfection dissolves.
(349) Software's essential difficulty, then, is the tool that human free will and unpredictability exact on technological progress.
(352) As the project's first big-splash Long Bet, Kapor wagered $20,000 (all winnings earmarked for worthy nonprofit institutions) that by 2029 no computer or “machine intelligence” will have passed the Turing Test.
Kurzweil believes acceleration of technology enhancements related to computing power, storage capacity, network speed point toward critical moment when human brain is technologically emulated.
(352) [Ray] Kurzweil's belief in a machine that could ace the Turing Test was one part of his larger creed—that human history was about to be kicked into overdrive by the exponential acceleration of Moore's Law and a host of other similar skyward-climbing curves. As the repeated doublings of computational power, storage capacity, and network speed start to work their magic, and the price of all that power continues to drop, according to Kurzweil, we will reach a critical moment when we can technologically emulate the human brain, reverse-engineering our own organic processors in computer hardware and software. At the same time, biotechnology and its handmaiden, nonotechnology, will be increasing their powers at an equally explosive rate.
Kurzweil singularity in late 2020s will radically transform human experience; what becomes of the human side of the symbiosis, will we be dreaming in code or merely overdetermined by it?
(353) Kurzweil predicts that artificial intelligence will induce a singularity in human history. When it rolls out, something in the late 2020s, an artificial intelligence's passing of the Turing Test will be a mere footnote to this singularity's impact—which will be, he says, to generate a “radical transformation of the reality of human experience” by the 2040s.
(357) An identical version of these notes, with active hyperlinks, is located at http://www.dreamingincode.com.
Rosenberg, Scott. Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software. New York: Crown Publishers, 2007. Print.