Notes for Katie Hafner and Matthew Lyon Where Wizards Stay Up Late: The Origins of the Internet
Key concepts: batch, checksum, distributed network, domain name system, Ethernet, interface message processors, packet-switching, protocol, real-time, Request For Comments, store-and-forward network, symbiosis, synchronizer bug, time-sharing, watchdog timer.
Related theorists: Paul Baran, Wes Clark, Steve Crocker, Will Crowther, Donald Davies, Engelbart, Galloway, Licklider, Tom Marill, Bob Metcalf, Papert, Larry Roberts.
The Fastest Million Dollars
(13) The presence of three different computer terminals in [Robert] Taylor's Pentagon office reflected IPTO's strong connection to the leading edge of the computer research community, resident in a few of the nation's top universities and technical centers.
(14) The research agency was to be a fast-response mechanism closely tied to the president and secretary of defense, to ensure that Americans would never again be taken by surprise on the technological frontier.
(15-16) Eisenhower was the first president to host a White House dinner specifically to single out the scientific and engineering communities as guests of honor, just as the Kennedys would later play host to artists and musicians.
(17) One aspect of his career at P&G that he [Niel McElroy] was most proud of was the amount of money the company had devoted to research. He believed in the value of unfettered science, in its ability to produce remarkable, if not always predictable, results.
(18) One of the principal attractions of the research agency concept, in McElroy's mind, was the ability it would give him to manage the fierce competition within DOD over R&D programs and budgets.
(20) In the post-Sputnik panic, the race for outer space cut a wide swath through American life, causing a new emphasis on science in the schools, worsening relations between Russia and the United States, and opening a floodgate for R&D spending.
(22) Almost overnight, while Johnson drummed for a military presence in space, the space projects and missile programs were stripped away from ARPA and transferred over to NASA or back to the services, leaving ARPA's budget whittled to a measly $150 million. ARPA's portfolio was gutted, its staff left practically without any role.
(22) The staff of ARPA saw an opportunity to redefine the agency as a group that would take on the really advanced “far-out” research.
(23) A golden era for ARPA was just beginning. [Jack P.] Ruina brought a relaxed management style and decentralized structure to the agency. Details didn't interest him; finding great talent did. He believed in picking the best people and letting them pick the best technology.
(24) In the early 1950s computing meant doing arithmetic fast.
(25) [Ken] Olsen's idea for an interactive computer had come from a pioneering group of computer researchers at MIT. A different, slightly younger group there came up with another dramatic concept in computing that was beginning to catch on, particularly in academic institutions. They called it “time-sharing,” and it had obvious appeal as an alternative to the slow and awkward traditional method of “batch” processing.
Changing attitude to value of direct access and time-sharing through spending time programming; Licklider prescient in potential for amplifying range of human intelligence through symbiosis with computers.
The appreciation of time-sharing was directly proportional to the
amount of direct access one had to the computer. And usually that
meant that the more you programmed, the better you understood the
value of direct access.
(27) Licklider was far more than just a computer enthusiast, however. For several years, he had been touting a radical and visionary notion: that computers weren't just adding machines. Computers had the potential to act as extensions of the whole human being, as tools that could amplify the range of human intelligence and expand the reach our our analytical powers.
SAGE example of Licklider symbiosis, machine as problem-solving partner; fitting that it was so large people walked inside it.
Based on a large IBM computer, SAGE was so mammoth that its operators
and technicians literally walked inside the machine. . . . In fact,
SAGE was one of the first fully operational, real-time interactive
computer systems. Operators communicated with the computer through
displays, keyboards, switches, and light guns. Users could request
information from the computer and receive an answer within a few
(31) SAGE was an early example of what Licklider would later call “symbiosis” between humans and machines, where the machine functions as a problem-solving partner.
(34) Lick's thoughts about the role computers could play in people's lives hit a crescendo in 1960 with the publication of his seminal paper “Man-Computer Symbiosis.”
(36) Lick belonged to a small group of computer scientists who believed that people could be much more effective if they had at their fingertips a computer system with good displays and good databases.
(37) The principal charter he'd been given was to come up with uses for computers other than as tools for numerical scientific calculations. Lick developed new programs partly as a reaction against some of the applications the Defense Department had in mind for large computers.
(38) Six months after his arrival at ARPA, Lick wrote a lengthy memo to the members of the Intergalactic Network in which he expressed his frustration over the proliferation of disparate programming languages, debugging systems, time-sharing system control languages, and documentation schemes. In making the case for an attempt at standardization, Lick discussed the hypothetical problem of a network of computers.
(41) By building a system of electronic links between machines, researchers doing similar work in different parts of the country share resources and results more easily.
Book gives evidence that origins of Internet in interoperability and communication of Tayler and Herzfeld, not so much as to sustain nuclear attack.
(42) If the network idea worked, Taylor told Herzfeld, it would be possible for computers from different manufacturers to connect, and the problem of choosing computers would be greatly diminished. Herzfeld was so taken with that possibility that those arguments alone might have been enough to convince him. But there was another advantage, centering on the question of reliability. It might be possible to connect computers in a network redundantly, so that if one line went down a message could take another path.
A Block Here, Some Stones There
(44) In those days, software programs were one-of-a-kind, like original works of art, and not easily transferred from one machine to another. Taylor was convinced of the technical feasibility of sharing such resources over a computer network, though it had never been done.
Peer collaboration among networked resources; ATT not interested.
But the networking idea marked a significant departure from
time-sharing. . . . The idea of one computer reaching out to tap
resources inside another, as peers in a collaborative organization,
represented the most advanced conception yet to emerge from
(48) Even before his first day at ARPA, Roberts had a rudimentary outline of the computer network figured out. Then, and for years afterward as the project grew, Roberts drew meticulous network diagrams, sketching out where the data lines should go, and the number of hops between nodes.
(51) AT&T, of course, had absolute hegemony when it came to the telephone network. But the systematic conveyance of information predated Ma Bell by at least a few thousand years.
(51) The telegraph was a classic early example of what is called a “store-and-forward network.”
(52) There was almost no way to bring radical new technology into the Bell System to coexist with the old. . . . Not surprisingly, then, in the early 1960s, when ARPA began exploring an entirely new way of transmitting information, AT&T wanted no part of it.
(53) In the early 1960s, before Larry Roberts had even set to work creating a new computer network, two other researchers, Paul Baran and Donald Davies—completely unknown to each other and working continents apart toward different goals—arrived a virtually the same revolutionary idea for a new kind of communications network. The realization of their concepts came to be known as packet-switching.
(54) Soon after Baran had arrived at RAND, he developed an interest in the survivability of communications systems under nuclear attack.
(58) Baran's idea constituted a third approach to network design. He called his a distributed network. Avoid having a central communications switch, he said, and build a network composed of many nodes, each redundantly connected to its neighbor. His original diagram showed a network of interconnected nodes resembling a distorted lattice, or fish net.
(59) He concluded that a redundancy level as low as 3 or 4—each node connecting to three or four other nodes—would provide an exceptionally high level of ruggedness and reliability.
Baran, who did think about nuclear survivability of networks, proposed distributed network diagram, message blocks, and adaptive routing.
Baran's second big idea was still more revolutionary. Fracture the
messages too. By dividing each message into parts, you could flood
the network with what he called “message
all racing over different paths to their destination. Upon their
arrival, a receiving computer would reassemble the message bits into
(61) What Baran envisioned was a network of unmanned switches, or nodes—stand-alone computers, essentially—that routed messages by employing what he called a “self-learning policy at each node, without need for a central, and possibly vulnerable, control point.” He came up with a scheme for sending information back and forth that he called “hot potato routing,” which was essentially a rapid store-and-forward system working almost instantaneously, in contrast with the old post-it-forward teletype procedure.
(61-62) In Baran's model, each switching node contained a routing table that behaved as a sort of transport coordinator or dispatcher. . . . The continuous updating of the tables is also known as “adaptive” or “dynamic” routing.
(63) By 1965, five years after embarking on the project, Baran had the full support of RAND, and that August RAND sent a formal recommendation to the Air Force that a distributed switching network be built, first as a research and development program, and later as a fully operational system.
(64) The Air Force, determined not to let the plan die on the drawing boards, decided that it should proceed without AT&T's cooperation. But the Pentagon decided to put the newly formed Defense Communications Agency (DCA), not the Air Force, in charge of building the network. Baran pictured nothing but trouble.
(64-65) In London in the autumn of 1965, just after Baran halted work on his project, Donald Watts Davies, a forty-one-year-old physicist at the British National Physical Laboratory (NPL), wrote the first of several personal notes expounding on some ideas he was playing with for a new computer network much like Baran's. . . . By following spring, confident that his ideas were sound, he gave a public lecture in London describing the notion of sending short blocks of data—which he called “packets”--through a digital store-and-forward network.
Davies motivated by matching network to characteristics of new computer-generated data traffic patterns.
The motivation that led Davies to conceive of a packet-switching
network had nothing to do with the military concerns that had driven
Baran. . . . The irregular, bursty characteristics of
computer-generated data traffic did not fit well with the uniform
channel capacity of the telephone system. Matching the network design
to the new types of data traffic became his main motivation.
(67) Davies' choice of the word “packet” was very deliberate. . . . Before settling on the word, he asked two linguists from a research team in his lab to convrim that there were cognates in other languages.
Mapping It Out
Origin of protocol by Marill as message sending procedures.
[Tom] Marill referred
to the set of procedures for sending information back and forth as a
prompting a colleague to inquire, “Why do you use that term? I
thought it referred to diplomacy.”
(73) Just before the meeting ended, Wes Clark passed a note up to Roberts. It read, “You've got the network inside out.”
Subnetworks with identical nodes leaving internetworking to what became the router and gateway devices.
(73) The way Clark
explained it, the solution was obvious: a subnetwork with small,
identical nodes, all interconnected.
(74) Clark's idea was to spare the hosts that extra burden and build a network of identical, nonshared computers dedicated to routing.
Delineation of real-time computing problems at edge of human perceptibility (10-20 ms).
(75) (Anything that takes more than 10 to 20 milliseconds, the point a which delays become humanly perceptible, is not considered real-time.) Strictly speaking, the ARPA network was to be a store-and-forward system. But data would zip in and out of the nodes so quickly, and the response time from a human perspective would be so rapid, that it qualified as a real-time problem.
Roberts christened interface message processors (IMPs) as intermediate computers controlling the network.
He [Roberts] called the intermediate computers that would control the
network “interface message processors,”
or IMPs, which he pronounced “imps.” They were to perform the
functions of interconnecting the network, sending and receiving data,
checking for errors, retransmitting in the event of errors, routing
data, and verifying that messages arrived at their intended
destinations. A protocol would be established for defining just how
the IMPs should communicate with host computers.
(77) Roberts thought the network should start out with four sites—UCLA, SRI, the University of Utah, and the University of California at Santa Barbara—and eventually grow to around nineteen.
(78) Part of the strength of the NLS [oNLine System] was its usefulness in creating digital libraries and in storing and retrieving electronic documents. Engelbart also saw NLS as a natural way to support an information clearinghouse for the ARPA network. After all, if people were going to share resources, it was important to let everyone know what was available.
(79) This idea that maintaining reliability should be incumbent on the subnetwork, not the hosts, was a key principle.
(79) By the end of July, 1968, Roberts had finished drafting the request for proposals. He sent it out to 140 companies interested in building the Interface Message Processor. The document was thick with details of what the network should look like and what the IMPs would be expected to do. It was a rich piece of technical prose, filled with an eclectic mix of ideas.
(81) More than a dozen bids were submitted, resulting in a six-foot stack of paper.
(81) Raytheon officials answered ARPA's remaining technical questions and accepted the price.
(81) So it surprised everyone when, just a few days before Christmas, ARPA announced that the contract to build the Interface Message Processors that would reside at the core of its experimental network was being awarded to Bolt Beranek and Newman, a small consulting firm in Cambridge, Massachusetts.
The Third University
(85) The presence of an accessible computer inspired a change in the company. Everyone began thinking up things that could be done with it.
(86) The richly academic atmosphere at BBN earned the consulting firm a reputation as “the third university” in Cambridge.
(87) Among the computer researchers were Wally Feurzeig and Seymour Papert, who were working on educational applications. Papert was a consultant to BBN for about four years in the late 1960s. While there, he conceived of and made the first rough design of a programming language that would be accessible to school-age children. The idea was adopted as a research theme by BBN's education group, which Feurzeig ran, and the language came to be called LOGO.
(99) Central to the design of the network was the idea that the subnet of IMPs should operate invisibly.
(100) By the time the proposal was finished, it filled two hundred pages and cost BBN more than $100,000, the most the company had ever spent on such a risky project.
(102) When news reached Massachusetts Senator Edward Kennedy's office just before Christmas that a million-dollar ARPA contract had been awarded to some local boys, Kennedy sent a telegram thanking BBN for its ecumenical efforts and congratulating the company on its contract to build the “Interfaith Message Processor.”
Head Down in the Bits
(104) Who but a few government bureaucrats or computer scientists would ever use a computer network? It wasn't as if computing had a mass market like the television networks or the phone company.
(107) (And guys they were. In keeping with the norms of the time, with the exception of Heart's secretary, the people who designed and built the ARPA network were all men. Few women held positions in computer science. Heart's wife, Jane, had quit her programming job at Lincoln to raise their three children.)
(107) Functionality mattered now, not elegance or beauty. . . . The inner strength of Heart's team was its restraint, its maturity. This was no place for signature craftsmanship.
(111) To program in assembly language was to dwell maniacally on the mechanism. It was easy for programmers to lose sight of the larger goal and difficult to write short, economical programs. Unskilled programmers often ended up writing assembly-language programs that wandered aimlessly, like some hapless Arctic expedition in a blizzard.
(115) Giving ample authority to people like Roberts was typical of ARPA's management style, which stretched back to its earliest days. It was rooted in a deep trust of frontline scientists and engineers.
(116) The IMP Guys had to write the host-to-IMP specification before the hosts could start building anything.
(117) Reliability was, in a sense, the founding principle of the network. The ARPA network would always reflect Frank Heart's own steady character.
(119) Digital error correction rests upon the basic idea of the “checksum,” a relatively small number that is calculated from the data at its source, then transmitted along with the data, and recalculated at the destination.
(123) BBN had agreed with Roberts that the IMPs wouldn't perform any host-to-host functions.
(123) Between Roberts and BBN it was settled: The IMP would be built as a messenger, a sophisticated store-and-forward device, nothing more. Its job would be to carry bits, packets, and messages: To disassemble messages, store packets, check for errors, route the packets, and send acknowledgments for packets arriving error-free; and then to reassemble incoming packets into messages and send them up to the host machines—all in a common language.
(123) The host sites would have to get their disparate computers to talk to each other by means of protocols they agreed on in advance.
IMP Number 0
Critical programming studies attribute/outcome of dealing with bugs that are considered natural process of development, exemplified in Barker testing IMP Number 0.
(124) [Ben] Barker
had built a tester and had written some debugging code. He was
looking forward to working out whatever bugs the machine had.
Undoubtedly there would be something that would need fixing, because
there always was; bugs were part of the natural process of computer
(125) The trouble was that in furnishing Honeywell with a set of fairly generic block diagrams, BBN assumed that Honeywell's familiarity and expertise with its own machines would enable the computer manufacturer to anticipate any peculiar problems with BBN's requested modifications to the model 516. . . . But instead of working out the essential details in the blueprints, Honeywell has built BBN's machine without verifying that the BBN-designed interfaces, as drawn, would work with the 516 base model.
(126) Armed with an oscilloscope, a wire-wrap gun, and an unwrap tool, Barker worked alone on the machine sixteen hours a day. The circuitry of the comptuer relied on pin blocks, or wire-wrapped boards, that served as the central connection points to which wires, hundreds upon hundreds of wires, were routed.
Designed remote monitoring becomes part of protocological society (in manner different from panopticon).
wariness about the hordes of curious graduate students drove BBN to
conceptualize even greater measures of protection for the IMPs. In
time, among the most creative things Heart's team did was invent ways
of obtaining critical operating data from the network IMPs—reading
the machines' vital signs—unobtrusively from a distance. . . .
Heart's group envisioned someday being able to look across the
network to know whether any machine was malfunctioning or any line
was failing or perhaps if anyone was meddling with the
(128) The original request from ARPA had specified dynamic routing without offering a clue as to how to make it work.
(128-129) [Will] Crowther's dynamic-routing algorithm was a piece of programming poetry.
(129) The subnet would accept one message at a time on a given link. Messages waiting to enter the subnet were stored in a memory buffer (a waiting area inside the machine), in a queue. The next message wasn't sent until the sending IMP received an acknowledgment (similar to a return receipt) saying that the previous message had arrived error-free at the destination host.
(129) RFNM [Ready for Next Message] was a congestion-control strategy designed to protect the IMPs from overload, but at the expense of reducing service to the host. Kahn had studied the problem of congestion control even before the ARPA network project began. His reaction to Crowther's solution was that the links and RFNMs would still allow fatal congestion of the network's arteries.
Importance of unauthorized software tools by Kahn.
(131-132) Heart scotched Kahn's suggestion that they use a simulation. Heart hated to see his programming team spend time on simulations or on writing anything but operational code. They were already becoming distracted by something else he disliked—building software tools. . . . So no one ever asked; they just did it, building tools when they thought it was the right thing to do, regardless of what Heart thought. This was software they would eventually need when the time came to test the system. All were customized pieces of programming, specifically designed for the ARPA project.
(133) Synchronizer bugs are rare. But when they occur and the synchronizer fails to respond properly to an interrupt, the consequences are profoundly disturbing to the machine's total operation.
(134) The unpredictability made synchronizer bugs among the most frustrating of bugs because of the absence of any recognizable pattern to the resulting crashes.
(135) The fleeting trace was perhaps the only telltale sign that the crashes were caused by a timing problem: a synchronizer stalled in a metastable condition for a few nanoseconds too long. It was the computer equivalent of the one split second of confusion or indecision by a race car driver that ends suddenly in a fatal crash.
Do It to It Truett
Thrill of understanding power of loop to control execution of lengthy sequence with a few instructions underscores special feature of code empowering autonomous machines.
“I remember being thrilled when I finally understood the concept of
a loop,” [Steve] Crocker recalled,
“which enabled the computer to proceed with a very lengthy sequence
of operations with only a relatively few instructions.”
(139) He majored in math but soon got hooked on serious computing.
(143) The host-to-IMP interface had to be built from scratch each time a new site was established around a different computer model. Later, sites using the same model could purchase copies of the custom interface.
The Search for Protocols
(144) To avoid sounding too declarative, he labeled the note “Request for Comments” and sent it out on April 7, 1969. Titled “Host Software,” the note was distributed to the other sites the way all the first Requests for Comments (RFCs) were distributed: in an envelope with the lick of a stamp. RFC Number 1 described in technical terms the basic “handshake” between two computers—how the most elemental connections would be handled.
RFC initiated by Crocker set precedent for open cooperative means of evolving technical standards of protocological society (Galloway).
language of the RFC was warm and welcoming. The idea was to promote
cooperation, not ego. The fact that Crocker kept his ego out of the
first RFC set the style and inspired others to follow suit in the
hundreds of friendly and cooperative RFCs that followed. . . . For
years afterward (and to this day) RFCs have been the principal means
of open expression in the computer networking community, the accepted
way of recommending, reviewing, and adopting new technical
(145) The RFC, a simple mechanism for distributing documentation open to anybody, had what Crocker described as a “first-order effect” on the speed at which ideas were disseminated, and on spreading the networking culture.
Comparison between manuscript flyleaf (original Greek protocol) and packet header.
(145-146) They very word “protocol” found its way into the language of computer networking based on the need for collective agreement among network users. For a long time the word has been used for the etiquette of diplomacy and for certain diplomatic agreements. But in ancient Greek, protokollon meant the first leaf of a volume, a flyleaf attached to the top of a papyrus scroll that contained a synopsis of the manuscript, its authentification, and the date. Indeed, the word referring to the top a a scroll corresponded well to a packet's header, the part of the packet containing address information.
Platform orientation shifting from mainframe master-slave hegemony to peers called for development of protocols; protocols like two-by-four of standardized, distributed construction the goal of Network Working Group.
The computers themselves were extremely egocentric devices. The
typical mainframe of the period behaved as if it were the only
computer in the universe. . . . Everything connected to the main
computer performed a specific task, and each peripheral device was
presumed to be ready at all times for a fetch-my-slippers type of
command. (In computer parlance, this relationship is known as
master-slave communication.) . . . The goal in devising the
host-to-host protocol was to get the mainframe machines talking as
peers, so that either side could initiate a simple dialogue and the
other would be ready to respond with at least an acknowledgment of
the other machine's existence.
(147) The computer equivalent of a two-by-four was what the Network Working Group was trying to invent.
(147) The protocol design philosophy adopted by the NWG broke ground for what came to be widely accepted as the “layered” approach to protocols.
(147) The job of the lower layer was simply to move generic unidentified bits, regardless of what the bits might define.
(148) As the talks grew more focused, it was decided that the first two applications should be for remote log-ins and file transfers.
A Real Network
Irony that first network program as a dumb terminal explained by observation that new technologies are typically promoted for their ability to do things we already understand, their content being other technologies.
(154) There is no
small irony in the fact that the first program used over the network
was one that made the distant computer masquerade as a terminal. All
that work to get two computers talking to each other and they ended
up in the very same master-slave situation the network was supposed
to eliminate. Then again, technological advances often begin with
attempts to do something familiar. Researchers build trust in new
technology by demonstrating that we can use it to do things we
(156-157) After another year of meetings and several dozen RFCs, in the summer of 1970 the group reemerged with a preliminary version of a protocol for basic, unadorned host-to-host communications. When the “glitch-cleaning committee” finished its work a year later, the NWG at last produced a complete protocol. It was called the Network Control Protocol, or NCP.
(159) Above all, the esoteric concept on which the entire enterprise turned—packet-switching—worked. The predictions of utter failure were dead wrong.
Hacking Away and Hollering
(161) Computing continued to be the one line in the agency's budget that didn't turn downward at the beginning of the 1970s.
(162) But when an IMP was installed at BBN in the early spring of 1970, suddenly there was a way to ship data and status reports electronically from the West Coast IMPs directly back to BBN. . . . The BBN team had designed into the IMPs and the network the ability to control these machines from afar.
Improved telephone line trouble detection utilizing network monitoring tools.
(163) The engineers at BBN relished opportunities to spook the telephone company repair people with their ability to detect, and eventually predict, line trouble from afar.
Watchdog timer example of cybernetic self-corrective behavior.
Heart's team had designed the IMPs to run unattended as much as
possible, bestowing on the IMPs the ability to restart by themselves
after a power failure or crash. The “watchdog timer”
was the crucial component that
triggered self-corrective measures in the IMPs.
(168-169) One of the NCC's primary tasks was to issue software upgrades and reload IMP operating programs when necessary. The operators used a cleverly cooperative scheme by which every IMP downloaded the software from a neighbor.
(173) If development of the network was to proceed at a steady pace now, closer coordination would be necessary between BBN's effort to introduce the Terminal IMP and the Network Working Group's effort to develop protocols.
(174) FTP was the first application to permit two machines to cooperate as peers instead of treating one as a terminal to the other.
Showing It Off
(176) The ARPA network, however, was virtually unknown everywhere but in the inner sancta of the computer research community. . . . No one had come up with a useful demonstration of resource-sharing; the protocols to make it work weren't even in place. The ARPA network was a growing web of links and nodes, and that was it—like a highway system without cars.
(178) Roberts, ever so correctly, had foreseen the likelihood that scheduling a highly visible public demonstration for October 1972 would build pressure on the community to mobilize and make sure the network would be functioning flawlessly by that date.
(182) Some of the most ingenious demonstrations involved English-language conversational programs. These were elaborate programs constructed to engage a user in a verbal dialogue with a machine. There were four programs on display, two of which offered an especially fascinating glimpse into interactive computing.
Remote computer chat between PARRY and Doctor.
(183) Just a few
weeks before the ICCC demonstration, PARRY indeed met the Doctor for
an unusual conversation over the ARPANET, in an experiment
orchestrated at UCLA. It perhaps marked the origin, in the truest
sense, of all computer chat. There was no human intervention in the
dialogue. PARRY was running at Stanford's artificial-intelligence
lab, the Doctor was running on a machine at BBN, and at UCLA their
input and output were cross-connected through the ARPANET, while the
operators sat back and watched.
(185-186) The ICCC demonstration did more to establish the viability of packet-switching than anything else before it. As a result, the ARPANET community gained a much larger sense of itself, its technology, and the resources at its disposal. For computer makers, there was the realization that a market might emerge.
(188) There weren't any formal rules restricting use of the ARPANET by those with authorized access. Kleinrock's razor retrieval caper wasn't the first time anyone had pushed past official parameters in using the network. People were sending more and more personal messages.
(189) The ARPANET was not intended as a message system. In the minds of its inventors, the network was intended for resource-sharing, period.
(189) Electronic mail would become the long-playing record of cyberspace.
(189-190) As cultural artifact, electronic mail belongs in a category somewhere between found art and lucky accidents. . . . Using the ARPANET as a sophisticated mail system as simply a good hack. In those days hacking had nothing to do with malicious or destructive behavior; a good hack was a creative or inspired bit of programming.
(190-191) The art of computer programming gave them room for endless riffs, and variations on any theme. One of the main themes became electronic mail.
(193) ARPA managers noticed that e-mail was the easiest way to communicate with the boss, and the fastest way to get his quick approval of things.
E-mail as favorite hack of new network and element in evolution of management style.
Roberts would leave experts on the topic at hand, who in turn bounced
the questions off their graduate students. Twenty-four hours and a
flurry of e-mail later, the problem had usually been solved several
times over. “The way Larry worked was the quintessential argument
in favor of a computer network,” Lukasik said.
(197) But because the struggle over e-mail standards was one of the first sources of real tension in the community, it stood out.
(199) The diversity of nonstandard systems on the Net caused problems even with something as apparently trivial as Tomlinson's @ sign. The @ sign dispute was long-running, and there were many sides to it.
(200) The tiff made clear that Tenex sites, led by BBN, formed a dominant culture on the network, while the “minority” sites, with their diverse operating systems, posed a potentially rebellious countermovement. Thus were planted the roots of a protracted conflict that continued into the ensuring decade and became known in the community as the header wars.
(201) Header troubles were also rooted in human disagreement over how much and what kind of information should be presented at the tops of the messages. People differed widely over how much header information they cared to deal with when looking at their mail.
(205) MSG was the original “killer app”--a software application that took the world by storm.
Brain change through use of Vittal e-mail programs (Hayles synaptogenesis).
Vittal's MSG and his ANSWER command made him a legendary figure in
e-mail circles. “It was because of Vittal that we all assimilated
network mail into our spinal cords,” recalled Brian Reid.
(205) More than just a great hack, MSG was the best proof to date that on the ARPANET rules might get made, but they certainly didn't prevail. Proclamations of officialness didn't further the Net nearly so much as throwing technology out onto the Net to see what worked. And when something worked, it was adopted.
Adventure and Quasar: The Open Net
and Free Speech
(207) When Adventure was done, Woods created a guest account on the computer at the Stanford AI Lab to let people play, and swarms of guests logged in. Adventure spread like hula hoops, as people sent the program to one another over the network. Because Crowther had written it in FORTRAN, it could be adapted to many different computers with relative ease. Both Crowther and Woods encouraged programmers to pirate the game and included their e-mail addresses for anyone looking for help installing, playing, or copying the game.
(207-208) People grew bleary-eyed searching for treasure into the small hours of the morning. “I've long ago lost count of the programmers who've told me that the experience that got them started using computers was playing Adventure,” Woods said. The game inspired hundreds of knockoffs, which eventually spawned an entire industry.
(210) Using U.S. Government facilities to case aspersions on a corporation, they said, could backfire on the ARPA research community. They urged their peers to impose careful self-censorship, to report only facts of technical interest to the community. Not everyone agreed, and with that the MsgGroup got embroiled in a soul-searching exchange.
(210) Reid had begun to notice that the Message Group was like a social club.
(211) The passion in defense of free speech was matched by an equally strong will to self-protection; the way to protect the network itself was not to attract unwanted supervision by the governments.
(212) In the private sector, companies were poised for the concept of electronic-mail service to take off. . . . Costs were heading down, and some analysts projected a “devastating” impact on the U.S. Postal Service's first-class business.
Misguided proposal for hybrid electronic message system by Carter administration.
(213) By 1979,
President Carter was supporting a post office proposal to offer a
limited kind of electronic message service to the nation. The hybrid
scheme worked more like a telegram service than a start-of-the-art
electronic communication system.
(213) The USPS, like AT&T earlier, never really broke free of the mindset guarding its traditional business, probably because both were monopolistic entities.
New cultural reference points developing in e-mail communities.
(215) In many ways
the ARPANET community's basic values were traditional—free speech,
equal access, personal privacy. However, e-mail also was
uninhibiting, creating reference points entirely its own, a virtual
society, with manners, values, and acceptable behaviors—the
practice of “flaming” for example—strange to the rest of the
(215) The acidic attacks and level of haranguing unique to on-line communication, unacceptably asocial in any other context, was oddly normative on the ARPANET.
(216) FINGER didn't allow you to read someone else's messages, but you could tell the date and time of the person's last log-on and when last he or she had read mail. Some people had a problem with that.
(217) One of the MsgGroup's eminent statesmen, Dave Crocker, sometimes probed the Net with a sociologist's curiosity.
(217) On April 12, 1979, a rank newcomer to the MsgGroup named Kevin MacKenzie anguished openly about the “loss of meaning” in this electronic, textually bound medium.
(218) What did Shakespeare know? ;-) Emoticons and smileys :-), hoisted by the hoi polloi no doubt, grew in e-mail and out into the iconography of our time.
(218) Instead of chords, white noise seemed to gradually overtake the MsgGroup.
(218) The MsgGroup was perhaps the first virtual community.
(218) The romance of the Net came not from how it was built or how it worked but from how it was used.
A Rocket on Our Hands
(225) The challenge for the International Network Working Group was to devise protocols that could cope with autonomous networks operating under their own rules, while still establishing standards that would allow hosts on the different networks to talk to each other.
(226) By the end of 1973, Cerf and Kahn had completed their paper, “A Protocol for Packet Network Intercommunication.” They flipped a coin to determine whose name should appear first, and Cerf won the toss. The paper appeared in a widely read engineering journal the following spring.
(226) Under the framework described in the paper, messages should be encapsulated and decapsulated in “datagrams,” much as a letter is put into the taken out of an envelope, and sent as end-to-end packets. These messages would be called transmission-control protocol, or TCP, messages. The paper also introduced the notion of gateways, which would read only the envelope so that only the receiving hosts would read the contents.
(227) The overall idea behind the new protocol was to shift the reliability from the network to the destination hosts.
Changes at DARPA
(233) One of the first problems Lick faced upon his return was an awkward one involving BBN, his onetime employer. BBN was refusing to release the IMP source code—the original operating program written by the IMP Guys five years earlier.
(234) While keeping source code proprietary is usually the prerogative of the company that develops it, the IMP source code was different, for it had been developed by BBN with federal funds.
(234-235) Finally, in direct response to Crocker's threat, BBN agreed to provide the code to whoever asked for it, charging a nominal handling fee. “This was just an early version of much more serious intellectual property rights issues that emerged across the industry over the next few decades,” Crocker said.
(236) During a break from a meeting Cerf chaired at ISI to discuss TCP in early 1978, Cerf, Postel, and Danny Cohen, a colleague of Postel's at ISI, got into a discussion in a hallway. “We were drawing diagrams on a big piece of cardboard that we leaned up against the wall in the hallway,” Postel recalled. When the meeting resumed, the trio presented an idea to the group: break off the piece of the Transmission-Control Protocol that deals with routing packets and form a separate Internet Protocol, or IP.
(237) With a clean separation of the protocols, it was now possible to build fast and relatively inexpensive gateways, which would in turn fuel the growth of internetworking. By 1978, TCP had officially become TCP/IP.
Ethernet name suggested by Metcalf references ether medium of nineteenth-century physicists.
(240) In May 1973 [Bob] Metcalf suggested a name, recalling the hypothetical luminiferous medium invented by nineteenth-century physicists to explain how light passes through empty space. He rechristened the system Ethernet.
(241) But the ARPANET was threatening to split the community of computer researchers into haves and have-nots.
(241) More important, an exodus of computing talent from academia to industry had caused a nationwide fear that the United States would not be able to train its next generation of computer scientists.
(243) In the summer of 1980, Landweber's committee came back with a way to tailor the architecture of CSNET to provide affordable access to even the smallest lab. They proposed a three-tiered structure involving ARPANET, a TELENET-based system, and an e-mail only service called PhoneNet. Gateways would connect the tiers into a seamless whole.
(243) By June 1983, more than seventy sites were on-line, obtaining full services and paying annual dues. At the end of the five-year period of NSF support in 1986, nearly all the country's computer science departments, as well as a large number of private computer research sites, were connected. The network was financially stable and financially self-sufficient.
(244) Because this growing conglomeration of networks was able to communicate using the TCP/IP protocols, the collection of networks gradually came to be called the “Internet,” borrowing the first word of “Internet Protocol.”
(244) Gateways were the internetworking variation on IMPs, while routers were the mass-produced version of gateways, hooking local area networks to the ARPANET.
(245) Government-sponsored Internet and borders began to dissolve. And gradually the Internet came to mean the loose matrix of interconnected TCP/IP networks worldwide.
(245) The model of CSNET convinced NSF of the importance of networking to the scientific community. The professional advantages to be gained from the ability to communicate with one's peers was incalculable.
(245) The creation in 1985 of five supercomputer centers scattered around the United States offered a solution. Physicists and others were agitating for a “backbone” to interconnect the supercomputer centers. The NSF agreed to build the backbone network, to be called NFSNET. At the same time, the NSF offered that if the academic institutions in a geographic region put together a community network, the agency would give the community network access to the backbone network.
(245) In response, a dozen or so regional networks were formed around the country.
TCP/IP versus OSI
(246) The Defense Department had endorsed TCP/IP, but the civilian branch of the government had not. And there was mounting concern that the National Bureau of Standards would decide to support an emergent rival standard for network interconnection called the OSI Reference Model.
(247) A battle of sorts was forming along familiar lines, recalling the confrontation between AT&T and the inventors of packet-switching during the birth of ARPANET. On the OSI side stood entrenched bureaucracy, with a strong we-know best attitude, patronizing and occasionally contemptuous.
(247) As far as the Internet crowd was concerned, they had actually implemented TCP/IP several times over, whereas the OSI model had never been put to the tests of daily use, and trial and error.
TCP/IP developed in collaborative community of emerging protocological society where standards are discovered versus mandated (including open documentation, UNIX operating system, and Ethernet); OSI in bureaucratic committees of disciplinary society (proprietary, closed models).
(247-248) Cerf and others argued that TCP/IP couldn't have been invented anywhere but in the collaborative research world, which was precisely what made it so successful, while a camel like OSI couldn't have been invented anywhere but in a thousand committees.
(248) On January 1, 1983, the ARPANET was to make its official transition to TCP/IP. Every ARPANET user was supposed to have made the switch from the Network Control Protocol to TCP/IP.
(250) One key development in determining the outcome between TCP/IP and OSI turned out to be the popularity of the UNIX operating system, which had been developed at AT&T's Bell Laboratories in 1969.
(250) Berkeley UNIX with TCP/IP would be crucial to the growth of the Internet. When Sun included network software as part of every machine it sold and didn't charge separately for it, networking exploded.
(250) It further mushroomed because of Ethernet.
(251) Perhaps what TCP/IP had to recommend it most was the fact that it was unerringly “open.” . . . The ARPANET, and later the Internet, grew as much from free availability of software and documentation as from anything else.
(252) Sorting out the Frodos of the Internet wasn't unlike sorting out the Joneses of Cleveland or the Smiths of Smithville. Where one lived, precisely, was important in differentiating who one was. For years, sorting this out was among the most troublesome, messiest issues for the Internet, until at last a group chiseled out a workable scheme, called the domain name system, or DNS.
Hierarchical tree-branching structure of domain name system becomes critical topic for Galloway.
(253) “Tree-branching” was the guiding metaphor. Each address would have a hierarchical structure. From the trunk to the branches, and outward to the leaves, every address would include levels of information representing a progression, a smaller, more specific part of the network address.
Pulling the Plug
Discovery of standards rather than decree as model for technological change.
(254) By virtue of
its quiet momentum, TCP/IP had prevailed over the official OSI
standard. Its success provided an object lesson in technology and how
it advances. “Standards should be discovered, not decreed,” said
one computer scientist in the TCP/IP faction.
(256) By the end of 1989, the ARPANET was gone. The NSFNET and the regional networks it had spawned became the principal backbone.
Aspects of control society internet dividual influenced and reflected in personal styles of male participants featured in the book; noting attitude of releasing early and often while designing self correcting error tolerance represents risk taking profile that may be different from other styles, such as making fewer releases or that method of testing boundaries mentioned with respect to Tetris game playing studies.
(258) The Net of
the 1970s had long since been supplanted by something at once more
sophisticated and more unwieldy. Yet in dozens of ways, the Net of
1994 still reflected the personalities and proclivities of those who
built it. Larry Roberts kept laying pieces of the foundation to the
great big rambling house that became the Internet. Frank Heart's
pragmatic attitude toward technical invention—build it, throw it
out on the Net, and fix it if it breaks—permeated Net sensibility
for years afterward. Openness in the protocol process started with
Steve Crocker's first RFC for the Network Working Group, and
continued into the Internet. While at DARPA, Bob Kahn made a
conspicuous choice to maintain openness. Vint Cert gave the Net its
civility. And the creators of the Net still ran the Internet Society
and attended meetings of the Internet Engineering Task Force.
(259) The company had missed its greatest opportunity when it failed to enter the market for routers—of which IMPs were the progenitors. BBN failed to see the potential in routers much as AT&T had refused to acknowledge packet-switching.
Easy to engage in critical analysis of male concentration and harder to keep in mind that Imps are the logical predecessor to routers, not bothering to wonder whether students ought to stay them closely, and that money can be made both servicing the innards and using it for marketing other wares, which opens the space beyond this small set of initiators who were lucky to participate so directly.
(263) The multiple
paternity claims to the Internet (not only had each man been there at
the start but each had made a contribution that he considered
immeasurable) came out most noticeably that afternoon during a group
interview with the Associated Press. . . . “How about women?”
asked the reported, perhaps to break the silence. “Are there any
female pioneers?” More silence.
(263) Three years earlier, the NSF had lifted restrictions against commercial use of the Internet, and how you could get rich not just by inventing a gateway to the Net but by taking business itself onto the Net.
Are there ancient texts attesting to similar fortunate positions, such as certain works of Plato that feature successful writers?
(265) Heart ended on a high note. “Only a small fraction of the technically trained population get a shot at riding a technological rocket, and then get to see that revolution change the world.” The networking revolution, Heart said, would rank among a small number of the most important technological changes of the century.
Important for scholars of history of software and technology advancing organizations to have access to archives, funding, and assistance from librarians; that BBN even had a lead librarian, who took the initiative that led to the writing of the book, confirms the point made by Cambell Kelly and others at how slim the chances are of capturing much of that early history as remains forgotten in archives or has been destroyed.
(287) This book grew out of an idea that originated with engineers at Bolt Beranek and Newman. Memories were growing fuzzy in late 1993, when we first started thinking about doing a book, and Frank Heart and others were interested in having BBN's considerable role in the creation of the original ARPANET recorded. Not only did the company open its archives to us and cooperate in every way but it helped fund the project as well, while agreeing to exercise no control over the content of the book. Marian Bremer, then BBN's head librarian, made the initial phone call that led to the book.
Hafner, Katie, and Matthew Lyon. Where Wizards Stay Up Late : The Origins Of The Internet. New York : Simon & Schuster, 1996. Print.