Notes for Martin Campbell-Kelly and William Aspray Computer: A History of the Information Machine

Key concepts: conditional branch, human/computer interaction, stored-program computer, technology, technology intercept strategy, time-sharing, virtual memory.


Related theorists: Babbage, Edmund C. Berkley, Burks, Vannevar Bush, Perry Crawford, De Prony, Gee, Goldstine, Grace Murray Hopper, Kemeny, John Mauchly, von Neumann.

PREFACE TO THE SLOAN TECHNOLOGY SERIES

Books of this genre, serious attempts at narrating minimally biased history of evolution of state of the art best practices, form the foundation of critical programming and philosophy of computing studies; follow them with insider perspective of software management and software architect informed by substantial professional experience, including Brooks and Lammers.

(vii) Technology is the application of science, engineering, and industrial organization to create a human-built world.
(vii) The aim of the series is to convey both the technical and human dimensions of the subject: the invention and effort entailed in devising the technologies and the comforts and stresses they have introduced into contemporary life.


Acknowledgments
(ix) This book has its origins in the vision of the Alfred P. Sloan Foundation that it is important for the public to understand the technology that has so profoundly reshaped Western society during the past century.


Introduction
(2) Today, research scientists and atomic weapons designers still use computers extensively, but the vast majority of computers are employed for other purposes, such as word processing and keeping business records. How did this come to pass? To answer this question, we must take a broader view of the history of the computer as the history of the information machine.
(3) The old technologies had three shortcomings: They were too slow in doing their calculations, they required human intervention in the course of a computation, and many of the most advanced calculating systems were special-purpose rather than general-purpose devices.
(3) The basic function specifications of the computer were set out in a government report written in 1945, and these specifications are still largely followed today. . . . One is the improvement in components, leading to faster processing speed, larger information storage capacity, improved price performance, better reliability, less required maintenance, and the like.
(3-4) The second type of innovation was the mode of operation . . . high-level programming languages, real-time computing, time-sharing, networking, and graphically oriented human-computer interfaces.
(4) We have organized the book in four sections. The first covers the way computing was handled before the arrival of electronic computers. The next two sections describe the mainframe computer era, roughly from 1945 to 1980, with one section devoted to the computer's creation and the other to its evolution. The final section discusses the origins of the new computing environment of the personal computer and the Internet.

Predicts deeper understanding of computers than their broad definition of information machines through emergence of synthetic historical scholarship epitomized by texts and technology studies.

(6) Our work falls in the present generation of scholarship based on the broader definition of the information machine, with strong business and other contextual factors considered in addition to technical factors. We anticipate that within the next decade, a new body of historical scholarship will appear that will enable someone to write a new synthetic account that will deepen our understanding of computers in relation to consumers, gender, labor, and other social and cultural issues.


PART ONE
Before the Computer

1 When Computers Were People
(9) The electronic computer can be said to combine the roles of the human computer and the human clerk.
(9-10) However, logarithmic and trigonometric tables were merely the best-known general-purpose tables. By the late eighteenth century, specialized tables were being produced for several different occupations: navigational tables for mariners, star tables for astronomers, life insurance tables for actuaries, civil engineering tables for architects, and so on. All these tables were produced by human computers, without any mechanical aid.
(10) When Astronomer Royal Maskelyne died in 1811—Hitchins had died two years previously—the
Nautical Almanacfell on evil days for about 20 years, and even became notorious for its errors.”

CHARLES BABBAGE AND TABLE-MAKING
(10-11) During this period Charles Babbage became interested in the problem of table-making and the elimination of errors in tables. . . . Realizing that Cambridge (and England) had become a mathematical backwater compared to continental Europe, Babbage and two fellow students organized the Analytical Society, which succeeded in making major reforms of mathematics in Cambridge and eventually the whole of England.
(11) In 1819 Babbage made the first of several visits to Paris, where he made the acquaintance of a number of leading members of the French scientific academy, such as the mathematicians Pierre-Simon Laplace and Joseph Fourier, with whom he formed lasting friendships. It was probably during this visit that Babbage learned of the great French table-making project organized by Baron Gaspard de Prony. This project would show Babbage a vision that would determine the future course of his life.
(11) It was by far the largest table-making project the world have ever known, and de Prony decided to organize it much as one would organize a factory.
(12)
De Prony organized his table-making “factory” into three sections. The first section consisted of half a dozen eminent mathematicians—including Adrien Legendre and Lazare Carnot—who decided on the mathematical formulas to be used in the calculations. Beneath them was another small section—a kind of middle management—that, given the mathematical formulas to be used, organized the computations and compiled the results ready for printing. Finally, the third and largest section, which consisted of sixty to eighty human computers, did the actual computation. The computers used the “methods of differences,” which required only the two basic operations of addition and subtraction, and not the more demanding operations of multiplication and division. Hence the computers were not, and did not need to be, educated beyond basic numeracy and literacy. In fact, most of them were hairdressers who had lost their jobs because “one of the most hated symbols of the ancient regime was the hairstyles of the aristocracy.”
(12) Although the Bureau was producing mathematical tables, the operation was not itself mathematical. It was fundamentally the application of an organizational technology, probably for the first time outside a manufacturing or military context, to the production of information.
(13) The laborers in Adam Smith's imaginary pin-making factory would soon be replaced by a pin-making machine. Babbage decided that rather than emulate de Prony's labor-intensive and expensive manual table-making organizations, he would ride the wave of the emerging mass-production technology and invent a machine for making tables.
(13) Conceptually the Difference Engine was very simple: It consisted of a set of adding mechanisms to do the calculations and a printing part.
(14) Unfortunately, the engineering was more complicated than the conceptualization.
(14) Babbage was now battling on two fronts: first, designing the Difference Engine; and second, developing the technology to build it.
(15) By 1833, Babbage had produced a beautifully engineered prototype Difference Engine that was too small for real table-making and lacked a printing unit, but showed beyond any question the feasibility of his concept.
(15) Raising the specter of the Analytical Engine was the most spectacular political misjudgment of Babbage's career; it fatally undermined the government's confidence in his project, and he never obtained another penny.

CLEARING HOUSES
(16) The Bankers' Clearing House was an organization that processed the rapidly increasing number of checks being used in commerce.
(18) Babbage clearly recognized the significance of the Bankers' Clearing House as an example of the “division of mental labor,” comparable with de Prony's table-making project and his own Difference Engine.
(18) Alongside this physical and highly visible transport infrastructure grew a parallel, unseen information infrastructure known as the Railway Clearing House, which was modeled very closely on the Bankers' Clearing House.
(18) Another important example was the Central Telegraph Office founded in 1859 to overcome the problem that it was not economical to have telegraph lines connecting every town in the land with every other.

Packet switching resembles telegraphy system of sorters, pigeon holes, and messengers; does Hayles discuss in How We Think?

(18-19) Row upon row of telegraphists communicated messages with all parts of the nation and abroad. Messenger boys constantly scuttled through the rows of telegraphists collecting telegrams as they arrived, delivering them to a team of women sorters. The sorters placed the telegrams in pigeon holesone for each of the hundreds of destination towns. More messenger boys emptied the pigeon holes and delivered the telegrams to the telegraphists for onward transmission.
(19) By the 1860s another class of data-processing bureaucracy was beginning to develop in association with the “thrift movement.” . . . It was this relative prosperity combined with the thrift movement that created the market for savings banks and industrial insurance companies in the second half of the nineteenth century.
(20) Yet here was an extremely sophisticated organizational technology that can still be seen underlying the structure of any modern corporation.

HERMAN HOLLERITH AND THE 1890 CENSUS
(20) The population census was established by an Act of Congress in 1790 to determine the “apportionment” of members of the House of Representatives.
(21) Over 21,000 pages of census reports were produced for the 1880 census, which took some seven years to process. This unreasonably long time provided a strong motive to speed up the census by mechanization or any other means that could be devised.
(22) While Hollerith was not a deep thinker like the polymath Babbage, he was practical where Babbage was not. Hollerith also had a strong entrepreneurial flair, so that he was able to exploit his inventions and establish a major industry.
(22) Hollerith's key idea was to record the census return for each individual as a pattern of holes on punched paper tape or a set of punched cards, similar to the way music was recorded on a string of punched cards on fairground organettes of the period. It would then be possible to use a machine to automatically count the holes and produce the tabulations.
(24) A punching clerk—doing what was optimistically described as “vastly interesting” work—could punch an average of seven hundred cards in a six and a half hour day. Female labor was heavily used for the first time in the census, which a male journalist noted “augurs well for its conscientious performance” because “women show a moral sense of responsibility that is still beyond the average.” Over 62 million cards were punched, one for each citizen.

Hollerith census machines used tabulator and sorter for punched cards.

(24-25) Each census machine consisted of two parts: a tabulating machine, which could count the holes in a batch of cards, and the sorting box, into which cards were placed by the operator ready for the next tabulating operation. . . . When the press was forced down on the card, a pin meeting the solid material was pushed back into the press and had no effect. But a pin encountering a hole passed straight through, dipped into a mercury cup, and completed an electrical circuit. This circuit would then be used to add unity to one of forty counters on the front of the census machine. The circuit could also cause the lid of one of the twenty-four compartments of the sorting box to fly open—into which the operator would place the card so that it would be ready for the next phase of the tabulation.
(25) As each card was read, the census machine gave a ring of a bell to indicate that it had been correctly sensed.

Hollerith machines sabotaged by workers to provide a break.

(26) The trouble was usually that somebody had extracted the mercury from one of the little cups with an eye-dropper and squirted it into a spittoon, just to get some un-needed rest.
(26) The Census Bureau used the Hollerith system not only to reduce the cost of the census, but also to improve the quality and quantity of information, and the speed with which it was produced.

AMERICA'S LOVE AFFAIR WITH OFFICE MACHINERY
(26) In the closing decades of the nineteenth century, office equipment, in both its most advanced and its least sophisticated forms, was almost entirely an American phenomenon.
(27) The fact is that America was gadget-happy and was caught by the glamor of the mechanical office. . . . This attitude was reinforced by the rhetoric of the office systems movement.
(27-28) Systematizers set about restructuring the office—introducing typewriters and adding machines, designing multipart business forms and loose-leaf filing systems, replacing old-fashioned accounting ledgers with machine billing systems, and so on. The office systematizer was the ancestor of today's information-technology consultant.
(28) There is thus an unbroken line of descent from the giant office-machine firms of the 1890s to the computer makers of today.


2 The Mechanical Office
(29) To understand the development of the computer industry, and how this apparently new industry was shaped by the past, one must understand the rise of the office machine giants in the years around the turn of the century. This understanding is necessary, above all, to appreciate how IBM's managerial style, sales ethos, and technologies combined to make it perfectly adapted to shape and then dominate the computer industry.
(29) Today, we use computers in the office for three main tasks. There is document preparation . . . information storage . . . financial analysis and accounting.
(29) These were precisely the three key office activities that the business machine companies of the late nineteenth century were established to serve.

THE TYPEWRITER
(30) Hence the major attraction of the typewriter was that typewritten documents could be read effortlessly at several times the speed of handwritten ones.
(33) Selling for an average price of $75, typewriters became the most widely used business machines, accounting for half of all office appliance sales.
(34) By the 1920s office work was seen as predominantly female, and typing as universally female.
(34) But now we can see that it is important to the history of computing in that it pioneered three key features of the office machine industry and the computer industry that succeeded it: (1) the perfection of the product and low-cost manufacture; (2) a sales organization to sell the product; and (3) a training organization to enable workers to use the technology.

THE RANDS
(35) Vertical filing systems used only a tenth of the space of the box-and-drawer filing systems they replaced, and made it much faster to access records.

THE FIRST ADDING MACHINES
(36) The first commercially produced addign machine was the Arithmometer developed by Thomas de Colmar of Alsace at the early date of 1820.
(37) The bookkeeper of the 1880s was trained to add up a column of four-figure amounts mentally almost without error, and could do so far more quickly than using an Arithmometer.
(38) William S. Burroughs made the first successful attack on the second critical problem of the office adding machine: the need to print its results.

Social need for adding machines with progressive, withholding tax law, as Social Security Act would require punched-card machinery.

(39) A major impetus to development of the U.S. adding machine industry occurred in 1913 with the introduction of a new tax law that adopted progressive tax rates and the withholding of tax from pay.
(39) Between the two world wars Burroughs moved beyond adding machines and introduced full-scale accounting machines.

THE NATIONAL CASH REGISTER COMPANY
(41) [John H.] Patterson understood that the cash register needed constant improvement to keep it technically ahead of the competition. In 1888 he established a small “inventions department” for this purpose. This was probably the first formal research and development organization to be established in the office machine industry, and it was copied by IBM and others.
(42) NCR's greatest legacy to the computer age, however, was the way it shaped the marketing of business machines and established virtually all of the key sales practices of the industry.

THOMAS WATSON AND THE FOUNDING OF IBM
(49) The “rent-and-refill” nature of the punched-card business made IBM virtually recession-proof. Because the punched-card machines were rented and not sold, even if IBM acquired no new customers in a bad year, its existing customers would continue to rent the machines they already had, ensuring a steady year-after-year income.
(49) The second source of IBM's financial stability was its punched-card sales.
(50) Technical innovation was the third factor that kept IBM at the forefront of the office machine industry between the two world wars.
(51) Under the Social Security Act of 1935, it became necessary for the federal government to maintain the employment records of the entire working population of 26 million people. IBM, with its overloaded inventories and with factories in full production, was superbly placed to benefit.


3 Babbage's Dream Comes True
(54) The idea of the Analytical Engine came to Babbage when he was considering how to eliminate human intervention in the Difference Engine by feeding back the results of a computation, which he referred to as the engine “eating its own tail.” Babbage took this simple refinement of the Difference Engine, and from it evolved the design of the Analytical Engine, which embodies almost all the important functions of the modern digital computer.
(55) In the Analytical Engine, numbers would be brought from the store to the arithmetic mill for processing and the results of the computation would be returned to the store.

Is the dark age of mechanical digital computing attributed to failure of Babbage the historical fictional pivot for Steam Punk?

(59-60) As it was, Babbage's failure undoubtedly contributed to what L. J. Comrie called the “dark age” of digital computing. Rather than follow where Babbage had failed, scientists and engineers preferred to take a different, nondigital path—a path that involved the building of models, but which we now call analog computing.

THE TIDE PREDICTOR AND OTHER ANALOG COMPUTING MACHINES
(60-61) The most important analog computing technology of the nineteenth century, from an economic perspective, was the mechanical tide predictor. . . . In 1876 the British scientist Lord Kelvin invented a useful tide-predicting machine.
(61) This was the overriding problem with analog computing technologies—a specific problem required a specific machine.
(61) A whole new generation of analog computing machines were developed in the 1920s to help design the rapidly expanding U.S. electrical power system.
(63) [Vannevar]
Bush's differential analyzer, built between 1928 and 1931, was not a general-purpose computer in the modern sense, but it addressed succh a wide range of problems in science and engineering that it was far and away the single most important computing machine developed between the wars.

A WEATHER-FORECAST FACTORY
(65) At the end of the war in 1918, [Lewis Fry] Richardson returned to his work at the Meteorological Office and completed his book, which was published in 1922. In it he described one of the most extraordinary visions in the history of computing: a global weather-forecast factory.

SCIENTIFIC COMPUTING SERVICE
(67) [Leslie John] Comrie's great insight was to realize that one did not need special-purpose machines such as differential analyzers; he thought computing was primarily a question of organization. For most calculations, he found that his “calculating girls” equipped with ordinary commercial calculating machines did the job perfectly well.
(68) Perhaps Comrie's major achievement at the Nautical Almanac Office was to bring punch card machines—and therefore, indirectly, IBM—into the world of numerical computing.

THE HARVARD MARK I
(72) The following month, December 1937, [Howard Hathaway] Aiken polished up his proposal, in which he referred to some of Babbage's concepts—in particular, the idea of using Jacquard cards to program the machine—and sent it to IBM. This proposal is the earliest surviving document about the Mark I, and it must remain something of an open question as to how much Aiken derived from Charles Babbage. In terms of design and technology, the answer is clearly very little; but in terms of a sense of destiny, Aiken probably derived a great deal more.
(73) Unfortunately, the calculator was incapable of making what we now call a “
conditional branch--that is, changing the progress of a program according to the results of an earlier computation. This made complex programs physically very long. . . . If Aiken had studied Babbage's—and especially Lovelace's—writings more closely, he would have discovered that Babbage had already arrived at the concept of a conditional branch.

First book on digital computing by Aiken on Mark I, although Mauchly memorandum credited as real starting point, and von Neumann the first on the stored-program computer.

(74) After the dedication and the press coverage, there was intense interest in the Harvard Mark I from scientific workers and engineers wanting to use it. This prompted Aiken and his staff to produce a 500-page Manual of Operation of the Automatic Sequence Controlled Calculator, which was effectively the first book on digital computing ever published.
(76) Babbage could never have envisioned that one day electronic machines would come onto the scene with speeds
thousands of times faster than he ever dreamed. This happened within two years of the Harvard Mark I being completed.


PART TWO
Creating the Computer

4 Inventing the Computer

THE MOORE SCHOOL

THE ATANASOFF-BERRY COMPUTER

ECKERT AND MAUCHLY

Must appreciate how beliefs about likely processing speeds influence speculation on design and uses of computer technologies.

(86) By August 1942 [John] Mauchly's ideas on electronic computing had sufficiently crystallized that he wrote a memorandum on The Use of High Speed Vacuum Tubes for Calculating. In it he proposed an “electronic computer” that would be able to perform calculations in 100 seconds that would take a mechanical differential analyzer 15 to 30 minutes, and that would have taken a human computer “at least several hours.” This memorandum was the real starting point for the electronic computer project.

ENIAC AND EDVAC: THE STORED-PROGRAM CONCEPT
(88) All together, in addition to its 18,000 tubes, the ENIAC would include 70,000 resistors, 10,000 capacitors, 6,000 switches, and 1,500 relays.
(88) The ENIAC was Eckert and Mauchly's project. Eckert, with his brittle and occasionally irascible personality, was “the consummate engineer,” while Mauchly, always quiet, academic, and laid back, was “the visionary.”
(91) In short, there were three major shortcomings of the ENIAC: too little storage, too many tubes, too lengthy reprogramming.
(91-92) Von Neumann was captivated by the logical and mathematical issues in computer design and became a consultant to the ENIAC group to try to help them resolve the machine's deficiencies and develop a new design. This new design would become known as the “stored-program computer,” on which virtually all computers up to the present day have been based. All this happened in a very short space of time.
(92) From this point on, while construction of the ENIAC continued, all the important intellectual activity of the computer group revolved around the design of ENIAC's successor: the EDVAC, for Electronic Discrete Variable Automatic Computer.
(92-93) Goldstine has likened the stored-program concept to the invention of the wheel: It was simple—once he had thought of it. This simple idea would allow for rapid program set-up by enabling a program to be read into the electronic memory from punched cards or paper tape in a few seconds; it would be possible to deliver instructions to the control circuits at electronic speeds; it would provide two orders of magnitude more number storage; and it would reduce the tube content by 80 percent. But most significantly, it would enable a program to treat its own instructions as data. Initially this was done to solve a technical problem associated with handling arrays of numbers, but later it would be used to enable programs to create other programs—laying the seeds for programming languages and artificial intelligence.

Invitation to study contested history of development of electronic computer as Hayles does cybernetics.

(93) During these meetings, Eckert and Mauchly's contributions focused largely on the delay-line research, while von Neumann, Goldstine, and Burks concentrated on the mathematical-logical structure of the machine. Thus there opened a schism in the group between the technologists (Eckert and Mauchly) on the one side and the logicians (von Neumann, Goldstine, and Burks) on the other, which would lead to serious disagreements later on.
(93) Von Neumann designated these five units as the central control, the central arithmetic part, the memory, and the input and output organs.
(93) Another key decision was to use binary to represent numbers.
(94) By the spring of 1945, the plans for EDVAC had evolved sufficiently that von Neumann decided to write them up. His report, entitled
A First Draft of a Report on the EDVAC, dated 30 June 1945, was the seminal document describing the stored-program computer. . . . Von Neumann's sole authorship of the report seemed unimportant at the time, but it later led to him being given sole credit for the invention of the modern computer.

Compare urge to disseminate bootstrapping knowledge to create computers as will of technological unconscious to later cycle in which money making trumped openness that made corporations and wealthy individuals rivaling the largest established human and machine ensembles; today an intellectual deoptimization settled on path of least resistance default philosophies of computing dominate.

(95) Although originally intended for internal circulation to the Project PY group, the EDVAC Report rapidly grew famous, and copies found their way into the hands of computer builders around the world. This was to constitute publication in a legal sense, and it eliminated any possibility of getting a patent. For von Neumann and Goldstine, who wished to see the idea move into the public domain as rapidly as possible, this was a good thing; but for Eckert and Mauchly, who saw the computer as an entrepreneurial business opportunity, it was a blow that would eventually cause the group to break up.

THE ENGINEERS VERSUS THE LOGICIANS

THE MOORE SCHOOL LECTURES
(98) Apart from its gargantuan size, the feature the media found most newsworthy was its awesome ability to perform 5,000 operations in a single second.
(98) The Moore School Lectures, as the course later became known, took place over eight weeks from 8 July to 31 August 1946. The list of lecturers on the course read like a
Who's Who of computing of the day. It included luminaries such as Howard Aiken and von Neumann, who made guest appearances, as well as Eckert, Mauchly, Goldstine, Burks, and several others from the Moore School who gave the bread-and-butter lectures.

MAURICE WILKES AND EDSAC
(100) [F. C.] Williams decided that the key problem in developing a computer was the memory technology. . . . With a single assistant to help him, he developed a simple memory system based around a commercially available cathode-ray tube. With guidance from Newman, who “took us by the hand” and explained the principles of the stored-program computer, Williams and his assistant built a tiny computer to test out the new memory system. . . . The date was Monday, 21 June 1948, and the “Manchester Baby Machine” established incontrovertibly the feasibility of the stored-program computer.
(102) From the outset, he [Maurice Wilkes] decided that he was interested in
having a computer, rather than trying to advance computer engineering technology. He wanted the laboratory staff to become experts in using computers—in programming and mathematical applications—rather than in building them.
(104) On 6 May 1949, a thin ribbon of paper containing the program was loaded into the computer; half a minute later the teleprinter sprang to life and began to print 1, 4, 9, 16, 25 . . . The world's first practical stored-program computer had come to life, and with it the dawn of the computer age.


5 The Computer Becomes a Business Machine
(105) What really happened in the 1950s, as suggested by the title of this chapter, is that the computer was reconstructed—mainly by computer manufacturers and business users—to be an electronic data-processing machine rather than a mathematical instrument.
(106) There were actually three types of firms that entered the computer industry: electronics and control equipment manufacturers, office machine companies, and entrepreneurial start-ups.

MORE THAN OPTIMISTIC”: UNIVAC AND BINAC
(109) The most ambitious feature of the UNIVAC was the use of magnetic-tape storage to replace the millions of punched cards used in the Census Bureau and other businesses.
(110) The Prudential's computer expert was a man named Edmund C.
Berkley, who was to write the first semipopular book on computers, Giant Brains, published in 1949.
(111) The first American stored-program computer to operate, the BINAC was never a reliable machine.

IBM: EVOLUTION, NOT REVOLUTION
(112) IBM had several electronics and computer development projects in the laboratories in the late 1940s; what delayed them being turned into products was the uncertainty of the market.
(113) IBM institutionalized its attitude to electronics and computers through the slogan “evolution not revolution.” By this, it meant that it would incorporate electronics into existing products to make them faster, but they would not otherwise be any different.
(115) Computer historians have often failed to realize the importance of the [IBM Card Programmed Calculator] CPC, not least because it was called a “calculator” instead of a “computer.” Watson insisted on this terminology because he was concerned that the latter term, which had always referred to a human being, would raise the specter of technological unemployment.

UNIVAC COMES TO LIFE

Hopper foremost female computer professional and promoter of advanced programming techniques.

(121) The programming team was initially led by Mauchly, but was later run by a programmer recruited from the Harvard Computation Laboratory, Grace Murray Hopper, who would become the driving force behind advanced programming techniques for commercial computers and the world's foremost female computer professional.

Election night role of mock up UNIVAC predicting outcome for Eisenhower was key introduction of computers to general public.

(123) The appearance of the UNIVAC on election night was a pivotal moment in computer history. Before that date, while some people had heard about computers, very few had actually seen one; after it, the general public had been introduced to computers and had seen at least a mock-up of one. And that computer was called a UNIVAC, not an IBM.

IBM'S BIG PUSH
(125) However, this failed to take account of the fact that computers were hot news, and business magazines were buzzing with stories about electronic brains for industry and commerce. Cost effectiveness was no longer the only reason, or even the most important reason, for a business to buy a computer.
(126) Instead of the mercury delay lines chosen for the UNIVAC, IBM decided to license the Williams Tube technology developed for the Manchester University computer in England.
(126) Another advantage over the UNIVAC was that IBM's computers were modular in construction—that is, they consisted of a series of “boxes” that could be linked together on site.
(126-127) Although it was still a laboratory prototype at this time, IBM mounted a crash research program to develop core memory into a reliable product.
(127) Yet it was not the large-scale 700 series that secured IBM's leadership of the industry, but the low-cost Magnetic Drum Computer.

Mastery of marketing and long term planning by IBM through dissemination of their computers in higher education to produce the next generation of workers trained on them; good example of social factor influencing history more so than the technological capabilities of the devices, a topic developed with respect to real time processing.

(127) With an astute understanding of marketing, IBM placed many 650s in universities and colleges, offering machines with up to a 60 percent discount provided courses were established in computing. The effect was to create a generation of programmers and computer scientists nurtured on IBM 650s, and a trained workforce for IBM's products. It was a good example of IBM's mastery of marketing, which was in many ways more important than mastery of technology.
(128) In the memorable phrase of the computer pundit Herb Grosch, in losing its early lead to IBM, Remington Rand “snatched defeat from the jaws of victory.”

THE COMPUTER RACE
(129) Thus, by the end of the 1950s the major players in the computer industry consisted of IBM and a handful of also-rans: Sperry Rand, Burroughs, NCR, RCA, Honeywell, GE, and CDC. Soon journalists would call them IBM and the seven dwarves.


6
The Maturing of the Mainframe:
The Rise and Fall of IBM

THE BREAKTHROUGH MODEL 1401
(131) The turning point for IBM was the announcement of the model 1401 computer in October 1959. The 1401 was not so much a computer as a computer
system.
(133) IBM also had to find a technological solution to the programming problem. The essential challenge was how to get punched card-oriented business analysts to be able to write programs without a huge investment in retraining them and without companies having to hire a new temperamental breed of programmer. The solution that IBM offered was a new programming system called Report Program Generator (RPG).
(133) IBM developed entire program suites for the industries that it served most extensively, such as insurance, banking, retailing, and manufacturing. These application programs were very expensive to develop, but because IBM had such a dominant position in the market it could afford to “give” the software away, recouping the development costs over tens of hundreds of customers.
(134) How did IBM get its forecast so wrong? The 1401 was certainly an excellent computer, but the reasons for its success had very little to do with the fact that it was a computer. Instead, the decisive factor was the new type 1403 “chain” printer that IBM supplied with the system.

IBM AND THE SEEN DWARVES

REVOLUTION, NOT EVOLUTION: SYSTEM/360
(137-138) The biggest problem, however, was not in hardware but in software. Because the number of software products IBM offered to its customers was constantly increasing, the proliferation of computer models created a nasty gearing effect. Given
m different computer models, each requiring n different software products, a total of m x n programs had to be developed and supported. . . . For IBM, a compatible range promised to be the electronic Esperanto that would contain the software problem.
(139) The decision to produce a compatible family was not so clear-cut as it appears in hindsight, and there was a great deal of agonizing at IBM.
(140) The New Product Line was one of the largest civilian R&D projects ever undertaken.
(141) Simply keeping the machine designs compatible between the geographically separate design groups was a major problem.
(141) All told, direct research costs were around $500 million. But ten times as much again was needed for development—to tool up the factories, retrain marketing staff, and re-equip field engineers.
(142) However, all internal debate about the announcement strategy effectively ceased in December 1963 when Honeywell announced its model 200 computer. . . . Honeywell 200s could run IBM programs without reprogramming and, using a provocatively named “liberator” program, could speed up existing 1401 programs to make full use of the Honeywell 200's power.
(143) The computer industry and computer users were stunned by the scale of the announcement. While an announcement from IBM had long been expected, its tight security had been extremely effective, so that outsiders were taken aback by the decision to replace the entire product line.
(144) System/360 has been called “the computer that IBM made, that made IBM.” The company did not know it at the time, but System/360 was to be its engine of growth for the next thirty years.

THE DWARVES FIGHT BACK
(144-145) But in technological terms, System/360 was no more than competent. . . . For example, the proprietary electronics technology that IBM had chosen to use, known as Solid Logic Technology (SLT), was halfway between the discrete transistors used in second-generation computers and the true integrated circuits of later machines.

Lack of time-sharing support in System/360 major design flaw.

(145) Perhaps the most serious design flaw in System/360 was its failure to support time-sharing—the fastest-growing market for computers—which enabled a machine to be sued simultaneously by many users.
(146) RCA was the only mainframe company to go IBM-compatible in a big way.
(146) Hence, the second and less risky strategy to compete with System/360 was product differentiation—that is, to develop a range of computers that were software-compatible with one another but not compatible with System/360. This was what Honeywell did.
(146) The third strategy to compete with IBM was to aim for niche markets that were not well satisfied by IBM and in which the supplier had some particular competitive advantage.

THE FUTURE SERIES
(148) In June 1970 the evolutionary successor to the 360 range was announced as System/370. The new series offered improved price performance through updated electronic technology: True integrated circuits were used in place of the SLT modules of System/360, and magnetic-core storage was repalced with semiconductor memory. The architecture was also enhanced to support more effectively time-sharing and communications-based on-line computing. The technique of “
virtual memorywas also introduced, using a combination of software and hardware to increase the amount of working memory in the computer (and therefore permit the execution of much larger programs).
(148) While the life of the 360-370 line was being extended by these modest improvements, IBM's main R&D resources were focused on the much more challenging Future Series.
(149) FS was indeed a remarkably advanced architecture, and this entailed its own problems. Even today there is no major computer range that has the functionality of FS as it was proposed in the early 1970s.
(149) But more than anything, it was the growing importance of the existing software investment of IBM and its users that made a new architecture less feasible by the day and sealed the fate of FS.

Suggest the contradiction of the personal computer and Internet age is the knowledge gap between problem solving by use and by programming resulting from hysteresis of a generation of users acculturated to closed source, interface level competencies.

(150) The maturing of the mainframe has produced one of the great technological contradictions of the twentieth century: As semiconductor components improve in size and speed by a factor of two every year or two, they are used to build fossilized computer designs and run software that is groaning wit age. The world's mainframes and the software that runs on them have become like the aging sewers beneath British city streets, laid in the Victorian era. Not a very exciting infrastructure, and largely unseen—but they work, and life would be totally different without them.

THE DECLINE OF THE IBM EMPIRE
(151) IBM's current malaise has frequently been attributed to the rise of the personal computer, but the facts do not support this contention. The scene was set for IBM's downfall in the mid-1970s, with the commodification of the mainframe—that is, by the mainframe becoming an article that could be produced by any competent manufacturer. This had the inevitable result of reducing the generous profit margins IBM had enjoyed for sixty years.
(153) Only IBM guaranteed a complete solution to business problems, and an IBM salesman was all too likely to remind a data-processing manager that no one ever got fired by hiring from IBM. This was a patronizing attitude that came close to condescension, and often resulted in a love-hate relationship between IBM and its customers.


PART THREE
Innovation and Expansion

7 Real Time: Reaping the Whirlwind
(157) No previous office technology had been able to achieve this speed, and the emergence of real-time computers had the potential for transforming business practice.

JAY FORRESTER AND PROJECT WHIRLWIND
(159) [Perry] Crawford was perhaps the first to appreciate real-time digital control systems, long predating the development of practical digital computers. As early as 1942 he had submitted a master's thesis on
Automatic Control by Arithmetic Operations, which discussed the use of digital techniques for automatic control.
(161) As did everyone leading an early computer project, Forrester soon discovered that the single most difficult problem was the creation of a reliable storage technology.
(163) Crawford foresaw a day when real-time computing would control not just military systems but whole sectors of the civilian economy—such as air traffic control.

THE SAGE DEFENSE SYSTEM
(167) The value to the nation of the core memory spin-off could by itself be said to have justified the cost of the entire Whirlwind project.

Distributed control with SAGE Direction Centers; subindustry grew to develop and implement its basic technologies.

(167) The SAGE system that was eventually constructed consisted of a network of twenty-three Direction Centers distributed throughout the country.
(168) The real contribution of SAGE was thus not to military defense, but through technological spin-off to civilian computing. An entire subindustry was created as industrial contractors and manufacturers were brought in to develop the basic technologies and implement the hardware, software, and communications.

SABRE: A REVOLUTION IN AIRLINE RESERVATIONS

Technology intercept strategy joined system planning and expected advances; compare to military planning of scheduled breakthroughs discussed by Edwards.

(172-173) Simultaneously an idealized, integrated, computer-based reservation system was planned, based on likely future developments in technology. This later become known as a technology intercept” strategy. The system would be economically feasible only when reliable solid-state computers with core memories became available. Another key requirement was a large random-access disk storage unit, which at that time was a laboratory development rather than a marketable product.
(175) The airline reservations problem was unusual in that it was largely unautomated in the 1950s and so there was a strong motivation for adopting the new real-time technology. Longer-established and more traditional businesses were much slower to respond.

THE UNIVERSAL PRODUCT CODE
(177) Even greater benefits would come if manufacturers, wholesalers, and retailers all used the same, universal product code. This would allow the whole of the food industry to harmonize their computer hardware and software development, in effect becoming one integrated operation. This implied an extraordinary degree of cooperation among all these organizations, and the implementation of the UPC was thus as much a political achievement as a technical one.
(178) One of the first was to agree upon a product code of ten digits—five digits representing the manufacturer and five representing the product.
(179) The bar code principle has diffused very rapidly to cover not just food items but most packaged retail goods (note the ISBN bar ode on the back cover of this book). The bar code has become an icon of great economic significance.
(180) In a sense, there are two systems co-existing, one physical and one virtual. The virtual information system in the computers is a representation of the status of every object in the physical manufacturing and distribution environment—right down to an individual can of peas.


8 Software

New type of rhetoric required for computer programming.

(181) The fundamental difficulty of writing software was that, until computers arrived, human beings had never before had to prepare detailed instructions for an automaton—a machine that obeyed unerringly the commands given to it, and for which every possible outcome had to be anticipated by the programmer.

PROGRAMMING LANGUAGES: FORTRAN AND COBOL

Fascinating to find Hopper simultaneously crossing philosophy of computing and feminist discourse where she is explicitly grouped among those not claiming to be a feminist.

(187) Probably no one did more to change the conservative culture of 1950s programmers than Grace Hopper, fist programmer for the Harvard Mark I in 1943, then with UNIVAC. For several years in the 1950s she barnstormed around the country, proselytizing the virtues of automatic programming at a time when the technology delivered a good deal less than it promised.
(191) COBOL was particularly influenced by what one critic described as Hopper's “missionary zeal for the cause of English language coding.”

SOFTWARE ENGINEERING
(200)

Misconception of early programmers as long-haired men (male humans) when many were women, noting Grace Hopper evangelizing learning programming languages and four who started a business.

(201) The Garminsch conference began a major cultural shift in the perception of programming. Software writing started to make the transition from being a craft for a long-haired programming priesthood to becoming a real engineering discipline.


9
New Modes of Computing

(207) Improved machines and software enabled more sophisticated applications and many primitive batch-processing systems became real time, but data processing still consisted of computing delivered to naïve users by an elite group of systems analysts and software developers.
(207) The personal computer grew out of an entirely different culture of computing, which is the subject of this chapter. This other mode of computing is associated with computer time-sharing, the BASIC programming language, Unix, minicomputers, and new microelectronic devices.


PART FOUR
Getting Personal

10
Shaping of the Personal Computer

(233) Perhaps its most serious distortion is to focus on a handful of individuals, portrayed as visionaries who clearly saw the future and made it happen: Apple Computer's Steve Jobs and Microsoft's Bill Gates figure prominently in this genre. By contrast, IBM and the established computer firms are usually portrayed as dinosaurs: slow-moving, dim-witted, deservedly extinct. When it comes to be written, the history of the personal computer will be much more complex than this. It will be seen to be the result of a rich interplay of cultural forces and commercial interests.

VISICALC

Importance of early computer games producing a new generation of programmers who developed understanding of HCI (Gee).

(250) Computer games are often overlooked in discussions of the personal-computer software industry, but they played an important role in its early development. Programming computer games created a corps of young programmers who were very sensitive to what we now call human/computer interaction. The most successful games were ones that needed no manuals and gave instant feedback.


11
The Shift to Software

(260) Entrants to the new industry needed not advanced software-engineering knowledge but the same kind of savvy as the first software contractors in the 1950s: creative flair and the technical knowledge of a bright undergraduate. The existing software companies simply could not think or act small enough: their overhead costs did not allow them to be cost-competitive with software products for personal computers.

GRAPHICAL USER INTERFACE

GUI user-friendliness was the next step in broadening computer use following Kemeny vision of BASIC programming on time-sharing systems.

(264) For the personal computer to become more widely accepted and reach a broader market, it had to become more “user-friendly.” During the 1980s user-friendliness was achieved for one-tenth of computer users by using a Macintosh computer; the other nine-tenths could achieve it through Microsoft Windows software. Underlying both systems was the concept of the graphical user interface.

STEVE JOBS AND THE MACINTOSH

Product differentiation marketing strategy by Apple may have contributed to shift toward postmodern preferences Turkle articulates.

(274) John Sculley recognized that Apple had a classic business problem that called for a classic solution: product differentiation.

12
From the World Brain to the World Wide Web

To Campbelly-Kelley the history of the modern computer as information machine concludes with commerce, recreation, and socializing seeming to have replaced the initial excitement of access to knowledge that the Internet offered.

(284) However, the use of the Internet that has created the most excitement is its potential for giving ordinary people access to the world's store of knowledge through the “World Wide Web.”



Campbell-Kelly, Martin and William Aspray. Computer: A History of the Information Machine. New York: Basic Books, 1996. Print.