Notes for Mark Aloisio “The Calculation of Easter Day, and the Origin and Use of the Word Computer”
Key concepts: computer, computing machine, computus, divide-and-conquer, division-of-labor.
Etymology and history of the use of the word computer. Made up English from French used to denote measurements of short time intervals. Neglects early Roman use. First machine computer in Swift's fiction, a fantasy. Divide and conquer and division of labor. Definition of computing as calculating in accordance with effective methods, machine doing so automatically in succession of operations with intermediate storage.
Related theorists: Alighieri, Bacon, Borst, Dionysius Exiguus, Heidegger, Lancelot Hogben, Kittler, Napier, Pascal, Jonathan Swift.
Summary
Computer a suitable word for a Heideggerian hermeneutic phenomenological account, complemented by rigorous etymological and historical accounts like this one, noting this study does not stretch back to classical Latin usage.
(42)
Like so many English words, computer
derives
from Latin and therefore traces its origins back many centuries.
(42)
In the introductory chapter of his book The Ordering
of Time,
historian Arno Borst
makes
the point that few people are aware of the true origin of the word
computer.
. . . there has been—at least until quite recently—scarcely any
literature that properly explains the etymology of computer.
Etymology and history of the use of the word computer: Borst reckon up, counting on fingers in use in early Roman times..
(42)
According to Borst, the word computare,
which meant “to reckon up,” or “to count on one's fingers,”
was already in use in early Roman times. This word frequently
accompanied the word numerare,
which had a similar meaning. Later, the word calculare
was
added to indicate counting of numbers with beads (or pebbles).
(42)
Although this is generally correct, the Latin word computus
(sometimes
compotus
or
compotos)
may well have been the one giving rise to the word computer as was in
widespread use in Europe throughout the Middle Ages. . . . The first
specific meaning was coined by a Sicily-based writer who used it to
denote “the astrological interpretation of computed and observed
planetary orbits,” a practice relevant among pagans at that time.
Jokes about the lack of planning by the Nicene council in creating such a confusing definition of Easter Day aiding the development of computing, curious parallel to need for ballistic tables aiding development of electromechanical computers.
(42-43)
The probable reason why computus acquired widespread use has to do
with ecclesiastical history, that relating to Easter. When the Nicene
council, convened by Constantine in AD 325, laid down the rules
(actually just adopted an already established method) for determining
the date of Easter, it certainly did not anticipate the confusion
that would ensue for centuries to follow. . . . However, the general
consensus among Christians was that Easter should be celebrated on a
Sunday and, importantly, on the Sunday after the feast of the Jewish
Passover. Passover is based on the lunar cycle; consequently, the
date of Easter was inextricably linked with the moon. To calculate
this date therefore required almost the impossible: an accurate
determination in advance of the movements of the sun, earth, and
moon.
(43) One of the first persons to have possessed a thorough
knowledge of time and the calendar was the Scythian monk Dionysius
Exiguus who,
in 525, was instructed by Pope John I to determine the date of Easter
for the following year. This abbot not only calculated the Easter
date for one year, but went on to draw up a table of future Easter
dates covering a period of some 95 years. Traditionally, the
computation of Easter was the realm of the Alexandrian fathers who
had treated this practice with some secrecy, as though it belonged
only to a gifted few. The Roman Catholic Church, probably not wanting
to depend entirely on the Eastern Orthodox Church regarding this
matter, and aware of Dionysius' competence, therefore sought to go
its way be seeking this bright fellow's services.
(43) Dionysius
gave us our Anno Domini (AD) system of dating. Cassiodorus,
Dionysius's friend, was the first to officially use it in his
Computus
Paschalis,
a textbook describing the basic skills of time reckoning. From now
on, for centuries to follow, computus essentially meant “the
calculation of Easter.”
Compotiste
and
abaciste
of
late medieval times
(43)
The meaning of computare as “to recount” was suggested by Bishop
Gregory of Tours as well as by Bede, from the custom that
“uneducated” people, when ask to give information in terms of
numbers would, instead, recount stories.
(43) With the advent of
the astrolabe in the 10th
century,
the abacus may have become even more popular because the use of the
astrolabe required calculations for which the abacus, if not the
ideal reckoning tool, was certainly handy. The astrolabe was one of
the first elaborate and accurate instruments for sighting the
position of celestial objects, and for this reason may be considered
among the earliest analog devices.
Astrolabe as early analog device.
(44)
The word augrim descends from algorism,
itself a corruption of the Latinized version of the name of the
9th-century Muslim mathematician al-Khwarizmi.
(44) In the second
half of the 13th
century,
Roger Bacon
wrote
his famous work titled Compotus,
a treatise on the science of time. . . . He also noted that to
achieve better results regarding the calendar, one could now no
longer work with whole numbers as the earlier compotiste
did.
He insisted that Christians should not look ignorant before Muslims
who had one of the most accurate lunar calendars.
(44) In this
same period, the word conto
in
Italian still meant astronomical time-reckoning as did, more or less,
the word conteour.
However, Dante Alighieri
wrote
a collection of love poems in which conto was used in a different
context. It suggested the relationship between two lovers—not
physically, but in terms of monetary accounting, how lovers reckon
and balance income and expenditure. It subsequently found its way
into neighboring countries, as compte
in
French and Konto
in
German. The papl chancellery helped complete the change to Latin when
it created the office of the taxator
or
computator
so
that papal hulls could be chared and registered. In English, the word
computist
was
also used in the 16th
and
17th
centuries
to refer to a keeper of accounts (that is, an accountant).
First use: Compute, Computation, and Computers
First English use made up from French to denote measurements of short time intervals.
(44)
In his time, many new English words were made up, and the use of the
modern English word compute is probably no exception. It goes back to
Chaucer's time, when the French word compte
was
used in an English text to denote the measurements of short time
intervals.
(44) Computacion,
which was also spelled the modern way (computation)
from the beginning, appeared frequently in 17th-century texts that
involved dates. . . . The earliest reference to the word computer in
the English literature is probably by the physician Thomas Browne in
1646. In his Pseudodoxia
Epidemica,
he used its plural form to refer to the persons involved in time
calculations, so that the word computers
was
in fact used instead of the then more popular Latin word compotiste.
First machine computer a fictional fantasy in Swift Gullivers Travels.
(44-45) The word computer made another appearance some half a century later when it was used by the satirist Jonathan Swift. . . . This “skillful Computer” was an informed person who had arrived at this conclusion by applying the “rules of Arithmetick.” In part III of Gulliver's Travels, Swift refers to another “computer” with the aid of which anyone would be able to master all the arts and sciences. This must be one of the earliest instances when the word was used—by the same author and in a short space of time—to refer both to a machine and a person.
Era
of logarithms and the early 'computists'
(45)
Napier
was
also one of the first persons to attempt “mechanizing”
mathematical calculations.
(45) The mathematician and historian
Lancelot
Hogben,
in his classic book Mathematics
for the Million,
mentions two needs that may have contributed to the development of
logarithms (and ultimately to the design and manufacture of
calculating machines). The first has to do with the preparation of
trigonometric tables for use in navigation. The second is related to
accounting, namely the lengthy calculations required when reckoning
compound interest upon investments.
Logarithms, and calculating machines, likely developed for trigonometry for navigation, especially determining longitude, and compound interest for accounting (see Campbell-Kelley and Aspray).
(45) Following a number of notable
fatal accidents, various governments eventually decided to reward
prize money to anyone who came up with an accurate, practical, and
reliable method of solving the longitude problem.
(45) For a long
time, the calculators' main calculating aids remained the logarithms
(and, for less exact work, the slide rules), for although a number of
machines had been conceived and built, none was practical and
reliable. The general public showed little interest in these
Rechnungs-Maschinen or
Rechenmaschinen
[Calculating
Machines], and it was not until the 1820s that Leibniz-type machines,
which were among the best then available, began to be manufactured in
large quantities.
Human
computer era
(45)
When Blaise Pascal
built
his machine
d'arithmetique in
1642, it was meant to relieve the calculateur
of
the task of using the jetons,
or counting beads.
(46) It is certain, however, that the word
computer to refer to a human had become popular by the beginning of
the 20th
century
and retained this meaning for a few decades thereafter.
(46) By
the late 1700s, it was becoming apparent that the numerical solutions
to some of these mathematical problems could not possibly be
completed by one person in a realistic time frame, although the task
could be achieved if the problem was appropriately prepared, broken
down, and given to several people to work on.
(46) One of the
earliest small groups to work in this manner included three
astronomers who, in the summer of 1758, worked for nearly five months
on the orbit of comet Halley to predict its perihelion passage.
(46)
Observatories began to employ personnel whose sole task was purely
numerical calculation.
Divide and conquer and division of labor becoming key characteristic of computing; see citation by Kittler of Hasslacher on discretization.
(46) The divide-and-conquer strategy and the division-of-labor principle had shown their worth. Moreover, as a result, the words calculator and computer became firmly established. For a good two centuries their meaning remained synonymous, referring only to the human being.
New
definitions
(46)
During World War I, for example, many computers were employed on both
sides of the war to perform tasks related to ballistics, surveying,
navigation, and cartography. Also, because most of the men went to
war, this period marked an increase in women computers.
(47) Three
significant things happened in the years leading to the digital
computer that initially started differentiating between the words
computer and calculator, and subsequently completely changed their
meaning. First, a number of adding machines were becoming popular. .
. . Second, the art and science of human computation was being
professionalized. . . . Third, the progress in electronics, combined
with that in the theoretical field of computer science—which led to
the introduction of portable “scientific” calculators and digital
computers—ultimately changed the role of those employed in the
field and created new titles for both machine and person that were to
stick.
(47) From an etymological point of view, probably the most
noteworthy period was that circa 1930-1960, when the words computer
and calculator were beginning to take on a new sense, but had not yet
lost their old meaning.
Definition of computing as calculating in accordance with effective methods, machine doing so automatically in succession of operations with intermediate storage; then transition from computing machine to computer.
(47)
As early as the 1920s, the term computing
machine had
been used for any machine that did the work of a human computer, that
is, that calculated
in accordance with effective methods.
With the onset of the first digital computers, this term gradually
gave way to computer.
(47) Computer, however, would refer to a
machine capable of carrying out automatically a succession of
operations of this kind and of storing the necessary intermediate
results.
(47) The last generation of human computers retired in
about 1970, which probably explains the change in the dictionary
entries.
(47) In Germany, the computer continued to be called
Rechner,
in France, it was referred to initially as calculateur
and
later as ordinateur;
and in Italy they called it calcolatore,
a word that was once reserved for the human computer.
(47-48)
Considering what modern computers are now capable of doing, the word
that describes them has, paradoxically, almost become a misnomer.
Aloisio, Mario. "The Calculation Of Easter Day, And The Origin And Use Of The Word Computer." IEEE Annals Of The History Of Computing 26.3 (2004): 42-49. Academic Search Premier. Web. 6 Dec. 2012.