A History of Mathematics- From Mesopotamia to Modernity

(Marvins-Underground-K-12) #1

162 A History ofMathematics


than their share of attention in history of mathematics courses, they are either ignored or grossly
misrepresented even in ‘historically’ minded calculus textbooks where one or both feature with
portrait as icons and founders, as Shelley Costa has pointed out:
When writing of Newton and Leibniz, 20th-century authors of calculus textbooks tend to reduce their history to
method and notation while exalting them as insightful, majestic intellectual forebears, perpetuating a mathematical
mystique that rewards genius and ignores context. (Costa, n.d.)
While what seems important from a modern point of view is the easy access to powerful results,
equally significant at the time was the questionable legality of the procedure. At least since the time
of the Greeks, mathematics had rested its claims to certainty on rules of precision in reasoning.
The new methods treated these rules with a degree of indifference from which they have never
fully recovered. The major problem was the use (which was essential) of infinitely small quant-
ities or ‘infinitesimals’ which were either zero or not zero, depending on where you were in the
argument. It had, reasonably, been assumed that the Law of Contradiction (‘if something is X it
is not also not-X’) operated in mathematics as elsewhere. The following quote from the first (and
most important) early calculus textbook shows that its power was slipping:
Postulate 1.Grant that two quantities, whose difference is an infinitely small quantity, may be taken (or used)
indifferently for each other: or (which is the same thing) that a quantity, which is increased or decreased only by
an infinitely small quantity, may be considered as remaining the same. (L’Hôpital 1696, cited Fauvel and Gray,
extract 13.B.6)


The confusion underlying this extract was fundamental to the calculus, and was essential for its
development. To make it explicit, as Bishop Berkeley was to do 40 years later, if two quantities which
differ by an infinitesimal are the same, then what is the infinitesimal there for at all? More simply, if
dx(as Leibniz called it) is infinitesimal, it differs by an infinitesimal from 0, and so is equivalent to



  1. Is it then 0 or not?
    The point is, of course, that any mediocre person can break the laws of logic, and many
    do. What Newton and Leibniz did was to formalize the breakage as a workable system of
    calculation which both of them quickly came to see was immensely powerful, even if they
    were not entirely clear about what they meant. The new methods built on Descartes’s geo-
    metry; that geometry had raised a number of important questions about how to find tangents
    to curves, their lengths, and their areas, and the calculus was to provide the means of find-
    ing rapid solutions. Mathematics became, in a sudden transition, both easy and difficult; easy
    for the circle of initiates who learned how to use the method, and difficult for the outsiders
    who could understand neither what was being done nor how it was justified. ‘There goes a
    man hath writt a book which neither he nor any body else understands’ remarked a sceptical
    Cambridge undergraduate of Newton^2 ; and the contradictory dogmas of the early calculus per-
    haps mark the origin of a widening split between the world of the ‘serious’ mathematician and the
    amateur.
    Already it may be clear that this chapter differs from the preceding ones in covering a much
    shorter period; instead of hundreds of years, we are dealing with the relatively short time which
    separates the 1660s (when the calculus did not exist) from the 1720s (when it was on its way to
    becoming the dominant method for answering a wide range of mathematical questions). Indeed,
    appropriately for the science of the infinitely small, the historians often seem to be focusing
    2. Cited Iliffe (1995, p. 174).

Free download pdf