University of Virginia Library

Search this document 
Dictionary of the History of Ideas

Studies of Selected Pivotal Ideas
  
  

expand sectionV. 
expand sectionV. 
expand sectionV. 
expand sectionV. 
expand sectionV. 
expand sectionV. 
expand sectionV. 
expand sectionVII. 
expand sectionVII. 
expand sectionIII. 
expand sectionIII. 
expand sectionI. 
expand sectionII. 
expand sectionV. 
expand sectionV. 
expand sectionVI. 
expand sectionII. 
expand sectionV. 
expand sectionV. 
expand sectionVII. 
RELATIVITY OF STANDARDS OFMATHEMATICAL RIGOR
expand sectionVII. 
expand sectionI. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionIII. 
expand sectionIII. 
expand sectionVI. 
expand sectionIII. 
expand sectionIII. 
expand sectionIII. 
expand sectionIII. 
expand sectionIII. 
expand sectionIII. 
expand sectionIII. 
expand sectionIII. 
expand sectionIII. 
collapse sectionIII. 
  
expand sectionIII. 
expand sectionIII. 
expand sectionV. 
expand sectionV. 
expand sectionIII. 
expand sectionI. 
expand sectionVI. 
expand sectionIII. 
expand sectionVI. 
expand sectionI. 
expand sectionIII. 
expand sectionVII. 
expand sectionI. 
expand sectionI. 
expand sectionIV. 
expand sectionVI. 
expand sectionV. 
expand sectionVI. 
expand sectionVI. 
expand sectionIV. 
expand sectionIII. 
expand sectionV. 
expand sectionVI. 
expand sectionIII. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionIII. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionII. 
expand sectionII. 
expand sectionII. 
expand sectionVII. 
expand sectionIV. 
expand sectionIV. 
expand sectionV. 
expand sectionVI. 
expand sectionVI. 
expand sectionV. 

RELATIVITY OF STANDARDS OF
MATHEMATICAL RIGOR

From a broad standpoint, rigor in any field of en-
deavor and particularly in scientific fields, means the
adherence to procedures that have been generally
accepted as leading to correct conclusions. Thus the
statement, “It has been rigorously established that
Norsemen reached the shores of North America before
Columbus,” can be taken to mean that documentary,
archaeological, or other evidence has been produced
which conforms to the standards of acceptance set by
the group to which the statement is addressed. Such
a group might be a society of professional historians,
in which case the term “rigorously established” implies
that the evidence offered as a basis for the assertion
conforms to the standards set by professional historians;
for example, the evidence might be documentary ma
terials whose validity is considered acceptable by such
a group. If addressed to a group of anthropologists,
the evidence might consist of archaeological materials
whose authenticity meets the standards recognized as
acceptable by this group.

Moreover, it is possible that the statement is accept-
able to a group of historians, but not to professional
anthropologists; or even to one group of historians and
not to another. An example of the latter kind could
concern the validity of a certain alleged miracle, which
might be established quite rigorously according to the
standards of a group of church historians, but not to
those of a lay historical group.

Furthermore, rigor is not just a function of the group
involved, but of time. Standards of rigor notoriously
change with the passage of time. What would have
been considered rigorous by scientists of the year 1800
would certainly not meet the standards set by the
professional scientists of 1900. On the other hand,
standards of rigor do not necessarily become more
stringent with time, since cultures rise and fall, and
standards set by one culture may be forgotten and have
to be recreated or replaced by succeeding cultures. The
classical case of this kind may be found in connection
with the decay of the Hellenic culture and the gradual
ascendency of its successors.

It may be expected, too, that standards of rigor will
sometimes become the subject of profound discussion
amongst members of a group concerned. Examples of
this kind are frequently brought to public attention
in connection with the marketing of new drugs, where
the standards of rigor governing pretesting are fre-
quently bitterly debated between manufacturer's
chemists and those of government agencies. It is prob-
ably not generally realized that similar instances occur
even in mathematics, a field popularly known as the
“most exact” of the sciences; and in which no motives
of a pecuniary nature becloud the issues as is often
the case when commercial interests are involved. A
classic story in mathematical circles relates that one
of the contemporaries of David Hilbert, late professor
of mathematics at the University of Göttingen, Ger-
many, exclaimed upon reading a short and elegant
proof that Hilbert had given, “This is theology, not
mathematics!”—indicating an opinion that the proof
did not conform to accepted mathematical standards.
And this same Hilbert, who was one of the leading
mathematicians of the first third of the twentieth cen-
tury, became engaged in a prolonged debate with the
famous Dutch mathematician, L. E. J. Brouwer, over
what constitutes rigorous methods of proof in mathe-
matics (see below). Such debates are not of rare occur-
rence, and have occurred frequently throughout the
history of mathematics.


171

The development of the concept of rigor in mathe-
matics provides a most instructive and revealing story,
which can be told without becoming involved in eso-
teric technicalities and which has meaningful parallels
in other fields of learning. As one of the oldest sciences,
and especially one in which the concept of rigor has
achieved mature formulations, mathematics has tradi-
tionally been most concerned with standards of rigor;
and the stages through which mathematical rigor has
passed, with attention to cultural influences (internal
and external), give a superb example of the evolution
of a concept (rigor), which in spite of the paucity of
ancient documents, can be observed virtually from its
inception to the present.

Presumably such a concept as rigor was at first only
intuitive, not a consciously realized ideal. The Sume-
rian-Babylonian mathematics was the earliest for which
historical records have been found, although it was not
a separate “discipline” such as it became in the later
European cultures. In it a number of mathematical
formulas and procedures which later became standard
were developed, as well as a system of numerals almost
as sophisticated as our present-day decimal system.
Methods for solving algebraic equations had also been
developed along with a number of geometric formulas.
Most surprising among the latter was the famous “Py-
thagorean theorem,” relating the square on the hypot-
enuse of a right triangle to the squares on the other
two sides—traditionally attributed to the Pythagorean
school which flourished over a millennium later in the
Greek culture. Such materials presumably imply the
development of some kind of standards according to
which these algebraic and geometric ideas became
admissible for those uses (usually commercial) to which
they were put. The nature of these standards is un-
known, but there is no evidence as yet available that
they were as advanced as the methods that developed
in the later Hellenic civilization. They were probably
of an intuitive, traditional nature, although they could
also have embraced certain pragmatic and diagram-
matic tests. For example, if an ancient Sumerian “Ein-
stein” were faced with a problem involving the deter-
mination of the quantity of material needed to erect
a certain structure, he might have found a formula for
the purpose. Then presumably this formula would not
have gained acceptance by his contemporaries without
his first convincing them in some way of its validity.
It can be surmised that this would have been accom-
plished merely by showing that the method
“worked”—that is, that it gave the desired amount of
material, or a reasonable approximation thereto. Or,
if the problem were of a geometric character, demon-
stration of the validity of the formula might have been
accomplished by certain visual methods consisting of
counting pebble arrangements, or of geometric pat-
terns displaying “obvious” properties. This is con-
jecture of course; but since the earliest methods used
by their Greek successors consisted of just such tests
for validity, and since there were cultural contacts
between the later Babylonians and the early Greeks,
it seems a not improbable hypothesis (Neugebauer
[1957], Ch. II).

The course of Greek mathematics, thanks to the
extant traces of the unusual intellectual atmosphere in
which it developed, is somewhat less conjectural. Spe-
cifically, its development within a philosophical milieu
influential in both the Greek and succeeding cultures
resulted in the preservation of more important written
records. Moreover, the circumstances of its evolution
contain suggestions of the manner in which cultural
influences, both environmental and intrinsic, promoted
its development toward increased rigor. This first be-
comes noticeable in the Pythagorean school of the sixth
and fifth centuries B.C. The geographical location of
this school was Croton, in the southeastern section of
Italy. In nearby Elea, the Eleatic school of philosophy
was centered, and one of its foremost exponents,
Parmenides, was apparently associated for a time with
the Pythagorean school of mathematics. Usually, when
two cultural entities meet and mingle, diffusion of ideas
from each to the other occurs. In this case, the cultural
entities were the Pythagorean school of mathematics
and the Eleatic system of philosophy. The cosmological
system conceived by Parmenides was evidently influ-
enced by Pythagorean points of view; on the other
hand, the Pythagoreans could have become acquainted
with the dialectic of the Eleatics, one of whose features
was indirect argument (Szabó, 1964).

If such was the case, we have here one of the earliest
examples of concepts external to mathematics com-
bining with intrinsic mathematical needs to produce
a method promoting greater rigor of proof. Up to this
time, Pythagorean methods of proof had not advanced
much further than the primitive diagrammatic methods
termed “visual.” By arranging objects, such as pebbles,
in simple geometrical arrays, a number of elementary
formulas had been discovered by direct observation.
In other instances, the use of superposition—moving
one geometric configuration into coincidence with
another—was employed. Some have conjectured that
the first proofs of the Pythagorean Theorem were
accomplished in this way. While such methods were
well adapted to the discovery of simple arithmetic and
geometric facts, they were not as conclusive as the
deductive methods which came into use and which
were possibly influenced by adaptation of the Eleatic
dialectic to mathematical proofs. The previous primi-
tive methods could never have sufficed to prove certain


172

geometric facts which are completely inaccessible to
visual methods, as, for example, the existence of in-
commensurable line segments; i.e., line segments for
which there exists no common unit of measurement,
such as the side and diagonal of a square. We can
conjecture that the Pythagoreans began to suspect the
existence of incommensurable segments and realized
the inadequacy of their traditional proof methods. If
such was the case, there was thereby set up an intrinsic
motivation to find a more rigorous type of reasoning.

It appears likely that the proof of incommensura-
bility of the side of a square with its diagonal was one
of the earliest, if not the earliest, to make appeal to
the dialectic method. For this proof, it was necessary
to show that there cannot exist any unit of length, no
matter how small, that will exactly measure both the
side of a square and its diagonal. A geometric fact of
this kind cannot be handled by visual methods, since
the stipulation “no matter how small” places it beyond
the range of human perception. However, if the as-
sumption that by using some sufficiently small unit of
length, both the side and diagonal of a square can be
exactly measured, can be shown to lead to contra-
diction, then one may conclude that such a unit of
length cannot possibly exist; i.e., that the side and
diagonal are incommensurable. (Later, the basis for this
type of argument was formulated by Aristotle in the
Law of Contradiction: “Contradiction is impossible”
or more explicitly, “No proposition can be both true
and false”; and the Law of the Excluded Middle:
“Every proposition is either true or false.” Thus, the
proposition that there exists a common unit of measure
for the side and diagonal of a square is either true or
false; and since its truth is untenable, having been
shown to imply contradiction, it must be false.)

Such arguments are called “indirect” forms of
proof—later called “reductio ad absurdum.” They
produce a conviction not attainable by visual argu-
ments, which are always open to the objection that
they cover only particular cases and may be the result
of illusory perceptions. Consequently they soon became
standard in Greek mathematics, not to be matched in
quality of rigor by visual methods. Indeed, it soon
became the rule that no longer was a mathematical
formula or method to be accepted because it always
seemed to work in particular cases (in Plato's dialogues,
Socrates frequently rejects a definition of a concept
like justice by enumeration of particular cases falling
under it, and demands an essential or universal prop-
erty); it must be proved by a logical argument such
as that of the indirect type. Rigorous proof came to
be synonymous with proof by logic.

Although not all historians agree on the details of
the above interpretation of the available historical
literature, the evidence strongly implies that as a result
of a need internal to mathematics combined with the
existence of a philosophical dialectic in the culture
external to mathematics, greater rigor was achieved
in Greek mathematics. Moreover, this was possibly not
the only case in which mathematical rigor was in-
debted to influences external to mathematics. For ex-
ample, Zeno of Elea, a pupil of Parmenides, had been
led by his work on the extension of his master's philos-
ophy to a series of paradoxes which were ultimately
recognized to be of fundamental importance to mathe-
matics, in that they raised questions concerning the
continuous character of the straight line. For instance,
if the line is made up of points, and a point has no
length, then how can a line have length? Zeno also
argued that motion in a straight line would be impossi-
ble since an object could never get from one point
to another. Again, historians differ in their opinions
regarding whether Zeno's work was influential or not
in the development of ancient mathematical thought,
but it may have been in the effort to get around such
difficulties that mathematicians came to realize that
the vague intuitive conceptions on which geometry had
been based must be replaced by an explicit set of
assumptions which embodied the intuitive “facts” on
which proofs could be based. The fourth-century (B.C.)
mathematician Eudoxus was most prominently iden-
tified with this accomplishment, and it is generally
agreed that a considerable part of Euclid's Elements
stems directly from Eudoxus' work. In the Elements
the basic assumptions are called “axioms” and “postu-
lates,” and the proofs display the mature form of which
the indirect method was the first example. These
proofs, ultimately called proofs by logical deduction,
demonstrate that by “taking thought” alone, one can
establish the validity of an assertion covering infinitely
many special cases. Another type of reasoning, impor-
tant heuristically (as a method of discovery), by “anal-
ysis,” used the device of first assuming the truth of the
assertion to be proved in order to ascertain its conse-
quences; if these consequences consisted of basic as-
sumptions (axioms or postulates) or previously proved
assertions, then it was sometimes possible to reverse
the process by showing that the consequences had the
desired assertion as one of their consequences.

The Greek philosopher Aristotle made a noteworthy
study of logical deduction, arriving at general frame-
works for the methods involved which were applicable
to all fields of study, not just to geometry. He proposed
a general definition of a demonstrative science which
became a model for centuries of later scientific work.
According to this definition, a demonstrative science
should consist of a collection of basic assumptions, and
of the theorems which these assumptions imply (Beth,


173

1959). The process of implication should utilize the
various forms of logical deduction set forth in Aris-
totle's study of argumentative methods.

So far as rigor is concerned, little further significant
progress was made until the nineteenth century, when
a combination of circumstances, bearing a curious
resemblance to those which seem to have brought
about an increase in rigor during the Greek era came
into play. These circumstances developed in the fol-
lowing manner.

During the period which followed the Greek decline,
mathematics underwent an extensive development and
evolution in both symbolic and conceptual content. In
arithmetic, the remarkable number system of the
Sumerian-Babylonian culture evolved essentially into
the decimal system used today. Although the numerals
used by the Babylonians were cumbersome (due, per-
haps, to the necessity of having to adapt them to the
use of the stylus and baked clay media), their place
value system in which the “value” of a single digit
depended on its position (“place”) within the numeral
was the same as that used in the decimal system. (It
lacked a true zero, but this was clearly evolving by
the end of the Babylonian era.) However, the symbolic
algebra which we now use was a product of the later
European cultures. And (in the seventeenth century)
it was the imposition of this algebra on the geometry
bequeathed by the Greeks which resulted in analytic
geometry, and enabled Newton and Leibniz to crystal-
lize their ideas on the calculus. Although Newton and
Leibniz are popularly credited with creating the cal-
culus, what they essentially did was to synthesize, in
symbolic form, concepts that had been developed by
a host of predecessors going back to the Greeks (Boyer,
1949; Rosenthal, 1951; Bochner, 1966). This achieve-
ment was a breakthrough whose motivation lay at
least as much in the search for a medium in which to
express natural laws as in the desire to bolster the
purely symbolic aspects of mathematics: in short, in a
combination of cultural and intrinsic mathematical
stresses.

However, the success of the symbolic machinery set
up by Newton and Leibniz was so great that it went
beyond the conceptual background; symbols and oper-
ations with them were created for which no one could
give a satisfactory meaning, although results achieved
with them generally justified their invention. They
passed the pragmatic test but flunked the conceptual.
As a result, that vaunted rigor for which mathematics
had been praised from the time of the Greeks was now
lacking, and there ensued a field day for philosophical
critics (such as the renowned Bishop Berkeley, who
called Newton's infinitesimals “the ghosts of departed
quantities”), not to mention the uncomfortable feeling
that the mathematical defenders of the new calculus
could not escape.

Actually, this lack of conceptual justification was not
a new phenomenon in mathematics in those areas
where the conditions laid down by Aristotle for a
demonstrative science had not been met. Consider the
ordinary arithmetic of the integers, for example; no
satisfactory conceptual background had ever been fur-
nished for it. But this caused little concern and there
is little evidence that anyone was aware of the lack
until quite recent times. True, some qualms were expe-
rienced by the introduction of negative numbers, which
for centuries had been toyed with but rejected as “ficti-
tious,” even after their use became common in the
seventeenth century. The conceptual basis for the
nonnegative integers was purely intuitive, but they had
been in use for untold centuries and had achieved
cultural acceptability—that is, as meeting the demands
of the rigor of the day. But the extension to negative
numbers was purely formal—a symbolic achievement
embodying such operational rules as the laws of signs,
but otherwise having no conceptual justification.
Moreover, no axiomatic basis satisfying Aristotle's con-
ditions was given for them until the late nineteenth
and early twentieth centuries (Landau, 1951).

A similar situation prevailed concerning complex
numbers of the form a + bi (where i stands for the
“imaginary” √-1) encountered in elementary prob-
lems such as the solution of quadratic and cubic equa-
tions (a and b being “real” numbers). These numbers
and arithmetical operations with them were success-
fully carried out for several centuries, although a satis-
factory conceptual background was not provided until
the twentieth century. Intuitive bases of a geometric
nature did develop for them much earlier, but by that
time geometry was coming to be no longer accepted
as a basis for numerical theories.

Thus the introduction of a new symbolic apparatus
like the calculus should “logically” not have caused
such concern, so long as it passed the pragmatic
test—which it certainly did. Of course it lacked the
long traditional background possessed by the natural
numbers, but this was also true, possibly to a lesser
extent, in the case of the negative integers and the
complex numbers. However, an idea had become
prominent which, although not strictly new in mathe-
matics, had nevertheless not caused much concern
since Eudoxus devised his theory of proportion. This
was the concept of the infinite. It intruded into all the
basic conceptions offered as an explanation of the new
calculus, and occurred in two opposing forms; the
“infinitely small” and the “infinitely great.” Attempts
at clarifying the basic concepts of the calculus, such
as that of the derivative of a function, made appeal


174

to the “infinitely small,” or “little zeroes,” and were
quite unconvincing (even, one sometimes suspects, to
those who devised them). And although the axiomatic
method of the Greeks enjoyed quite a vogue at the
time (Leibniz had used it in arguments of a political
and military nature), notably in social and philo-
sophical theories (as, for instance, by Spinoza), there
seems to have been little effort to use it as a means
for giving the calculus a firm basis.

Although appeal to the axiomatic method had to
await the latter part of the nineteenth century, certain
notable contributions to the rigorous development of
the calculus were made earlier. Chief among these was
that of A. Cauchy, whose Cours d'analyse... (1821)
gave the basic ideas of the calculus a quite rigorous
treatment, making no appeal to such vague notions as
“little zeroes.” In other fields of mathematics, the
realization was growing that the axiomatic method
offered an acceptable path to greater rigor. This was
helped by the accompanying realization that the num-
ber systems which had achieved mathematical accept-
ance either through tradition or by special needs were
not the only ones that could be devised. Similarly, the
geometry of Euclid was not the only type of geometry
that provided a consistent description of physical
space. The result of such considerations was the incep-
tion of new algebras and geometries, all rigorously
defined by means of the axiomatic method in the
Aristotelian tradition. Although these developments
had many consequences, the one of greatest importance
for present purposes was the casting into prominence
of the problem of consistency. How could one be
assured that all these algebraic and geometric systems,
frequently mutually incompatible (as, e.g., Euclidean
and non-Euclidean geometries), were within them-
selves consistent systems? For certainly if a mathe-
matical system harbored contradiction, then it could
not have been rigorously developed. In this way, rigor
and consistency began to be associated; that which is
rigorously structured ought not to be inconsistent, and
systems that turn out to be consistent must ipso facto
be the result of rigorous formulation.

In contrast, the calculus was still based on fuzzy
notions of a number system which, in addition to the
ordinary integers and fractions, contained irrationals
such as √2, π , etc. This number system, known techni-
cally as the real number system, had grown as new
accretions were needed. With the introduction of ana-
lytic geometry, it had been given a more satisfactory
intuitive background through association with the
straight line. By selecting an arbitrary point P on some
fixed line L as a representative of zero, the points to
one side of P were associated with the positive real
numbers in increasing order, and those to the other
side with negative numbers, each negative number
being the same distance from P as its positive counter-
part on the other side of P (Figure 1). It became intui-
tively evident that to each point of L corresponded
a unique real number in this manner, and that in
problems of the calculus appeal could be made to this
linear structure, considered as equivalent to the system
of real numbers. Proofs were given which made use
of this geometric concept and it gradually became clear
that the amount of geometric intuition employed in
the proofs of theorems of the calculus was exceeding
the limits imposed by new standards of rigor. This was
made all the more evident by the fact that many of
the geometric facts used to substantiate numerical
statements were frequently the same “facts” that had
seemed so evident to the Greeks that they had never
been adequately established in geometry. In short, they
had no firm basis either numerically or geometrically.

This unsatisfactory state of affairs became all the
more pronounced as the calculus gradually grew into
what is now termed classical analysis, which embodied
not only the advanced ideas owing to the successors
of Leibniz and Newton, but also a theory whose foun-
dation was the system of complex numbers geometri-
cally represented by the points of a Euclidean plane.
This growth of analysis was not just an internal evolu-
tion, influenced only by mathematical considerations,
but was in large measure due to the needs of physical
theories (Bochner, 1966). Of great importance was the
work of a French mathematical physicist Baron Joseph
Fourier (1768-1830), who was not a professional
“pure” mathematician—but was one who, in the opin-
ion of one historian (Bell [1945], p. 292), “had almost
a contempt for mathematics except as a drudge of the
sciences.” Being quite uninhibited by such qualms as
would have (and did—see Bell [1937], pp. 197-98)
beset a pure mathematician, he proceeded to set up
mathematical tools whose chief virtue was apparently
that they were suited to the needs of such studies as
the theory of heat. In particular they involved infinite
processes which had little rigorous foundation and
which stretched to its limits that geometric intuition
upon which mathematical analysts were wont to rely.

As so often happens, mathematicians found them-
selves confronted, much as in the case of the basic
notions of the calculus (which had by now been essen-
tially cleared up by Cauchy), with new symbolic and
operational apparatuses which could not be ignored.
It was not just that they seemed to prove their worth


175

in applications to physical theories—if this were their
only compensating feature, they might well have been
left to the whimsies of physics—but they rapidly
offered ways in which to treat purely mathematical
problems as well as suggestions for new concepts or
expansions of already existing concepts (such as that
of function). And to accept them meant, again, to find
a rigorous foundation for them.

Thus the growth of mathematical analysis brought
mathematics to face much the same types of problems
as had confronted the ancient Greeks and which were
solved by such innovations as Eudoxus' theory of pro-
portion. In particular, it was necessary to replace the
largely intuitive conception of the structure of the real
number system by a precisely formulated axiomatic
system which would serve as a satisfactory base upon
which to found analysis. Such a foundation would, one
hoped, not only settle once and for all whether the
types of series, and functions related thereto, “worked”
in applications because of accidental circumstances or
whether they could be shown, by logical deduction
from the new foundation, to be mathematically sound.

The solution of the latter problem was found, as
anyone familiar with the way in which mathematics
evolved would expect, by several independent investi-
gators (Meray, Dedekind, Weierstrass, G. Cantor).
Although their solutions were not precisely the same,
they turned out to be equivalent (in the sense that each
could be derived from the others). And one now re-
joiced in the feeling that the one apparently remaining
insecure part of mathematics had been given a secure
foundation; and mathematics could resume its course
presumably assured of having once again achieved a
rigor safe from all criticism.

But this feeling of security was not to last long. As
usually happens when mathematics makes a great ad-
vance, new insights are achieved regarding concepts
which had long been taken for granted. A mathe-
matician of ancient Greece, for instance, knew per-
fectly well that a line joining a point exterior to a circle
to a point interior to the circle would have to intersect
the periphery of the circle; it was self-evident, and
needed no justification. Nevertheless, the time arrived
(during the nineteenth century) when it was forced
upon one that justification really was necessary if the
demands of modern rigor were to be met. Similarly,
one had no qualms in speaking of a “collection” of
numbers or geometric entities; for instance, no one
would object to speaking of the collection—the term
“set” is more in vogue today—of numbers that were
solutions of an algebraic equation. Correspondingly,
one might speak of the set of all even numbers, or
the set of all odd numbers. True, the latter sets each
contain infinitely many numbers, whereas the number
of solutions of an algebraic equation is finite. Never-
theless, the use of the words “set” and “collection”
was felt to be the same as their use in the physical
world. To speak of a collection of people or a set of
chairs is an ordinary usage of the natural language.
And although mathematics had become increasingly
symbolized over the centuries, employment of the
natural language (as in the statements of the axioms
and theorems of Euclidean geometry) continued to be
acceptable.

However, this apparently innocent use of the notion
of a collection turned out to be another case of a
concept borrowed from the general culture and put
to use in mathematics in ways never before dreamed
of. Not only was it used to define such a basic notion
as number (theretofore taken for granted, but whose
extension to numbers for infinite sets plainly demanded
definition), but it lay at the heart of the foundation
of mathematical analysis. It was inevitable that a study
of the concept for its own sake would become neces-
sary, and this was finally undertaken by the German
mathematician Georg Cantor during the latter half of
the nineteenth century. Symptomatic of the lack of
interest or concern generally felt by mathematicians
of his time, however, was the fact that most of Cantor's
contemporaries at first considered his researches as
neither mathematically justified nor even “good”
mathematics. Some of his colleagues considered that
Cantor was transgressing the bounds of what could be
called “mathematics.” Fortunately Cantor persisted in
his researches, and not only did they lead to a full-
blown theory of great inherent interest, but its appli-
cations to such problems as those bequeathed by Four-
ier proved unexpectedly fruitful. By the end of the
century, his ideas were coming to be generally ac-
cepted, and the Theory of Sets was well on the way
to becoming an established mathematical discipline.

About the same time, the German mathematician
and logician G. Frege was turning his attention to the
problem of furnishing a rigorous foundation for the
arithmetic of integers. He was convinced that all
mathematics could be derived from logic and thus
rendered free of all criticism regarding its lack of rigor.
In showing this, he did not hesitate to use the notion
of set, which he apparently felt to be itself rooted in
logic. From a somewhat different point of view, both
Dedekind and the Italian logician Peano (ca. 1890)
gave an axiomatic foundation for the system of natural
numbers from which, again using set theory, the real
number system could be derived.

As a result of these researches, the mathematical
world came to consider, around the turn of the century,
that mathematics had at last been placed on a rigorous
foundation, and that all criticism of the foundations


176

of analysis had been met. Symptomatic of the general
feeling were the words of the renowned French math-
ematician, Henri Poincaré, in an address at the Inter-
national Congress of Mathematicians of 1900: “We
believe that we no longer appeal to intuition in our
reasoning.... Now, in analysis today, if we take the
pains to be rigorous, there are only syllogisms or ap-
peals to the intuition of pure number that could possi-
bly deceive us. It may be said that today absolute rigor
is attained” (Bell [1945], Ch. 13; also see Poincaré
[1946], pp. 210-22).

It is doubtful if Poincaré could have been aware,
at the time he uttered these words, that contradiction
had already been discovered in the theory of sets
(communication between various national mathe-
matical groups was rather poor at that time). In the
unrestricted use of set-theoretic methods in the realm
of the infinite, contradiction had been, and was being
found.

The earliest attempt to meet the situation was to
call again upon the axiomatic method for help. The
first set of axioms for the theory of sets was given in
1908 by the German mathematician Ernst Zermelo.
Thus the apparently innocent notion of set, universally
used in common discourse, and having come into
mathematics because of the use of the natural language,
became the central concept of a mature mathematical
theory, deserving of axiomatic foundation in the same
way that geometry had been axiomatized by the
Greeks. And much as the Greeks succeeded in avoiding
the difficulties posed by the discovery of incommensu-
rable magnitudes, so did the axiom system of Zermelo
promise to avoid the contradictions to which the un-
restricted notion of set had led. Unfortunately there
was no guarantee that it would suffice to avoid all
possible contradictions; that is, there appeared no way
of proving Zermelo's system consistent, even though
the axioms in themselves seemed to restrict the theory
of sets sufficiently to avoid contradiction. One could
no longer assert, consequently, that mathematics had
attained that absolute rigor which Poincaré had cited.

Concurrently with the axiomatization of the theory
of sets, other approaches were made to the problem
of giving mathematics a rigorous foundation, and for
a time three distinct “schools of thought” emerged
(Wilder [1965], Chs. 8-11). One of these, associated
with the name of Bertrand Russell, but actually pre-
sented in the monumental Principia Mathematica
(1910-13) of A. N. Whitehead and B. Russell, followed
a path based conceptually on Frege's ideas and sym-
bolically upon Peano's work. The central thesis of the
Whitehead-Russell doctrine was again that mathe-
matics could be founded on logic. But it developed
that in order to build a secure theory of number, free
from contradiction, axioms had to be introduced which
had not only never been part of classical logic, but
were obviously framed solely to suit the needs of
mathematics. Moreover, they did not have the charac-
ter of universality that one might expect of an axiom
of logic, but were clearly manufactured to meet a
special situation. Consequently, although the White-
head-Russell “school” acquired a sizable following for
a time, it had only a limited life. Nevertheless, the
central theme—that mathematics is derivable from
logic—persisted, and the Principia Mathematica has
continued to be a source of both inspiration and sym-
bolic modes for workers in the foundations of mathe-
matics and logic.

In particular, the so-called “Formalist School,”
starting under the leadership of the great German
mathematician David Hilbert, adopted a symbolism
obviously inspired by that of the Principia Mathe-
matica.
However, the motivating philosophy of this
school was not that mathematics is derivable from
logic, but that all of mathematics could be formulated
in a symbolic framework which, although formally
meaningless, could be interpreted by mathematical
concepts and shown to be consistent. More specifically,
it was Hilbert's idea to set up certain axioms using
symbols alone and no words from the natural language,
along with a set of rules which, although not an in-
trinsic part of the symbolic system and couched in the
natural language, would specify how theorems could
be derived from the axioms. The object of this program
was to show that a symbolic system could be set up
which would, when interpreted by mathematical con-
cepts, give all of mathematics, and which could be
shown would never give the formula for a contra-
diction. In this way, it was hoped that absolute rigor
could be established.

Meanwhile a distinctly different and radical ap-
proach to the problem of rigor was being promoted
by the Dutch mathematician L. E. J. Brouwer (who
was influenced by the ideas of the nineteenth-century
mathematician L. Kronecker). Brouwer maintained
that mathematical concepts are intuitively given and
that language and symbolism are necessary only for
communicating these concepts. The intuition basic to
mathematics, according to Brouwer, is that of stepwise
progression as in the passage of time, conceived as one
instant following another; for mathematics, the basic
intuition gives the sequence of natural numbers: 1, 2,
3,.... All mathematics must be constructed on the
basis of this sequence. In particular, “existence” of a
mathematical concept depends upon such a con-
struction; appeal to the logical Law of the Excluded
Middle to prove the existence of a mathematical entity,
involving showing that assumption of nonexistence
leads to contradiction, is not acceptable, for example.
Brouwer called the resulting philosophy “Intuition-


177

ism.” According to its tenets the complete set of real
numbers does not exist, since it cannot be built up from
the natural numbers without using certain axioms of
the theory of sets which are not constructive and hence
are not admissible to the Intuitionist. The contra-
dictions encountered in the “orthodox” mathematics
are due not to the use of the infinite per se, but to
the “unjustified” extension of the laws of logic from
the finite to the infinite. By using constructive methods
only, these contradictions are avoided.

While the Intuitionist contention that their methods
yielded an absolutely rigorous mathematics was appar-
ently correct, unfortunately only a portion of the
mathematics which had been built up during the pre-
ceding three centuries was attainable by these methods.
Acceptance of the Intuitionist path to absolute rigor
meant, then, giving up concepts which had not only
proved their usefulness but had become firmly im-
bedded in the culture. It is not surprising, therefore,
that Intuitionism attracted few converts, and that the
major part of the mathematical community looked for
another way out of the difficulties posed by the contra-
dictions.

Later attempts to establish an absolutely rigorous
mathematics, employing chiefly the methods of formal
axiomatics, have revealed that such a concept as abso-
lute rigor is apparently an ideal toward which to strive,
but one that is in practice unattainable except in cer-
tain limited domains. It is in much the same category
as such an intuitively conceived abstraction as abso-
lutely perfect linear measurement; no matter how
much more precise measuring instruments are made,
it is in practice unattainable. This does not imply that
certain restricted portions of mathematics are not rig-
orously founded; quite the contrary. It applies chiefly
to those parts of mathematics in which the (infinite)
theory of sets is employed. Moreover, logic itself has
been revealed as only an intuitive cultural construct
which gives rise to the same kind of problems and
variations as mathematics when subjected to formal
symbolic analysis (Beth, 1959).

In the natural sciences such as physics, chemistry,
and zoology, at least in their experimental aspects, the
amount of rigor attainable is dependent upon technical
factors such as measuring devices, and will increase
as the related technology becomes more precise. Simi-
lar conclusions hold in the social sciences. Both cate-
gories of science—natural and social—tend toward
greater mathematization as they develop; and so long
as the portions of mathematics which they employ can
be shown to be rigorous, they will not be affected by
the types of difficulty still encountered in the parts of
mathematics dependent upon general set theory.

It must be recognized, too, that a sizable group of
mathematicians of Platonistic persuasion take the view
that mathematics simply has not yet advanced far
enough to be able to cope with such vexing questions
as arise in modern set theory; that the “truth” con-
cerning these is still a matter for investigation and that
their rigorous solutions are still attainable. The situa-
tion is much like that of a natural scientist who believes
that the “laws” of nature as presently formulated are
only an approximation to the true situation which
exists. Whether this “true” situation will ever be dis-
covered, or even whether it can be formulated in
linguistic or mathematical terms if it does exist, he
cannot say. Similarly, the mathematician who feels that
rigorous mathematical truth does exist must admit, in
the present state of knowledge, that it may never be
possible to attain it.

BIBLIOGRAPHY

E. T. Bell, The Development of Mathematics (New York,
1945); idem, Men of Mathematics (New York, 1937). E. W.
Beth, The Foundations of Mathematics (Amsterdam, 1959).
S. Bochner, The Role of Mathematics in the Rise of Science
(Princeton, 1966). C. B. Boyer, The History of the Calculus
and its Conceptual Development
(New York, 1949). E. G.
H. Landau, Foundations of Analysis (New York, 1951). O.
Neugebauer, The Exact Sciences in Antiquity (Providence,
1957). H. Poincaré, The Foundations of Science (Lancaster,
Pa., 1946). A. Rosenthal, “The History of Calculus,” Ameri-
can Mathematical Monthly,
58 (1951), 75-86. A. Szabo, “The
Transformation of Mathematics into Deductive Science and
the Beginnings of its Foundation on Definitions and
Axioms,” Scripta Mathematica, 27 (1964), Part I, 24-48A,
Part II, 113-39. R. L. Wilder, Introduction to the Founda-
tions of Mathematics,
2nd ed. (New York, 1965).

RAYMOND L. WILDER

[See also Axiomatization; Continuity; Infinity; Number;
Pythagorean...; Relativity.]