University of Virginia Library

Search this document 
Dictionary of the History of Ideas

Studies of Selected Pivotal Ideas
  
  
expand section 
  
expand section 
  
  

expand sectionVI. 
expand sectionV. 
expand sectionVI. 
expand sectionI. 
expand sectionVI. 
expand sectionV. 
expand sectionIII. 
expand sectionIII. 
expand sectionVI. 
expand sectionVI. 
expand sectionV. 
expand sectionV. 
expand sectionIII. 
expand sectionVII. 
expand sectionVI. 
expand sectionVI. 
expand sectionIII. 
expand sectionIII. 
expand sectionII. 
expand sectionI. 
expand sectionI. 
expand sectionI. 
expand sectionV. 
expand sectionVII. 
expand sectionVI. 
expand sectionV. 
expand sectionIII. 
expand sectionIII. 
expand sectionIII. 
expand sectionII. 
expand sectionI. 
expand sectionI. 
expand sectionI. 
BIOLOGICAL MODELS
expand sectionVI. 
expand sectionVII. 
expand sectionIII. 
expand sectionVII. 
expand sectionVII. 
expand sectionVII. 
collapse sectionV. 
  
  
  
  
  
  
  
  
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionVII. 
expand sectionIII. 
expand sectionIV. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionV. 
expand sectionV. 
expand sectionV. 
expand sectionIII. 
expand sectionIII. 
expand sectionVII. 
expand sectionIII. 
expand sectionI. 
expand sectionV. 
expand sectionV. 
expand sectionVII. 
expand sectionVI. 
expand sectionI. 
expand sectionI. 
expand sectionI. 
expand sectionI. 
expand sectionVI. 
expand sectionIII. 
expand sectionIV. 
expand sectionIII. 
expand sectionIV. 
expand sectionIV. 
expand sectionIV. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionV. 
expand sectionIII. 
expand sectionVI. 

BIOLOGICAL MODELS

The making of models, so much a part of all the natural
sciences, and increasingly of the social sciences, has
been a central feature of the development of biology.
Indeed, modern biology springs from that ur-model,
the bête-machine, described in 1637 by René Descartes:

If there were a machine that had the organs and the external
features of a monkey, or some other dumb animal, we would
have no way at all of knowing that it was not, in every
aspect, of the very same nature as those animals

(Discours
de la méthode,
Part V).

For Descartes, even man would be indistinguishable
from such an automaton if it were not for his power
of communicating and apprehending complex thoughts
by means of speech. Even this exception to the machine
model of living organisms was challenged early in the
history of biology with the publication of La Mettrie's
L'Homme machine (ca. 1750).

The machine-animal, and its extension the machine-
man, are more than simply examples of model and
metaphor in biology; they are at the basis of all model-
making, for they are a statement of an underlying
relation between effects and causes in living organisms.
It was precisely the element of will and the infinite
variety of personal and idiosyncratic response to exter-
nal conditions that led Descartes to exempt man from
the constraints of such a model. Animal behavior
seemed to him stereotyped, without variety, and totally
predictable from external conditions, so that for beasts
the relation between cause and effect was unbroken.
Not so for man, whose essential nature lay in his free
will. It seems reasonable, on the other hand, that it
was La Mettrie's Jansenist training, with its heretical
denial of free will, that made possible his inclusion of
man in the realm of automata. As we shall see, the
Cartesian view of cause and effect, while an integral
aspect of model-making for a large part of modern
biology, has been replaced, for some problems, by a
weaker form of relationship among events. These are
the so-called “stochastic” theories in which the rela-
tionships between events are described by probability
statements rather than by exact one-for-one corre-
spondence. The influence of Cartesianism is very
strong, however, and stochastic theories are admitted
only slowly, grudgingly, and with a certain condescen-
sion toward those “inexact” fields of biology that seem
to require them. As in nineteenth-century physics, some
feel that uncertainty in a system is a reflection of
epistemological rather than ontological properties so
that with the aid of a “Laplace's demon” it would be
possible to reformulate stochastic theories in com-


243

pletely deterministic form. Thus, after 300 years,
Descartes' original metaphor maintains its powerful
influence on model-making in biology.

WHAT IS A MODEL?

Biologists, like other scientists, use the notion of
model in a host of ways. At one extreme they may
mean a scaled-up, three-dimensional wire and plastic
representation of a cell. At the other, they may mean
an abstract structure such as a “mathematical model”
of evolution which may consist only of a set of differ-
ential equations. In what sense can the wire and plastic
cell be said to be the same kind of structure as a set
of differential equations? And in what sense are they
like a mouse, which is said to be a “model organism”
for the study of certain physiological or genetical
problems?

The similarities of these models can best be under-
stood by beginning with the most abstract. The basic
theory of evolutionary genetics is well worked out,
based on a detailed knowledge of the mechanics of
inheritance and certain empirical information about
the biology of reproduction and survival of a variety
of organisms. This theoretical superstructure is suffi-
ciently complex that quantitative predictions of the
outcome of evolutionary changes cannot be made by
inspection. In order to find out what is entailed by the
theory in any particular case, a model is built. First,
the theory is abstracted and is framed in terms of a
logical flow diagram involving dummy variables and
logical relations among these variables. Then this logi-
cal structure is realized by a program for a computer,
or a set of matrices and matrix operators, or a system
of difference or differential equations. All these real-
izations are isomorphic with each other and with the
original logical structure.

A second possibility is that a series of resistors, ca-
pacitors, and other electric and electronic devices is
used to make a physical model such that these electri-
cal elements produce a set of electrical quantities
(current, voltages) that behave isomorphically with the
dummy variables of the abstract system. Alternatively,
a “model organism” may be employed, like the fruit
fly, Drosophila, which is thought to be a specific real-
ization of the same general principles as are expressed
in the abstract representation of the original theory.
In fact, with a physical analogue as complex as a
“model organism,” the explicit construction of the
abstract system that served as the pattern for the
mathematical realizations may never be made. Rather,
the model organism is assumed to embody the general
properties of the system being modelled and, in fact,
the general theory of evolutionary genetics supposes
that all organisms embody certain general relations
which are the subject of investigation. A model orga-
nism can then be used, and often is, in the absence
of a well worked-out theory of a biological process
on the assumption that general biological similarities
between the systems will guarantee isomorphism with
respect to the specific aspects under investigation.

The differences between the mathematical model,
the electronic model, and the model organism as real-
izations of the underlying abstractions, are of great
importance. The physical entities in the latter two
kinds of models carry with them certain intrinsic prop-
erties that are different from those in the original being
modelled. That is, these physical realizations are meta-
phorical
and their iconic elements can be a source of
serious difficulty. The physical realizations were chosen
because some set of their properties was isomorphic
with some set of properties of the original system or
theory. In the case of the electronic analogue, the
theory of capacitors, resistors, and vacuum tubes is so
well understood and the empirical properties of these
objects are so different from the system being modelled
that there is no danger of confusion from the meta-
phorical elements. That vacuum tubes glow, get hot,
break when jarred, make a tinkling sound when
knocked together, will in no way confound the biolo-
gist since he is unlikely to confuse these properties with
the properties of evolving organisms. In the case of
model organisms, however, the danger is very great,
because the metaphorical elements introduced by these
organisms are in some cases so subtly different from
the properties being modelled, that they cannot be
distinguished, yet they produce great distortions.

Moreover, since such metaphors are often introduced
without an explicit laying out of the abstract system
of which the model should be a realization, there is
no clear way of differentiating relevant from irrelevant
from obfuscating properties of the model. For example,
Drosophila was used for a long time as the model for
the genetic mechanism of sex determination in man,
because of a general similarity of genetic mechanisms
between flies and man. But this has turned out to be
completely wrong, and the conclusions that arose from
this model were erroneous. This danger does not arise
only from using complex organisms as realizations. The
“digital computer” model of the central nervous system
has been one of the most misleading and possibly
harmful examples of allowing metaphorical properties
to intrude. In this case such properties as redundancy
checks, topological relationship between physical ele-
ments and conceptual elements, and the bit structure
of information characteristic of electronic digital com-
puters, although all metaphorical, were taken to be


244

isomorphic with the elements of the central nervous
system, whereas they certainly are not. It is for this
reason that an explicit abstraction of the original sys-
tem, followed by a realization of that abstraction in
either an abstract or physical form is much preferable
to modelling by generalized and “felt” analogy.

There is some confusion in biology between “models
of” and “models for.” The isomorphisms with particu-
lar biological systems are “models of.” But models in
the sense of ideals or patterns, like the “model of a
modern major general,” are also found in the biological
literature. The essential difference is in their epistemo-
logical status. “Models of” are not intended, especially
when they are abstract, as contingent. They are ana-
lytic isomorphs of some phenomenon or system. They
may be good or bad models as they are perfect or
imperfectly isomorphic, but they cannot be said to be
true or false. On the other hand, “models for” like the
logistic model of population growth or the Lotka-
Volterra model of species competition, or the gradient
model of development, are taken as statements about
the real world, as contingent, and are in fact assertions
about the way organisms really behave. Sometimes
such models (patterns) are introduced as examples of
how nature might be, but they very soon become reified
by biologists. Such “models” are most common in those
branches of biology where there is little or no theoret-
ical basis. In these cases it is quite proper to speak
of “testing the model” since it is not really a model
but a hypothesis. Unfortunately, confusion between
these two senses of “model” sometimes results in an
attempt to test a “model of” which always results in
a vacuous “confirmation” of the model. Since the
model is analytic, it must be, and always is, confirmed.
Some such “tests” of analytic models go on in popula-
tion biology, where a model organism is placed under
extremely well controlled experimental conditions so
that it realizes a mathematical structure that has been
solved. If the mathematics has been done competently,
the model, which is a living computer, confirms it. But,
of course, no “test” (except of the competence of the
investigator) has been performed.

FUNCTIONS OF MODELS

Model-making in biology serves much the same set
of functions as in any science. The special complexity
of many biological systems and the apparent diversity
of biological phenomena place a rather different
weight on various functions of modelling in biology
than in the physical or social sciences.

1. Models have an experimental convenience. Fruit
flies are easier to breed than man, and the giant axon
of the squid is easy to manipulate in neurophysiological
experiments. The very great delicacy of many biologi
cal materials and especially the idiosyncrasies of each
species, make the search for “model organisms” in
which particular features are convenient for investi-
gation one of the outstanding features of research
strategy. Most important advances in biology depend
upon finding just the right “model organism” or “model
system” for investigation. The best known example is
the dependence on the use of bacteriophage for the
development of molecular genetics (Cairns, Stent, and
Watson, 1966). This case also shows how unique aspects
of the model system itself, its metaphorical content,
can distract attention from the central features of the
realization. A great deal of the research on bacte-
riophage is now concerned with the peculiar properties
of this parasite interacting with its host, properties that
are irrelevant or even misleading for general genetical
problems. At the present time a determined search is
under way for a model organism for the study of the
molecular and micro-anatomical basis of central nerv-
ous system function, based on an explicit list of desira-
ble model properties.

2. Second in importance for models in biology is
the function of computation. An increasing number of
biological theories are framed in terms of numerically
quantified variables. Even when the variables are
themselves qualitative (“on” vs. “off” in nerve firings,
“male” vs. “female,” “gene A” vs. “gene a”), many
theories are probabilistic and provide only probability
distributions for the states, even for fixed inputs. The
computation of such probability distributions cannot
be carried out by observing nature since only single
realizations of particular input sets exist. It is not pos-
sible, for example, to build and check a quantitative
theory of population extinction by observing actual
populations becoming extinct. Extinction is reasonably
rare in nature and no two populations have the same
starting conditions or environments. A theory of extinc-
tion is expressed in terms of a very large number of
variables including certain stochastic inputs from the
physical environment. Such theories are modelled by
analogue or digital computer programs in which there
is no metaphorical element and the model is isomor-
phic, with stochastic variables introduced to provide
an empirical probability distribution of results. An
alternative has been to create large numbers of con-
trolled populations in the laboratory or on islands.

3. Of lesser importance to biology is the function
of reification. In physics, inferred entities like electrons
and constructs like the photon are reified in macro-
scopic models, presumably because one cannot “un-
derstand” or “comprehend” them otherwise. Most of
the entities of biology are either macroscopic or can
be visualized with optical devices. As molecular biol-
ogy has grown, however, with its concept of gene as


245

molecule and with its preoccupation with the mechan-
ical interactions between molecules, there has been an
increase in macroscopic modelling. Molecular models
of metal or rubber in which atoms and chemical bonds
are represented by three-dimensional objects with ex-
actly cut angles and shapes are now common in biol-
ogy. Most of these models are built in order to “have
a look at” a complex biological molecule because it
is felt that somehow its three-dimensional structure will
provide some intuition about its function. Closely re-
lated to this kind of comprehension is a form of weak
hypotheses testing that accompanies reification. When
J. D. Watson and F. H. C. Crick were investigating
the molecular structure of DNA, they built metal real-
izations of their hypothetical structures, based on nu-
merical information from X-ray crystallography. A
number of those did not fit together too well (one is
described as a “particularly repulsive back-bone
model”) while the final, correct solution looked right
(“Maurice [Wilkins] needed but a minute's look at the
model to like it”). The fact that many of the structures
seemed strained and tortured while the one model had
an elegant and easy-fitting look was important in
reaching the final conclusion about the correct molec-
ular configuration (Watson, 1968).

4. Slowly, biological model-making is coming to
serve as a function of unification. Biology has been
marked in the past by particularism, by the notion that
most generalizations are only trivially true, and that
what is truly interesting and unique about biological
systems is their variety and uniqueness, arising from
their complexity. Even a great generalization like
Darwinism allows for such a vast variety of forms of
natural selection and variation, that evolutionists for
many years concentrated on individual patterns of
evolutionary change. This has been even truer of ecol-
ogy, which has remained anecdotal and particularist
in the extreme. Abstract models usually framed in
logical and mathematical terms with little or no iconic
element, have come into use in an attempt to unify
large areas of biological investigation. Computer simu-
lations especially have shown that a model involving
only a few genes and a fairly simple set of assumptions
about the environment will predict a great variety of
possible evolutionary outcomes depending upon the
initial state of the population that is evolving. Models
of coupled harmonic oscillators appear to be predictive
of events in the central nervous system, embryonic
development, and physiological rhythms, and may in-
dicate an underlying general mechanism for all these
diverse phenomena.

5. Unification of diverse phenomena can be accom-
plished by increasing complications of models. A suffi-
ciently complex model will be homomorphic with
(structurally similar to) a vast variety of phenomena,
but trivially so. But models in biology are increasingly
being used for simplification as well. A new technique
in the investigation of problems in community ecology
is to make a series of models with fewer and fewer
entities, variables, and syntactical rules in an attempt
to find the “simplest” model that will give a satisfactory
account of the observations. While this seems a com-
monplace description of how science in general is done,
it has not been true of community ecology in the past.
The explicit program of R. MacArthur and R. Levins
(1967) to express the ecological niche in terms of a
very small number of abstract dimensions sufficient to
explain the numbers of organisms of different species
coexisting, is a radical departure in ecology and one
not universally approved. The opposite approach, that
of “systems analysis” (Watt, 1968) is to build a model
so complex that it approaches as closely as possible
an isomorphism (one to one correspondence) with the
natural systems. In part, this difference in approach
reflects a difference in intent. The systems analytic
model is designed for the control of particular pest
organisms, or the management of wildlife. As such it
is concerned with a particular organism in a particular
circumstance. The minimal model is esteemed chiefly
for its elegance and is viewed as a part of the general
program of natural science—to explain as much as
possible by as little as possible.

DETERMINISTIC AND STOCHASTIC MODELS

Descartes' machine has been the meta-model on
which most biological models have been patterned. It
is a clockwork machine designed so that a fixed input
will result in a fixed output. The input-output relation
is one-one or many-one, but never one-many. A per-
turbation at any point in the structure results in an
exactly predictable response (including no response) at
every other part of the structure. This meta-model is
widely accepted in biology and molecular biology. The
program of molecular biology is identical with the
program described by Descartes in Part V of the Dis-
course on Method.
The present description of the action
of genes in controlling protein synthesis, which is the
core of molecular biology, is isomorphic with the de-
scription of an automobile factory including quality
control, inventory control, assembly lines, and the like.
There is even a conscious exclusion of ambiguities
when they appear in experimental data, because of the
a priori certainty that the correct model is a Cartesian
one. For example, the present picture of the action
of genes is that each triplet of bases in the DNA mole-
cule specifies a particular amino acid to be added to
the protein. The experimental results that established
the correspondence between triplet and amino acid


246

showed that a particular triplet could cause the incor-
poration of several amino acids in vitro, but one more
than others. It was assumed that this ambiguity was
an experimental artifact and that genes are more exact
than chemists. In general, molecular biologists deal
with all-or-none concepts, with switchings-on and
switchings-off of genes, with repression and de-repres-
sion. While the methods and data of this kind of biology
are quantitative and continuous, the interpretations are
qualitative and discrete. It is not accidental that statis-
tics and probability, the fitting of curves, and the
estimation of parameters, is not part of the apparatus
of modern molecular biology.

A quite different world view is reflected in the
models constructed for the analysis of population phe-
nomena like evolution and ecology. The models are
Laplacean rather than Cartesian. Chance elements are
built into the models, although this does not imply that
the phenomena being modelled are themselves really
uncertain. That is, most biologists adhere to a deter-
ministic view of phenomena but assume that a large
number of small deterministic effects that cannot and
need not be analyzed, give an apparent indeterminacy
to phenomena at a higher level.

In stochastic or probabilistic models the correlative
statements connecting variables are of the form “If X
takes the value x then Y takes the value y with proba-
bility f(y|x).” That is, each rule becomes a table of
probabilities. of y given x, or a function for generating
those probabilities. Realizations of such models then
require generations of “random” sequences, or, more
usually, pseudo-random sequences, which are indistin-
guishable from randomness by nearly any criterion, yet
are produced by an analytic rule.

Such stochastic models can then be set in motion
over and over again to produce empirically an array
of outcomes for any given input, since the random
sequence generator never repeats itself. For example,
in primitive populations, marriages are contracted
according to certain age, clan, and relationship prefer-
ences. These rules are not rigid, however, but can be
expressed as probabilities. If a village has a small pop-
ulation size, the subsequent history of marriages, births,
and deaths cannot be exactly given but various histories
may occur with different probabilities. It is relatively
simple to simulate the history of such a village in a
computer program and then to run hypothetical village
histories over and over again from which an array of
growth rates, pedigrees, total birth and death rates will
be produced. This array is a picture of the probability
distribution of outcomes predicted by the theory.

While this kind of stochastic modelling is very com-
mon in population biology and ecology, it raises a
serious problem. Since a range of outputs will occur
for a given input, only the weakest kind of comparison
with nature is possible. There is only one primitive
tribe with a unique history. Is this history “explained”
by the theory that has been modelled, when that model
produces an array of results among which the observed
history may very well lie? What kind of observed
history would be at variance with a model that pro-
duces an array of results? About the best that can be
done in those areas of biology is to say that a given
observation in nature is reasonable under the theory,
that it is not surprising given the hypothetical struc-
ture. The method of making such judgments is the
method of statistical testing. The model is used to
construct a calculating engine which produces a prob-
ability distribution of outcomes, given the hypothesis.
The actual case is compared with this probability
distribution, and if it is very improbable, the hypothesis
is rejected as an explanation of the natural event. The
difficulty is that many theories, especially in evolu-
tionary genetics, when modelled, turn out to give such
a broad array of results that almost any observation
is compatible with them.

Modelling serves a function that becomes apparent
in biology, but not in the physical sciences. If theories
contain a stochastic element and if only unique real-
izations of particular cases occur, model building may
be used to show that no choice among theories can be
made at some levels of theory making.
These conditions,
which apply in much of population and evolutionary
biology, may also turn out to have the same effect in
social science.

BIBLIOGRAPHY

For a detailed categorization and bibliography of models
in biology, see W. R. Stahl, “The Role of Models in Theo-
retical Biology,” Progress in Theoretical Biology, 1 (1967),
165-218. See also M. W. Beckner, The Biological Way of
Thought
(Berkeley, 1968), Ch. III; R. B. Braithewaite, Sci-
entific Explanation
(Cambridge, 1953), Chs. III, V; J. Cairns,
G. S. Stent, and J. D. Watson, Phage and the Origins of
Molecular Biology
(Cold Spring Harbor, 1966); H. Freuden-
thal, ed., The Concept and the Role of the Model in Mathe-
matics and Natural and Social Sciences
(Dordrecht, 1961),
1-37, 163-94; R. MacArthur and R. Levins, “The Limiting
Similarity, Convergence and Divergence of Coexisting Spe-
cies,” The American Naturalist, 101 (1967), 377-85; A.
Rosenblueth and N. Wiener, “The Role of Models in Sci-
ence,” Philosophy of Science, 12 (1943), 317-20; J. D.
Watson, The Double Helix (New York, 1968); K. E. F. Watt,
Ecology and Resource Management (New York, 1968).

R. C. LEWONTIN

[See also Biological Homologies; Evolutionism; Genetic Continuity; Man-Machine; Recapitulation.]

247