DETERMINISTIC AND STOCHASTIC MODELS
Descartes' machine has been the meta-model on
which most biological models
have been patterned. It
is a clockwork machine designed so that a fixed
input
will result in a fixed output. The input-output relation
is
one-one or many-one, but never one-many. A per-
turbation at any point in the structure results in an
exactly
predictable response (including no response) at
every other part of the
structure. This meta-model is
widely accepted in biology and molecular
biology. The
program of molecular biology is identical with the
program described by Descartes in Part V of the Dis-
course on Method. The present
description of the action
of genes in controlling protein synthesis, which
is the
core of molecular biology, is isomorphic with the de-
scription of an automobile factory
including quality
control, inventory control, assembly lines, and the
like.
There is even a conscious exclusion of ambiguities
when they
appear in experimental data, because of the
a priori certainty that the correct model is a
Cartesian
one. For example, the present picture of the action
of genes
is that each triplet of bases in the DNA mole-
cule specifies a particular amino acid to be added to
the protein.
The experimental results that established
the correspondence between
triplet and amino acid
showed that a particular triplet could cause the incor-
poration of several amino acids
in vitro, but one more
than others. It was assumed
that this ambiguity was
an experimental artifact and that genes are more
exact
than chemists. In general, molecular biologists deal
with
all-or-none concepts, with switchings-on and
switchings-off of genes, with
repression and de-repres-
sion. While the
methods and data of this kind of biology
are quantitative and continuous,
the interpretations are
qualitative and discrete. It is not accidental that
statis-
tics and probability, the fitting
of curves, and the
estimation of parameters, is not part of the
apparatus
of modern molecular biology.
A quite different world view is reflected in the
models constructed for the
analysis of population phe-
nomena like
evolution and ecology. The models are
Laplacean rather than Cartesian.
Chance elements are
built into the models, although this does not imply
that
the phenomena being modelled are themselves really
uncertain.
That is, most biologists adhere to a deter-
ministic view of phenomena but assume that a large
number of
small deterministic effects that cannot and
need not be analyzed, give an
apparent indeterminacy
to phenomena at a higher level.
In stochastic or probabilistic models the correlative
statements connecting
variables are of the form “If X
takes the
value x then Y takes the value y with proba-
bility
f(y|x).” That is, each
rule becomes a table of
probabilities. of y given
x, or a function for generating
those
probabilities. Realizations of such models then
require generations of
“random” sequences, or, more
usually, pseudo-random sequences, which are indistin-
guishable from randomness by
nearly any criterion, yet
are produced by an analytic rule.
Such stochastic models can then be set in motion
over and over again to
produce empirically an array
of outcomes for any given input, since the
random
sequence generator never repeats itself. For example,
in
primitive populations, marriages are contracted
according to certain age,
clan, and relationship prefer-
ences. These
rules are not rigid, however, but can be
expressed as probabilities. If a
village has a small pop-
ulation size, the
subsequent history of marriages, births,
and deaths cannot be exactly given
but various histories
may occur with different probabilities. It is
relatively
simple to simulate the history of such a village in a
computer program and then to run hypothetical village
histories over and
over again from which an array of
growth rates, pedigrees, total birth and
death rates will
be produced. This array is a picture of the
probability
distribution of outcomes predicted by the theory.
While this kind of stochastic modelling is very com-
mon in population biology and ecology, it raises a
serious problem.
Since a range of outputs will occur
for a given input, only the weakest kind of comparison
with
nature is possible. There is only one primitive
tribe with a unique
history. Is this history “explained”
by the theory
that has been modelled, when that model
produces an array of results among
which the observed
history may very well lie? What kind of observed
history would be at variance with a model that pro-
duces an array of results? About the best that can be
done in
those areas of biology is to say that a given
observation in nature is
reasonable under the theory,
that it is not surprising given the
hypothetical struc-
ture. The method of making
such judgments is the
method of statistical testing. The model is used
to
construct a calculating engine which produces a prob-
ability distribution of outcomes, given the
hypothesis.
The actual case is compared with this probability
distribution, and if it is very improbable, the hypothesis
is rejected as
an explanation of the natural event. The
difficulty is that many theories,
especially in evolu-
tionary genetics, when
modelled, turn out to give such
a broad array of results that almost any
observation
is compatible with them.
Modelling serves a function that becomes apparent
in biology, but not in the
physical sciences. If theories
contain a stochastic element and if only
unique real-
izations of particular cases
occur, model building may
be used to show that no choice
among theories can be
made at some levels of theory making. These
conditions,
which apply in much of population and evolutionary
biology, may also turn out to have the same effect in
social science.