University of Virginia Library

Search this document 
Dictionary of the History of Ideas

Studies of Selected Pivotal Ideas
  
  

expand sectionV. 
expand sectionIV. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionV. 
expand sectionV. 
expand sectionV. 
expand sectionII. 
expand sectionIV. 
expand sectionIV. 
expand sectionI. 
expand sectionI. 
expand sectionI. 
expand sectionVI. 
expand sectionV. 
expand sectionV. 
expand sectionVI. 
expand sectionVI. 
expand sectionIII. 
expand sectionI. 
expand sectionVI. 
expand sectionI. 
expand sectionIII. 
expand sectionVI. 
expand sectionIII. 
expand sectionIV. 
expand sectionVI. 
expand sectionVI. 
expand sectionV. 
expand sectionIV. 
expand sectionVII. 
expand sectionV. 
expand sectionI. 
expand sectionIII. 
expand sectionIII. 
expand sectionIII. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionIII. 
expand sectionVI. 
expand sectionIII. 
expand sectionI. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionVI. 
expand sectionV. 
expand sectionIV. 
expand sectionIV. 
expand sectionIV. 
expand sectionVI. 
expand sectionIV. 
collapse sectionIII. 
  
  
  
  
expand sectionVI. 
expand sectionVI. 
expand sectionV. 
expand sectionV. 
expand sectionVI. 
expand sectionIII. 
expand sectionII. 
expand sectionI. 
expand sectionII. 
expand sectionVII. 
expand sectionI. 
expand sectionI. 
expand sectionIII. 
expand sectionVI. 
expand sectionVI. 
expand sectionV. 
expand sectionVII. 
expand sectionV. 
expand sectionV. 
expand sectionV. 

4. The Statistical Definition of Entropy. Boltz-
mann's H-Theorem, that is, his conclusion that, for
nonequilibrium systems, H is a decreasing function in
time, was bound to raise questions concerning the
nature of irreversibility of physical systems and its
compatibility with the principles of mechanics. Boltz-
mann, fully aware of these problems, tried therefore
to base his conclusion on more general grounds by
taking into consideration the relative frequencies of
equilibrium states compared to nonequilibrium distri-
butions. In 1877 he showed that if W denotes the
number of states in which each molecule has a specified
position and velocity (so-called “micro-states”) which
describe the same given macroscopic state defined by
measurable thermodynamic variables like pressure or
temperature (so-called “macro-states”), then the
entropy of the system (gas) is proportional to the loga-
rithm of W (Boltzmann, 1877a). The introduction of
the logarithm followed from the fact that for two
independent systems the total entropy is the sum of
the individual entropies while the total probability is
the product of the individual probabilities:
S = S1 + S2 = f(W1) + f(W2 = f(W1 W2)
implies that S = constant log W. It is clear that a given
macro-state can be realized by a large number of
different micro-states, for the interchange of two mol-
ecules, for example, does not alter the density distribu-
tion in the least. If therefore the number W of micro-
states corresponding to a given macro-state is regarded
as a measure of the probability of occurrence of the
thermodynamic state, this statistical conception of
entropy provides an immediately visualizable inter-
pretation of the concept: it measures the probability
for the occurrence of the state; and the fact that in
adiabatically closed systems S increases toward a maxi-
mum at thermodynamic equilibrium means that the
system tends toward a state of maximum probability.
Finally, since ordered arrangements of molecules (e.g.,
when the molecules in one part of a container all move
very fast—corresponding to a high temperature—and
those in another part all move very slowly—corres-
ponding to a low temperature ζ) have a much smaller
probability of occurrence than disordered or random
arrangements, the increase of entropy signifies increase
of disorganization or of randomness (equalization of
temperature).