Next: Entropy
Up: Statistical thermodynamics
Previous: Mechanical interaction between macrosystems
Consider two systems,
and
, which can interact by exchanging heat energy
and
doing work on one another. Let the system
have energy
and adjustable external
parameters
. Likewise, let the
system
have energy
and adjustable
external parameters
. The combined system
is
assumed to be isolated. It follows from the first law of thermodynamics that
 |
(198) |
Thus, the energy
of system
is determined once the energy
of
system
is given, and vice versa. In fact,
could be regarded as
a function of
. Furthermore, if the two systems can interact mechanically then,
in general, the parameters
are some function of the parameters
. As
a simple example, if the two systems are separated by a movable partition in
an enclosure of fixed volume
, then
 |
(199) |
where
and
are the volumes of systems
and
, respectively.
The total number of microstates accessible to
is clearly a function of
and the parameters
(where
runs from 1 to
), so
.
We have already demonstrated (in
Sect. 5.2) that
exhibits a very pronounced maximum at one particular
value of the energy
when
is varied but the external parameters are held constant.
This behaviour comes about because of the very strong,
 |
(200) |
increase in the number of accessible microstates of
(or
)
with energy. However, according to Sect. 3.8, the number of
accessible microstates exhibits a similar strong increase with
the volume, which is a typical external parameter, so that
 |
(201) |
It follows that the variation of
with a typical parameter
,
when all the other parameters and the energy are held constant, also exhibits
a very
sharp maximum at some particular
value
. The equilibrium situation
corresponds to the configuration of maximum probability, in which virtually all
systems
in the ensemble have values of
and
very close
to
and
. The mean values of these quantities are
thus given by
and
.
Consider a quasi-static process in which the system
is brought from an equilibrium
state described by
and
to an infinitesimally different
equilibrium state described by
and
. Let us calculate the resultant change in the
number of microstates accessible to
. Since
, the change in
follows from standard mathematics:
 |
(202) |
However, we have previously demonstrated that
 |
(203) |
[from Eqs. (186) and (197)],
so Eq. (202) can be written
 |
(204) |
Note that the temperature parameter
and the mean conjugate forces
are only well-defined for equilibrium states. This is
why we are only considering quasi-static changes
in which the two systems are always
arbitrarily close to equilibrium.
Let us rewrite Eq. (204)
in terms of the thermodynamic temperature
,
using the relation
. We obtain
 |
(205) |
where
 |
(206) |
Equation (205) is a differential relation which enables us to calculate
the quantity
as a function of the mean energy
and the mean external parameters
, assuming that we can calculate the temperature
and mean
conjugate forces
for each equilibrium state. The function
is termed the entropy of system
. The word
entropy is derived from the Greek en+trepien, which means ``in change.''
The reason for this etymology
will become apparent presently. It can be seen from Eq. (206)
that the entropy is merely a parameterization
of the number of accessible microstates.
Hence, according to statistical mechanics,
is essentially
a measure of the relative probability
of a state characterized by values of the mean energy and mean external parameters
and
, respectively.
According to Eq. (129), the net amount of work performed during a quasi-static
change is given by
 |
(207) |
It follows from Eq. (205) that
 |
(208) |
Thus, the thermodynamic temperature
is the integrating factor for the
first law of thermodynamics,
 |
(209) |
which converts the inexact differential
into the exact
differential
(see Sect. 4.5).
It follows that the entropy difference between any two macrostates
and
can be written
 |
(210) |
where the integral is evaluated for any process through which the system is brought
quasi-statically via a sequence of near-equilibrium configurations
from its initial to its final macrostate. The process has to be quasi-static
because the temperature
, which appears in the integrand, is only well-defined
for an equilibrium state. Since the left-hand side of the above equation only depends
on the initial and final states, it follows that the integral on the right-hand side
is independent of the particular sequence of quasi-static changes used to get
from
to
. Thus,
is independent of the
process (provided that it is quasi-static).
All of the concepts which we have encountered up to now in this course, such
as temperature, heat, energy, volume, pressure, etc., have been fairly
familiar to us
from other branches of Physics.
However, entropy, which turns out to be of crucial importance
in thermodynamics, is something quite new. Let us consider the following
questions. What does the entropy of a system actually signify? What use is
the concept of entropy?
Next: Entropy
Up: Statistical thermodynamics
Previous: Mechanical interaction between macrosystems
Richard Fitzpatrick
2006-02-02