next up previous
Next: Uses of Entropy Up: Statistical Thermodynamics Previous: Entropy


Properties of Entropy

Entropy, as we have defined it, has some dependence on the resolution, $ \delta E$ , to which the energy of macrostates is measured. Recall that $ {\mit\Omega}(E)$ is the number of accessible microstates with energy in the range $ E$ to $ E+\delta E$ . Suppose that we choose a new resolution $ \delta^\ast E$ , and define a new density of states, $ {\mit\Omega}^{ \ast} (E)$ , which is the number of states with energy in the range $ E$ to $ E+\delta^\ast E$ . It can easily be seen that

$\displaystyle {\mit\Omega}^{ \ast}(E) = \frac{\delta^\ast E}{\delta E} {\mit\Omega}(E).$ (5.70)

It follows that the new entropy, $ S^\ast = k \ln{\mit\Omega}^{ \ast}$ , is related to the previous entropy $ S=k \ln {\mit\Omega}$ via

$\displaystyle S^\ast = S + k  \ln\left(\frac{\delta^\ast E}{\delta E}\right).$ (5.71)

Now, our usual estimate that $ {\mit\Omega} \sim E^{ f}$ (see Section 3.8) gives $ S\sim k f$ , where $ f$ is the number of degrees of freedom. It follows that even if $ \delta^\ast E$ were to differ from $ \delta E$ by of order $ f$ (i.e., twenty four orders of magnitude), which is virtually inconceivable, the second term on the right-hand side of the previous equation is still only of order $ k \ln f$ , which is utterly negligible compared to $ k f$ . It follows that

$\displaystyle S^\ast = S$ (5.72)

to an excellent approximation, so our definition of entropy is completely insensitive to the resolution to which we measure energy (or any other macroscopic parameter).

Note that, like the temperature, the entropy of a macrostate is only well defined if the macrostate is in equilibrium. The crucial point is that it only makes sense to talk about the number of accessible states if the systems in the ensemble are given sufficient time to thoroughly explore all of the possible microstates consistent with the known macroscopic constraints. In other words, we can only be sure that a given microstate is inaccessible when the systems in the ensemble have had ample opportunity to move into it, and yet have not done so. For an equilibrium state, the entropy is just as well defined as more familiar quantities such as the temperature and the mean pressure.

Consider, again, two systems, $ A$ and $ A'$ , that are in thermal contact, but can do no work on one another. (See Section 5.2.) Let $ E$ and $ E'$ be the energies of the two systems, and $ {\mit\Omega}(E)$ and $ {\mit\Omega}'(E')$ the respective densities of states. Furthermore, let $ E^{ (0)}$ be the conserved energy of the system as a whole, and $ {\mit\Omega}^{ (0)}$ the corresponding density of states. We have from Equation (5.2) that

$\displaystyle {\mit\Omega}^{ (0)}(E) = {\mit\Omega}(E)  {\mit\Omega}'(E'),$ (5.73)

where $ E' = E^{ (0)}-E$ . In other words, the number of states accessible to the whole system is the product of the numbers of states accessible to each subsystem, because every microstate of $ A$ can be combined with every microstate of $ A'$ to form a distinct microstate of the whole system. We know, from Section 5.2, that in equilibrium the mean energy of $ A$ takes the value $ \overline{E}=\tilde{E}$ for which $ {\mit\Omega}^{ (0)}(E)$ is maximum, and the temperatures of $ A$ and $ A'$ are equal. The distribution of $ E$ around the mean value is of order $ {\mit\Delta}^\ast
E = \tilde{E}/\!\sqrt{f}$ , where $ f$ is the number of degrees of freedom. It follows that the total number of accessible microstates is approximately the number of states which lie within $ {\mit\Delta}^\ast E$ of $ \tilde{E}$ . Thus,

$\displaystyle {\mit\Omega}^{ (0)}_{\rm tot} \simeq \frac{{\mit\Omega}^{ (0)}(\tilde{E})} {\delta E}  {\mit\Delta}^\ast E.$ (5.74)

The entropy of the whole system is given by

$\displaystyle S^{(0)} = k  \ln {\mit\Omega}^{ (0)}_{\rm tot} = k \ln{\mit\Omega}^{ (0)}(\tilde{E})+ k \ln \left(\frac{{\mit\Delta}^\ast E}{\delta E}\right).$ (5.75)

According to our usual estimate, $ {\mit\Omega} \sim E^{ f}$ (see Section 3.8), the first term on the right-hand side of the previous equation is of order $ k f$ , whereas the second term is of order $ k \ln(\tilde{E}/\!\sqrt{f}
 \delta E)$ . Any reasonable choice for the energy subdivision, $ \delta E$ , should be greater than $ \tilde{E}/f$ , otherwise there would be less than one microstate per subdivision. It follows that the second term is less than, or of order, $ k \ln f$ , which is utterly negligible compared to $ k f$ . Thus,

$\displaystyle S^{(0)} = k \ln{\mit\Omega}^{ (0)}(\tilde{E}) = k \ln[{\mit\Om...
...\tilde{E}')] =k \ln {\mit\Omega}(\tilde{E}) + k \ln {\mit\Omega}'(\tilde{E}')$ (5.76)

to an excellent approximation, giving

$\displaystyle S^{(0)} = S(\tilde{E}) + S'(\tilde{E}').$ (5.77)

It can be seen that the probability distribution for $ {\mit\Omega}^{ (0)}(E)$ is so strongly peaked around its maximum value that, for the purpose of calculating the entropy, the total number of states is equal to the maximum number of states [i.e., $ {\mit\Omega}_{\rm tot}^{ (0)}\sim {\mit\Omega}^{ (0)}(\tilde{E})$ ]. One consequence of this is that entropy possesses the simple additive property illustrated in Equation (5.77). Thus, the total entropy of two thermally interacting systems in equilibrium is the sum of the entropies of each system taken in isolation.


next up previous
Next: Uses of Entropy Up: Statistical Thermodynamics Previous: Entropy
Richard Fitzpatrick 2016-01-25