next up previous
Next: The laws of thermodynamics Up: Statistical thermodynamics Previous: Uses of entropy

Entropy and quantum mechanics

The entropy of a system is defined in terms of the number ${\mit\Omega}$ of accessible microstates consistent with an overall energy in the range $E$ to $E+\delta E$ via
S = k\ln{\mit\Omega}.
\end{displaymath} (249)

We have already demonstrated that this definition is utterly insensitive to the resolution $\delta E$ to which the macroscopic energy is measured (see Sect. 5.7). In classical mechanics, if a system possesses $f$ degrees of freedom then phase-space is conventionally subdivided into cells of arbitrarily chosen volume $h_0^{~f}$ (see Sect. 3.2). The number of accessible microstates is equivalent to the number of these cells in the volume of phase-space consistent with an overall energy of the system lying in the range $E$ to $E+\delta E$. Thus,
{\mit\Omega} = \frac{1}{h_0^{~f}} \int \cdots \int dq_1\cdots dq_f\,dp_1\cdots dp_f,
\end{displaymath} (250)

S = k \ln\left( \int \cdots \int dq_1\cdots dq_f\,dp_1\cdots dp_f\right) -
kf\ln h_0.
\end{displaymath} (251)

Thus, in classical mechanics the entropy is undetermined to an arbitrary additive constant which depends on the size of the cells in phase-space. In fact, $S$ increases as the cell size decreases. The second law of thermodynamics is only concerned with changes in entropy, and is, therefore, unaffected by an additive constant. Likewise, macroscopic thermodynamical quantities, such as the temperature and pressure, which can be expressed as partial derivatives of the entropy with respect to various macroscopic parameters [see Eqs. (242) and (243)] are unaffected by such a constant. So, in classical mechanics the entropy is rather like a gravitational potential: it is undetermined to an additive constant, but this does not affect any physical laws.

The non-unique value of the entropy comes about because there is no limit to the precision to which the state of a classical system can be specified. In other words, the cell size $h_0$ can be made arbitrarily small, which corresponds to specifying the particle coordinates and momenta to arbitrary accuracy. However, in quantum mechanics the uncertainty principle sets a definite limit to how accurately the particle coordinates and momenta can be specified. In general,

\delta q_i\, \delta p_i \geq h,
\end{displaymath} (252)

where $p_i$ is the momentum conjugate to the generalized coordinate $q_i$, and $\delta q_i$, $\delta p_i$ are the uncertainties in these quantities, respectively. In fact, in quantum mechanics the number of accessible quantum states with the overall energy in the range $E$ to $E+\delta E$ is completely determined. This implies that, in reality, the entropy of a system has a unique and unambiguous value. Quantum mechanics can often be ``mocked up'' in classical mechanics by setting the cell size in phase-space equal to Planck's constant, so that $h_0 = h$. This automatically enforces the most restrictive form of the uncertainty principle, $\delta q_i\, \delta p_i = h$. In many systems, the substitution $h_0 \rightarrow h$ in Eq. (251) gives the same, unique value for $S$ as that obtained from a full quantum mechanical calculation.

Consider a simple quantum mechanical system consisting of $N$ non-interacting spinless particles of mass $m$ confined in a cubic box of dimension $L$. The energy levels of the $i$th particle are given by

e_i = \frac{\hbar^2 \pi^2}{2\,m \,L^2}\left( n_{i1}^{~2}+n_{i2}^{~2}
\end{displaymath} (253)

where $n_{i1}$, $n_{i2}$, and $n_{i3}$ are three (positive) quantum numbers. The overall energy of the system is the sum of the energies of the individual particles, so that for a general state $r$
E_r = \sum_{i=1}^N e_i.
\end{displaymath} (254)

The overall state of the system is completely specified by $3N$ quantum numbers, so the number of degrees of freedom is $f=3\,N$. The classical limit corresponds to the situation where all of the quantum numbers are much greater than unity. In this limit, the number of accessible states varies with energy according to our usual estimate ${\mit\Omega}\propto E^f$. The lowest possible energy state of the system, the so-called ground-state, corresponds to the situation where all quantum numbers take their lowest possible value, unity. Thus, the ground-state energy $E_0$ is given by
E_0 = \frac{ f\,\hbar^2 \pi^2}{2\,m\, L^2}.
\end{displaymath} (255)

There is only one accessible microstate at the ground-state energy (i.e., that where all quantum numbers are unity), so by our usual definition of entropy
S(E_0) = k\ln 1 = 0.
\end{displaymath} (256)

In other words, there is no disorder in the system when all the particles are in their ground-states.

Clearly, as the energy approaches the ground-state energy, the number of accessible states becomes far less than the usual classical estimate $E^f$. This is true for all quantum mechanical systems. In general, the number of microstates varies roughly like

{\mit\Omega}(E) \sim 1 + C \,(E - E_0)^f,
\end{displaymath} (257)

where $C$ is a positive constant. According to Eq. (187), the temperature varies approximately like
T \sim \frac{E - E_0}{k\,f},
\end{displaymath} (258)

provided ${\mit\Omega} \gg 1$. Thus, as the absolute temperature of a system approaches zero, the internal energy approaches a limiting value $E_0$ (the quantum mechanical ground-state energy), and the entropy approaches the limiting value zero. This proposition is known as the third law of thermodynamics.

At low temperatures, great care must be taken to ensure that equilibrium thermodynamical arguments are applicable, since the rate of attaining equilibrium may be very slow. Another difficulty arises when dealing with a system in which the atoms possess nuclear spins. Typically, when such a system is brought to a very low temperature the entropy associated with the degrees of freedom not involving nuclear spins becomes negligible. Nevertheless, the number of microstates ${\mit\Omega}_s$ corresponding to the possible nuclear spin orientations may be very large. Indeed, it may be just as large as the number of states at room temperature. The reason for this is that nuclear magnetic moments are extremely small, and, therefore, have extremely weak mutual interactions. Thus, it only takes a tiny amount of heat energy in the system to completely randomize the spin orientations. Typically, a temperature as small as $10^{-3}$ degrees kelvin above absolute zero is sufficient to randomize the spins.

Suppose that the system consists of $N$ atoms of spin $1/2$. Each spin can have two possible orientations. If there is enough residual heat energy in the system to randomize the spins then each orientation is equally likely. If follows that there are ${\mit\Omega}_s = 2^N$ accessible spin states. The entropy associated with these states is $S_0 = k\ln{\mit\Omega}_s =
\nu\, R\, \ln 2$. Below some critical temperature, $T_0$, the interaction between the nuclear spins becomes significant, and the system settles down in some unique quantum mechanical ground-state (e.g., with all spins aligned). In this situation, $S\rightarrow 0$, in accordance with the third law of thermodynamics. However, for temperatures which are small, but not small enough to ``freeze out'' the nuclear spin degrees of freedom, the entropy approaches a limiting value $S_0$ which depends only on the kinds of atomic nuclei in the system. This limiting value is independent of the spatial arrangement of the atoms, or the interactions between them. Thus, for most practical purposes the third law of thermodynamics can be written

{\rm as~} T\rightarrow 0_{+},~~~S\rightarrow S_0,
\end{displaymath} (259)

where $0_{+}$ denotes a temperature which is very close to absolute zero, but still much larger than $T_0$. This modification of the third law is useful because it can be applied at temperatures which are not prohibitively low.

next up previous
Next: The laws of thermodynamics Up: Statistical thermodynamics Previous: Uses of entropy
Richard Fitzpatrick 2006-02-02