The third law of thermodynamics states as follows, regarding the properties of closed systems in thermodynamic equilibrium:
The entropy of a system approaches a constant value as its temperature approaches absolute zero.
This constant value cannot depend on any other parameters characterizing the closed system, such as pressure or applied magnetic field. At absolute zero (zero kelvins) the system must be in a state with the minimum possible energy. Entropy is related to the number of accessible microstates, and there is typically one unique state (called the ground state) with minimum energy.[1] In such a case, the entropy at absolute zero will be exactly zero. If the system does not have a well-defined order (if its order is glassy, for example), then there may remain some finite entropy as the system is brought to very low temperatures, either because the system becomes locked into a configuration with non-minimal energy or because the minimum energy state is non-unique. The constant value is called the residual entropy of the system.[2] The entropy is essentially a state-function meaning the inherent value of different atoms, molecules, and other configurations of particles including subatomic or atomic material is defined by entropy, which can be discovered near 0 K. The Nernst–Simon statement of the third law of thermodynamics concerns thermodynamic processes at a fixed, low temperature:
The entropy change associated with any condensed system undergoing a reversible isothermal process approaches zero as the temperature at which it is performed approaches 0 K.
Here a condensed system refers to liquids and solids. A classical formulation by Nernst (actually a consequence of the Third Law) is:
It is impossible for any process, no matter how idealized, to reduce the entropy of a system to its absolute-zero value in a finite number of operations.[3]
There also exists a formulation of the Third Law which approaches the subject by postulating a specific energy behavior:
If the composite of two thermodynamic systems constitutes an isolated system, then any energy exchange in any form between those two systems is bounded.[4]
History
The third law was developed by chemist Walther Nernst during the years 1906–12, and is therefore often referred to as Nernst's theorem or Nernst's postulate. The third law of thermodynamics states that the entropy of a system at absolute zero is a well-defined constant. This is because a system at zero temperature exists in its ground state, so that its entropy is determined only by the degeneracy of the ground state.
In 1912 Nernst stated the law thus: "It is impossible for any procedure to lead to the isotherm T = 0 in a finite number of steps."[5]
An alternative version of the third law of thermodynamics as stated by Gilbert N. Lewis and Merle Randall in 1923:
If the entropy of each element in some (perfect) crystalline state be taken as zero at the absolute zero of temperature, every substance has a finite positive entropy; but at the absolute zero of temperature the entropy may become zero, and does so become in the case of perfect crystalline substances.
This version states not only ΔS will reach zero at 0 K, but S itself will also reach zero as long as the crystal has a ground state with only one configuration. Some crystals form defects which cause a residual entropy. This residual entropy disappears when the kinetic barriers to transitioning to one ground state are overcome.[6]
With the development of statistical mechanics, the third law of thermodynamics (like the other laws) changed from a fundamental law (justified by experiments) to a derived law (derived from even more basic laws). The basic law from which it is primarily derived is the statistical-mechanics definition of entropy for a large system:
\( {\displaystyle S-S_{0}=k_{\text{B}}\ln \,\Omega \ } \)
where S is entropy, kB is the Boltzmann constant, and ( \Omega \) is the number of microstates consistent with the macroscopic configuration. The counting of states is from the reference state of absolute zero, which corresponds to the entropy of S0.
Explanation
In simple terms, the third law states that the entropy of a perfect crystal of a pure substance approaches zero as the temperature approaches zero. The alignment of a perfect crystal leaves no ambiguity as to the location and orientation of each part of the crystal. As the energy of the crystal is reduced, the vibrations of the individual atoms are reduced to nothing, and the crystal becomes the same everywhere.
a) Single possible configuration for a system at absolute zero, i.e., only one microstate is accessible. Thus S = k ln W = 0. b) At temperatures greater than absolute zero, multiple microstates are accessible due to atomic vibration (exaggerated in the figure). Since the number of accessible microstates is greater than 1, S = k ln W > 0.
The third law provides an absolute reference point for the determination of entropy at any other temperature. The entropy of a closed system, determined relative to this zero point, is then the absolute entropy of that system. Mathematically, the absolute entropy of any system at zero temperature is the natural log of the number of ground states times Boltzmann's constant kB = 1.38×10−23 J K−1.
The entropy of a perfect crystal lattice as defined by Nernst's theorem is zero provided that its ground state is unique, because ln(1) = 0. If the system is composed of one-billion atoms, all alike, and lie within the matrix of a perfect crystal, the number of combinations of one-billion identical things taken one-billion at a time is Ω = 1. Hence:
\( {\displaystyle S-S_{0}=k_{\text{B}}\ln \Omega =k_{\text{B}}\ln {1}=0}\)
The difference is zero, hence the initial entropy S0 can be any selected value so long as all other such calculations include that as the initial entropy. As a result, the initial entropy value of zero is selected S0 = 0 is used for convenience.
\( S-S_{0}=S-0=0\)
\( S=0 \)
Example : Entropy change of a crystal lattice heated by an incoming photon
Suppose a system consisting of a crystal lattice with volume V of N identical atoms at T= 0 K, and an incoming photon of wavelength λ and energy ε.
Initially, there is only one accessible microstate :
\( {\displaystyle S_{0}=k_{\text{B}}\ln \Omega =k_{\text{B}}\ln {1}=0} .\)
Let's assume the crystal lattice absorbs the incoming photon. There is a unique atom in the lattice that interacts and absorbs this photon. So after absorption, there is N possible microstates accessible by the system, each of the microstates corresponding to one excited atom, and the other atoms remaining at ground state.
The entropy, energy, and temperature of the closed system rises and can be calculated. The entropy change is:
\( {\displaystyle \Delta S=S-S_{0}=k_{\text{B}}\ln {\Omega }}\)
From the second law of thermodynamics:
\( \Delta S = S - S_0 = \frac{\delta Q}{T}\)
Hence:
\( {\displaystyle \Delta S=S-S_{0}=k_{\text{B}}\ln(\Omega )={\frac {\delta Q}{T}}}\)
Calculating entropy change:
\( {\displaystyle S-0=k_{\text{B}}\ln {N}=1.38\times 10^{-23}\times \ln {(3\times 10^{22})}=70\times 10^{-23}\,\mathrm {J} \,\mathrm {K} ^{-1}}\)
We assume N = 3 • 1022 and λ = 1 cm . The energy change of the system as a result of absorbing the single photon whose energy is ε:
\( \delta Q = \epsilon = \frac {hc}{\lambda} =\frac{6.62 \times 10^{-34}\,\mathrm{J}\cdot \mathrm{s} \times 3 \times 10^{8} \,\mathrm{m}\,\mathrm{s}^{-1}}{0.01 \,\mathrm{m}}=2 \times 10^{-23} \,\mathrm{J}\)
The temperature of the closed system rises by:
\( {\displaystyle T={\frac {\epsilon }{\Delta S}}={\frac {2\times 10^{-23}\,\mathrm {J} }{70\times 10^{-23}\,\mathrm {J} \,\mathrm {K} ^{-1}}}=0.02857\,\mathrm {K} }\)
This can be interpreted as the average temperature of the system over the range from \( {\displaystyle 0<S<70\times 10^{-23}\,\mathrm {J} \,\mathrm {K} ^{-1}}[7] \). A single atom was assumed to absorb the photon but the temperature and entropy change characterizes the entire system.
Systems with non-zero entropy at absolute zero
An example of a system which does not have a unique ground state is one whose net spin is a half-integer, for which time-reversal symmetry gives two degenerate ground states. For such systems, the entropy at zero temperature is at least kB*ln(2) (which is negligible on a macroscopic scale). Some crystalline systems exhibit geometrical frustration, where the structure of the crystal lattice prevents the emergence of a unique ground state. Ground-state helium (unless under pressure) remains liquid.
In addition, glasses and solid solutions retain large entropy at 0 K, because they are large collections of nearly degenerate states, in which they become trapped out of equilibrium. Another example of a solid with many nearly-degenerate ground states, trapped out of equilibrium, is ice Ih, which has "proton disorder".
For the entropy at absolute zero to be zero, the magnetic moments of a perfectly ordered crystal must themselves be perfectly ordered; from an entropic perspective, this can be considered to be part of the definition of a "perfect crystal". Only ferromagnetic, antiferromagnetic, and diamagnetic materials can satisfy this condition. However, ferromagnetic materials do not, in fact, have zero entropy at zero temperature, because the spins of the unpaired electrons are all aligned and this gives a ground-state spin degeneracy. Materials that remain paramagnetic at 0 K, by contrast, may have many nearly-degenerate ground states (for example, in a spin glass), or may retain dynamic disorder (a quantum spin liquid).
Consequences
Fig. 1 Left side: Absolute zero can be reached in a finite number of steps if S(0,X1)≠S(0, X2). Right: An infinite number of steps is needed since S(0,X1)= S(0,X2).
Absolute zero
The third law is equivalent to the statement that
It is impossible by any procedure, no matter how idealized, to reduce the temperature of any closed system to zero temperature in a finite number of finite operations.[8]
The reason that T = 0 cannot be reached according to the third law is explained as follows: Suppose that the temperature of a substance can be reduced in an isentropic process by changing the parameter X from X2 to X1. One can think of a multistage nuclear demagnetization setup where a magnetic field is switched on and off in a controlled way.[9] If there were an entropy difference at absolute zero, T = 0 could be reached in a finite number of steps. However, at T = 0 there is no entropy difference so an infinite number of steps would be needed. The process is illustrated in Fig. 1.
Specific heat
A non-quantitative description of his third law that Nernst gave at the very beginning was simply that the specific heat can always be made zero by cooling the material down far enough.[10] A modern, quantitative analysis follows.
Supposed that the heat capacity of a sample in the low temperature region has the form of a power law C(T,X)=C0Tα asymptotically as T→0, and we wish to find which values of α are compatible with the third law. We have
\( \int_{T_0}^T \frac {C(T^\prime,X)}{T^\prime}dT^\prime = \frac {C_0}{ \alpha}(T^{ \alpha}-T_0^{ \alpha}). \) (11)
By the discussion of third law (above), this integral must be bounded as T0→0, which is only possible if α>0. So the heat capacity must go to zero at absolute zero
\( \lim_{T \rightarrow 0}C(T,X)=0. \) (12)
if it has the form of a power law. The same argument shows that it cannot be bounded below by a positive constant, even if we drop the power-law assumption.
On the other hand, the molar specific heat at constant volume of a monatomic classical ideal gas, such as helium at room temperature, is given by CV=(3/2)R with R the molar ideal gas constant. But clearly a constant heat capacity does not satisfy Eq. (12). That is, a gas with a constant heat capacity all the way to absolute zero violates the third law of thermodynamics. We can verify this more fundamentally by substituting CV in Eq. (14), which yields
\( S(T,V) = S(T_0,V) + \frac{3}{2}R \ln \frac{T}{T_0}. \) (13)
In the limit T0 → 0 this expression diverges, again contradicting the third law of thermodynamics.
The conflict is resolved as follows: At a certain temperature the quantum nature of matter starts to dominate the behavior. Fermi particles follow Fermi–Dirac statistics and Bose particles follow Bose–Einstein statistics. In both cases the heat capacity at low temperatures is no longer temperature independent, even for ideal gases. For Fermi gases
\( {\displaystyle C_{V}={\frac {\pi ^{2}}{2}}R{\frac {T}{T_{\text{F}}}}} \) (14)
with the Fermi temperature TF given by
( {\displaystyle T_{\text{F}}={\frac {1}{8\pi ^{2}}}{\frac {N_{\text{A}}^{2}h^{2}}{MR}}\left({\frac {3\pi ^{2}N_{\text{A}}}{V_{\text{m}}}}\right)^{2/3}.} \) (15)
Here NA is Avogadro's number, Vm the molar volume, and M the molar mass.
For Bose gases
\( {\displaystyle C_{V}=1.93..R\left({\frac {T}{T_{\text{B}}}}\right)^{3/2}} \) (16)
with TB given by
\( {\displaystyle T_{\text{B}}={\frac {1}{11.9..}}{\frac {N_{\text{A}}^{2}h^{2}}{MR}}\left({\frac {N_{\text{A}}}{V_{\text{m}}}}\right)^{2/3}.} \) (17)
The specific heats given by Eq. (14) and (16) both satisfy Eq. (12). Indeed, they are power laws with α=1 and α=3/2 respectively.
Even within a purely classical setting, the density of a classical ideal gas at fixed particle number becomes arbitrarily high as T goes to zero, so the interparticle spacing goes to zero. The assumption of non-interacting particles presumably breaks down when they are sufficiently close together, so the value of \( C_{V}\) gets modified away from its ideal constant value.
Vapor pressure
The only liquids near absolute zero are ³He and ⁴He. Their heat of evaporation has a limiting value given by
\( L=L_0+C_pT \) (18)
with L0 and Cp constant. If we consider a container, partly filled with liquid and partly gas, the entropy of the liquid–gas mixture is
\( S(T,x) = S_l(T)+x(\frac{L_0}{T}+C_p) \)(19)
where Sl(T) is the entropy of the liquid and x is the gas fraction. Clearly the entropy change during the liquid–gas transition (x from 0 to 1) diverges in the limit of T→0. This violates Eq.(8). Nature solves this paradox as follows: at temperatures below about 50 mK the vapor pressure is so low that the gas density is lower than the best vacuum in the universe. In other words: below 50 mK there is simply no gas above the liquid.
Latent heat of melting
The melting curves of ³He and ⁴He both extend down to absolute zero at finite pressure. At the melting pressure, liquid and solid are in equilibrium. The third law demands that the entropies of the solid and liquid are equal at T=0. As a result, the latent heat of melting is zero and the slope of the melting curve extrapolates to zero as a result of the Clausius–Clapeyron equation.
Thermal expansion coefficient
The thermal expansion coefficient is defined as
\( {\displaystyle \alpha _{V}={\frac {1}{V_{m}}}\left({\frac {\partial V_{m}}{\partial T}}\right)_{p}.} \)(20)
With the Maxwell relation
\( {\displaystyle \left({\frac {\partial V_{m}}{\partial T}}\right)_{p}=-\left({\frac {\partial S_{m}}{\partial p}}\right)_{T}} \) (21)
and Eq. (8) with X=p it is shown that
\( \lim_{T \rightarrow 0}\alpha_V=0. \) (22)
So the thermal expansion coefficient of all materials must go to zero at zero kelvin.
See also
Adiabatic process
Ground state
Laws of thermodynamics
Quantum thermodynamics
Residual entropy
Thermodynamic entropy
Timeline of thermodynamics, statistical mechanics, and random processes
Quantum heat engines and refrigerators
References
J. Wilks The Third Law of Thermodynamics Oxford University Press (1961).
Kittel and Kroemer, Thermal Physics (2nd ed.), page 49.
Wilks, J. (1971). The Third Law of Thermodynamics, Chapter 6 in Thermodynamics, volume 1, ed. W. Jost, of H. Eyring, D. Henderson, W. Jost, Physical Chemistry. An Advanced Treatise, Academic Press, New York, page 477.
Heidrich, M. (2016). "Bounded energy exchange as an alternative to the third law of thermodynamics". Annals of Physics. 373: 665–681. Bibcode:2016AnPhy.373..665H. doi:10.1016/j.aop.2016.07.031.
Bailyn, M. (1994). A Survey of Thermodynamics, American Institute of Physics, New York, ISBN 0-88318-797-3, page 342.
Kozliak, Evguenii; Lambert, Frank L. (2008). "Residual Entropy, the Third Law and Latent Heat". Entropy. 10 (3): 274–84. Bibcode:2008Entrp..10..274K. doi:10.3390/e10030274.
Reynolds and Perkins (1977). Engineering Thermodynamics. McGraw Hill. pp. 438. ISBN 978-0-07-052046-2.
Guggenheim, E.A. (1967). Thermodynamics. An Advanced Treatment for Chemists and Physicists, fifth revised edition, North-Holland Publishing Company, Amsterdam, page 157.
F. Pobell, Matter and Methods at Low Temperatures, (Springer-Verlag, Berlin, 2007)
Einstein and the Quantum, A. Douglas Stone, Princeton University Press, 2013.
J. Wilks The Third Law of Thermodynamic
Further reading
Goldstein, Martin & Inge F. (1993) The Refrigerator and the Universe. Cambridge MA: Harvard University Press. ISBN 0-674-75324-0. Chpt. 14 is a nontechnical discussion of the Third Law, one including the requisite elementary quantum mechanics.
Braun, S.; Ronzheimer, J. P.; Schreiber, M.; Hodgman, S. S.; Rom, T.; Bloch, I.; Schneider, U. (2013). "Negative Absolute Temperature for Motional Degrees of Freedom". Science. 339 (6115): 52–5. arXiv:1211.0545. Bibcode:2013Sci...339...52B. doi:10.1126/science.1227831. PMID 23288533. S2CID 8207974. Lay summary – New Scientist (3 January 2013).
Levy, A.; Alicki, R.; Kosloff, R. (2012). "Quantum refrigerators and the third law of thermodynamics". Physical Review E. 85 (6): 061126. arXiv:1205.1347. Bibcode:2012PhRvE..85f1126L. doi:10.1103/PhysRevE.85.061126. PMID 23005070. S2CID 24251763.
Hellenica World - Scientific Library
Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License