Chapter 6: Heat Capacity, Enthalpy, Entropy & Third Law
Loading audio…
ⓘ This audio and summary are simplified educational interpretations and are not a substitute for the original text.
The sixth chapter establishes a crucial thermodynamic framework by focusing on the definitions and implications of heat capacity, enthalpy, entropy, and the fundamental Third Law of Thermodynamics. Heat capacity is defined as the ratio of thermal energy added to the resulting increase in temperature, specified either at constant volume (linked to the change in internal energy with temperature) or at constant pressure (linked to the change in enthalpy with temperature). These specific heat capacities are further related to the second derivatives, or rates of change, of the fundamental Gibbs and Helmholtz free energies. Historically, the empirical Law of Dulong and Petit approximated the constant volume molar heat capacity of all solid elements to three times the gas constant R at high temperatures, a concept extended by Kopp’s rule to solid chemical compounds. It is noted that all real substances show decreased heat capacities at low temperatures, consistent with quantum theory principles. Calculations for changes in enthalpy (delta H) are determined by integrating the constant pressure molar heat capacity over the temperature range. By standard convention, the enthalpy of elements in their stable states is assigned a value of zero at 298 Kelvin and 1 atmosphere. Reactions that cause the system to absorb thermal energy (positive delta H) are termed endothermic, while those that evolve thermal energy (negative delta H) are termed exothermic. The temperature dependence of enthalpy changes in phase transformations or reactions is mathematically expressed by Kirchhoff's law. The most significant concept introduced is the Third Law of Thermodynamics, which arose from experimental data collected by Richards showing that the values of Gibbs free energy change and enthalpy change converge, and their rates of change approach zero, as the temperature approaches 0 Kelvin. This resulted in the Nernst heat theorem, which postulated that the integration constant representing the change in entropy at 0 Kelvin is zero for reactions involving liquids and solids. The modern Nernst–Planck–Simon statement generalizes this by asserting that the entropy (S) of any homogeneous substance in complete internal equilibrium must be zero at 0 Kelvin. This imposes constraints, requiring stable ground state phases to have zero configurational entropy, meaning elements must be crystalline and alloys must form fully ordered, stoichiometric compounds or decompose. Numerous physical consequences follow, requiring properties like constant pressure heat capacity, the thermal expansion coefficient, and the thermal coefficient of pressure to all approach zero as the temperature approaches 0 Kelvin. An alternative phrasing is the unattainability statement, which posits that 0 Kelvin cannot be reached in a finite number of operations. Apparent failures of the Third Law, such as non-zero entropy for glasses or random solid solutions, are attributed to the material not being in internal equilibrium. The absolute entropy of a material at any temperature is calculated by integrating the ratio of constant pressure heat capacity and temperature starting from 0 Kelvin. Finally, two empirical rules estimate entropy changes during transitions: Richards’ rule suggests the molar entropy of fusion is approximately constant for similar metals (e.g., 9.6 J/K for FCC), and Trouton’s rule posits that the molar entropy of boiling is approximately constant (around 87 J/K for many metals). The influence of pressure on enthalpy and entropy is typically negligible for condensed phases within the common pressure range of 0 to 1 atmosphere.