To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In preparation for our study of statistical thermodynamics, we first review some fundamental notions of probability theory, with a special focus on those statistical concepts relevant to atomic and molecular systems. Depending on your background, you might be able to scan quickly Sections 2.1–2.3, but you should pay careful attention to Sections 2.4–2.7.
Probability: Definitions and Basic Concepts
Probability theory is concerned with predicting statistical outcomes. Simple examples of such outcomes include observing a head or tail when tossing a coin, or obtaining the numbers 1, 2, 3, 4, 5, or 6 when throwing a die. For a fairly-weighted coin, we would, of course, expect to see a head for 1∕2 of a large number of tosses; similarly, using a fairly-weighted die, we would expect to get a four for 1∕6 of all throws. We can then say that the probability of observing a head on one toss of a fairly-weighted coin is 1∕2 and that for obtaining a four on one throw of a fairly-weighted die is 1∕6. This heuristic notion of probability can be given mathematical formality via the following definition:
Given Nsmutually exclusive, equally likely points in sample space, with Ne of these points corresponding to the random event A, then the probability P(A) = Ne/Ns.
Here, sample space designates the available Ns occurrences while random event A denotes the subset of sample space given by Ne ≤ Ns.
We have previously shown that the translational energy mode for an ideal gas, even through a shock wave, invariably displays classical equilibrium behavior. In contrast, the rotational, vibrational, and electronic modes generally require significant time for re-equilibration upon disturbances in their equilibrium particle distributions. On this basis, we may expand our statistical discourse to nonequilibrium topics by grounding any dynamic redistribution on the presumption of translational equilibrium. For this reason, we now shift to elementary kinetic theory, which focuses solely on the translational motion of a gaseous assembly. Specifically, in this chapter, we consider equilibrium kinetic theory and its applications to velocity distributions, surface collisions, and pressure calculations. We then proceed to nonequilibrium kinetic theory with particular emphasis on calculations of transport properties and chemical reaction rates, as pursued in Chapters 16 and 17, respectively.
The Maxwell–Boltzmann Velocity Distribution
In Section 9.1, we showed that the translational energy mode for a dilute assembly displays classical behavior because of the inherently minute spacing between its discrete energy levels (Δε ≪ kT).
Parameters describing internal energy modes for molecular systems are required for statistical calculations of thermodynamic properties. This appendix includes such parameters for both diatomic and polyatomic molecules. Energy-mode parameters are tabulated for selected diatomic molecules in their ground electronic states. Term symbols for these electronic states are included, along with relevant bond lengths and dissociation energies. Similar parameters are also given for diatomic molecules in accessible upper electronic states. Finally, term symbols, rotational constants, and vibrational frequencies (cm−1) are tabulated for selected polyatomic molecules, with a particular focus on triatomic species. All diatomic data have been extracted from Huber and Herzberg (1978), while the polyatomic data have been taken from Herzberg (1991). Additional spectroscopic data are available electronically from NIST(http://webbook.nist.gov/chemistry/).
To this point, our study of statistical thermodynamics has provided a methodology for determining the most probable macrostate when considering an isolated system of independent particles. The most probable macrostate, in turn, has spawned mathematical definitions for both the internal energy and entropy in the dilute limit, thus producing general analytical expressions for all intensive thermodynamic properties of the ideal gas, as discussed in Chapter 4. These properties are inherently expressed in terms of the partition function, which mandates information on those energy levels and degeneracies associated with a particular atom or molecule. Obtaining such data has provided the rationale for our study of quantum mechanics and spectroscopy. Now that we have access to the necessary spectroscopic information, we are finally prepared to calculate the properties of the ideal gas. We begin, for simplicity, with the monatomic gas, which requires only knowledge connected with the translational and electronic energy modes. We then move on to the diatomic gas, which demands additional information based on the rotational and vibrational energy modes. Finally, we consider the polyatomic gas, which thus far has received little attention in our deliberations related to either statistical thermodynamics or quantum mechanics.
The Monatomic Gas
Typical monatomic gases include the noble gases, such as He and Ar, and elemental radicals, such as atomic oxygen and nitrogen. For such gases, rotation and vibration are irrelevant; thus, we need only consider the translational and electronic energy modes.
Results from a variety of spectroscopic measurements are necessary for computations in statistical thermodynamics. In particular, calculations of atomic and molecular properties usually require knowledge of electronic energy levels and their associated electronic degeneracies. This appendix provides the appropriate data in tabular form for selected atoms and molecules. The atomic tables include electron configurations, term symbols, and energies (cm−1) for the ground state and five additional upper energy levels. For most degenerate energy levels, mean energies are determined from relevant closely-lying values and reported with one less significant digit. In a similar fashion, the molecular tables provide term symbols and electronic energies (cm−1) for the ground state and three additional upper energy levels. The tabulated molecular energies (Te) represent energy gaps between the minima corresponding to internuclear potentials for the ground electronic and each upper electronic state. The atomic data have been taken from compilations made available electronically by the National Institute of Standards and Technology (NIST) (http://physics.nist.gov/PhysRefData), while the molecular data have been extracted from Huber and Herzberg (1979).
The basic concepts of classical thermodynamics can be summarized by invoking the following four postulates (Callen, 1985):
There exist particular states (called equilibrium states) of simple compressible systems that, macroscopically, are characterized completely by the internal energy, U, the volume, V, and the mole or particle numbers, of the chemical components.
There exists a function called the entropy, S, of the extensive parameters of any composite system, defined for all equilibrium states and having the following property: The values assumed by the extensive parameters in the absence of an internal constraint are those which maximize the entropy for the composite isolated system.
The entropy of a composite system is additive over the constituent subsystems. Moreover, the entropy is a continuous, differentiable, and monotonically increasing function of the internal energy.
The entropy of any system vanishes in the state for which (i.e., at the zero of temperature).
Recall that a simple compressible system is defined as one that is macroscopically homogeneous, uncharged, and chemically inert, that is sufficiently large that surface effects can be neglected, and that is not acted on by electric, magnetic, or gravitational fields. Although these four basic postulates are restricted to simple compressible systems, they can readily be extended to more complex systems (Lewis and Randall, 1961).