To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Here we face the analysis of another kind of memoryless discrete process: branching processes, otherwise termed “chain reactions” under more physical inspiration. Before that, we carefully deepen and generalize the knowledge of the very useful tool of generating functions. This will be soon applied to the study of the dynamics of a population, predicting whether it will certainly be extinct – and how fast – or it will be self-sustaining.
In this chapter we study the first example of a correlated memoryless phenomenon: the famous “drunkard’s walk”, formally termed the random walk. We begin from a very simple case, in a homogeneous and isotropic space on a discrete hypercubic lattice. Then we add traps here and there. Eventually we make a foray into the continuous regime, with the Fokker–Planck diffusion equation (which, we see, is what physicists call a Schrödinger equation in imaginary time), and the stochastic differential Langevin equation.
This chapter builds upon the previous chapters, applying the method of combining probability theory with Hamiltonian mechanics. To do so, one needs to build a meaningful sample space over states, in this case, quantum states. A substantial part of the chapter discusses how to construct these quantum states out of which one can build a sample space on which to apply a probability measure. Vector states and density operators are introduced and various worked examples are proposed. Once the quantum sample space is identified, the equilibrium quantum statistical mechanics is formulated. The ‘particle in a box’ problem turns out to be analytically intractable, unless we take a certain limit called the semi-classical limit. Heuristics as to what this limit means are proposed. Finally, the von Neumann (quantum) entropy is introduced and analogies with thermodynamics are made. An application to the heat capacity of solids is presented. As complement, the chapter also introduces a classical ‘ring-polymer’ analog of quantum statistical mechanics stating the formal equivalence between a one-particle quantum canonical system and an N-particle classical canonical system.
Here we return to discrete Markov processes, but this time with continuous-time processes. We first consider, study, and solve specific examples such as Poisson processes, divergent birth processes, and birth-and-death processes. We derive the master equations for their probability distributions, and derive and discuss important solutions. In particular, we deepen the theory of Feller for divergent birth processes. In the end we formally study the general case of Markov processes in the stationary case, writing down the forward and the backward Kolmogorov master equations.
Analysis of experimental scalar data is tackled here. Starting from basic analysis of a large number of well-behaved data, eventually displaying Gaussian distributions, we move on to Bayesian inference and face the cases of few (or no) data, sometimes badly behaved. We first present methods to analyze data whose ideal distribution is known, and then we show methods to make predictions even when our ignorance about the data distribution is total. Eventually, various resampling methods are provided to deal with time-correlated measurements, biased estimators, anomalous data, and under- or over-estimation of statistical errors.
This short chapter aims at motivating the interest for statistical mechanics. It starts by a brief description of the historical context within which the theory has developed, and ponders its status, or lack thereof, in the public eye. A first original parallel of the use of statistics with mechanics is drawn in the context of error propagation analysis, which can also be treated within statistical mechanics. With regard to situations, statistical mechanics can be applied for, two categories are distinguished: experimental/protocol error or observational state underdetermining the mechanical state of the system. The rest of the chapter puts the emphasis on this latter category, and explains how statistical mechanics plays the role of ‘Rosetta Stone’ translating between different modes of description of the same system, thereby giving tools to infer relations between observational variables, for which we usually do not have any fundamental theory, from the physics of the underlying constituents, which is presumed to be that of Hamiltonian classical or quantum mechanics.
As we realize that random walks, chain reactions, and recurrent events are all Markov chains, i.e., correlated processes without memory, in this chapter we derive a general theory, including classification and properties of the single states and of chains. In particular, we focus on the building blocks of the theory, i.e., irreducible chains, presenting and proving a number of fundamental and useful theorems. We end up deriving the balance equation for the limit probability and the approach to the limit for long times, developing and applying the Perron–Frobenius theory for non-negative matrices and the spectral decomposition for non-Hermitian matrices. Among the applications of the theory, we underline the sorting of Web pages by search engines.
This chapter is concerned with Gibbs’ statistical mechanics. It relies on developing the constraints imposed by Hamiltonian mechanics on the time evolution of a general probability density function in phase space. This is effectively done by using the notion of Hamiltonian flow and material derivative. Combining conservation of probability with Liouville’s theorem of Hamiltonian mechanics gives rise to Liouville’s equation, which is a cornerstone equation of both time-dependent and equilibrium statistical mechanics. From there on, the chapter focuses on equilibrium statistical mechanics and introduces the canonical and microcanonical Gibbs’ ensembles. The chapter takes a step-by-step approach where the main ideas are presented first for one particle in one dimension of space, and then reformulated in more increasingly more complex situations. Important properties such as the partition function acting as a moment generating function are derived and put in practice. Finally, a whole section is dedicated to little know works from Gibbs on statistical mechanics for identical particles. Finally, the grand canonical ensemble is also introduced.
This chapter follows a logic of exposition initiated by Gibbs in 1902. On the one hand, some theoretical results in statistical mechanics have been derived in Chapter 3, while, on another hand, some theoretical/experimental results are expressed within thermodynamics, and parallels are drawn between the two approaches. To this end, the theory of thermodynamics and its laws are presented. The chapter takes an approach where each stated law is attached to a readable source material and a person’s writing. The exposition of the second law follows the axiomatics of Carathéodory, for example. This has the advantage of decoupling the physics from the mathematics. The structure of thermodynamic theory with the scaling behaviour of thermodynamic variables, Massieu potentials and Legendre transformations is also developed. Finally, correspondence relations are postulated between thermodynamics and statistical mechanics, allowing one to interpret thermodynamic variables as observational states associated to certain probability laws. Applications are given, including the Gibbs paradox. The equivalence between the canonical and the microcanonical ensembles is analysed in detail.
Yet another memoryless correlated discrete process is considered, recurrent events. These are classified in different ways, and a whole theory is developed to describe the possible behaviors. Special attention is devoted to the proof of the limit probability theorem, whose lengthy details are reported in an appendix, so as not to scare readers. The theory of recurrent events is particularly useful because many properties and mathematical theorems can be straightforwardly translated in the more general theory of Markov chains.
This chapter is devoted to correlations. We take up the central limit theorem once again, first with a couple of specific examples solved with considerable – but instructive – effort: Markov chains and recurrent events. Then, we generalize the machinery of generating functions to multivariate, correlated systems of stochastic variables, until we are able to prove the central limit theorem and the large deviations theorem for correlated events. We go back to the Markov chain central limit example to show how the theorem massively simplifies things. Eventually, we show how correlations and the lack of a Gaussian central limit are linked to phase transitions in statistical physics.