We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Choquet–Deny theorem states that any random walk on a nilpotent group is Liouville. This theorem is presented and proved. We then present a recent result from 2018 by Frisch, Hartman, Tamuz, and Vahidi-Ferdowski, that these are basically the only such examples.
In this chapter we start applying the tools developed in Part I to study random walks.The notion of amenable groups is defined, and Kesten’s criterion for amenable groups is proved. We then move to define the notion of isopermitric dimension. Inequalities relating the volume growth of a group to the isoperimetric dimension and to the decay of the heat kernel are proved.
In this chapter the basic theory of Markov chains is developed, with a focus on irreducible chains.The transition matrix is introduced as well as the notions of irreducibility, periodicity, recurrence (null and positive), and transience.The theory is applied to the relationship of a random walk on a group to the random walk on a finite-index subgroup induced by the "hitting measure."
Research in recent years has highlighted the deep connections between the algebraic, geometric, and analytic structures of a discrete group. New methods and ideas have resulted in an exciting field, with many opportunities for new researchers. This book is an introduction to the area from a modern vantage point. It incorporates the main basics, such as Kesten's amenability criterion, Coulhon and Saloff-Coste inequality, random walk entropy and bounded harmonic functions, the Choquet–Deny Theorem, the Milnor–Wolf Theorem, and a complete proof of Gromov's Theorem on polynomial growth groups. The book is especially appropriate for young researchers, and those new to the field, accessible even to graduate students. An abundance of examples, exercises, and solutions encourage self-reflection and the internalization of the concepts introduced. The author also points to open problems and possibilities for further research.
This collection of four short courses looks at group representations, graph spectra, statistical optimality, and symbolic dynamics, highlighting their common roots in linear algebra. It leads students from the very beginnings in linear algebra to high-level applications: representations of finite groups, leading to probability models and harmonic analysis; eigenvalues of growing graphs from quantum probability techniques; statistical optimality of designs from Laplacian eigenvalues of graphs; and symbolic dynamics, applying matrix stability and K-theory. An invaluable resource for researchers and beginning Ph.D. students, this book includes copious exercises, notes, and references.
In this chapter, we look at the moments of a random variable. Specifically we demonstrate that moments capture useful information about the tail of a random variable while often being simpler to compute or at least bound. Several well-known inequalities quantify this intuition. Although they are straightforward to derive, such inequalities are surprisingly powerful. Through a range of applications, we illustrate the utility of controlling the tail of a random variable, typically by allowing one to dismiss certain “bad events” as rare. We begin by recalling the classical Markov and Chebyshev’s inequalities. Then we discuss three of the most fundamental tools in discrete probability and probabilistic combinatorics. First, we derive the complementary first and second moment methods, and give several standard applications, especially to threshold phenomena in random graphs and percolation. Then we develop the Chernoff–Cramer method, which relies on the “exponential moment” and is the building block for large deviations bounds. Two key applications in data science are briefly introduced: sparse recovery and empirical risk minimization.
In this chapter, we move on to coupling, another probabilistic technique with a wide range of applications (far beyond discrete stochastic processes). The idea behind the coupling method is deceptively simple: to compare two probability measures, it is sometimes useful to construct a joint probability space with the corresponding marginals. We begin by defining coupling formally and deriving its connection to the total variation distance through the coupling inequality. We illustrate the basic idea on a classical Poisson approximation result, which we apply to the degree sequence of an Erdos–Renyi graph. Then we introduce the concept of stochastic domination and some related correlation inequalities. We develop a key application in percolation theory. Coupling of Markov chains is the next topic, where it serves as a powerful tool to derive mixing time bounds. Finally, we end with the Chen–Stein method for Poisson approximation, a technique that applies in particular in some natural settings with dependent variables.
In this chapter, we develop spectral techniques. We highlight some applications to Markov chain mixing and network analysis. The main tools are the spectral theorem and the variational characterization of eigenvalues, which we review together with some related results. We also give a brief introduction to spectral graph theory and detail an application to community recovery. Then we apply the spectral theorem to reversible Markov chains. In particular we define the spectral gap and establish its close relationship to the mixing time. We also show in that the spectral gap can be bounded using certain isoperimetric properties of the underlying network. We prove Cheeger’s inequality, which quantifies this relationship, and introduce expander graphs, an important family of graphs with good “expansion.” Applications to mixing times are also discussed. One specific technique is the “canonical paths method,” which bounds the spectral graph by formalizing a notion of congestion in the network.