We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
Online ordering will be unavailable from 17:00 GMT on Friday, April 25 until 17:00 GMT on Sunday, April 27 due to maintenance. We apologise for the inconvenience.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Based on the long-running Probability Theory course at the Sapienza University of Rome, this book offers a fresh and in-depth approach to probability and statistics, while remaining intuitive and accessible in style. The fundamentals of probability theory are elegantly presented, supported by numerous examples and illustrations, and modern applications are later introduced giving readers an appreciation of current research topics. The text covers distribution functions, statistical inference and data analysis, and more advanced methods including Markov chains and Poisson processes, widely used in dynamical systems and data science research. The concluding section, 'Entropy, Probability and Statistical Mechanics' unites key concepts from the text with the authors' impressive research experience, to provide a clear illustration of these powerful statistical tools in action. Ideal for students and researchers in the quantitative sciences this book provides an authoritative account of probability theory, written by leading researchers in the field.
Play of Chance and Purpose emphasizes learning probability, statistics, and stochasticity by developing intuition and fostering imagination as a pedagogical approach. This book is meant for undergraduate and graduate students of basic sciences, applied sciences, engineering, and social sciences as an introduction to fundamental as well as advanced topics. The text has evolved out of the author's experience of teaching courses on probability, statistics, and stochastic processes at both undergraduate and graduate levels in India and the United States. Readers will get an opportunity to work on several examples from real-life applications and pursue projects and case-study analyses as capstone exercises in each chapter. Many projects involve the development of visual simulations of complex stochastic processes. This will augment the learners' comprehension of the subject and consequently train them to apply their learnings to solve hitherto unseen problems in science and engineering.
This text examines Markov chains whose drift tends to zero at infinity, a topic sometimes labelled as 'Lamperti's problem'. It can be considered a subcategory of random walks, which are helpful in studying stochastic models like branching processes and queueing systems. Drawing on Doob's h-transform and other tools, the authors present novel results and techniques, including a change-of-measure technique for near-critical Markov chains. The final chapter presents a range of applications where these special types of Markov chains occur naturally, featuring a new risk process with surplus-dependent premium rate. This will be a valuable resource for researchers and graduate students working in probability theory and stochastic processes.
Bringing together years of research into one useful resource, this text empowers the reader to creatively construct their own dependence models. Intended for senior undergraduate and postgraduate students, it takes a step-by-step look at the construction of specific dependence models, including exchangeable, Markov, moving average and, in general, spatio-temporal models. All constructions maintain a desired property of pre-specifying the marginal distribution and keeping it invariant. They do not separate the dependence from the marginals and the mechanisms followed to induce dependence are so general that they can be applied to a very large class of parametric distributions. All the constructions are based on appropriate definitions of three building blocks: prior distribution, likelihood function and posterior distribution, in a Bayesian analysis context. All results are illustrated with examples and graphical representations. Applications with data and code are interspersed throughout the book, covering fields including insurance and epidemiology.
Brownian motion is an important topic in various applied fields where the analysis of random events is necessary. Introducing Brownian motion from a statistical viewpoint, this detailed text examines the distribution of quadratic plus linear or bilinear functionals of Brownian motion and demonstrates the utility of this approach for time series analysis. It also offers the first comprehensive guide on deriving the Fredholm determinant and the resolvent associated with such statistics. Presuming only a familiarity with standard statistical theory and the basics of stochastic processes, this book brings together a set of important statistical tools in one accessible resource for researchers and graduate students. Readers also benefit from online appendices, which provide probability density graphs and solutions to the chapter problems.
The third edition of this highly regarded text provides a rigorous, yet entertaining, introduction to probability theory and the analytic ideas and tools on which the modern theory relies. The main changes are the inclusion of the Gaussian isoperimetric inequality plus many improvements and clarifications throughout the text. With more than 750 exercises, it is ideal for first-year graduate students with a good grasp of undergraduate probability theory and analysis. Starting with results about independent random variables, the author introduces weak convergence of measures and its application to the central limit theorem, and infinitely divisible laws and their associated stochastic processes. Conditional expectation and martingales follow before the context shifts to infinite dimensions, where Gaussian measures and weak convergence of measures are studied. The remainder is devoted to the mutually beneficial connection between probability theory and partial differential equations, culminating in an explanation of the relationship of Brownian motion to classical potential theory.
Research in recent years has highlighted the deep connections between the algebraic, geometric, and analytic structures of a discrete group. New methods and ideas have resulted in an exciting field, with many opportunities for new researchers. This book is an introduction to the area from a modern vantage point. It incorporates the main basics, such as Kesten's amenability criterion, Coulhon and Saloff-Coste inequality, random walk entropy and bounded harmonic functions, the Choquet–Deny Theorem, the Milnor–Wolf Theorem, and a complete proof of Gromov's Theorem on polynomial growth groups. The book is especially appropriate for young researchers, and those new to the field, accessible even to graduate students. An abundance of examples, exercises, and solutions encourage self-reflection and the internalization of the concepts introduced. The author also points to open problems and possibilities for further research.
This collection of four short courses looks at group representations, graph spectra, statistical optimality, and symbolic dynamics, highlighting their common roots in linear algebra. It leads students from the very beginnings in linear algebra to high-level applications: representations of finite groups, leading to probability models and harmonic analysis; eigenvalues of growing graphs from quantum probability techniques; statistical optimality of designs from Laplacian eigenvalues of graphs; and symbolic dynamics, applying matrix stability and K-theory. An invaluable resource for researchers and beginning Ph.D. students, this book includes copious exercises, notes, and references.
Providing a graduate-level introduction to discrete probability and its applications, this book develops a toolkit of essential techniques for analysing stochastic processes on graphs, other random discrete structures, and algorithms. Topics covered include the first and second moment methods, concentration inequalities, coupling and stochastic domination, martingales and potential theory, spectral methods, and branching processes. Each chapter expands on a fundamental technique, outlining common uses and showing them in action on simple examples and more substantial classical results. The focus is predominantly on non-asymptotic methods and results. All chapters provide a detailed background review section, plus exercises and signposts to the wider literature. Readers are assumed to have undergraduate-level linear algebra and basic real analysis, while prior exposure to graduate-level probability is recommended. This much-needed broad overview of discrete probability could serve as a textbook or as a reference for researchers in mathematics, statistics, data science, computer science and engineering.
Written by Sheldon Ross and Erol Peköz, this text familiarises you with advanced topics in probability while keeping the mathematical prerequisites to a minimum. Topics covered include measure theory, limit theorems, bounding probabilities and expectations, coupling and Stein's method, martingales, Markov chains, renewal theory, and Brownian motion. No other text covers all these topics rigorously but at such an accessible level - all you need is an undergraduate-level understanding of calculus and probability. New to this edition are sections on the gambler's ruin problem, Stein's method as applied to exponential approximations, and applications of the martingale stopping theorem. Extra end-of-chapter exercises have also been added, with selected solutions available.This is an ideal textbook for students taking an advanced undergraduate or graduate course in probability. It also represents a useful resource for professionals in relevant application domains, from finance to machine learning.
Discrete quantum walks are quantum analogues of classical random walks. They are an important tool in quantum computing and a number of algorithms can be viewed as discrete quantum walks, in particular Grover's search algorithm. These walks are constructed on an underlying graph, and so there is a relation between properties of walks and properties of the graph. This book studies the mathematical problems that arise from this connection, and the different classes of walks that arise. Written at a level suitable for graduate students in mathematics, the only prerequisites are linear algebra and basic graph theory; no prior knowledge of physics is required. The text serves as an introduction to this important and rapidly developing area for mathematicians and as a detailed reference for computer scientists and physicists working on quantum information theory.
This book studies the large deviations for empirical measures and vector-valued additive functionals of Markov chains with general state space. Under suitable recurrence conditions, the ergodic theorem for additive functionals of a Markov chain asserts the almost sure convergence of the averages of a real or vector-valued function of the chain to the mean of the function with respect to the invariant distribution. In the case of empirical measures, the ergodic theorem states the almost sure convergence in a suitable sense to the invariant distribution. The large deviation theorems provide precise asymptotic estimates at logarithmic level of the probabilities of deviating from the preponderant behavior asserted by the ergodic theorems.
Compound renewal processes (CRPs) are among the most ubiquitous models arising in applications of probability. At the same time, they are a natural generalization of random walks, the most well-studied classical objects in probability theory. This monograph, written for researchers and graduate students, presents the general asymptotic theory and generalizes many well-known results concerning random walks. The book contains the key limit theorems for CRPs, functional limit theorems, integro-local limit theorems, large and moderately large deviation principles for CRPs in the state space and in the space of trajectories, including large deviation principles in boundary crossing problems for CRPs, with an explicit form of the rate functionals, and an extension of the invariance principle for CRPs to the domain of moderately large and small deviations. Applications establish the key limit laws for Markov additive processes, including limit theorems in the domains of normal and large deviations.
Stable Lévy processes lie at the intersection of Lévy processes and self-similar Markov processes. Processes in the latter class enjoy a Lamperti-type representation as the space-time path transformation of so-called Markov additive processes (MAPs). This completely new mathematical treatment takes advantage of the fact that the underlying MAP for stable processes can be explicitly described in one dimension and semi-explicitly described in higher dimensions, and uses this approach to catalogue a large number of explicit results describing the path fluctuations of stable Lévy processes in one and higher dimensions. Written for graduate students and researchers in the field, this book systemically establishes many classical results as well as presenting many recent results appearing in the last decade, including previously unpublished material. Topics explored include first hitting laws for a variety of sets, path conditionings, law-preserving path transformations, the distribution of extremal points, growth envelopes and winding behaviour.
In pioneering work in the 1950s, S. Karlin and J. McGregor showed that probabilistic aspects of certain Markov processes can be studied by analyzing orthogonal eigenfunctions of associated operators. In the decades since, many authors have extended and deepened this surprising connection between orthogonal polynomials and stochastic processes. This book gives a comprehensive analysis of the spectral representation of the most important one-dimensional Markov processes, namely discrete-time birth-death chains, birth-death processes and diffusion processes. It brings together the main results from the extensive literature on the topic with detailed examples and applications. Also featuring an introduction to the basic theory of orthogonal polynomials and a selection of exercises at the end of each chapter, it is suitable for graduate students with a solid background in stochastic processes as well as researchers in orthogonal polynomials and special functions who want to learn about applications of their work to probability.
Over the past 25 years, there has been an explosion of interest in the area of random tilings. The first book devoted to the topic, this timely text describes the mathematical theory of tilings. It starts from the most basic questions (which planar domains are tileable?), before discussing advanced topics about the local structure of very large random tessellations. The author explains each feature of random tilings of large domains, discussing several different points of view and leading on to open problems in the field. The book is based on upper-division courses taught to a variety of students but it also serves as a self-contained introduction to the subject. Test your understanding with the exercises provided and discover connections to a wide variety of research areas in mathematics, theoretical physics, and computer science, such as conformal invariance, determinantal point processes, Gibbs measures, high-dimensional random sampling, symmetric functions, and variational problems.
Often it is more instructive to know 'what can go wrong' and to understand 'why a result fails' than to plod through yet another piece of theory. In this text, the authors gather more than 300 counterexamples - some of them both surprising and amusing - showing the limitations, hidden traps and pitfalls of measure and integration. Many examples are put into context, explaining relevant parts of the theory, and pointing out further reading. The text starts with a self-contained, non-technical overview on the fundamentals of measure and integration. A companion to the successful undergraduate textbook Measures, Integrals and Martingales, it is accessible to advanced undergraduate students, requiring only modest prerequisites. More specialized concepts are summarized at the beginning of each chapter, allowing for self-study as well as supplementary reading for any course covering measures and integrals. For researchers, it provides ample examples and warnings as to the limitations of general measure theory. This book forms a sister volume to René Schilling's other book Measures, Integrals and Martingales (www.cambridge.org/9781316620243).
Using Bishop's work on constructive analysis as a framework, this monograph gives a systematic, detailed and general constructive theory of probability theory and stochastic processes. It is the first extended account of this theory: almost all of the constructive existence and continuity theorems that permeate the book are original. It also contains results and methods hitherto unknown in the constructive and nonconstructive settings. The text features logic only in the common sense and, beyond a certain mathematical maturity, requires no prior training in either constructive mathematics or probability theory. It will thus be accessible and of interest, both to probabilists interested in the foundations of their speciality and to constructive mathematicians who wish to see Bishop's theory applied to a particular field.
The main subject of this introductory book is simple random walk on the integer lattice, with special attention to the two-dimensional case. This fascinating mathematical object is the point of departure for an intuitive and richly illustrated tour of related topics at the active edge of research. It starts with three different proofs of the recurrence of the two-dimensional walk, via direct combinatorial arguments, electrical networks, and Lyapunov functions. After reviewing some relevant potential-theoretic tools, the reader is guided toward the relatively new topic of random interlacements - which can be viewed as a 'canonical soup' of nearest-neighbour loops through infinity - again with emphasis on two dimensions. On the way, readers will visit conditioned simple random walks - which are the 'noodles' in the soup - and also discover how Poisson processes of infinite objects are constructed and review the recently introduced method of soft local times. Each chapter ends with many exercises, making it suitable for courses and independent study.
Elementary treatments of Markov chains, especially those devoted to discrete-time and finite state-space theory, leave the impression that everything is smooth and easy to understand. This exposition of the works of Kolmogorov, Feller, Chung, Kato, and other mathematical luminaries, which focuses on time-continuous chains but is not so far from being elementary itself, reminds us again that the impression is false: an infinite, but denumerable, state-space is where the fun begins. If you have not heard of Blackwell's example (in which all states are instantaneous), do not understand what the minimal process is, or do not know what happens after explosion, dive right in. But beware lest you are enchanted: 'There are more spells than your commonplace magicians ever dreamed of.'