To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this chapter, we show how Malliavin calculus and Stein's method may be combined into a powerful and flexible tool for studying probabilistic approximations. In particular, our aim is to use these two techniques to assess the distance between the laws of regular functionals of an isonormal Gaussian process and a one-dimensional normal distribution.
The highlight of the chapter is arguably Section 5.2, where we deduce a complete characterization of Gaussian approximations inside a fixed Wiener chaos. As discussed below, the approach developed in this chapter yields results that are systematically stronger than the so-called ‘method of moments and cumulants’, which is the most popular tool used in the proof of central limit theorems for functional of Gaussian fields.
Note that, in view of the chaos representation (2.7.8), any general result involving random variables in a fixed chaos is a key for studying probabilistic approximations of more general functionals of Gaussian fields. This last point is indeed one of the staples of the entire book, and will be abundantly illustrated in Section 5.3 as well as in Chapter 7.
Throughout the following, we fix an isonormal Gaussian process X = {X(h): h ∈ h}, defined on a suitable probability space (Ω, ℱ, P) such that ℱ = σ {X}. We will also adopt the language and notation of Malliavin calculus introduced in Chapter 2.
This is a text about probabilistic approximations, which are mathematical statements providing estimates of the distance between the laws of two random objects. As the title suggests, we will be mainly interested in approximations involving one or more normal (equivalently called Gaussian) random elements. Normal approximations are naturally connected with central limit theorems (CLTs), i.e. convergence results displaying a Gaussian limit, and are one of the leading themes of the whole theory of probability.
The main thread of our text concerns the normal approximations, as well as the corresponding CLTs, associated with random variables that are functionals of a given Gaussian field, such as a (fractional) Brownian motion on the real line. In particular, a pivotal role will be played by the elements of the socalled Gaussian Wiener chaos. The concept of Wiener chaos generalizes to an infinite-dimensional setting the properties of the Hermite polynomials (which are the orthogonal polynomials associated with the one-dimensional Gaussian distribution), and is now a crucial object in several branches of theoretical and applied Gaussian analysis.
The cornerstone of our book is the combination of two probabilistic techniques, namely the Malliavin calculus of variations and Stein's method for probabilistic approximations.
The Malliavin calculus of variations is an infinite-dimensional differential calculus, whose operators act on functionals of general Gaussian processes. Initiated by Paul Malliavin (starting from the seminal paper [69], which focused on a probabilistic proof of Hörmander's ‘sum of squares’ theorem), this theory is based on a powerful use of infinite-dimensional integration by parts formulae.
Presenting important trends in the field of stochastic analysis, this collection of thirteen articles provides an overview of recent developments and new results. Written by leading experts in the field, the articles cover a wide range of topics, ranging from an alternative set-up of rigorous probability to the sampling of conditioned diffusions. Applications in physics and biology are treated, with discussion of Feynman formulas, intermittency of Anderson models and genetic inference. A large number of the articles are topical surveys of probabilistic tools such as chaining techniques, and of research fields within stochastic analysis, including stochastic dynamics and multifractal analysis. Showcasing the diversity of research activities in the field, this book is essential reading for any student or researcher looking for a guide to modern trends in stochastic analysis and neighbouring fields.