We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The topic of this chapter is the deterministic (worst-case) theory of quantization. The main object of interest is the metric entropy of a set, which allows us to answer two key questions:
(1) covering number: the minimum number of points to cover a set up to a given accuracy;
(2) packing number: the maximal number of elements of a given set with a prescribed minimum pairwise distance.
The foundational theory of metric entropy was put forth by Kolmogorov, who, together with his students, also determined the behavior of metric entropy in a variety of problems for both finite and infinite dimensions. Kolmogorov’s original interest in this subject stems from Hilbert’s thirteenth problem, which concerns the possibility or impossibility of representing multivariable functions as compositions of functions of fewer variables. Metric entropy has found numerous connections to and applications in other fields, such as approximation theory, empirical processes, small-ball probability, mathematical statistics, and machine learning.
So far our discussion on information-theoretic methods has been mostly focused on statistical lower bounds (impossibility results), with matching upper bounds obtained on a case-by-case basis. In Chapter 32 we will discuss three information-theoretic upper bounds for statistical estimation under KL divergence (Yang–Barron), Hellinger (Le Cam–Birgé), and total variation (Yatracos) loss metrics. These three results apply to different loss functions and are obtained using completely different means. However, they take on exactly the same form, involving the appropriate metric entropy of the model. In particular, we will see that these methods achieve minimax optimal rates for the classical problem of density estimation under smoothness constraints.
Katok [Lyapunov exponents, entropy and periodic points of diffeomorphisms. Publ. Math. Inst. Hautes Études Sci.51 (1980), 137–173] conjectured that every $C^{2}$ diffeomorphism f on a Riemannian manifold has the intermediate entropy property, that is, for any constant $c \in [0, h_{\mathrm {top}}(f))$, there exists an ergodic measure $\mu $ of f satisfying $h_{\mu }(f)=c$. In this paper, we obtain a conditional intermediate metric entropy property and two conditional intermediate Birkhoff average properties for basic sets of flows that characterize the refined roles of ergodic measures in the invariant ones. In this process, we establish a ‘multi-horseshoe’ entropy-dense property and use it to get the goal combined with conditional variational principles. We also obtain the same result for singular hyperbolic attractors.
In this paper we investigate the Margulis–Ruelle inequality for general Riemannian manifolds (possibly non-compact and with a boundary) and show that it always holds under an integrable condition.
We revisit an old topic in algorithms, the deterministic walk on a finite graph which always moves toward the nearest unvisited vertex until every vertex is visited. There is an elementary connection between this cover time and ball-covering (metric entropy) measures. For some familiar models of random graphs, this connection allows the order of magnitude of the cover time to be deduced from first passage percolation estimates. Establishing sharper results seems a challenging problem.
We characterize the geometry of a path in a sub-Riemannian manifold
using two metric invariants, the entropy and the complexity.
The entropy of a subset A of a metric space is the minimum number of
balls of a given radius ε needed to cover A.
It allows one to compute the Hausdorff dimension in some cases and
to bound it from above in general.
We define the complexity of a path in a sub-Riemannian manifold as the
infimum of the lengths of all trajectories contained in an
ε-neighborhood of the path, having the same extremities as the
path.
The concept of complexity for paths was first developed to model the
algorithmic complexity of the nonholonomic motion planning problem in
robotics. In this paper, our aim is to estimate the entropy, Hausdorff dimension and
complexity for a path in a general sub-Riemannian manifold.
We construct first a norm $\| \cdot \|_{\varepsilon}$ on the tangent space
that depends on a parameter ε > 0.
Our main result states then that the entropy of a path is equivalent to the
integral of this ε-norm along the path.
As a corollary we obtain upper and lower bounds for the Hausdorff
dimension of a path.
Our second main result is that complexity and entropy are equivalent
for generic paths.
We give also a computable sufficient condition on the path for this
equivalence to happen.
We establish new exponential inequalities for partial sums of random fields. Next, using classicalchaining arguments, we give sufficient conditions for partial sum processes indexed by large classes ofsets to converge to a set-indexed Brownian motion. For stationary fields of bounded random variables, thecondition is expressed in terms of a series of conditional expectations. For non-uniform ϕ-mixingrandom fields, we require both finite fourth moments and an algebraic decay of the mixing coefficients.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.