To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The statistical distribution of the (global and local) extrema are described by an extreme value (Gumbel) distribution. Local maxima can serve as centers for Voronoi cells whose statistics is investigated in terms of density, area, and shape. Following the "musical score" analogy, those local maxima (either isolated or organized along ridges) can be interpreted as "loud" points from which signals embedded in noise can be detected and estimated.
Gaussian functions are central throughout the book. They are introduced as basic elementary waveforms whose versatility can be used for establishing a continuum of representations between the "natural" (Shannon-Nyquist-like) representation in time to the dual (Fourier) representation in frequency. This allows for a first introduction of a joint (Gabor) time-frequency representation.
As Fourier images of (modified) ambiguity functions, time-frequency energy distributions obey uncertainty relations that can take different forms, depending on the chosen measure of spread in the plane (e.g., norms, support, variance, or entropy). In complement to classical formulations of uncertainty, this reinforces the role played by Gaussians and Gaussian STFTs/spectrograms.
When applied to white Gaussian noise, Gaussian spectrograms are made of patches whose distribution is controlled by the correlation function of the STFT considered as a 2D homogeneous field, and in turn by the underlying reproducing kernel of the analysis. This can be given a simple model whose basic ingredients are a mean distribution of logons resulting from a circle packing argument due to uncertainty, and an adequate degree of randomness in fluctuations around the mean model and inter-logon phase relationships.
An important distinction in signal processing is often made between “signal” and “noise.” This is proposed here to be approached via a contrast between “order” and “disorder,” in the sense of a “coherent” or “incoherent” structure in a time-frequency description. Basic classes of signals and noise models are discussed, focusing on what will be mostly used in the rest of the book.
Elaborating on spectrograms viewed as smoothed Wigner distributions, their fine geometrical structure can be explained in cases of increasing complexity. This is detailed by resorting to the interference geometry of Wigner distributions. This allows in particular to make a connection between the zeros of the transforms and the Voronoi cells attached to local maxima.
Gaussian functions happen also to be central with respect to the intrinsic conflict that exists between time and frequency and which is referred to as “uncertainty.” This can take various forms depending upon the chosen measure of spread or occupancy in both domains. The striking feature is that most of the intuitive measures such as dispersion or entropy end up with Gaussians as minimizers, making them “quanta” of information (“logons” in Gabor's terminology).
Time-frequency energy distributions are faced with a trade-off between localization and interference. Different approaches exist for getting sharply localized that are almost interference-free: they are based either on some forms of post-processing (reassignment and synchrosqueezing, and variations thereof, which both move computed values in the plane) or by invoking sparsity arguments and replacing a Fourier-based transform by a constrained optimization. When targeting a sharpened distribution, another trade-off exists between localization and reconstruction capabilities: some methods are presented in order to overcome this limitation.
The different examples that had been used in Section 1 as motivations, are revisited at the light of what has been discussed later. The bat echolocation case is considered in a greater generality, with considerations about sequences of calls and the “why and how” of their structure in terms of optimality. Time-frequency formulations of matched filtering are proposed and used for, e.g., supporting in a geometrical way the solution to Doppler-tolerance. A similar analysis is provided for gravitational waves, with signal denoising complemented by parameter estimations and comparisons with theoretical models. Finally, Riemann’s zeta-function, as well as variations thereof and Weierstrass' functions, are given a time-frequency interpretation based on their disentanglement into chirp components.
General considerations are given to possible extensions of the methods discussed in the book to nonstationary data beyond 1D signals (e.g., multivariate or graph-based). The analogy between signal theory and quantum mechanics formalisms are mentioned, and a general perpsective on further opportunities is outlined as an open conclusion.
General considerations are given about signal processing and its place within data science. It is argued that its specificity is rooted in a balanced implication of tools and concepts from physics, mathematics, and informatics. Examples (Fourier, wavelets) are given for supporting this claim, and arguments are detailed for justifying why time-frequency analysis, which is the topic of this book, can be viewed as a natural language for nonstationary signal processing. The Introduction is also the place where to present a roadmap for the way to read the book.
The simplest and most intuitive time-frequency transforms are the Short-Time Fourier Transform (STFT) and the associated spectrogram (squared STFT), which are in close connection with the Gabor transform when using a Gaussian window. Given a 1D signal, its STFT is a 2D function whose redundancy is controlled by a reproducing kernel. This kernel can be equivalently seen as an ambiguity function, i.e., a 2D time-frequency correlation function that is constrained by uncertainty. Building upon the classical duality which exists between “correlation functions” and “energy distributions,” we end up with the Wigner distribution as a central tool for time-frequency signal analysis. The Wigner distribution happens however to be a member of larger classes (such as Cohen's) to which the spectrogram belongs too.
Stationarity is a central concept in time-frequency analysis, but its formal definition in terms of global time-shift invariance is demanding, callingfor a more operational viewpoint that better matches intuition and common practice. A concept of relative stationarity is proposed, which not only makes the observation time-scale enter the picture but also permits to test for stationarity.
Rather than considering the disentanglement of multicomponent nonstationary signals as a time-frequency post-processing, a possibility is to first decompose the observation into modes that are amenable to some further demodulations. In this spirit, this chapter reviews a technique that has recently gained popularity, namely “Empirical Mode Decomposition” and the associated “Hilbert-Huang Transform.” The rationale of those data-driven methods is presented, as well as their actual implementation, with a brief discussion of pros and cons with respect to more conventional time-frequency analysis.