To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In Chapter 21 we will consider an interesting variation of the channel coding problem. Instead of constraining the blocklength (i.e., the number of channel uses), we will constrain the total cost incurred by the codewords. The motivation is the following. Consider a deep-space probe that has a k-bit message that needs to be delivered to Earth (or a satellite orbiting it). The duration of transmission is of little worry for the probe, but what is really limited is the amount of energy it has stored in its battery. In this chapter we will learn how to study this question abstractly and how this fundamental limit is related to communication over continuous-time channels.
Modeling relies on the presence of patterns in reference datasets, and some of those patterns might occur in text. For example, customer service operations at an application service provider might need to know if customer comments on social media indicate satisfaction or dissatisfaction with the company’s products. There could be many thousands of comments across many channels, so getting people to read all of them would be slow and costly. Operations would be glad of a predictive model that automatically discerns sentiment expressed in the comments.
By means of an ion crystal model, we illustrate the concepts of a particle in the sense of quantum mechanics and of quantum field theory. The latter describes reality in particle physics, but in order to avoid confusion, we temporarily denote it as a “wavicle”.
In Chapter 25 we present the hard direction of the rate-distortion theorem: the random coding construction of a quantizer. This method is extended to the development of a covering lemma and soft-covering lemma, which lead to the sharp result of Cuff showing that the fundamental limit of channel simulation is given by Wyner’s common information. We also derive (a strengthened form of) Han and Verdú’s results on approximating output distributions in Kullback–Leibler.
This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study. The book introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite blocklength approach. With over 210 end-of-part exercises and numerous examples, students are introduced to contemporary applications in statistics, machine learning, and modern communication theory. This textbook presents information-theoretic methods with applications in statistical learning and computer science, such as f-divergences, PAC-Bayes and variational principle, Kolmogorov’s metric entropy, strong data-processing inequalities, and entropic upper bounds for statistical estimation. Accompanied by additional stand-alone chapters on more specialized topics in information theory, this is the ideal introductory textbook for senior undergraduate and graduate students in electrical engineering, statistics, and computer science.
Chiral symmetry of free fermions is studied in the continuum and on the lattice. In the latter case, we review the fermion doubling problem and the Nielsen–Ninomiya theorem, then we construct Wilson fermions and finally several types of Ginsparg–Wilson fermions, which are endowed with an exact, lattice modified chiral symmetry.
Blackjack is a game of chance providing a player with a 42.22% probability of winning a bet. That 42.44% summarizes a vast number of possible outcomes succinctly in just one useful number.
The topic of this chapter is the deterministic (worst-case) theory of quantization. The main object of interest is the metric entropy of a set, which allows us to answer two key questions:
(1) covering number: the minimum number of points to cover a set up to a given accuracy;
(2) packing number: the maximal number of elements of a given set with a prescribed minimum pairwise distance.
The foundational theory of metric entropy was put forth by Kolmogorov, who, together with his students, also determined the behavior of metric entropy in a variety of problems for both finite and infinite dimensions. Kolmogorov’s original interest in this subject stems from Hilbert’s thirteenth problem, which concerns the possibility or impossibility of representing multivariable functions as compositions of functions of fewer variables. Metric entropy has found numerous connections to and applications in other fields, such as approximation theory, empirical processes, small-ball probability, mathematical statistics, and machine learning.
In Chapter 17 we introduce the concept of an error-correcting code (ECC). We will spend time discussing what it means for a code to have low probability of error, and what is the optimum (ML or MAP) decoder. On the special case of coding for the binary symmetric channel (BSC), we showcase the evolution of our understanding of fundamental limits from pre-Shannon’s to modern finite blocklength. We also briefly review the history of ECCs. We conclude with a conceptually important proof of a weak converse (impossibility) bound for the performance of ECCs.
Chapter 33 introduces the strong data-processing inequalities (SDPIs), which are quantitative strengthening of the DPIs in Part I. As applications we show how to apply SDPI to deduce lower bounds for various estimation problems on graphs or in distributed settings. The purpose of this chapter is two-fold. First, we want to introduce general properties of the SDPI coefficients. Second, we want to show how SDPIs help prove sharp lower (impossibility) bounds on statistical estimation questions. The flavor of the statistical problems in this chapter is different from the rest of the book in that here the information about the unknown parameter θ is either more “thinly spread” across a high-dimensional vector of observations than in classical X = θ + Z type of models (see spiked Wigner and tree-coloring examples), or distributed across different terminals (as in correlation and mean estimation examples).
This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study. The book introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite blocklength approach. With over 210 end-of-part exercises and numerous examples, students are introduced to contemporary applications in statistics, machine learning, and modern communication theory. This textbook presents information-theoretic methods with applications in statistical learning and computer science, such as f-divergences, PAC-Bayes and variational principle, Kolmogorov’s metric entropy, strong data-processing inequalities, and entropic upper bounds for statistical estimation. Accompanied by additional stand-alone chapters on more specialized topics in information theory, this is the ideal introductory textbook for senior undergraduate and graduate students in electrical engineering, statistics, and computer science.
For her New Year’s resolution, Gaowen has determined to finally get organized. She puts up shelves and racks, adds new filing cabinets, cupboards, a desk, and assorted bins, and sets about putting her electronics equipment, clothes, dishes, books, keys, and everything else all in their proper places. You can think of data objects as containers to help you get organized.
This chapter deals with the renormalization group in Wilson’s spirit. General concepts, like fixed points, are illustrated with examples, such as block-variable transformations, perfect lattice actions, the Wilson–Fisher fixed points, the Callan–Symanzik equation, and various scenarios for running couplings.