To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In Chapter 30 we describe a strategy for proving the statistical lower bound we call the mutual information method (MIM), which entails comparing the amount of information data provides with the minimum amount of information needed to achieve a certain estimation accuracy. Similar to Section 29.2, the main information-theoretical ingredient is the data-processing inequality, this time for mutual information as opposed to f-divergences.
There is a criminal trial in progress and the prosecutor tells the jury that the defendant’s DNA was found at the crime scene. This, says the prosecutor, is enough to prove that the defendant did commit the crime. In fact, someone’s DNA was indeed found at the crime scene, and it could be the defendant’s, according to a DNA expert who testifies that there is only a 1 in a million chance that the DNA could be from someone else. Even so, the prosecutor insists, that means there is a 99.99999% chance that the DNA is from the defendant, which is so overwhelmingly conclusive that any reasonable person would agree the DNA must be from the defendant. The prosecutor presses the jury for a conviction.
We stay in the framework of the low-energy effective theory of QCD in terms ofNambu–Goldstone bosons fields and consider effects due to their topology. We distinguishthe cases of Nf = 2 or Nf >= 3 light quark flavors and discuss in both cases how thegauge anomaly cancelation is manifest in the effective theory, the role of G-parity, andthe neutral pion decay into two photons, which does not explicitly depend on the number ofcolors, Nc . For Nf >= 3 we introduce the Wess–Zumino–Novikov–Witten term in a 5thdimension, we discuss the intrinsic parity of light meson fields and their electromagneticinteractions. In this context, we clarify the question whether there is low-energyevidence for Nc = 3, and we address again therole of technicolor.
What interest could rational agents have in acting lawfully if not for the order, stability, and other collective goods that law brings to society? Why should it otherwise matter to them that their actions are lawful? It would matter to them, of course, if acting unlawfully made them liable to punishment. But in that case their interest in acting lawfully would not come from seeing it as a good thing. It would come, rather, from seeing it as the surest way to avoid a bad thing, something they have an interest in escaping. Yet the challenge to an ethics like Kant’s that represents lawfulness as the essence of moral action is to explain what could interest rational agents in acting lawfully regardless of how the law is enforced, regardless, that is, of whether it is enforced by threats of punishment or incentives to obey. The question, then, that confronts a defender of Kant’s ethics is why a rational agent should regard an action’s being lawful as a condition of its being reasonable to do. If he cannot give an answer to this question, the charge of excessive formalism will stick.
The Higgs mechanism is introduced, first for scalar QED and then with the Higgs doublet,which takes us to the gauge bosons in the electroweak sector of the Standard Model. Nextwe discuss variants of “spontaneous symmetry breaking” patterns, which deviate from theStandard Model, in the continuum and on the lattice. Finally we consider a “smallunification” of the electroweak gauge couplings, as a toy model for the concept of GrandUnified Theories (to be address in Chapter 26).
The topological charge of smooth Yang–Mills gauge fields is discussed, describing inparticular the SU(2) instanton. This leads to the Adler–Bell–Jackiw anomaly and to θ-vacuum states, which are similar to energy bands in a crystal. Wefinally discuss the Atiyah–Singer index theorem in the continuum and more explicitly onthe lattice.
So far our discussion of channel coding was mostly following the same lines as the M-ary hypothesis testing (HT) in statistics. In Chapter 18 we introduce a key departure from this: The principal and most interesting goal in information theory is the design of the encoder mapping an input message to the channel input. Once the codebook is chosen, the problem indeed becomes that of M-ary HT and can be tackled by standard statistical tools. However, the task of choosing the encoder has no exact analogs in statistical theory (the closest being design of experiments). It turns out that the problem of choosing a good encoder will be much simplified if we adopt a suboptimal way of testing M-ary HT, based on thresholding information density.
Free fermion fields are canonically quantized, proceeding from Weyl to Dirac andMajorana fermions, and from the massless to the massive case. We discuss properties likechirality, helicity, and the fermion number, as well as the behavior under parity andcharge conjugation transformation. Fermionic statistics is applied to the cosmic neutrinobackground.
Scalar quantum field theory is introduced in the functional integral formulation,starting from classical field theory and quantum mechanics. We consider Euclidean time andrelate the system in the lattice regularization to classical statistical mechanics.
Consider the following problem: Given a stream of independent Ber(p) bits, with unknown p, we want to turn them into pure random bits, that is, independent Ber(1/2) bits. Our goal is to find a universal way to extract the most number of bits. In other words, we want to extract as many fair coin flips as possible from possibly biased coin flips, without knowing the actual bias. In 1951 von Neumann proposed the following scheme: Divide the stream into pairs of bits, output 0 if 10, output 1 if 01, otherwise do nothing and move to the next pair. Since both 01 and 10 occur with probability pq, regardless of the value of p, we obtain fair coin flips at the output. To measure the efficiency of von Neumann’s scheme, note that, on average, we have 2n bits in and 2pqn bits out. So the efficiency (rate) is pq. The question is: Can we do better? It turns out that the fundamental limit (maximal efficiency) is given by the entropy $h(p)$. In this chapter we discuss optimal randomness extractors, due to Elias and Peres respectively, and several related problems.
In Chapter 6 we start with explaining the important property of mutual information known as tensorization (or single-letterization), which allows one to maximize and minimize mutual information between two high-dimensional vectors. Next, we extend the information measures discussed in previous chapters for random variables to random processes by introducing the concepts of entropy rate (for a stochastic process) and mutual information rate (for a pair of stochastic processes).