To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
An isolated system is described by a classical Hamiltonian dynamics. In the long-time limit, the trajectory of such a system yields a histogram, i.e., a distribution for any observable. With one plausible assumption, introduced here as a fundamental principle, this histogram is shown to lead to the microcanonical distribution. Pressure, temperature, and chemical potential can then be identified microscopically. This dynamical approach thus recovers the results that are often obtained for equilibrium by minimizing a postulated entropy function.
For time-dependent driving, the key concepts of time-reversed and backward protocols are introduced. The reversibility of Hamiltonian dynamics is shown to imply that work is antisymmetric with respect to time-reversal. Integral fluctuation relations are introduced as a general property of certain distributions. For the work distributions, this yields the Jarzynski relation, which expresses free-energy differences as a particular nonlinear average over nonequilibrium work. Various limiting cases such as slow driving and the apparent counterexample of free expansion of a gas are discussed. The Bochkov–Kuzovlev relation is shown to be another variant of such an integral fluctuation relation. The Crooks fluctuation relation yields a symmetry of the work distributions for a forward and a backward process. As an important application, free energy differences and a free energy landscape based on exploiting the Hummer–Szabo relation are recovered as illustrated with experimental data for the unfolding of biopolymers.
The asymmetric random walk is introduced as a simple model for a molecular motor. Thermodynamic consistency imposes a condition on the ratio between the forward and the backward rate. Fluctuations in finite time can be derived analytically and are used to illustrate the thermodynamic uncertainty relation. For the long-time limit, concepts from large deviation theory like a rate function and a contraction can be determined explicitly.
For many systems, the full information of an underlying Markovian decription is not accessible due to limited spatial or temporal resolution. We first show that such an often inevitable coarse-graining implies that, rather than the full entropy production, only a lower bound can be retrieved from coarse-grained data. As a technical tool, it is derived that the Kullback–Leibler divergence decreases under coarse-graining. For a discrete time-series obtained from an underlying time-continuous Markov dynamics, it is shown how the analysis of n-tuples leads to a better estimate with increasing length of the tuples. Finally, state-lumping as one strategy for coarse-graining an underlying Markov model is shown explicitly to yield a lower bound for the entropy production. However, in general, it does not yield a consistent interpretation of the first law along coarse-grained trajectories as exemplified with a simple model.
The Maxwell demon and the Szilard engine demonstrate that work can be extracted from a heat bath through measurement and feedback in apparent violation of the second law. A systematic analysis shows that, by including the measurement process and the subsequent erasure of a memory according to Landauer’s principle, the second law is indeed restored. For such feedback-driven processes, the Sagawa–Ueda relation provides a generalization of the Jarzynski relation. For the general class of bipartite systems, the concepts from stochastic thermodynamics are developed. This framework applies to systems where one component “learns” about the changing state of the other one, as in simple models for bacterial sensing. The chapter closes with a simple information machine that shows how the ordered sequence of bits in a tape can be used to transform heat into mechanical work. Likewise, mechanical work can be used to erase information, i.e., randomize such a tape. These processes are shown to obey a second law of information processing.
The overdamped Langevin equation for a particle in a potential and, possibly, subject to a nonconservative force is introduced. The corresponding Fokker–Planck equation, the Smoluchowski equation, is derived. In a time-independent potential, any initial distribution finally approaches the equilibrium one. For a constant external force and periodic boundary condition like the motion along a ring, a nonequilibrium steady state is established. As an application, the Kramers escape from a meta-stable well can be discussed. The mean local velocity and the path integral representation are introduced. Thermodynamic quantities like work, heat, and entropy production are identified along individual trajectories and their ensemble averages are determined. Their distributions are shown to obey detailed fluctuation relations. A master integral fluctuation relation can be specialized to yield inter alia the Jarzynski relation, the integral fluctuation relation for entropy production, and the Hatano–Sasa relation.
This chapter deals with advanced topics for a multivariate Langevin and Fokker–Planck dynamics. For systems with multiplicative noise it is shown that neither the drift term in the Langevin equation nor the discretization parameter can be determined uniquely. If one of the two is fixed, the other one is determined. In contrast, the Fokker–Planck equation, which contains the physically observable distribution is unique. Experimental data for a particle near a wall illustrate the relevance of space-dependent friction. Martingales are introduced for a Langevin dynamics with a nonlinear expression of entropy production as a prominent example that with Doob’s optimal stopping theorem leads to universal results of its statistics. Finally, underdamped Langevin dynamics is described by the Klein–Kramers equation, for which entropy production is determined by the irreversible currents. A multi-time-scale analysis recovers the Smoluchowski equation in the overdamped limit even in the presence of an inhomogeneous temperature for which an anomalous contribution to entropy production is found.
Optimal protocols transform a given initial distribution into a given final one in finite time with a minimal amount of work or entropy production. We first analyze this optimization paradigmatically for a driven harmonic oscillator for which analytical results can be obtained. For a general Langevin dynamics, it is shown that the optimal protocol can be realized through a time-dependent potential with no need to use a nonconservative force. In contrast for discrete systems, nonconservative driving decreases the thermodynamic costs. For a broader perspective, we introduce concepts from information geometry which deals with the statistical manifold of distributions. The Fisher information provides a metric on this manifold from which the distance between two distributions as the minimal length connecting them can be derived. Speed limits yield relations between these quantities referring to an initial and a final distribution and the entropy production associated with the transformation of the former into the latter. For slow processes, cost along the optimal protocol or path is bounded by the distance between these distributions and the inverse of the allocated time.
This chapter deals with processes both from a macroscopic, thermodynamic point of view and from a dynamical perspective. For the latter, a class of processes is introduced that can be described through a Hamiltonian description with a time-dependent external control parameter. It is shown how the expressions of work and heat from classical thermodynamics can be obtained as an appropriate average over an initial distribution. The second law inequality relating work and free energy can then be proven as a consequence of a master inequality. With well-specified additional assumptions, second law inequalities for heat exchange and entropy production are derived.
Systems with a discrete set of mesostates and their canonical description in equilibrium are introduced. Observing trajectories in equilibrium yields the thermodynamic potentials of these mesostates. Time-scale separation allows one to describe the dynamics using a Markovian master equation. The ratio of transition rates is constrained by the free energy difference of the corresponding mesostates. First for relaxation and then for time-dependent driving, work, heat, and internal energy are identified along individual trajectories. Entropy production along such a trajectory is shown to contain three contributions given by the dissipated heat and the change in internal entropy and in stochastic entropy. The distributions of these thermodynamic quantities obey various exact fluctuation relations. For entropy production, the relation to the arrow of time and a putative identification within a Hamiltonian dynamics is discussed.
Stochastic thermodynamics has emerged as a comprehensive theoretical framework for a large class of non-equilibrium systems including molecular motors, biochemical reaction networks, colloidal particles in time-dependent laser traps, and bio-polymers under external forces. This book introduces the topic in a systematic way, beginning with a dynamical perspective on equilibrium statistical physics. Key concepts like the identification of work, heat and entropy production along individual stochastic trajectories are then developed and shown to obey various fluctuation relations beyond the well-established linear response regime. Representative applications are then discussed, including simple models of molecular motors, small chemical reaction networks, active particles, stochastic heat engines and information machines involving Maxwell demons. This book is ideal for graduate students and researchers of physics, biophysics, and physical chemistry, with an interest in non-equilibrium phenomena.
Networks can get big. Really big. Examples include web crawls, online social networks, and knowledge graphs. Networks from these domains can have billions of nodes and hundreds of billions of edges. Systems biology is yet another area where networks will continue to grow. As sequencing methods continue to advance, more networks and larger, denser networks will need to be analyzed. This chapter discusses some of the challenges you face and solutions you can try when scaling up to massive networks. These range from implementation details to new algorithms and strategies to reduce the burden of such big data. Various tools, such as graph databases, probabilistic data structures, and local algorithms, are at our disposal, especially if we can accept sampling effects and uncertainty.
Every network has a corresponding matrix representation. This is powerful. We can leverage tools from linear algebra within network science, and doing so brings great insights. The branch of graph theory concerned with such connections is called spectral graph theory. This chapter will introduce some of its central principles as we explore tools and techniques that use matrices and spectral analysis to work with network data. Many matrices appear in different cases when studying networks, including the modularity matrix, nonbacktracking matrix, and the precision matrix. But one matrix stands out—the graph Laplacian. Not only does it capture dynamical processes unfolding over a networks structure, its spectral properties have deep connections to that structure. We show many relationships between the Laplacians eigendecomposition and network problems, such as graph bisection and optimal partitioning tasks. Combining the dynamical information and the connections with partitioning also motivates spectral clustering, a powerful and successful way to find groups of data in general. This kind of technique is now at the heart of machine learning, which well explore soon.
The fundamental practices and principles of network data are presented in this book, and the preface serves as an important starting point for readers to understand the goals and objectives of this text. The preface explains how the practical and fundamental aspects of network data are intertwined, and how they can be used to solve real-world problems. It also gives advice on how to use the book, including the boxes that will be featured throughout the book to highlight key concepts and provide practical examples of working with network data. Readers will find this preface to be a valuable resource as they begin their journey into the world of network science.