To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this chapter, we explore the concept of information in living organisms in its broadest sense. Biological organisms perceive the external environment, alter their own state, and take action (selection among possibilities). To capture these properties intrinsic to the organisms, we begin by discussing the “information quantity” that quantifies such situations. Starting with the definition of information quantity, we introduce Shannon entropy and provide an overview of Shannon’s information theory framework. We also discuss Kullback–Leibler divergence and mutual information. Next, moving on to information in DNA sequences, we cover various aspects such as differences in the frequency of AT and GC occurrence, the structure of genetic codes, long-range correlations in DNA sequences, and recent findings in intergenic sequences. Additionally, we explain kinetic proofreading as one candidate for achieving high accuracy in molecular recognition from a combination of unreliable elements. Furthermore, we explore the relationship between entropy in statistical mechanics and information, elucidating the connection between Maxwell’s demon and information using the Szilard engine as a mediator. Finally, we introduce intriguing points from the perspective of dynamics and information, highlighting the dynamic interplay between the two.
The Maxwell demon and the Szilard engine demonstrate that work can be extracted from a heat bath through measurement and feedback in apparent violation of the second law. A systematic analysis shows that, by including the measurement process and the subsequent erasure of a memory according to Landauer’s principle, the second law is indeed restored. For such feedback-driven processes, the Sagawa–Ueda relation provides a generalization of the Jarzynski relation. For the general class of bipartite systems, the concepts from stochastic thermodynamics are developed. This framework applies to systems where one component “learns” about the changing state of the other one, as in simple models for bacterial sensing. The chapter closes with a simple information machine that shows how the ordered sequence of bits in a tape can be used to transform heat into mechanical work. Likewise, mechanical work can be used to erase information, i.e., randomize such a tape. These processes are shown to obey a second law of information processing.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.