To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this lecture, we prepare the way for the notion of local logic by studying the ways that classifications give rise to “regular theories.” These theories can be seen as an idealized version of the scientific laws supported by a given closed system. The adjective “regular” refers to the purely structural properties that any such theory must satisfy. Any theory with these properties can be obtained from a suitable classification. At the end of the lecture, we will return to the question of how different scientific theories, based on different models of the phenomena under study, can be seen as part of a common theory. We will see conditions under which this obtains.
Theories
One way to think about information flow in a distributed system is in terms of a “theory” of the system, that is, a set of known laws that describe the system. Usually, these laws are expressed in terms of a set of equations or sentences of some scientific language. In our framework, these expressions are modeled as the types of some classification. However, we will not model a theory by means of a set of types. Because we are not assuming that our types are closed under the Boolean operations, as they are not in many examples, we get a more adequate notion of theory by following Gentzen and using the notion of a sequent.
With the groundwork laid in the preceding lectures, we come to the central material of the book, the idea of a local logic, which will take up the remainder of Part II. In this lecture we introduce local logics and proceed in the lectures that follow to show how local logics are related to channels and so to information flow.
If one is reasoning about a distributed system with components of very different kinds, the components will typically be classified in quite different ways, that is, with quite different types. Along with these different types, it is natural to think of each of the components as having its own logic, expressed in its own system of types. In this way, the distributed system gives rise to a distributed system of local logics. The interactions of the local logics reflect the behavior of the system as a whole.
In order to capture this idea, we introduce and study the notions of “local logic” and “local logic infomorphism” in this lecture. The main notions are introduced in the first two sections and studied throughout this lecture. The important idea of moving a logic along an infomorphism is studied in Lecture 13. In Lecture 14, we show that every local logic can be represented in terms of moving natural logics along binary channels. The idea of moving logics is put to another use in Lecture 15 to define the distributed logic of an information system.
In Lecture 7 we discussed the relationship between classifications and the Boolean operations. In this lecture, we study the corresponding relationship for theories. In particular, we discuss Boolean operations that take theories to theories, as well as what it would mean for operations to be Boolean operations in the context of a particular theory. In this way, we begin to see how the traditional rules of inference emerge from an informational perspective. The topic is a natural one but it is not central to the main development so this lecture could be skipped.
Boolean Operations on Theories
Given a regular theory T = (Σ, ⊢), one may define a consequence relation on the set pow(Σ) of subsets of Σ in one of two natural ways, depending on whether one thinks of the sets of types disjunctively or conjunctively. This produces two new theories, ∨T and ∧T, respectively.
These operations should fit with the corresponding power operations ∨A and ∧A on classifications A; we want vTh(A) to be the same as the theory Th(∨A), for example. Thus, to motivate our definitions, we begin by investigating the relationship of the theory Th(A) of a classification to the theories Th(∨A) and Th(∧A) of its two power classifications.
Definition 11.1. Given a set Γ of subsets of Σ, a set Y is a choice set on Γ if X ∩ Y ≠ Ø for each X ∈ Γ.
Information and talk of information is everywhere nowadays. Computers are thought of as information technology. Each living thing has a structure determined by information encoded in its DNA. Governments and companies spend vast fortunes to acquire information. People go to prison for the illicit use of information. In spite of all this, there is no accepted science of information. What is information? How is it possible for one thing to carry information about another? This book proposes answers to these questions.
But why does information matter, why is it so important? An obvious answer motivates the direction our theory takes. Living creatures rely on the regularity of their environment for almost everything they do. Successful perception, locomotion, reasoning, and planning all depend on the existence of a stable relationship between the agents and the world around them, near and far. The importance of regularity underlies the view of agents as information processors. The ability to gather information about parts of the world, often remote in time and space, and to use that information to plan and act successfully, depends on the existence of regularities. If the world were a completely chaotic, unpredictable affair, there would be no information to process.
Still, the place of information in the natural world of biological and physical systems is far from clear. A major problem is the lack of a general theory of regularity.
The view of information put forward here associates information flow with distributed systems. Such a system A, we recall, consists of an indexed family cla(A) = {Ai}i∈I of classifications together with a set inf (A) of isomorphisms, all of which have both a domain and a codomain in cla(A). With any such a system we want to associate a systemwide logic Log(A) on the sum ∑i∈I,Ai of the classifications in the system. The constraints of Log(A) should use the lawlike regularities represented by the system as a whole. The normal tokens of Log(A) model those indexed families of tokens to which the constraints are guaranteed to apply, by virtue of the structure of the system.
If we consider a given component classification Ai of A, there are at least two sensible logics on Ai that we might want to incorporate into Log(A), the a priori logic AP(Ai) and the natural logic Log(Ai). The former assumes we are given no information about the constraints of Ai except for the trivial constraints. The latter assumes perfect information about the constraints of Ai. There is typically quite a difference. But really these are just two extremes in our ordering of sound local logics on Ai. After all, in dealing with a distributed system, we may have not just the component classifications and their informorphisms, but also local logics on the component classifications. We want the systemwide logic to incorporate these local logics.
State-space models are one of the most prevalent tools in science and applied mathematics. In this lecture, we show how state spaces are related to classifications and how systems of state spaces are related to information channels. As a result, we will discover that state spaces provide a rich source of information channels. In later lectures, we will exploit the relationship between state spaces and classifications in our study of local logics.
State Spaces and Projections
Definition 8.1. A state space is a classification S for which each token is of exactly one type. The types of a state space are called states, and we say that a is in state σ if a ⊨s σ. The state space S is complete if every state is the state of some token.
Example 8.2. In Example 4.5 we pointed out that for any function f : A → B, there is a classification whose types are elements of B and whose tokens are elements of A and such that a ⊨b if and only if b = f(a). This classification is a state space and every state space arises in this way, so another way to put the definition is to say that a state space is a classification S in which the classification relation ⊨s is a total function. For this reason, we write states(a) for the state σ of a in S.
To understand the account presented here, it is useful to distinguish two questions about information flow in a given system. What information flows through the system? Why does it flow? This book characterizes the first question in terms of a “local logic” and answers the second with the related notion of an “information channel.” Within the resulting framework one can understand the basic structure of information flow. The local logic of a system is a model of the regularities that support information flow within the system, as well as the exceptions to these regularities. The information channel is a model of the system of connections within the system that underwrite this information flow.
The model of information flow developed here draws on ideas from the approaches to information discussed in Lecture 1 and, in the end, can be seen as a theory that unifies these various apparently competing theories. The model also draws on ideas from classical logic and from recent work in computer science. The present lecture gives an informal overview of this framework.
Classifications and Infomorphisms
Fundamental to the notions of information channel and local logic are the notions of “classification” and “infomorphisms.” These terms may be unfamiliar, but the notions have been around in the literature for a long time.
Paying Attention to Particulars
We begin by introducing one of the distinctive features of the present approach, namely its “two-tier” nature, paying attention to both types and particulars.
The concepts of information and representation are, of course, closely related. Indeed, Jerry Fodor feels that they are so closely related as to justify the slogan “No information without representation.” Though we do no go that far, we do think of the two as intimately connected, as should be clear from our account. In this lecture, we sketch the beginnings of a theory of representation within the framework presented in Part II. We have three motives for doing so. One is to suggest what we think such a theory might look like. The second is to explain some interesting recent work on inference by Shimojima. The third is to show that Shimojima's work has a natural setting in the theory presented here.
Modeling Representational Systems
When we think of information flow involving humans, some sort of representational system is typically, if not always, involved: Spoken or written language, pictures, maps, diagrams, and the like are all examples of representations. So representations should fit into our general picture of information flow.
A theory of representation must be compatible with the fact that representation is not always veridical. People often misrepresent things, inadvertently or otherwise. For this reason, we model representation systems as certain special kinds of information systems where unsound logics can appear. We begin with our model of a representational system.
A standard objection to classical logic has been its failure to come to grips with vague predicates and their associated problems and paradoxes. An analysis of the vague predicates “low,” “medium,” and “high” (as applied to brightness of light bulbs) was implicit in Lecture 3. In this lecture we want to make the idea behind this treatment more explicit, thereby suggesting an information-theoretic line of research into vagueness. At best, this line of development would allow the information-flow perspective to contribute to the study of vagueness. At the very least, it should show that vagueness is not an insurmountable problem to the perspective offered in this book.
In this lecture we explore a different family of related vague predicates, “short,” “medium,” “tall,” “taller,” and “same height as.” This family is simple enough to treat in some detail but complicated enough to exhibit three problems that are typical of vague predicates.
Information Flow Between Perspectives
The first problem is that different people, with differing circumstances, often have different standards in regard to what counts as being short or tall. In spite of the lack of any absolute standard, though, information flow is possible between people using these predicates. If Jane informs me that Mary is of medium height while she, Jane, is short, and if I consider Jane to be tall, then I know that I would consider Mary as tall as well. How is such reliable information flow possible between people with quite different standards of what counts as being tall?
The notion of a classification does not build in any assumptions about closure under the usual Boolean operations. It is natural to ask What role do the usual Boolean connectives play in information and its flow? This lecture takes an initial step toward answering this question. We will return to this question in later chapters as we develop more tools. It is not a central topic of the book, but it is one that needs to be addressed in a book devoted to the logic of information flow.
Actually, there are two ways of understanding Boolean operations on classifications. There are Boolean operations mapping classifications to classifications, and there are Boolean operations internal to (many) classifications. Because there is a way to explain the latter in terms of the former, we first discuss the Boolean operations on classifications.
Boolean Operations on Classifications
Given a set Φ of types in a classification, it is often useful to group together the class of tokens that are of every type in Φ. In general, there is no type in the classification with this extension. As a remedy, we can always construct a classification in which such a type exists. Likewise, we can construct a classification in which there is a type whose extension consists of all those tokens that are of at least one of the types in Φ.
Josep Díaz, Universitat Politècnica de Catalunya, Barcelona,Maria Serna, Universitat Politècnica de Catalunya, Barcelona,Paul Spirakis, University of Patras, Greece,Jacobo Torán, Universität Ulm, Germany