To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Denotational semantics is a formal method for defining the semantics of programming languages. It is of interest to the language designer, compiler writer and programmer. These individuals have different criteria for judging such a method – it should be concise, unambiguous, open to mathematical analysis, mechanically checkable, executable and readable depending on your point of view. Denotational semantics cannot be all things to all people but it is one attempt to satisfy these various aims. It is a formal method because it is based on well-understood mathematical foundations and uses a rigorously defined notation or meta-language.
The complete definition of a programming language is divided into syntax, semantics and sometimes also pragmatics. Syntax defines the structure of legal sentences in the language. Semantics gives the meaning of these sentences. Pragmatics covers the use of an implementation of a language and will not be mentioned further.
In the case of syntax, context-free grammars expressed in Backus–Naur form (BNF) or in syntax diagrams have been of great benefit to computer scientists since Backus and Naur [44] formally specified the syntax of Algol-60. Now all programming languages have their syntax given in this way. The result has been ‘cleaner’ syntax, improved parsing methods, parser-generators and better language manuals. As yet no semantic formalism has achieved such popularity and the semantics of a new language is almost invariably given in natural language.
The typical problem facing a programmer is to write a program which will transform data satisfying some properties or assertions ‘P’ into results satisfying ‘Q’.
Ashcroft and Wadge [4] have criticized effort spent on describing existing programming languages and have suggested a more active, prescriptive role for denotational semantics in designing languages of the future. Accepting some truth in this, this chapter contains a semantics for Prolog. While there are existing Prologs, plural, logic programming is still a research area and a denotational semantics is one way to investigate variations in it.
Prolog [9] is a programming language based on first-order predicate logic. A Prolog program can be thought of in two ways. It can be taken to be a set of logical assertions or facts about a world or some part of a world. This is the declarative semantics of the program. It can also be taken as a set of procedure definitions which gives its procedural semantics.
The declarative semantics are very elegant in that the program stands for some basic facts and certain other facts that logically follow from them. No side-effects or considerations of the order of evaluation are involved. Unfortunately, to make Prolog run and to make it run efficiently, some programs require side-effects such as input–output and the order of evaluation to be taken into account. This can only be understood procedurally.
Here a denotational semantics of a subset of Prolog is given. This defines the backtracking search and unification processes of Prolog. Later the definition is translated into Algol-68 to form an interpreter. Prolog is still a research language and giving a denotational semantics enables it to be compared with other languages in a uniform framework.
There is great variety amongst programming languages in the area of data structures and type checking. It is only possible to deal with some of the more straightforward issues in this chapter.
Some languages, such as BCPL [52], are typeless, or have only one type. All BCPL variables have the type ‘word’. This enables BCPL to rival assembly code in application while being much more readable and concise. There are dangers, however; the compiler cannot detect type errors because there are none.
Languages that do provide types are characterized by the kind of data structures, the time at which types are checked and how much the programmer can define. Simple types, such as integer, stand for basic domains like Int. Structured types – arrays and records – stand for derived domains. There are hard problems, however, in deciding what a programmer-defined type, particularly one defined by possibly recursive equations, stands for – see recent conference proceedings [1, 2, 31]. This is obviously connected with recursive domains (§4.3).
APL [27] is a dynamically typed language. Each constant has a particular type – integer, character or vector or array of one of these. The value currently assigned to a variable therefore has some type, but both the value and the type may change as the program runs. Each APL operator is only applicable to certain types, so it is possible to add 1 to an integer or to a vector of integers but not to a character.
Philosophy of science and schema theory have provided us with two approaches to a theory of knowledge as construction. In this theory, a knowing subject bases a dialogue with external reality on schemas or theoretical languages embodying a “construction of reality.” Schema theory builds on elementary processes of assimilation and accommodation whereby the individual's sensorimotor schemas come to provide anticipations of the effects of action in the world. We have seen similar processes, operating at the level of groups rather than individuals, in the network view of philosophy of science. Both schema theory and the network view of science have led to a theory of language in which metaphor is normative, with literal meaning as the limiting case.
The resulting epistemology combines coherence and correspondence criteria of truth and dissolves the barriers between “objective” science and nonscience. In this chapter, we look at the alternatives to “objectification” proposed as modes of knowledge in hermeneutic and critical philosophy and suggest that our viewpoint reveals unities in these diversities. Just as we see a continuum between literal and metaphorical meanings, so we do not posit a sharp dichotomy between the natural sciences on the one hand and the social or literary hermeneutic sciences on the other. Special pragmatic or “objective” aims are predominant in the natural sciences, though also to some extent applicable elsewhere; on the other hand, hermeneutic considerations apply in the natural sciences, particularly in connection with theoretical interpretations and “world models.”
Our task now is to develop a theory of human knowledge that makes contact with the AI concepts and the notion of the embodied subject of the previous chapter. We call this approach schema theory, and have already outlined its features in Section 1.3. We provide our view of the current shape of schema theory as a scientific discipline within cognitive science and also point to ways the theory must develop if we are to use it in addressing such issues as freedom, the person in society, and the possibilities of religious knowledge. We stress that schema theory is not a closed subject, nor is there any consensus as to what constitutes its current status. Even the notion of a “schema” as “intermediate functional entity” in cognitive processes is not fully delimited but will evolve with developments in cognitive science.
Our approach to schema theory denies language the primary role in cognition. True, with language “in place,” we seek to understand its substrates, both within the human brain and in the social nexus. But when we take an evolutionary or developmental view, language is no longer primary. Even though as adults we are immersed in language, we seek to burst the bounds of language to construct a richer epistemology. Schema theory seeks to mediate between the billionsfold complexity of neurons and the thousands-fold complexity of words.