To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this chapter we begin to lay out a theory of syntactic change. In order to develop such a theory it is necessary to address what the entity upon which change operates is, what the forces of change are, how those forces act upon that entity, and what – if anything – constrains those forces. This chapter introduces our theory of how syntax changes over time, defines some basic terms, and presents some claims about syntactic change that are treated in detail in subsequent chapters.
Basic assumptions about language change
Views of language change have assumed that:
(a) human beings are genetically endowed with aspects of universal grammar which regulate how infants acquire languages and hence determine what constitutes a possible language;
(b) child language acquisition is (in large measure) responsible for linguistic change;
(c) change is abrupt since change occurs in the construction of a new grammar by child language learners which may differ in its internal structure from the grammar of adult models (whose grammars may no longer be optimal due to additions or modifications later in life).
We evaluate each of these in order to make clear the relationship of these assumptions to our own theory. Let us take up the claimed abruptness of syntactic change ((c) above) first.
Modularity is certainly a term whose time has come … What is considerably less clear is whether the term retains a constant meaning when it passes from one author to another.
John C. Marshall
In this chapter I will be taking up the specific issue of the role of semantic and pragmatic factors in syntactic processing. I will not, however, be developing a theory of semantic and discourse processing. Rather, the focus here will be on incremental, deterministic, structure building and how this process is affected by nonsyntactic information. In order to draw out the main issues, I will contrast two views: (i) the modular theory (MT) of Rayner et al. (1983) and Clifton and Ferreira (1989), and (ii) the interactive theory (IT) of Crain and Steedman (1985) and Steedman and Altmann (1989). One reason to contrast these particular proposals is that they have each focused on the use of discourse information in the processing of a particular ambiguity – the reduced-relative ambiguity which has been a focus of research since Bever (1970). Also, this ambiguity has been extensively studied with sensitive experimental techniques for the last few years. Thus, there is perhaps more data available for assessing the time course of information availability and use in ambiguous reduced-relative constructions than for any other type of processing phenomenon. Given this, it will serve as a useful probe into how Structural Determinism might be embedded in a more general language comprehension mechanism.
There should be clear linkages between linguistic descriptions and cognitive/perceptual requirements.
William Marslen-Wilson
Since the mid-1970s specific proposals for the form of syntactic knowledge have had difficulty finding their way into theories of language comprehension. This is not to say that syntax does not play a role in such theories, but it is usually limited to a reference to a fairly imprecise phrase structure. Much of the work that is of interest to syntacticians provokes (at best) scant interest from those working in experimental psycholinguistics. To a certain extent this is justified; investigations into syntactic knowledge and into sentence processing are related, but clearly distinct, research programs.
It is a central thesis of this book that recent work within Government-Binding (GB) theory (Chomsky 1981, and subsequent work) raises questions about the nature of syntactic knowledge that have long concerned researchers into syntactic processing (parsing). Consider the term minimal. Since the important work of Frazier and Fodor (1978), which introduced the concept of Minimal Attachment to the psycholinguistics literature, the concept of minimal structure building has played a significant role in studies of properties of the parser. More recently, work in syntactic theory has become concerned both with insuring minimal structure generation (e.g. Speas 1990) and with establishing minimal connections between related elements within a structure (e.g. Rizzi 1990). As will become clear below, the form of the grammar within much current work in GB requires principles of minimal structure generation in much the same way that properties of the parser require a principle of minimal structure computation.
Be on the watch to take the best parts of many beautiful faces.
Leonardo da Vinci
Structural ambiguity
The parsing model to be described in detail in Chapter 4 addresses particular questions that have arisen from two related, but distinct, research programs investigating natural language processing. The first seeks to investigate hypotheses concerning human sentence comprehension within the context of experimental psycholinguistics; the second is concerned with the computational implications of such hypotheses. In recent years there has been a considerable amount of convergence concerning basic principles of language comprehension. For example, the important role of phrase structure in language processing has been convincingly demonstrated, and it is a rare processing model which does not make some reference to syntactic constituents and structural relations. There is a growing consensus that, given ambiguous input, the perceptual mechanism has an initial bias toward the reading consistent with the minimal structural representation.
Further, owing to the important work of Marslen-Wilson (1973, 1975) there is general agreement that models of syntactic processing must be responsive to the experimental demonstrations of the speed and efficiency of human language processing.
As this chapter will make clear, important, and interesting, differences remain, but real progress has been made since the mid-1970s. The purpose of this chapter is to outline some of the main avenues of research responsible for this progress. This will give a context for the parsing model described in Chapter 4. This model, as will become clear, incorporates significant aspects of the parsers described in the following sections.
Dick Feynman told me about his “sum over histories” version of quantum mechanics. “The electron does anything it likes,” he said. “It goes in any direction at any speed, forward or backward in time, however it likes, and then you add up the amplitudes and it gives you the wave function.” I said to him, “You're crazy.” But he wasn't.
Freeman Dyson
Grammatical assumptions: Government-Binding theory
In order to properly investigate the nature of the relation between syntax and perception, it is necessary to begin with a discussion of the syntactic framework being assumed. I will outline two aspects of syntactic phenomena as they are treated within the GB framework: the generation of structure and the relations which exist between elements in a syntactic representation. The particular type of structure I will consider is a phrase-structure tree. Structural relations involving discontinuous dependencies will first be discussed from the derivational perspective of standard GB (Chomsky 1981). I will then turn to the representational approach of Koster (1986). As noted in Chapter 1, it is this representational form of GB which I will assume in subsequent discussion of the parser's properties (Chapter 4).
As noted above, there are many aspects of syntactic theory in general, and GB in particular, which I will gloss over or ignore. The intent of this chapter is to describe the form of syntactic knowledge in sufficient detail so that we can, in a meaningful way, address the issue of the manner in which syntactic knowledge is put to use in the perceptual process.
Again, one thinks of Don Quixote. He may see a windmill as a giant, but he doesn't see a giant unless there is a windmill there.
W. H. Auden
In this book I have outlined a theory of syntactic knowledge and examined its role in language comprehension. Although I have argued that syntax plays a significant role in perception, it is clearly the case that numerous other factors also play a role. For example, there are currently underway a number of important studies within the general framework of constraintbased comprehension systems. As briefly discussed in the last chapter, it is important that theories of syntactic processing incorporate, in some way, the effects of frequency of both individual lexical items as well as cooccurrence probabilities (MacDonald 1994, Trueswell et al. 1993). I have not attempted to do that here. Rather the focus has been to establish the role of structural variables in sentence processing.
One of the most important aspects of the parsing model proposed here is the grammatical distinction between primary and secondary structural relations in a phrase-structure tree. This distinction is reflected in the design of the processing model, given in (1). The types of syntactic relations which a GB-based parser must encode in the structural representation it builds are listed in (2).
Unfortunately, there is as yet no standard terminology in this field, so the author has followed the usual practice … namely to use words that are similar but not identical to the terms used in other books.
Donald E. Knuth
Introduction
The properties of the parser described in this chapter are motivated by the form of the grammar and the speed and efficiency with which interpretive processes are able to make use of the structural representation constructed by the parser. Before detailing the specific properties of the parser I will first consider some general issues. One important question concerns the role of primary structural relations in a principle-based parser and its grammatical database.
Following the work of Pritchett (1987, 1992), a number of proposals of the last few years have sought to move from form-based parsing strategies such as Minimal Attachment (Frazier 1978) to content-based strategies (e.g. Pritchett's 1992 Generalized Theta Attachment or Crocker's 1992 Argument Attachment). There appear to be two motivations for this. The first is based on the intuition that the form of a phrase marker (however that is instantiated, e.g. as a tree or reduced phrase marker, etc.) is, in some sense, derivative or secondary whereas the licensing relations (theta, case, etc.) are more central in current linguistic theory. That is, the role of a particular structural form is to allow certain licensing relations to hold between elements in the representation. Thus, content-based approaches are taken to be more “grammatically responsible” than form-based models.
This book discusses the syntax of sentential negation against the background of generative syntax; more specifically, the Principles and Parameters approach (cf. Chomsky 1981, 1986a, 1986b etc.). The conceptual framework adopted for the discussion is that usually referred to as Government and Binding Theory, GB theory for short (cf. Haegeman 1991, 1994a). I have also incorporated occasional references to more recent developments of the Principles and Parameters framework, such as Chomsky's Minimalist Program (1993) and Brody's Radical Minimalism (1993b). Some of Brody's proposals will be used extensively.
The first part of this chapter consists of an introduction to the main theoretical concepts used in the book. For reasons of space I cannot provide an exhaustive introduction to the theory. I have selected those modules of the grammar which will have primary importance for the discussion. I refer the reader to the literature for detailed discussion and motivation. The following areas will be discussed:
1.1 Syntactic structure is endocentric
1.2 Levels of representation
1.3 Word order variation
1.4 Perfect projections and Extended projections
1.5 Movement
1.6 Relativized Minimality
1.7 Movement at S-structure or at LF
Some of the concepts introduced in this chapter will be treated in more detail in later chapters.
Syntactic structure is endocentric
X-bar theory
One of the core principles of generative syntax is the idea that syntax is structure-determined. Clauses are hierarchically organized into types of constituents, the phrases. At each level of the hierarchy the same principles determine the structure of a constituent.
In this book I develop an analysis of the syntax of negation against the background of the generative tradition, more specifically the Principles and Parameters framework initiated by Noam Chomsky.
The linguistics literature is full of discussions of negation and has been so for a long time. Discussions have ranged from the morphological aspects of negation, to the syntax, the semantics and the pragmatics. In this book I do not intend to provide an exhaustive discussion of all the aspects of negation which were, at one moment or another, prominent issues in the linguistics literature. This could not be the topic of one book, but it would be the topic of a series. I concentrate on the syntactic aspects of negation, focusing almost exclusively on what is usually referred to as sentence negation, i.e. those examples where the negation marker has scope over and thus gives negative value to a whole sentence, as is the case in the following English sentences: (i) I won't go there any more and (ii) No one said nothing or (iii) He gave nothing to Mary.
I will not restrict the discussion to an analysis of aspects of the syntax of negation; rather I will try to bring out those aspects of the syntax of negation which are not specific to negative sentences as such, but which belong to the larger domain of the syntax of operators, with special attention to the parallelism between negative sentences and interrogative sentences.