To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We have argued throughout this book that the cognitive system underlying a person's language capacity has intrinsic properties which are there by biological endowment. Those properties interact with contingencies resulting from exposure to a particular linguistic environment and the interaction yields a final state in which the person may communicate, perhaps some form of French. In that case, the person, Brigitte, will have incorporated from her environment the contingent lexical properties that livre is a word to refer to the novel she is reading and cooccurs with forms like le and bon (being “masculine”), père may refer to her father. She has also incorporated contingent structural properties: interrogative phrases like quel livre may be displaced to utterance-initial position, verbs raise to a higher functional position, and so on. We have described ways in which linguists have teased apart the intrinsic properties common to the species and the contingent properties resulting from individual experience. That work has been guided by the kind of poverty-of-stimulus arguments that we have discussed, by theoretical notions of economy and elegance, and by the specific phenomena manifested by the mature grammar under investigation.
Viewing a person's language capacity in this way and focusing on what we have called I-language leads one to ask novel questions about children and their linguistic development. The perspective we have sketched has already led to productive research and we have learned a great deal about the linguistic minds of young children.
In this chapter and the two following ones, we turn from issues of syntactic organization in natural language to the systematicities of sound structure. There is a conventional division between phonetics, or the study of sounds in speech, and phonology, the study of sound patterns within particular languages. As we will see, there is a reasonably clear conceptual distinction here, and we will follow it in devoting most of this chapter and the next to the more obviously linguistic domain of phonology while postponing substantive discussion of the nature of phonetics until chapter 6, after some necessary preliminaries in section 4.1. We will attempt to tease apart these notions, but that process will reveal that questions of sound structure, seemingly concrete and physical in their nature, are actually abstract matters of cognitive organization – aspects of I-language and not measurable external events.
Ideally, we should broaden our scope a bit: signed languages also have a “phonology” (and a “phonetics”) despite the fact that this is not based on sound, although we cannot go into the implications of that within the scope of this book.In recent years, the study of signed languages has revealed the fact that their systems of expression are governed by principles essentially homologous with those relevant to spoken language phonology and phonetics. This close parallelism reinforces the conclusion that we are dealing here with the structure of the mind, and not simply sound, the vocal tract, and the ears (or the hands and the eyes).
Our ability to speak and understand a natural language results from – and is made possible by – a richly structured and biologically determined capacity specific both to our species and to this domain. In this chapter we review arguments that show that the language faculty is a part of human biology, tied up with the architecture of the human brain, and distinct at least in significant part from other cognitive faculties. We also discuss some of the work that has tried to link the language organ with specific brain tissue and its activity.
Previous chapters have explored the structure of various components of our language organ, and some aspects of the course by which that structure arises. Some component of the mind must be devoted to language, and in its original state (determined by Universal Grammar (UG)), prior to any actual linguistic experience, it seems predisposed to infer certain quite specific sorts of system on the basis of limited and somewhat degenerate data. This is what we mean when we say that our language organ can be described by a grammar, and the shape of particular grammars is determined by the system of UG as this interprets the primary linguistic data available during the period of growth of the language organ.
Thus far, our description is largely an abstract or functional one: that is, it does not depend on the specific properties of the physical system that realizes it. For a parallel, consider the nature of multiplication.
If you ask a naive person-in-the-street – the kind of person the British call “the man on the Clapham omnibus” – what the central thing is that has to be learned in order to “know” a language, the chances are that a major part of the answer will be “the words of the language.” This notion that the words of a language are the essence of its identity is reinforced by standard language courses, which devote great attention to the systematic presentation of vocabulary. Indeed, much of what passes for “grammar” in many language courses is actually a subpart of the theory of words: what has to be learned about things like conjugation and inflection is first and foremost how to form inflected words. Compared with the effort usually devoted to drilling vocabulary and word formation, the amount of attention devoted to exemplifying the uses of the various forms and providing usage notes is usually quite limited, and the space given to fundamental matters of syntactic structure virtually none at all.
So if the set of words is such an important property of, say, English, how do we determine what that set is? A standard answer is provided by a dictionary (though that, of course, simply puts the problem off by one step: how did the dictionary makers know what to include?). Most speakers behave as if the question “Is [such and such] a word of English?).
In chapter 1, we saw how nineteenth-century linguists promoted the rise of linguistics as a distinct discipline, thinking of texts as the essential reality and taking languages to be entities “out there,” existing in their own right, waiting to be acquired by speakers. For them, languages were external objects and changed in systematic ways according to “laws” and general notions of directionality. They focused on the products of human behavior rather than on the internal processes that underlie the behavior, dealing with E-language rather than I-language. By the end of the nineteenth century, the data of linguistics consisted of an inventory of sound changes but there were no general principles: the changes occurred for no good reason and tended in no particular direction. The historical approach had not brought a scientific, Newtonian-style analysis of language, of the kind that had been hoped for, and there was no predictability to the changes–see section 1.2. The historicist paradigm–the notion that there are principles of history to be discovered–was largely abandoned in the 1920s, because it was not getting anywhere.
In sections 8.3 and 8.4 we shall ask what kinds of accounts of language history we can give if we take a more contingent, I-language-based approach. Following our general theme, we shall shift away from a study of the products of behavior toward a study of the states and properties of the mind/brain that give rise to those products.
If you meet someone at a cocktail party and tell them you are a carpenter, or a veterinarian, or an astronomer, they are likely to be quite satisfied with that, and the subsequent evolution of the conversation will depend, at least in part, on the depth of their interest in woodworking, animals, or the universe. But if you tell them you are a linguist, this is unlikely to satisfy whatever curiosity they may have about you: “Oh, so how many languages can you speak?” is the most common reply at this point. But in fact, many – probably even most – linguists actually speak few if any languages in addition to their native tongue, in any practical sense. A “linguist,” at least in academic disciplinary terms, is not a person who speaks many languages, but rather someone concerned with the scientific study of language more generally.
That still doesn't settle matters, though. As we will discuss below, different generations of scholars have had rather different notions of what was important enough about language to warrant study. Languages have histories, and relationships with one another that at least superficially parallel genetic connections, and one can study those things. Most often, languages are spoken, and it is possible to study the anatomical, acoustic, and perceptual aspects of speech. Different spoken forms can mean different things, and we might study the kinds of things we can “mean” and the ways differences in the forms of words are related to differences in their meanings.
Our knowledge of a language is determined by the language organ we develop as a child on the basis of exposure to utterances in that language, and includes what we know about contrasts, relations, and regularities within the set of linguistic objects. Obviously, though, it also includes what we know about the objects themselves. The structure of that knowledge is described by a theory of representations of the various sorts of object that form parts of our language. Seeing the foundation of these representations as an aspect of our knowledge (an I-language point of view) has somewhat different consequences from seeing them as based purely on externally determined properties, part of E-language. There may be much formal similarity between the actual representations that result from these two differing perspectives, but the conceptual content is still quite distinct.
In this chapter, we address the nature of the representations that seem to be most obviously and irreducibly based on observable, physically measurable properties: phonetic representations. We argue that when phonetics is seen as genuinely part of language, rather than a subpart of physics or physiology, the resulting conception of “phonetic representation” (while still recognizable) differs in a number of important ways from what is often taught (or more accurately, assumed) in standard textbooks.
Representations and the study of sound structure
Most linguists assume, as we argued in chapter 4, that the principles of sound structure in a given language mediate between a phonological representation that indicates all and only the properties of an utterance in terms of which it contrasts with other utterances in that language, and a phonetic representation that provides a language-independent characterization of its pronunciation.
One of the great success stories of post-Second-World-War intellectual inquiry has been the extent to which linguists have been able to make the syntactic and phonological structure of natural language into a serious object of explicit formal study. This work has uncovered principles of surprising subtlety, abstractness, and deductive richness; it has also raised fundamental questions concerning the ontogenetic and phylogenetic developments by which knowledge of this kind could develop in the organism. Much of this progress results fairly directly from the adoption of an explicitly biological perspective on the subject: instead of seeing language as an external phenomenon, as a collection of sounds, words, texts, etc. that exists apart from any particular individual, contemporary linguistics increasingly concerns itself with the internal organization and ontogeny of a special kind of knowledge. The specific form that this aspect of human cognition takes appears, we will argue, to be a species-specific property of human beings, and thus rooted in our biological nature.
As our subtitle promises, we will describe linguistics, the scientific study of (human natural) language, as cognitive physiology. An individual's use of language involves that person's brain: the way this brain works depends at least in part on childhood influences and whether the person was raised in New Haven, New Delhi, or New Guinea. The relevant aspect of the brain's structure that responds to these differences in experience is the person's language organ, but to characterize it we need to take seriously the notion of physiology as the study of functions.
Before the development of generative grammar in the late 1950s, linguists focused almost entirely on the smallest units of language: sounds, minimal meaningful elements (“morphemes” like ed, ful, con – see chapter 7 below for more on this notion), and words, where the model of the Saussurian sign has most plausibility. “Syntax” was largely a promissory note to the effect that such sign-based analysis would eventually encompass the larger units of phrases, sentences, etc. Meanwhile, what went by that name was largely a kind of applied morphology: some instructions for what to do with the various kinds of words (inflected and otherwise).
For example, drawing from our bookshelves more or less at random, we find that Morris Jones' (1913) comprehensive grammar of Welsh is divided into two sections, phonology and accidence (inflectional properties), and has nothing under the rubric of syntax. Arthur MacDonnell's (1916) grammar of Vedic Sanskrit has two chapters on sounds, four chapters on inflections, and a final chapter entitled “Syntax”. There he has some observations about word order and agreement phenomena, and then a discussion of the uses of cases, tenses, and moods. He notes that the subjunctive mood has a fundamental sense of “will” and lists the uses of the subjunctive mood in main clauses, relative clauses, and with “relative conjunctions.”
In this chapter, we pursue an important source of evidence for the claim that human language has a specialized basis in human biology: the relation between what a speaker of a language can be said to “know” and the evidence that is available to serve as the basis of this knowledge. The apparently common-sense notion that an adult speaker's knowledge of his/her language arises by simple “learning,” that is, as a direct generalization of experience, turns out to pose a logical paradox. We begin with two brief examples that illustrate this point, and then explore the consequences of this for the mechanisms that must in fact underlie the development of language organs in normal human speakers.
We know more than we learn
A striking property of language acquisition is that children attain knowledge which, quite literally, infinitely surpasses their actual experience. On the basis of very limited experience, a productive system, a grammar, arises in each speaker which not only encompasses the actual facts to which they have been exposed, but also permits the production and comprehension of an unlimited range of novel utterances in the language. There must, therefore, be much more to language acquisition than mimicking what is heard in childhood; and there is more to it than the simple transmission of a set of words and sentences from one generation of speakers to the next.
In the previous chapter, we traced the path of linguists' interests in sound structure as these evolved from an E-language-based focus on representations alone to an I-language approach. Over time, it has come to be appreciated that knowledge of language includes not only (representational) questions of what speakers of a language know about the sound properties of its words, etc., but also the characterization of what they know about overall regularities that transcend particular items (see Anderson 1985). In the domain of sound structure, the description of these regularities originated in important respects from the study of what had been previously thought of as “morphophonemics” (see section 4.2.3 above). It inherited from that work a descriptive framework going back to one of the oldest grammatical traditions about which we have evidence, that of ancient Indian grammarians such as Pāṇini (c. 500 BC). In those terms, regularities are formulated as a system of rules, each of which performs some limited, local modification of a representation. Collectively, and in the context of a theory of the way they interact with one another, these rules describe a mapping between phonological representation and overt phonetic form.
Until relatively recently, linguists assumed that the description of a speaker's knowledge of overall regularities, of the general principles that are not part of any individual word or other linguistic form, was essentially equivalent to such a system of rules.
If a VP merges with an infinitival suffix instead of a finite Tense, the resulting infinitival phrase can assume the role of a complement or adjunct in a matrix clause. If the subject of the infinitive is controlled by an argument of the matrix predicate, it is represented by a caseless PRO. If, on the other hand, the matrix predicate has no argument, and therefore cannot provide an adequate controller, the infinitive has a case-marked subject represented by a lexical noun phrase or a pro(noun), and it also bears an agreement marker. Infinitival phrases – whether agreeing or non-agreeing – can merge with the same types of operators that can extend a finite VP into a predicate phrase, and they can also combine with topic phrases into a TopP. Or, alternatively, both types of infinitives can be unified with their matrix V into a complex predicate.
Subject and object control constructions
Subject and object control verbs
A number of Hungarian verbs, among them those listed under (1a, b), are marked in the lexicon as selecting an infinitival phrase with a phonologically empty PRO subject controlled by the matrix subject.
In describing the complexity of creating a corpus, Leech (1998: xvii) remarks that “a great deal of spadework has to be done before the research results [of a corpus analysis] can be harvested.” Creating a corpus, he comments, “always takes twice as much time, and sometimes ten times as much effort” because of all the work that is involved in designing a corpus, collecting texts, and annotating them. And then, after a given period of time, Leech (1998: xviii) continues, the corpus becomes “out of date,” requiring the corpus creator “to discard the concept of a static corpus of a given length, and to continue to collect and store corpus data indefinitely into the future …” The process of analyzing a corpus may be easier than the description Leech (1998) gives above of creating a corpus, but still, many analyses have to be done manually, simply because we do not have the technology that can extract complex linguistic structures from corpora, no matter how extensively they are annotated. The challenge in corpus linguistics, then, is to make it easier both to create and analyze a corpus. What is the likelihood that this will happen?
Planning a corpus. As more and more corpora have been created, we have gained considerable knowledge of how to construct a corpus that is balanced and representative and that will yield reliable grammatical information.