From Hand to Mouth: The Origins of Language
A groundbreaking theory of how language arose from primate gestures

It is often said that speech is what distinguishes us from other animals. But are we all talk? What if language was bequeathed to us not by word of mouth, but as a hand-me-down?

The notion that language evolved not from animal cries but from manual and facial gestures—that, for most of human history, actions have spoken louder than words—has been around since Condillac. But never before has anyone developed a full-fledged theory of how, why, and with what effects language evolved from a gestural system to the spoken word. Marshaling far-flung evidence from anthropology, animal behavior, neurology, molecular biology, anatomy, linguistics, and evolutionary psychology, Michael Corballis makes the case that language developed, with the emergence of Homo sapiens, from primate gestures to a true signed language, complete with grammar and syntax and at best punctuated with grunts and other vocalizations. While vocal utterance played an increasingly important complementary role, autonomous speech did not appear until about 50,000 years ago—much later than generally believed.

Bringing in significant new evidence to bolster what has been a minority view, Corballis goes beyond earlier supporters of a gestural theory by suggesting why speech eventually (but not completely!) supplanted gesture. He then uses this milestone to account for the artistic explosion and demographic triumph of the particular group of Homo sapiens from whom we are descended. And he asserts that speech, like written language, was a cultural invention and not a biological fait accompli.

Writing with wit and eloquence, Corballis makes nimble reference to literature, mythology, natural history, sports, and contemporary politics as he explains in fascinating detail what we now know about such varied subjects as early hominid evolution, modern signed languages, and the causes of left-handedness. From Hand to Mouth will have scholars and laymen alike talking—and sometimes gesturing—for years to come.

"1104162818"
From Hand to Mouth: The Origins of Language
A groundbreaking theory of how language arose from primate gestures

It is often said that speech is what distinguishes us from other animals. But are we all talk? What if language was bequeathed to us not by word of mouth, but as a hand-me-down?

The notion that language evolved not from animal cries but from manual and facial gestures—that, for most of human history, actions have spoken louder than words—has been around since Condillac. But never before has anyone developed a full-fledged theory of how, why, and with what effects language evolved from a gestural system to the spoken word. Marshaling far-flung evidence from anthropology, animal behavior, neurology, molecular biology, anatomy, linguistics, and evolutionary psychology, Michael Corballis makes the case that language developed, with the emergence of Homo sapiens, from primate gestures to a true signed language, complete with grammar and syntax and at best punctuated with grunts and other vocalizations. While vocal utterance played an increasingly important complementary role, autonomous speech did not appear until about 50,000 years ago—much later than generally believed.

Bringing in significant new evidence to bolster what has been a minority view, Corballis goes beyond earlier supporters of a gestural theory by suggesting why speech eventually (but not completely!) supplanted gesture. He then uses this milestone to account for the artistic explosion and demographic triumph of the particular group of Homo sapiens from whom we are descended. And he asserts that speech, like written language, was a cultural invention and not a biological fait accompli.

Writing with wit and eloquence, Corballis makes nimble reference to literature, mythology, natural history, sports, and contemporary politics as he explains in fascinating detail what we now know about such varied subjects as early hominid evolution, modern signed languages, and the causes of left-handedness. From Hand to Mouth will have scholars and laymen alike talking—and sometimes gesturing—for years to come.

39.0 In Stock
From Hand to Mouth: The Origins of Language

From Hand to Mouth: The Origins of Language

by Michael C. Corballis
From Hand to Mouth: The Origins of Language

From Hand to Mouth: The Origins of Language

by Michael C. Corballis

Paperback(Revised ed.)

$39.00 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview

A groundbreaking theory of how language arose from primate gestures

It is often said that speech is what distinguishes us from other animals. But are we all talk? What if language was bequeathed to us not by word of mouth, but as a hand-me-down?

The notion that language evolved not from animal cries but from manual and facial gestures—that, for most of human history, actions have spoken louder than words—has been around since Condillac. But never before has anyone developed a full-fledged theory of how, why, and with what effects language evolved from a gestural system to the spoken word. Marshaling far-flung evidence from anthropology, animal behavior, neurology, molecular biology, anatomy, linguistics, and evolutionary psychology, Michael Corballis makes the case that language developed, with the emergence of Homo sapiens, from primate gestures to a true signed language, complete with grammar and syntax and at best punctuated with grunts and other vocalizations. While vocal utterance played an increasingly important complementary role, autonomous speech did not appear until about 50,000 years ago—much later than generally believed.

Bringing in significant new evidence to bolster what has been a minority view, Corballis goes beyond earlier supporters of a gestural theory by suggesting why speech eventually (but not completely!) supplanted gesture. He then uses this milestone to account for the artistic explosion and demographic triumph of the particular group of Homo sapiens from whom we are descended. And he asserts that speech, like written language, was a cultural invention and not a biological fait accompli.

Writing with wit and eloquence, Corballis makes nimble reference to literature, mythology, natural history, sports, and contemporary politics as he explains in fascinating detail what we now know about such varied subjects as early hominid evolution, modern signed languages, and the causes of left-handedness. From Hand to Mouth will have scholars and laymen alike talking—and sometimes gesturing—for years to come.


Product Details

ISBN-13: 9780691116730
Publisher: Princeton University Press
Publication date: 10/05/2003
Edition description: Revised ed.
Pages: 272
Product dimensions: 6.12(w) x 9.25(h) x (d)

About the Author

Michael C. Corballis (1936–2021) was professor emeritus of psychology at the University of Auckland. His books include The Recursive Mind: The Origins of Human Language, Thought, and Civilization (Princeton) and A Very Short Tour of the Mind: 21 Short Walks around the Human Brain.

Read an Excerpt

COPYRIGHT NOTICE: Published by Princeton University Press and copyrighted, © 2002, by Princeton University Press. All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher, except for reading and browsing via the World Wide Web. Users are not permitted to mount this file on any network servers.

Chapter 1

WHAT IS LANGUAGE?

I am beguiled by the frivolous thought that we are descended, not from apes, but from birds. We humans have long sought features that are unique to our own species, with an especially keen eye for those that show us to be superior to others. Many special qualities differentiating us from our ape cousins have been proposed, but often, disconcertingly, these are found in our feathered friends as well. Like us, birds get around on two legs rather than four, at least when they're not flying (and some of them can't). Parrots, at least, have a consistent preference for picking things up with one foot, although in a mocking reversal of human handedness most of them prefer to use the left foot (most humans are right-handed and right-footed). Some birds prudently store food for the winter, and there is evidence that some of them can remember not only where they store food but also when they stored it, suggesting a kind of memory—known as episodic memory—that has been claimed as unique to our own species.1 Birds make tools. They fly, albeit without purchasingairline tickets. They sing. And some of them talk.

Perhaps it is the last point that is the most interesting. Most birds far outperform mammals, including our immediate primate ancestors, in the variety and flexibility of the vocal sounds they make, and one can see (or hear) some striking parallels with human speech. The vocalizations of songbirds are complex and, like human speech, are controlled primarily by the left side of the brain.2 Although birdsong is largely instinctive, birds can learn different dialects, and some of them can even learn arbitrary sequences of notes. In order to learn a particular song, the bird must hear it early on, while it is still in the nest, even though it does not produce the song until later. This crucial window of time is known as a critical period. The way people learn to speak also seems to depend on a critical period; that is, it seems to be impossible to learn to speak properly if we are not exposed to speech during childhood, and a second language learned after puberty is almost inevitably afflicted with a telltale accent. Some birds, like the parrot, can outdo humans in their ability to adapt their vocalizations, and not just by imitating human speech. The Australian lyre bird is said to be able to produce a near perfect imitation of the sound of a beer can being opened—which is perhaps the most commonly heard sound where humans congregate in that country.3

But of course birdsong differs in lots of ways from human speech. The ability of birds to imitate sounds probably has to do with the recognition of kin and the establishment and maintenance of territory but has nothing to do with conversation. Birds sing characteristic songs for much the same reason that nations of people fly characteristic flags or play national anthems. The remarkable ability of species like the mockingbird to imitate the songs of other birds has no doubt evolved also as a deceptive device to give the illusion of a territory filled with other birds, so that they may occupy that territory for themselves.4

Among most species of songbirds, it is only the males that vocalize, whereas women are said to be the more verbal members of our own species; we strong, silent chaps don't seem to have much to say. The vocalizations of birds, and and hope they are only kidding. indeed of other species, are mostly emotional, serving to signal aggression, to warn of danger, to advertise their sexual prowess, or to establish and maintain hierarchical social structures. Some of our own vocalizations serve similar, largely emotional ends. We laugh, grunt, weep, shriek with fear, howl with rage, cry out in warning. But these noises, although important means of communication, are not language, as I explain below.

In any event, it would of course be irresponsible of me to claim any real kinship between humans and birds. There is a remote sense in which we are related to them, but to find the common ancestor of birds and humans we would have to go back some 250 million years (and it couldn't fly), while the common ancestor of ourselves and the chimpanzees existed a mere 5 or 6 million years ago. I am therefore compelled to adopt the more conventional, down-to-earth view that our descent was not from the creatures of the sky but from the more restricted arboreal heights of our primate forebears. Those seductive parallels between characteristics we fondly imagine to be unique to ourselves and their taunting counterparts in birds are most likely the results of what is known as convergent evolution—independent adaptations to common environmental challenges—rather than features that were handed down from that 250-million-year-old common ancestor. But if there is any one characteristic that distinguishes us from birds, and probably from any other nonhuman creature, it is indeed that extraordinary accomplishment that we call language.

The specialness of language

Unlike birds, people use language, not just to signal emotional states or territorial claims, but to shape each other's minds. Language is an exquisitely engineered device for describing places, people, other objects, events, and even thoughts and emotions. We use it to give directions, to recount the past and anticipate the future, to tell imaginary stories, to flatter and deceive. We gossip, which is a useful way to convey information about other people. We use language to create vicarious experiences in others. By sharing our experiences, we can make learning more efficient, and often less dangerous. It is better to tell your children not to play in traffic than to let them discover for themselves what can happen if they do.

Even birdsong, for all its complexity, is largely stereotyped, more like human laughter than human discourse. Give or take a few notes, the song of any individual bird is repetitive to the point of monotony. Human talk, by contrast, is possessed of a virtually infinite variety, except perhaps in the case of politicians. The sheer inventiveness of human language is well illustrated in an anecdote involving the behavioral psychologist B. F. Skinner and the eminent philosopher A. N. Whitehead. On an occasion in 1934, Skinner found himself seated at dinner next to Whitehead and proceeded to explain to him the behaviorist approach to psychology. Feeling obliged to offer a challenge, Whitehead uttered the following sentence: "No black scorpion is falling upon this table," and then asked Skinner to explain why he might have said that. It was more than twenty years before Skinner attempted a reply, in an appendix to his 1957 book Verbal Behavior. Skinner proposed that Whitehead was unconsciously expressing a fear of behaviorism, likening it to a black scorpion that he would not allow to intrude into his philosophy. (The skeptical reader might be forgiven for concluding that this reply owed more to psychoanalysis than to behaviorism.)

Be that as it may, Whitehead had articulated one of the properties of language that seem to distinguish it from all other forms of communication, its generativity. While all other forms of communication among animals seems to be limited to a relatively small number of signals, restricted to limited contexts, there is essentially no limit to the number of ideas, or propositions that we can convey using sentences. We can immediately understand sentences composed of words that we have never heard in combination before, as Whitehead's sentence illustrates.

Here is another example. A few years ago I visited a publishing house in England and was greeted at the door by the manager, whose first words were: "We have a bit of a crisis. Ribena is trickling down the chandelier." I had never heard this sentence before but knew at once what it meant, and was soon able to confirm that it was true. For those who don't know, ribena is a red fruit drink that some people inflict on their children, and my first sinister thought was that the substance dripping from the chandelier was blood. It turned out that the room above was a crèche, and one of the children had evidently decided that it would be more fun to pour her drink onto the floor than into her mouth.

This example illustrates that language is not just a matter of learning associations between words. I had never in my life encountered the words ribena and chandelier in the same sentence, or even in the remotest association with each other, yet I was immediately able to understand a sentence linking them. Rather than depending on previously learned associations, language allows us to connect concepts that are already established in the mind. It operates through the use of rules, known collectively as grammar. I hasten to assure the nervous reader that grammar does not refer to the prescriptive rules that some of us struggled with in school but rather to a set of largely unconscious rules that govern all natural forms of human speech, including street slang. In this sense, there ain't no such thing as bad grammar, and it don't really matter what your teacher tried to teach you. Even so, I need to torment you with a short grammar lesson.

A grammar lesson

Something of the way in which grammar operates to create an endless variety of possibilities is illustrated by a familiar childhood story, in which each sentence is built from the previous one:

This is the house that Jack built.
This is the malt that lay in the house that Jack built.
This is the rat that ate the malt that lay in the house that Jack built.
This is the cat that killed the rat that ate the malt that lay in the house that Jack built.

. . . and so on, potentially forever, although limited in practice by constraints on short-term memory. In these examples, the phrases qualifying each character in the story are simply added: the cat that killed the rat, the rat that ate the malt, the malt that lay in the house, the house that Jack built. But qualifying phrases can also be embedded, like this:

The malt, which was eaten by the rat that was killed by the cat, lay in the house that Jack built.

And phrases can be embedded in phrases that are themselves embedded, although too much embedding can create a kind of linguistic indigestion that makes a sentence hard to swallow, as in the following:

The malt that the rat that the cat killed ate lay in the house that Jack built.

This ability to tack clauses onto clauses, or embed clauses within clauses, is known as recursion. Mathematically, a recursion formula is a formula for calculating the next term of a sequence from one or more of the preceding terms. Clauses like that ate the rat and that killed the cat are relative clauses, and a simple formula dictates that a relative clause can be defined (or "rewritten") as a relative clause plus an (optional) relative clause! This formula allows relative clauses to be strung together indefinitely, as in "The House That Jack Built." Grammar is often expressed in terms of rewrite rules, in which phrases are "rewritten" as words and other phrases, and it is this rewriting of phrases as combinations involving phrases that gives grammar its recursive property (see figure 1.1). Perhaps the most minimal example of recursion in literature is that penned by the American writer Gertrude Stein in her poem Sacred Emily :

A rose is a rose is a rose is a rose, is a rose.

This isn't quite as simple, perhaps, as it looks at first glance—note that cunningly placed comma.

It is also clear that rules rule, and not just associations. We may learn poems or everyday expressions by heart, simply associating the words together, but when we generate new sentences we do not rely on past associations between words. In the last of the above sentences about the house that Jack built, the words malt and lay are associated in the meaning of the sentence but are separated by eight other words—and of course even more words could have been inserted, had we chosen, for example, to mention that the rat was fat and the cat lazy. Yet the speaker and the listener both understand that the malt did not kill or eat, but in fact lay in the house that Jack built, at least until greedily devoured by the rat. Our ability to construct and understand sentences depends on a remarkable skill in the use of rules. Even more remarkably, perhaps, we apply these rules without being aware of them, and even linguists are not agreed as to what all the rules are and precisely how they work.

Linguists also like to draw a clear distinction between grammar and meaning. We can understand sentences to be grammatically regular even if they have no meaning, as in the sentence Colorless green ideas sleep furiously, constructed by the most eminent linguist of our time, Noam Chomsky. Indeed, we can recognize a sentence as grammatical even if the words don't have any meaning at all, as in Lewis Carroll's "Jabberwocky":

T'was brillig and the slithy toves
Did gyre and gimble in the wabe.
All mimsy were the borogoves
And the mome raths outgrabe.

But note that some of the words (was, and, the, etc.) are regular English words. These words are called function words, as distinct from the content words that refer to objects, actions, or qualities in the world. Suppose we insert nonsense words in the place of the function words:

G'wib brillig pog dup slithy toves
Kom gyre pog gimple ak dup wabe.
Utt mimsy toke dup borogoves
Pog dup mome raths outgrabe.

Now we have no idea whether this is grammatical or not. This illustrates that function words play a critical role in grammar, providing a kind of scaffold on which to build sentences. Function words include articles (a, the, this, etc.), conjunctions (and, but, while, etc.), prepositions (at, to, by, etc.), pronouns (I, you, they, it, etc.), and a few other little things. Content words, by contrast, are easily replaceable, and as speakers we are always receptive to new words that we can easily slot into sentences. We live in a world of rapid invention, and new words, like geek and dramedy (a drama that doesn't know whether it's funny or not), are coined every day. Another word I encountered recently is pracademic, referring to the rare academic who is blessed with practical skills.

Of course, different languages have somewhat different rules, and no one claims that each language has its own set of innate rules. Forming a question in Chinese is not the same as forming a question in English. One important way in which languages differ has to do with the relative importance of word order and what is known as inflection. If you studied Latin, you will know that there are many different forms of a noun or a verb, depending on its role in a sentence, and it is these different forms that are known as inflections. In English there are just two forms of the noun, one for the singular and one for the plural (e.g., table and tables). The Latin word mensa means table, but takes several different forms. If it is a direct object (as in I overturned the table), it must be rendered as mensam, and in the plural (tables) the equivalent forms are mensae and mensas. Again, the English phrase of tables is rendered in Latin as mensarum.

The contrast between English and Latin is much more extreme for verbs. There are just four forms of the regular verb in English (e.g., love, loves, loved, loving). In Latin there are dozens, as many a struggling schoolchild knows (or once knew). Just to take the present tense, we have

amo   I love
amas   you (singular) love
amat   he/she/it loves
amamus   we love
amatis   you (plural) love
amant   they love

And that's just the beginning of love. There are different forms for the future and past tenses, as well as more complex tenses, like the future perfect (she will have loved), the subjunctive, the conditional, and goodness knows what else. In Latin nearly all of this is accomplished by inflecting a basic stem, whereas in English we make much more use of function words (e.g., they might have loved, she would have been going to love). In some languages, there are even more variations. For example, Turkish is so highly inflected that there are said to be over two million forms of each verb! The different forms not only reflect the subject of the verb (I, you, she, etc.) but also the direct and indirect objects, and a lot else besides.

English is highly dependent on how we order the words. Man swallows whale has a rather different meaning from Whale swallows man, and is arguably more interesting.5 But in Latin the subject and object of a sentence are signaled by different inflections, and the words can be reordered without losing the meaning. The Australian aboriginal language Walpiri is a more extreme example of an inflected language in which word order makes essentially no difference; such languages are sometimes called scrambling languages. Chinese, by contrast, is an example of an isolating language, in which words are not inflected and different meanings are created by adding words or altering word order. English is closer to being an isolating language than a scrambling one.

Given the different ways in which different languages work, it might seem that no set of rules could apply to all of them. Chomsky has nevertheless argued that certain deeper rules are common to all languages. He refers to these rules as universal grammar. One way to conceptualize this is in terms of principles and parameters. In this view, the universal rules are the principles, and the particular forms they take are parameters that change from one language to another. Although some progress has been made toward identifying universal principles, linguists by no means agree as to what they are, or even as to whether language can be fully understood in this way.

This has not been a complete grammar lesson, but I hope to have illustrated the complexity of grammar and to have demonstrated that grammar operates according to rules rather than simple learned associations. It's true that we do learn some things, like poems, songs, prayers, or clichés, by rote, but this does not explain our extraordinary capacity to generate new sentences to express new thoughts, or to understand sentences—like Ribena is trickling down the chandelier—that we have never heard before. It is grammar, then, that gives language its property of generativity and distinguishes it from all other forms of animal communication. So far as we know, there is nothing remotely resembling grammar in any of the communication systems of other species: no function words, no recursion, no tenses; indeed, no sentences. This is not to say that nothing in the communication or actions of other animals bears on human language, but it is clear that the gap between human and animal communication is very wide indeed, and is one of the greatest challenges confronting psychological science.

How is language learned?

According to Chomsky, language is too complex to be learned by observation of its regularities. That is, no purely inductive device could possibly extract the rules of language simply by examining or analyzing examples of sentences. Therefore, children must possess some innate knowledge of language that enables them to acquire it, or what Steven Pinker called "the language instinct."6 In other words, they are born with a knowledge of universal grammar and simply adapt—or "parameterize"—this innate knowledge to conform to the specific language or languages they acquire.

This rather controversial notion at least captures one important truth about language: children of any race and culture can learn any language, which implies that language does have a universal property. Eskimo children brought up in France will speak French, and visitors to London are often surprised to hear people of African descent speaking English with Cockney accents. Under normal circumstances, all human children learn language, and the languages they learn are the languages they are exposed to in childhood. Of course we can learn languages as adults, but only with considerable effort, and it is probably impossible to do so if we have not learned another language in childhood. Another argument in favor of some kind of universal foundation for language is that all languages have the same kind of units, such as nouns, verbs, adjectives, function words, phrases, and sentences. It is also important to understand that the different languages of the world differ very little, if at all, in grammatical complexity. Grammatically speaking, no language is more "primitive" than any other, unless we include languages that are not yet properly formed, like baby talk or pidgin languages improvised by adults to communicate across linguistic boundaries. The grammatical complexity that different languages share is at least consistent with the idea of a common, universal grammar.

But although the acquisition of language is universally human, the fact that languages differ, typically to the point of mutual incomprehension, means that there is of course a learned component. In speech, the actual words we use are arbitrary and must be learned by rote. As we have seen, the rules also vary and depend on experience with the language, although the learning of rules may be more a question of selecting among preexisting alternatives than of rote learning. And although all languages are about equally complex in grammatical terms, they do, of course, vary in terms of the number of words they employ. In this respect, English is easily the most bountiful language in the world, in part because it has borrowed vocabulary from a good many other languages, and in part because it has become the main language of science and technology and so must absorb large numbers of new words for different inventions and concepts. This is not to say that English has a monopoly on concepts: some words in other languages express ideas or concepts that have no exact equivalents in English. Not all cultures think alike.

But is it really true that we could not learn language unless it possessed some universal grammatical structure, innately known to us? Chomsky's argument is essentially based on the idea that language is impossible to learn from the body of evidence available, and that there must be some predetermined structure that guides the discovery of grammatical rules. Consider, for example, how we turn a declarative sentence into a question:

The brigadier and his wife are coming to dinner tonight.

becomes

Are the brigadier and his wife coming to dinner tonight?

Here, the rule seems simple: you simply scan the sentence for the word are and move it to the beginning. But suppose we apply this rule to a slightly more complex sentence, like

The brigadier and his wife who are visiting the city are coming to dinner tonight.

This produces the anomalous sentence

*Are the brigadier and his wife who visiting the city are coming to dinner tonight?7

Children virtually never make mistakes like this8 but seem to understand that it is the second are, not the first, that must be moved to create the question:

Are the brigadier and his wife who are visiting the city coming to dinner tonight?

That is, children seem to instinctively understand the phrase structure of the sentence and so skip over the embedded phrase who are visiting the city when making the transformation.

On the face of it, this argument seems compelling, but to conclude that it is impossible, without some built-in structure, to learn the rules and phrase structure of grammar may be premature. They once said it was impossible to climb Mount Everest.9 And it has recently been suggested that language learning may not be so special after all. Since about the mid-1980s researchers have increasingly challenged the idea that the mind is a computational device, operating according to rules, and have suggested instead that it is after all merely a sophisticated associative device. It is the brain that creates the mind, and the brain does seem to work by means of elements, called neurons, that connect in associative fashion. It is neurons that convey information from the sense organs to the brain, and from the brain to various output devices. The traffic is not all one-way, since there are feedback processes, and circuits in which the firing of neurons is arranged in recurrent loops. Further, we have good evidence that the connections between neurons, known as synapses, can be modified by experience, and it is this modification that forms the basis of learning and memory.

Many investigators have tried to create artificial networks that mimic the properties of the human mind, and one of the challenges has been to create networks that demonstrate some of the apparently rule-governed properties of language. For example, Jeff Elman has devised a network with recurrent loops that can apparently learn something resembling grammar. Given a partial sequence of symbols, analogous to a partial sentence, the network can learn to predict events that would follow according to rules of grammar. In a very limited way, then, the network "learns" the rules of grammar. An important aspect of Elman's work is that he makes no attempt to teach the network the rules of language themselves. During training, when the network predicts the next word in a sequence, this is compared to the actual next word, and the network is then modified so as to reduce the discrepancy between them. That is, the network apparently learns to obey the rules without "knowing" what they are: no one programs it to obey them, nor has it been hard-wired to do so.

At first, as one might expect from Chomsky's arguments, the network was not able to handle the recursive aspects of grammar, in which phrases are embedded in other phrases, so that words that go together may be separated by several other words. However this problem was at least partially surmounted when Elman introduced a "growth" factor. Early on, the system was degraded so that only global aspects of the input were processed, but the "noise" in the system was gradually decreased so that it was able to process more and more detail. When this was done, the system was able to pick up some of the recursive quality of grammar, and so begin to approximate the processing of true language. Again, no rules were explicitly taught or built into the system.

Part of the problem in learning grammar has to do with its hierarchical structure. Some of the rules involve the embedding and movement of entire phrases; others, the placement and inflection of individual words; and still others, the component parts of words. The suggestion that arises from Elman's work is that this problem is solved by introducing a growth factor into the network itself, so that it initially processes only global properties of the input but gradually focuses more and more on the details. The developmental psychologist Elissa Newport has characterized this as a "less is more" principle; the reason that children learn language so easily is that they process the information crudely at first, and then gradually acquire detail. Far from being linguistic geniuses, as Steven Pinker has claimed, young children succeed precisely because their learning is diffuse and ill-formed. It is a bit like gradually focusing a telescope; only blurred outlines are visible at first, and then the details gradually emerge.

These ideas, elaborated in the book Rethinking Innateness by Elman, Newport, and their colleague Elizabeth Bates, constitute a significant challenge to the idea that humans possess a specific grammar gene, or that language requires some special "language acquisition device."10 Instead, our unique capacity for language may depend simply on evolutionary alterations to the growth pattern, in which the period of postnatal growth became relatively longer than it is in other primates, the brain grew to a larger size relative to the body, and the relative sizes of different parts of the brain shifted—but more of this later. To be sure, the specific pattern of changes is uniquely human and involves genetic modifications, but they are the sorts of modifications that have altered the basic body plan of animals throughout biological evolution.

If Elman and his colleagues are correct in assuming that grammar can be acquired by means of an associative device that includes a growth component, this does not mean that language does not follow rules. As we saw earlier, language is exquisitely rule-governed, and linguists like Chomsky have done a great deal to show us the nature of the rules. The point is that rule-governed behavior need not require that the rules be pre-programmed into the system, or even represented explicitly in the network. We do not know most of the rules that govern our language in any sense other than that we follow them when we speak. The rules themselves are not associative, but it is possible that they can be learned by an associative device.11

That said, it must be recognized that human language is highly complex, and Elman's relatively simple demonstrations do not really come close to capturing many of the niceties of grammar and meaning. Predicting the next word in a sentence is a long way off from actually comprehending a sentence, or producing one. And to be sure, there is something a bit zombielike about a network that responds in a languagelike way but has no built-in rules and no apparent understanding of what it is "saying." But Elman's work does go some distance toward demystifying language and bringing it into the realm of biology, from which it is always in danger of escaping. Clearly it will take a lot more research to convince most linguists that the secret of language learning lies in patterns of growth rather than in special-purpose grammar genes, but that's what the new millennium is for.

Language cannot be entirely dependent on genes, though, because it is heavily influenced by culture. Indeed, we are virtually helpless in a culture where different language is spoken—unless we resort to gesture, but that's a story for later. One might be almost tempted to believe that language is a mechanism for preserving cultural integrity and keeping foreigners out! Many human characteristics clearly depend not on the genetic code but on the culture we happen to be part of.Richard Dawkins has dubbed these culturally determined characteristics "memes."12 They include stories, songs, beliefs, inventions, political systems, cuisine—indeed, virtually all of the things we think of as part of culture.

But could language itself be a meme? In some respects, it is. The actual words we use are passed on by the culture we live in, as are accents, catch phrases, and other superficial aspects of language. But language cannot be purely cultural. Whether or not there are "grammar genes," as Pinker maintains, there is no evidence that other species can learn anything resembling true grammatical language, as we shall see in chapter 2. Moreover, memes depend fundamentally on our capacity to imitate, which itself is something that humans excel at. As we shall see in the following chapters, even our closest relatives, chimpanzees and bonobos, are relatively poor at imitating. And if that were not enough, true language goes beyond imitation. As I have tried to explain, language is relentlessly generative, allowing us to convey novel thoughts, as I hope I am managing to do in this book.

Another argument for an innate component underlying grammatical language comes from a phenomenon called creolization. In the days of colonial expansion, the European traders and colonizers communicated with indigenous peoples with a makeshift form of language called pidgin. Pidgin has virtually no grammar—no tenses, no articles like a or the—but was adequate for the exchange of simple information, as in trading or bartering. Pidgins can become quite complex, but complexity is achieved by stringing words together in associative fashion rather than by the more economical use of syntax. In Solomon Islands pidgin, Prince Charles is known as pikinini belong Missus Kwin, and Princess Diana was known as Meri belong pikinini belong Missus Kwin, until her divorce, when her title was upgraded to this fella Meri he Meri belong pikinini belong Missus Kwin him go finish.13

Research in Hawaii has shown that in the course of a generation a pidgin language can be converted into a more sophisticated language, known as a creole. Unlike pidgin, a creole does have fully fledged grammar. And it came out of the brains of babes and sucklings, as it were: all it took was for the children of the next generation to be exposed to pidgin at an early age. With no parental help, the children constructed the grammar, presumably because of the intricate grammatical machinery that was already wired into their brains!14

Language, speech, and thought

Language is not just speech. We can, of course, read silently and think in silent words. More critically, the signed languages invented by deaf people all over the world have all the generativity of language and are governed by grammar, yet have no basis in sound. They consist entirely of bodily gestures, mostly of the hands, arms, and face. Signed language has all the essential properties of spoken language, including grammar. I shall examine signed language in much more detail in chapter 6, since it provides one of the foundations for the main theme of this book, which is that even spoken language may have its origins in the silent gestures of our distant forebears.

Language, therefore, runs deeper than speech. Is it the same as thought? It is sometimes suggested that thinking is simply internal speech, and sometimes it is, but not always. There are ways of thinking that owe little to language. For example, we can imagine objects or scenes and manipulate them in our minds. A much studied example is mental rotation, which involves imagining how objects look if rotated into different orientations. Look at this picture of an upside-down man, holding out an arm (figure 1.2). Which arm is he holding out—the left or the right? To answer this question, you may find that you have to mentally rotate the man to the upright position, and perhaps turn him around—processes that have nothing to do with words.

Nonverbal thinking depends on our ability to represent objects, sounds, and actions in our minds and to manipulate them mentally. Besides rotating things, we can replay tunes in our minds or replay a passing shot in tennis or a goal scored in soccer and imagine how we might do these things on some future occasion. Such is the stuff of imagination and fantasy, and words or signs need be no part of it. We use nonverbal thinking to solve problems, and it is likely that our most creative thoughts are nonverbal, and often spatial, rather than linguistic. Albert Einstein, for example, is said to have worked out the theory of relativity by imagining himself traveling on a beam of light. There is no reason to doubt that even the great apes have the capability of forming mental representations of objects and manipulating them. For example, Wolfgang Kohler, in a classic series of experiments, showed that chimpanzees could solve mechanical problems in their minds before demonstrating the solutions in practice, a process he called insight.15

Language is nevertheless intimately connected with thought, since we use it to convey our thoughts to others. This requires that symbols, whether words or signs, be associated with the objects, actions, qualities, etcetera, that we store in our minds. By manipulating those symbols, we can transmit thoughts from our own minds to the minds of others. This can be effectively accomplished by writing, and I hope these very words are making some sort of impression on your thoughts. Novels and stories are a powerful and compelling means of creating images and fantasies in the minds of others. Television and film, of course, provide direct access to our internal representations, without the need for intervening symbols, except in the case of dialogue.

The language of thought is known as mentalese. Not surprisingly, it has much in common with communicative language. Our thoughts are generative, and we can imagine novel scenes, such as a cow jumping over the moon, as readily as we can construct novel sentences to describe them. Our thoughts can also be recursive. For example, one of the characteristics of human thought is what has been called theory of mind. This refers to the ability to understand the minds of others and to know what others see, or feel, or know. This can be recursive; for example, I might not only know that you can see me, but I might know that you know that I know that you can see me. The generativity and recursiveness of human language no doubt reflect the generativity and recursiveness of human thought.

But communicative language must be different from mentalese. For one thing, it must make use of symbols to stand for the things we want to talk about, since we cannot directly convey our internal representations. The use of symbols requires shared convention; that is, if I am to converse with you I must assume that your understanding of my words is the same as my own. (This in itself implies theory of mind.) Spoken language also differs from mentalese in that it is confined to a single dimension, time. Our thoughts, by contrast, can make use of all four physical dimensions, three dimensions of space and one of time. For example, I can form a three-dimensional spatial image in my mind of the inside of my house from a particular location,16 but in order to describe it to you I must take a mental walk through the house—a four-dimensional activity—and describe the various features one by one—a one-dimensional activity. This is known as linearization, and at least some of the properties of spoken language reflect this requirement. The embedding of phrases may relate to the sort of embedding that can occur as I imagine myself walking through the house; I may stop at a china cabinet, for example, and describe its contents, before continuing to the next item of furniture. That is, the thought processes themselves are hierarchical, ranging from the gross layout of the house, to the items of furniture within the rooms, to the smaller items contained within those items, and so on. One is reminded of Jonathan Swift's comment on fleas:

So, naturalists observe, a flea
Hath smaller fleas that on him prey;
And these have smaller fleas to bite 'em
And so proceed ad infinitum.17

Some features of language, therefore, such as its generativity and recursiveness, derive from features of thought itself. The special qualities of speech, at least, derive from the necessity to transform the intended message so that it is transmitted as a signal varying in time. The same sort of transformation occurs in transmitting a TV signal. The spatial pattern is scanned sequentially, so that the pixels on the screen that make up an image are transmitted one at a time and are then recomposed into a spatial pattern at the receiving end. Similarly, we turn our thoughts into a stream of sounds, and the listener then converts these sounds back into the thoughts their representations in the brain that we hope to convey. Although we can never be quite sure that the listener gets precisely the message we want, the speech system is remarkably powerful, accurate, and flexible.

The linearization problem is not quite so acute in the case of signed language, since the hands and arms can convey something of the spatial quality of the thoughts we might wish to convey, as we shall see in chapter 6. Moreover, where spoken words are arbitrary and depend on convention to convey their meaning, manual signs can in some instances represent shapes and actions more or less directly. The sign for a tree, for example, might depict the actual shape of a tree. Whereas nearly all words represent their meanings symbolically, signs have an imitative or iconic component that may make them easier to learn. It seems reasonable to suppose, then, that there is a more direct relation between signs and the thoughts they express than there is between words and the underlying thoughts. This is but one reason why I shall suggest that language may have originated in manual signs rather than in vocal sounds.

Summary

In summary, language is an extraordinary accomplishment, and almost certainly a uniquely human one, an idea I hope to amplify in the next chapter. It involves a complex system of rules, and our system of learning those rules is probably innately determined, even if the particular languages we speak have a strong cultural component, to the point of mutual incomprehensibility between cultures. Arguably, it is language that makes us human. Yet such a complex ability cannot have evolved entirely de novo in our species. In the following chapters I shall look closely at the roots of language in our primate ancestry and try to trace how it emerged in the evolution of our own species.

But even at this stage, I hope we can be fairly sure of one thing. Language is not, after all, for the birds.

Table of Contents

Preface vii
Acknowledgments xi
Chapter 1. What Is Language? 1
Chapter 2. Do Animals Have Language? 21
Chapter 3. In the Beginning Was the Gesture 41
Chapter 4. On Our Own Two Feet 66
Chapter 5. Becoming Human 82
Chapter 6. Signed Language 102
Chapter 7. It's All Talk 126
Chapter 8. Why Are We Lopsided? 159
Chapter 9. From Hand to Mouth 184
Chapter 10. Synopsis 213
References 221
Index 247

What People are Saying About This

From the Publisher

"A lively and well constructed read that bravely tackles head-on the tough question of where language came from. Corballis intriguingly concludes that this unique human property has gestural rather than vocal origins; and along the way he explores numerous fascinating byways that make this a must read for everyone interested in how humans became the extraordinary creatures they are."—Ian Tattersall, American Museum of Natural History, author of Extinct Humans and The Fossil Trail

"Michael Corballis has accomplished a Herculean task. Reviewing and synthesizing data from a range of disciplines, he has woven it all into a book that is at once enjoyable and easy to read and yet faithful to the complexity of the subject matter. While this is admittedly a provocative work, the author has marshaled considerable evidence in support of his thesis. Indeed, he has done all of us a great service by raising the level of discussion surrounding this controversial topic. This is no small accomplishment."—Sherman Wilcox, University of New Mexico, General Editor, Evolution of Communication

"A fascinating journey along the evolutionary path that 'converted us from wild gesticulators to smooth talkers.' On the path we pass our ape-like ancestors, the change to bipedalism, increase in brain size, gestures, the anatomical requirements for vocalization, and finally the spoken language."—Lewis Wolpert, University College London

Ian Tattersall

A lively and well constructed read that bravely tackles head-on the tough question of where language came from. Corballis intriguingly concludes that this unique human property has gestural rather than vocal origins; and along the way he explores numerous fascinating byways that make this a must read for everyone interested in how humans became the extraordinary creatures they are.
Ian Tattersall, American Museum of Natural History, author of "Extinct Humans and The Fossil Trail"

Sherman Wilcox

Michael Corballis has accomplished a Herculean task. Reviewing and synthesizing data from a range of disciplines, he has woven it all into a book that is at once enjoyable and easy to read and yet faithful to the complexity of the subject matter. While this is admittedly a provocative work, the author has marshaled considerable evidence in support of his thesis. Indeed, he has done all of us a great service by raising the level of discussion surrounding this controversial topic. This is no small accomplishment.
Sherman Wilcox, University of New Mexico, General Editor, "Evolution of Communication"

Lewis Wolpert

A fascinating journey along the evolutionary path that 'converted us from wild gesticulators to smooth talkers.' On the path we pass our ape-like ancestors, the change to bipedalism, increase in brain size, gestures, the anatomical requirements for vocalization, and finally the spoken language.
Lewis Wolpert, University College London

From the B&N Reads Blog

Customer Reviews