U is for Universal, G is for Grammar
'Universal Grammar' is taken to be the set of properties, conditions, or whatever that constitute the 'initial state' of the language learner, hence the basis on which knowledge of a language develops. It by no means follows from such an account that there must be specific elements or rules . . . or . . . 'features' common to all languages, unless we take these features in a suitably abstract manner."
— Noam Chomsky, Rules and Representations —
"It's perfectly obvious that there is some genetic factor that distinguishes humans from other animals and that it is language-specific. The theory of that genetic component, whatever it turns out to be, is what is called universal grammar."
— Noam Chomsky —
"And once that trick has been implemented, hierarchically embedded syntax immediately becomes possible [...] The macro-mutation seems complex and ‘747-ish’ but it really isn’t. It’s a simple addition – a ‘stretched DC-8 mutation’ [...] Chomsky’s hereditarian position in this instance makes sense and, more to the point, interesting sense. The origin of Language may represent a rare example of the ‘hopeful monster’ theory of evolution."
— Richard Dawkins (Brief Candle in the Dark: My Life in Science; pgs. 383-4) —
During the first half of the 20th century, linguists who theorized about the human ability to speak did so from the behaviourist perspective that prevailed at that time. They therefore held that language learning, like any other kind of learning, could be explained by a succession of trials, errors, and rewards for success. In other words, children learned their mother tongue by simple imitation, listening to and repeating what adults said.
This view became radically questioned, however, by the American linguist Noam Chomsky. For Chomsky, acquiring language cannot be reduced to simply developing an inventory of responses to stimuli, because every sentence that anyone produces can be a totally new combination of words. When we speak, we combine a finite number of elements—the words of our language—to create an infinite number of larger structures—sentences.
Moreover, language is governed by a large number of rules and principles, particularly those of syntax, which determine the order of words in sentences. The term “generative grammar”refers to the set of rules that enables us to understand sentences but of which we are usually totally unaware. It is because of generative grammar that everyone says “that’s how you say it” rather than “how that’s you it say”, or that the words “Bob”and “him” cannot mean the same person in the sentence “Bob loves him.” but can do so in “Bob knows that his father loves him.” (Note in passing that generative grammar has nothing to do with grammar textbooks, whose purpose is simply to explain what is grammatically correct and incorrect in a given language.)
Even before the age of 5, children can, without having had any formal instruction, consistently produce and interpret sentences that they have never encountered before. It is this extraordinary ability to use language despite having had only very partial exposure to the allowable syntactic variants that led Chomsky to formulate his “poverty of the stimulus” argument -- simply put, if hearing people around her speak amounts to a child learning/being (indirectly) taught language then the child would seem to be learning a lot more than she was being taught-- which was the foundation for the new approach that he proposed in the early 1960s.
Universal grammar is regarded by biologists like Richard Dawkins as part of the genetic endowment of the human species. It is something that the child is born with. Like it's thumbs! Children don't learn how to grow a thumb. No more than they learn Language.
Universal grammar, then, is the theoretical or hypothetical system of categories, operations, and principles shared by all human languages and considered to be innate to human children, a part of our genome. Since the 1980s, the term has often been capitalized.
The concept of a universal grammar (UG) has been traced to the observation of Roger Bacon, a 13th-century Franciscan friar and philosopher, that all languages are built upon a common grammar.
The expression was popularized in the 1950s and 1960s by the legendary polymath Noam Chomsky and his students. Chomsky has always noted that it is of primary importance to understand that Universal Grammar is not really a grammar per se. Therefore, it is not some kind of magical/auto-translating language that everyone can speak and understand. Nor does it imply that one language, say English, should be at least in certain degrees similar to another language, say Telegu. Chomsky observes:
"[U]niversal grammar is not a grammar, but rather a theory of grammars, a kind of metatheory or schematism for grammar" (Language and Responsibility, 1979)."
In Chomsky’s view, the reason that children so easily master the complex operations of language is that they have innate knowledge of certain principles that guide them in developing the grammar of their language. In other words, Chomsky’s theory is that language learning is facilitated by a predisposition that our brains have for certain structures of language.
But what language? For Chomsky’s theory to hold true, all of the languages in the world must share certain structural properties. And indeed, Chomsky and other generative linguists like him have shown that the 5000 to 6000 languages in the world, despite their very different grammars, do share a set of syntactic rules and principles. These linguists believe that this “universal grammar” is innate and is embedded somewhere in the neuronal circuitry of the human brain. And that would be why children can select, from all the sentences that come to their minds, only those that conform to a “deep structure” encoded in the brain’s circuits.
Universal grammar, then, consists of a set of unconscious constraints that let us decide whether a sentence is correctly formed. This mental grammar is not necessarily the same for all languages. But according to Chomskyian theorists, the process by which, in any given language, certain sentences are perceived as correct while others are not, is universal and independent of meaning.
Thus, we immediately perceive that the sentence “Robert book reads the” is not correct English, even though we have a pretty good idea of what it means. Conversely, we recognize that a sentence such as “Colorless green ideas sleep furiously.” is grammatically correct English, even though it is nonsense.
A pair of dice offers a useful metaphor to explain what Chomsky means when he refers to universal grammar as a “set of constraints”. Before we throw the pair of dice, we know that the result will be a number from 2 to 12, but nobody would take a bet on its being 3.143. Similarly, a newborn baby has the potential to speak any of a number of languages, depending on what country it is born in, but it will not just speak them any way it likes: it will adopt certain preferred, innate structures. One way to describe these structures would be that they are not things that babies and children learn, but rather things that happen to them. Just as babies naturally develop arms and not wings while they are still in the womb, once they are born they naturally learn to speak, and not to chirp or neigh.
Observations that support the Chomsky's view of Language
Until Chomsky propounded his theory of universal grammar in the 1960s, the empiricist school that had dominated thinking about language since the Enlightenment held that when children came into the world, their minds were like a blank slate. Chomsky’s theory had the impact of a large rock thrown into this previously tranquil, undisturbed pond of empiricism.
Subsequent research in the cognitive sciences, which combined the tools of psychology, linguistics, computer science, and philosophy, soon lent further support to the theory of universal grammar. For example, researchers found that babies only a few days old could distinguish the phonemes of any language and seemed to have an innate mechanism for processing the sounds of the human voice.
Thus, from birth, children would appear to have certain linguistic abilities that predispose them not only to acquire a complex language, but even to create one from whole cloth if the situation requires. One example of such a situation dates back to the time of plantations and slavery. On many plantations, the slaves came from many different places and so had different mother tongues. They therefore developed what are known as pidgin languages to communicate with one another. Pidgin languages are not languages in the true sense, because they employ words so chaotically—there is tremendous variation in word order, and very little grammar. But these slaves’ children, though exposed to these pidgins at the age when children normally acquire their first language, were not content to merely imitate them. Instead, the children spontaneously introduced grammatical complexity into their speech, thus in the space of one generation creating new languages, known as creoles.
Chomsky and the evolution of language
Many authors, adopting the approach of evolutionary psychology, believe that language has been shaped by natural selection. In their view, certain random genetic mutations were thus selected over many thousands of years to provide certain individuals with a decisive adaptive advantage. Whether the advantage that language provided was in co-ordinating hunting parties, warning of danger, or communicating with sexual partners remains uncertain, however.
Chomsky, for his part, does not see our linguistic faculties as having originated from any particular selective pressure, but rather as a sort of fortuitous accident. He bases this view, among other things, on studies which found that recursivity—the ability to embed one clause inside another, as in “the person who was singing yesterday had a lovely voice”—might be the only specifically human component of language. According to the authors of these studies, recursivity originally developed not to help us communicate, but rather to help us solve other problems connected, for example, with numerical quantification or social relations, and humans did not become capable of complex language until recursivity was linked with the other motor and perceptual abilities needed for this purpose. (Thus recursivity would meet the definition of a spandrel offered by Stephen Jay Gould.) According to Chomsky and his colleagues, there is nothing to indicate that this linkage was achieved through natural selection. They believe that it might simply be the result of some other kind of neuronal reorganization.
The minimalist program
In the 1990s, Chomsky’s research focused on what he called the “minimalist program”, which attempted to demonstrate that the brain’s language faculties are the minimum faculties that could be expected, given certain external conditions that are imposed on us independently. In other words, Chomsky began to place less emphasis on something such as a universal grammar embedded in the human brain, and more emphasis on a large number of plastic cerebral circuits. And along with this plasticity would come an infinite number of concepts. The brain would then proceed to associate sounds and concepts, and the rules of grammar that we observe would in fact be only the consequences, or side effects, of the way that language works. Analogously, we can, for example, use rules to describe the way a muscle operates, but these rules do nothing but explain what happens in the muscle; they do not explain the mechanisms that the brain uses to generate these rules.