grammar

grammar a system of rules specifying a language. The term has often been used synonymously with ‘syntax’, the principles governing the construction of sentences from words (perhaps also including the systems of word derivation and inflection – case markings, verbal tense markers, and the like). In modern linguistic usage the term more often encompasses other components of the language system such as phonology and semantics as well as syntax. Traditional grammars that we may have encountered in our school days, e.g., the grammars of Latin or English, were typically fragmentary and often prescriptive – basically a selective catalog of forms and sentence patterns, together with constructions to be avoided. Contemporary linguistic grammars, on the other hand, aim to be descriptive, and even explanatory, i.e., embedded within a general theory that offers principled reasons for why natural languages are the way they are. This is in accord with the generally accepted view of linguistics as a science that regards human language as a natural phenomenon to be understood, just as physicists attempt to make sense of the world of physical objects.
Since the publication of Syntactic Structures (1957) and Aspects of the Theory of Syntax (1965) by Noam Chomsky, grammars have been almost universally conceived of as generative devices, i.e., precisely formulated deductive systems – commonly called generative grammars – specifying all and only the well-formed sentences of a language together with a specification of their relevant structural properties. On this view, a grammar of English has the character of a theory of the English language, with the grammatical sentences (and their structures) as its theorems and the grammar rules playing the role of the rules of inference. Like any empirical theory, it is subject to disconfirmation if its predictions do not agree with the facts – if, e.g., the grammar implies that ‘white or snow the is’ is a wellformed sentence or that ‘The snow is white’ is not.
The object of this theory construction is to model the system of knowledge possessed by those who are able to speak and understand an unlimited number of novel sentences of the language specified. Thus, a grammar in this sense is a psychological entity – a component of the human mind – and the task of linguistics (avowedly a mentalistic discipline) is to determine exactly of what this knowledge consists. Like other mental phenomena, it is not observable directly but only through its effects. Thus, underlying linguistic competence is to be distinguished from actual linguistic performance, which forms part of the evidence for the former but is not necessarily an accurate reflection of it, containing, as it does, errors, false starts, etc. A central problem is how this competence arises in the individual, i.e., how a grammar is inferred by a child on the basis of a finite, variable, and imperfect sample of utterances encountered in the course of normal development. Many sorts of observations strongly suggest that grammars are not constructed de novo entirely on the basis of experience, and the view is widely held that the child brings to the task a significant, genetically determined predisposition to construct grammars according to a well-defined pattern. If this is so, and since apparently no one language has an advantage over any other in the learning process, this inborn component of linguistic competence can be correctly termed a universal grammar. It represents whatever the grammars of all natural languages, actual or potential, necessarily have in common because of the innate linguistic competence of human beings. The apparent diversity of natural languages has often led to a serious underestimation of the scope of universal grammar. One of the most influential proposals concerning the nature of universal grammar was Chomsky’s theory of transformational grammar. In this framework the syntactic structure of a sentence is given not by a single object (e.g., a parse tree, as in phrase structure grammar), but rather by a sequence of trees connected by operations called transformations. The initial tree in such a sequence is specified (generated) by a phrase structure grammar, together with a lexicon, and is known as the deep structure. The final tree in the sequence, the surface structure, contains the morphemes (meaningful units) of the sentence in the order in which they are written or pronounced. For example, the English sentences ‘John hit the ball’ and its passive counterpart ‘The ball was hit by John’ might be derived from the same deep structure (in this case a tree looking very much like the surface structure for the active sentence) except that the optional transformational rule of passivization has been applied in the derivation of the latter sentence. This rule rearranges the constituents of the tree in such a way that, among other changes, the direct object (‘the ball’) in deep structure becomes the surface-structure subject of the passive sentence. It is thus an important feature of this theory that grammatical relations such as subject, object, etc., of a sentence are not absolute but are relative to the level of structure. This accounts for the fact that many sentences that appear superficially similar in structure (e.g., ‘John is easy to please’, ‘John is eager to please’) are nonetheless perceived as having different underlying (deep-structure) grammatical relations. Indeed, it was argued that any theory of grammar that failed to make a deep-structure/surface-structure distinction could not be adequate.
Contemporary linguistic theories have, nonetheless, tended toward minimizing the importance of the transformational rules with corresponding elaboration of the role of the lexicon and the principles that govern the operation of grammars generally. Theories such as generalized phrase-structure grammar and lexical function grammar postulate no transformational rules at all and capture the relatedness of pairs such as active and passive sentences in other ways. Chomsky’s principles and parameters approach (1981) reduces the transformational component to a single general movement operation that is controlled by the simultaneous interaction of a number of principles or subtheories: binding, government, control, etc. The universal component of the grammar is thus enlarged and the contribution of languagespecific rules is correspondingly diminished. Proponents point to the advantages this would allow in language acquisition. Presumably a considerable portion of the task of grammar construction would consist merely in setting the values of a small number of parameters that could be readily determined on the basis of a small number of instances of grammatical sentences.
A rather different approach that has been influential has arisen from the work of Richard Montague, who applied to natural languages the same techniques of model theory developed for logical languages such as the predicate calculus. This so-called Montague grammar uses a categorial grammar as its syntactic component. In this form of grammar, complex lexical and phrasal categories can be of the form A/B. Typically such categories combine by a kind of ‘cancellation’ rule: A/B ! B P A (something of category A/B combines with something of category B to yield something of category A). In addition, there is a close correspondence between the syntactic category of an expression and its semantic type; e.g., common nouns such as ‘book’ and ‘girl’ are of type e/t, and their semantic values are functions from individuals (entities, or e-type things) to truth-values (T-type things), or equivalently, sets of individuals. The result is an explicit, interlocking syntax and semantics specifying not only the syntactic structure of grammatical sentences but also their truth conditions. Montague’s work was embedded in his own view of universal grammar, which has not, by and large, proven persuasive to linguists. A great deal of attention has been given in recent years to merging the undoubted virtues of Montague grammar with a linguistically more palatable view of universal grammar. See also CHOMSKY, LOGICAL FORM, PARS- ING , PHILOSOPHY OF LANGUAG. R.E.W.

meaning of the word grammar root of the word grammar composition of the word grammar analysis of the word grammar find the word grammar definition of the word grammar what grammar means meaning of the word grammar emphasis in word grammar