Arc Pair Grammar

Arc Pair Grammar

Arc Pair Grammar

Arc Pair Grammar

Paperback

$125.00 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview

Arc pair grammar is a new, extensively formalized, theory of the grammatical structure of natural languages. As an outgrowth of relational grammar, it constitutes a theoretical alternative to the long-dominant generative transformational approach to linguistics. In this work, David Johnson and Paul Postal offer the first comprehensive presentation of this theoretical framework, which provides entirely new notions of all the basic concepts of grammatical theory: sentence, language, rule, and grammar.

Originally published in 1981.

The Princeton Legacy Library uses the latest print-on-demand technology to again make available previously out-of-print books from the distinguished backlist of Princeton University Press. These editions preserve the original texts of these important books while presenting them in durable paperback and hardcover editions. The goal of the Princeton Legacy Library is to vastly increase access to the rich scholarly heritage found in the thousands of books published by Princeton University Press since its founding in 1905.


Product Details

ISBN-13: 9780691615585
Publisher: Princeton University Press
Publication date: 07/14/2014
Series: Princeton Legacy Library , #643
Pages: 754
Product dimensions: 9.10(w) x 6.10(h) x 1.50(d)

Read an Excerpt

Arc Pair Grammar


By David E. Johnson, Paul M. Postal

PRINCETON UNIVERSITY PRESS

Copyright © 1980 Princeton University Press
All rights reserved.
ISBN: 978-0-691-08270-7



CHAPTER 1

INTRODUCTION


1.1. Sentences, rules, and grammars

Any linguistic theory must involve two basic interrelated conceptions. First, there must be a set of ideas about the nature of the basic elements of language, sentences. Any such theory must have a basic conception of what kind of formal objects sentences are. For example, in all current variants of transformational theory (henceforth TG), sentences are regarded as quite complicated formal objects involving logical representations, phonological and phonetic representations, and a central core of structure called a derivation, which consists of a sequence of graph-theoretic objects called (not too happily) constituent structure trees. Secondly, a linguistic theory must have a conception of what a possible sentence-Specifying system or grammar is. This necessarily involves views about what a possible grammatical rule is, what a possible combination of grammatical rules (possible grammar) is, and a specification of how grammars characterize classes of sentences. Inevitably, the conception of the nature of a sentence will greatly determine what conception of grammatical rule is required, although the notion of rule adopted may (and normally will) feed back and determine in part the conception of linguistic object. Thus, it is the idea that grammatical structure involves a sequence of constituent structures (rather than the single structure of structuralist theories) which leads transformational theorists to a conception of grammatical rule countenancing grammatical transformations, etc. But the presumption that transformations exist has, in turn, led to many assumptions about sentence structure, e.g., that there are symbols which trigger transformations, elements like doom, traces, etc. As will become clear, the current work makes no use of the fundamental TG construct Derivation and a fortiori no use of those concepts dependent on or related to this central notion.

This work develops a new conceptual framework, which we call Arc Pair Grammar (henceforth: APG). APG is radically different from any other extant conception of linguistic theory along many but not all parameters. Although its concept of sentence structure is an outgrowth of largely unpublished work in RG (see section 1.3 for discussion of the recent historical antecedents of APG), APG will be about 95 percent new even to those familiar with work in RG. To those unfamiliar with the RG framework, APG will be essentially 100 percent novel.

Let us briefly contrast one current TG framework with that of APG. Within the current version of the so-called extended standard theory (of TG), a grammar is characterized in terms of various types of object, inter alia (see Chomsky and Lasnik [1977]): (i) a base containing a categorial component (context-free grammar generating an infinite set of phrase markers) and a lexicon (containing word formation and lexical redundancy rules); (ii) lexical insertion rules; (iii) a transformational component; (iv) a semantic interpretive component; (v) a deletion component; (vi) a surface filter component; (vii) a phonological interpretive component, and (viii) a stylistic component. Corresponding to this TG view of a grammar, sentence structure involves various levels of representation, roughly: (a) an initial phrase marker generated by (i); (b) a base phrase marker, generated by (ii); (c) a sequence of phrase markers or a derivation, generated by (iii), the last phrase marker being termed a surface structure, determined in part to be well-formed by (vi); (d) a logical form, generated by (iv); (e) a level determined by the output of (v); (f) a phonological form, generated by (vii), and (g) a final output generated by (viii).

In contrast, APG represents each natural language sentence in terms of a formal object called a Pair Network (PN). A PN is, from one viewpoint, a system involving four components: (i) an overall graph-theoretic object called a Relational Graph (R-graph), discussed in detail in Chapter 2; (ii) a graph-theoretic object which is a subpart of the overall R-graph in (i), called a Logical Graph (L-graph), discussed in detail in Chapter 4; (iii) a graph-theoretic object which is a subpart of the overall R-graph in (i) and distinct from the L-graph, called a Surface Graph (S-graph), also discussed in detail in Chapter 4; and (iv) two primitive relations called Sponsor and Erase, ranging over pairs of elements (called arcs) from R-graphs. These are discussed throughout this work from Chapter 5 on.

The APG construct L-graph is a formal characterization of the notion (linguistically motivated) logical form. The construct S-graph is roughly the formal characterization within APG of that aspect of sentence structure characterized in terms of Surface Structure in TG. The relations Sponsor and Erase and the laws and grammatical rules which refer to them are the APG explication not only of that aspect of sentence structure characterized in TG in terms of derivations and conditions on derivations, but also that aspect putatively characterized by interpretive components, stylistic components, etc. In short, the APG conception of grammar is far more homogeneous and monolithic than that of TG. It claims that there is a single web of notions relevant for describing all aspects of language, a claim in no way inconsistent with the existence of differentiating features of various subaspects. Thus, within APG there is no notion of, e.g., base component, lexical component, lexical insertion rules, transformational component, semantic interpretive component, deletion component, surface filter component, stylistic component, nor any of the other or associated constructs (cycles, rule ordering, traces, etc.) commonly assumed to be characterized in terms of, or held to apply to, the preceding aspects of TG.

A PN is simply a pairing of a Sponsor relation and an Erase relation such that (i) each relation is a set of ordered pairs of arcs, and (ii) the set of arcs formed by the domains and ranges of Sponsor and Erase (a) forms an R-graph and (b) has two subsets, one forming an L-graph and one forming an S-graph. Thus, R-graph, L-graph, and S-graph are derivative concepts, part of the defining criteria of the single formal objects called PNs.

Corresponding to the fundamental difference between the TG and APG notions of sentence structure are differences in the associated notions of grammatical rule and grammar. Unlike, e.g., transformations and other TG rules, APG grammatical rules do not map formal objects into formal objects via a specified set of operations. Rather, APG rules, like the laws of APG universal grammar (PN laws), check for the cooccurrence or noncooccurrence of specified properties of given PNs. More specifically, both PN laws and all APG grammatical rules are interpreted as materialimplications in the standard logical sense. Thus, APG draws it rules from this intensively studied and antecedently (to linguistic work) formally characterized class of objects. The essential difference between APG grammatical rules and PN laws is one of scope. The former are language-particular and thus not necessarily an aspect of any particular language. Thus PN laws are material implications which determine well-formedness in all languages; grammatical rules are material implications which determine well-formedness only for individual languages. Because of this interpretation of rules adopted in APG, notions like cycles, rule ordering, etc., so much discussed in the TG literature, are necessarily excluded from consideration.

More precisely, a grammar in APG terms for some language La is the union of the set of APG PN laws with some finite set of language-particular material implications specific to La. The set formed by this union is unstructured in the sense of containing no components. Roughly (see Chapter 14), a given PN is well formed with respect to La just in case it model-theoretically satisfies the grammar of La, which is the union of the set of PN laws with those material implications particular to La. Hence the APG conception of grammar differs from, e.g., the TG conception, as radically as do the APG conceptions of sentence structure and individual grammatical rule.

The negative idea of representing sentence structure in terms of objects not having derivations as subcomponents is not unique to APG. For example, RG as conceived most recently by Perlmutter and Postal (see Perlmutter and Postal [1977], Postal [1977], and below), Lakoff's (1977) conception of linguistic gestalts, and Hudson's (1976) idea of daughter dependency grammar, among others, all share the vague, unformalized and negative assumption that no aspect of sentence structure is properly characterized in terms of derivations. Since the above approaches to sentence structure all reject derivations, they could, in contrast to all variants of TG, be referred to as "uninetwork theories." However, the formal objects positively assumed in these views are considerably different in each case.

The rest of this work is devoted to elaborating the concept PN, and, to a much lesser degree, the notions of grammatical rule and grammar which go with it. In this volume, we proceed inductively, specifying the primitive concepts we assume and then building up to formal definitions of "R-graph," "S-graph," "L-graph," and "PN," (Chapters 2-4). Subsequently, we motivate and formalize a large set of PN laws which are designed to characterize (at the level of APG universal grammar) the notion Possible PN (see Chapters 5-13). These laws, plus the definition of "PN," characterize the concept Genetic PN (see Chapter 14, section 1). These are PNs which satisfy all PN laws and thus are well formed at the level of universal grammar, though not necessarily in any particular language. Finally, we discuss briefly the APG notions of Grammatical Rule, Grammar, and Well-Formed PN (for language La).

It is our working hypothesis that ultimately the entire syntactic, semantic, pragmatic, and phonological organization of a sentence is correctly formalized in terms of PNs. Our studies so far have concentrated on areas traditionally called syntactic. Nonetheless, one or two basic insights have emerged which have consequences for the representation of semantic or logical form. These mostly involve the area now usually discussed under the rubric "coreference" (see Chapter 11), but also other matters (see the discussion of The L-graph No Circuit Condition in Chapter 4). In addition, we believe the notion PN provides the correct mechanisms for relating logical form and superficial syntactic structure, embedding this in the same conception that relates different" aspects of syntactic structure to each other.


1.2.Informal view of sentences

Our basic conception of human language at the most abstract level does not, naturally, differ from that of other generative views. We take a language to be simply an infinite, presumed recursively enumerable, set of elements of some type, called sentences. From another point of view, we regard a language as essentially characterizable by a finite formal object, a grammar, which specifies the membership of the set of sentences. As noted earlier, "sentence" in this sense is not to be identified with a string of words or morphemes, but rather refers to the entire grammatical organization in the widest sense. Thus sentences are highly complex formal objects of some sort.

Our informal conception of sentence is this. A sentence involves a set of primitive linguistic elements; a set, PGR, of primitive grammatical relations holding between linguistic elements; a set of linguistic levels, which stratify the grammatical relations into distinct linguistic states; and, most characteristically, two primitive, binary relations called Sponsor and Erase, which hold between (in some cases, unary) sequences of linguistic states. The relation Sponsor organizes subsets of linguistic states into chains, defining sequences of distinct statuses for the primitive linguistic elements bearing the grammatical relations holding at various linguistic levels. In terms of Sponsor, it is possible to define, inter alia, two relations, Successor and Replace, which play distinct, fundamental roles in APG. A simple example should help clarify the foregoing abstract description of our view. Consider:

(1) Max was jostled by Naomi.

Oversimplifying considerably for expository purposes, (1), under our informal view, has the following partial structure. The primitive elements of (1) include a nominal, Max, a nominal, Naomi, and a clause corresponding to the entire sentence. The structure of (1) involves two linguistic levels. At the first level (L1), Max bears the direct object relation, and Naomi bears the subject relation, to the clause. At the second level (L2), Max bears the subject relation, and Naomi bears the chomeur relation, to the clause. With respect to clause structure, (1) involves four linguistic states: (S1) Max bearing the direct object relation to the clause at L1, (S2) Naomi bearing the subject relation to the clause at L1, (S3) Max bearing the subject relation to the clause at L2, and (S4) Naomi bearing the chomeur relation to the clause at L2. In terms of the Sponsor relations of relevance here, (S1) sponsors (S3) and (S2) sponsors (S4). More specifically, these two sponsor relation pairs are of the Successor type (for further discussion, see Chapters 3 and 5). (S3) is the successor of (S1), and (S4) is the successor of (S2). Further, ignoring passive verbal morphology and the auxiliary, L1 and L2 "contain" a verb, jostled, which bears the predicate relation to the clause, defining two more states. Diagrammatically:

(2) [ILLUSTRATION OMITTED]


Notice that there is no sponsor relation between the two states: (S5) jostled bearing the predicate relation to the clause in L1 and (S6) jostled bearing the predicate relation to the clause in L2.

The defined relation Replace is relevant to our account of:

(3) Max knows himself.


Again oversimplifying greatly, (3), under our informal view, has the following partial structure. The primitive elements of (3) include two nominals, Max and himself, and a verb, knows, in addition to a clause. As in the case of (1), (3) involves two linguistic levels. Level 1 (L1) has the following makeup: Max bears both the subject and direct object relations, and knows bears the predicate relation, to the clause. Level 2 (L2), in contrast, has the following character: Max bears the subject relation, himself bears the direct object relation, and knows bears the predicate relation, to the clause. Thus the structure of (3) involves the following six linguistic states: (S1) Max bearing the subject relation to the clause at L1; (S2) Max bearing the direct object relation to the clause at L1; (S3) knows bearing the predicate relation to the clause at L1; (S4) Max bearing the subject relation to the clause at L2; (S5) himself bearing the direct object relation to the clause at L2; (S6) knows bearing the predicate relation to the clause at L2 . In terms of the Sponsor relations of relevance here, (S2) sponsors (S5). This pair determines a defined relation called Replace, i.e., (S5) replaces (S2). Diagrammatically:

(4) [ILLUSTRATION OMITTED]


Note that there is no sponsor relation between (Si) and (S4) nor between (S3) and (S6).

The role of linguistic levels in sentences is rather analogous to the role of time or temporal points in the description of real world events and states. Just as it is vague to say, in the domain of social relations, that two individuals, a, b, are married (in a state of marriage), it is vague to say that two linguistic elements bear some relation. Rather, one should say that a, b are married at point t in time or over some interval of points in time, and that two linguistic elements stand in some relation at some level (though they may not be in that relation or any other relation at others). A linguistic state is thus a specification that a pair of elements are in a fixed relation at a fixed level. The primitive Sponsor relation means that there are fixed transitions between nontrivially distinct linguistic states, i.e., states which do not differ just in terms of level specification. For example, if a particular element enters into three states at three different levels, it is, under fixed conditions, possible to specify a unique ordering (a sponsor chain) among these. For instance, one linguistic state might specify that a linguistic element, a, bears the indirect object relation to a second element, b; a successive state could specify that a and b are related by the direct object relation; and a third might specify that a and b are related by the subject relation. The three linguistic states alone indicate only which relations hold at which levels. It is the sponsor chain which indicates the order of transition from one state to a nontrivially distinct state, determining which is the first state, which the last, etc., among a group of related states. This is important, since first states, last states, etc., determine lawfully which information is relevant for the semantic structure of sentences, which for the phonetic structure, etc.


(Continues...)

Excerpted from Arc Pair Grammar by David E. Johnson, Paul M. Postal. Copyright © 1980 Princeton University Press. Excerpted by permission of PRINCETON UNIVERSITY PRESS.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

  • FrontMatter, pg. i
  • CONTENTS, pg. v
  • Preface, pg. ix
  • Chapter 1. Introduction, pg. 1
  • Chapter 2. Graph-Theoretic Aspects of APG, pg. 29
  • Chapter 3. Arc Pair Relations, pg. 60
  • Chapter 4. Pair Networks, pg. 75
  • Chapter 5. Basic Sponsor and Erase Laws, pg. 105
  • Chapter 6. Coordinate Determination, pg. 149
  • Chapter 7. Focus on Clause Structure, pg. 189
  • Chapter 8. Cho Arcs, pg. 272
  • Chapter 9. Further Principles Governing the Distribution of Cho Arcs, pg. 359
  • Chapter 10. Ghost Arcs and Dummy Nominals, pg. 401
  • Chapter 11. Replacers and Anaphora, pg. 448
  • Chapter 12. Linear Precedence, pg. 547
  • Chapter 13. Grafts, Pioneers, and Closures, pg. 602
  • Chapter 14. APG Rules and Grammars, pg. 655
  • References, pg. 715
  • Index, pg. 724



From the B&N Reads Blog

Customer Reviews