You are on page 1of 14

Cognition, 10 (1981) 39-52 39

@ Elsevier Sequoia S.A., Lausanne - Printed in The Netherlands

An approach to Universal Grammar and the mental


representationof language

JOAN BRESNAN*
Massachusetts Institute of Technology

I. The need for cognitive constraints on 80guistic representation

Despite the exotic variety of the world’s languages, any normal child is
capable of mastering any language. This fact suggests that there is some
universal system for mentally representing natural language. The aim of
Universal Grammar is to discover this system,
One approach to Universal Grammar is to define formal symbolic systems
that describe properties of various natural languages and then attempt to
abstract a common formal structure from these systems. This is the formal-
descriptive approach of generative linguistics, and it has fundamentally
advanced our understanding cf the structure of na.tural language. Neverthe-
less, this approach by itself can yield only a limited understanding of how
languages are mentully represented. The reason is that, apart from the
requirement Cat any proposed fomd system (or generative grammar) bear
a descriptive relation to language users’ knowledge of the language, there are
virtually no cognitive constraints imposed on these systems. But the same
knowledge can be represented in descriptively equivalent ways by symbolic
systems whose design constraints would serve very different cognitive
processes,
ive a simple example, a symbolic system which is designed to
minimize the number of primitive operations may be equivalent to one
which employs more primitives but minimizes the length of derivations of
well-formed formulae. Thus, the propositional calculus which utilizes the
sole connective ‘4’ which means ‘neither nor’ is equivalent to the calculus
that is based on the more familiar logical connectives ‘-‘, ‘v’, ‘+‘, etc., but

*This articleis based in partupon &wk supported by the National Science Foundation under Grant
No. BNS 80-14730, and summarizes ideas that have developed in the close collaboration of members
of our resemh group: Ron Kaplan, Marilyn Ford, Jane Grimshaw, Kris Halvorsen, Steve Pinker.
Reprint requests shotid be sent to Joan Bresmw, Department of Linguisticsand Philosophy, MIT,
Can&&e, Mass. 02139, U.S.A.
40 Joan Bresnan

in the former ‘p+q’ is expressed as ((pJp)Jq)J ((pJp)Jq)). Which design


constraint- ;-eduction of primitives or compression of formulae-is the
right one? The answer of course depends upon the purposes for which the
symbolic system is used. ln the same way, descriptively equivalent lin-
guistic systems may satisfy design constraints which would be optimally
suited to very different cognitive processes (see Bresnan, in press).

2. The link between linguistic representations and cogni!ive processes

To take a deeper approach to Universal Grammar as a theory of mental


representation, we must recognize that the fundamental problems of
Universal Grammar are problems of understanding the natural information-
processing mechanisms of humans. The solution of a problem of natural
;nformation processing requires the integration of the levels of computa-
tional theory, algorithm, and process, as the work of Marr and his coworkers
shows* (Marr and Nisihara, 1978). The first step is to identify the inforrna-
t-ion-processing problem to be solved. In general, this takes the form of
specifying the desired (goal) representation and the given data (o I input repre-
sentation) from which the goal is derived by some unknown process. The second
step is to show theeretically how a reliable representation can be derived
from the available data. Here the aim is to determine the natural constraints
that guarantee the existence of a unique solution to the problem. The
computational thepry developed at this step goes beyond the formal-
descriptive approach by showing how constraints on the zeal representation
are related ta the nature of the computation that theoretically derives the
representation. The computational theory explains w/z) the representation
has the form that it has in terms of the solution, to an information-processing
problem. The third step is to design a particular algorithm that correctly
interprets the available input information. It is this step which connects the
level of theoretical representation with that of process, making it possible to
study experimentally how properties of the representatiorrs of knowledge are
related to properties of the mental processes that construct, maintain, and
interpret those representations. The fourth step is to test whether the natural
information-processing system uses the particular algorithm by comparing
the behavior of the implemented algorithm with the observable processes of

*The recentdeath of David Marris a greatloss to all workers in cognitive science,but his ideas and
his memorywill continue to inspireus.
An approwh to UniversalGmmmar 41

the natural system. If the comparison falls, a new algorithm must be


designed and the testing continued.
All of the major problems in the mental representation of language can be
approached in this way, each requiring a computational theory of some
natural human capability for information-processing. Thus, broadly speaking
(for each problem decomposes into a set of more tractable subproblems), the
problem of language acquisition is to specify a learning function that, given a
universal store of initial (perhaps innate) knowledge, maps the primary data
available to the child language learner onto a representation G of mature
knowledge of the language. Similarly, the problem of language comprehen-
sion is to specify a decoding function that, using the language-user’s stored
knowledge of a language, call it G again, maps auditory or visual data from
the language onto mental representations of meanings. Likewise, the produc-
tion problem is to find the inverse mapping, which, given linguistic knowl-
edge G, encodes meanings into visual or auditory representations. Even the
descriptive problem of generative linguistics is in principle no different: the
problem is to specify a function that, given the grammar G of a language L,
can associate all strings over the vocabulary of L with the judgments of well-
formedness, ambiguity, systematic paraphrase, and the like, of those who
know the language.

3. The competence hypothesis :JSa source of constraint

It is evident t!lat the na?ural constraints on the mental representation of


linguistic knowledge must derive from all of the interacting information-
processing sy4terns that support language use. The so-called ‘competence
hypothesis’ e: Eresses this fact by asserting that the same representation of
grammatical ‘knowledge G is included in all of these systems (Bresnan, 1978;
Kaplan, 1975; Chomsky, 1965). The competence hypothesis means no more
than that the stored knowledge of a language acquired by the language learner
is the same knowledge unconsciously utilized by the adult in comprehending
and producing language, and that this linguistic knowledge which permits peo-
ple to speak, recognize, and. understand their language is the same knowledge
that enables them to tell the linguist whether any given utterance belongs to
their language.
The competence hypothesis is a powerful unifying assumption for theoret-
ical linguistics and psycholinguistics, It implies that linguistic theory and
psycholinguistic theory are interdependent, and that the problem of how
; knowledge of language is mentally represented cannot be understood in the
absence of any theory of the mental processes required to construct,
maintain, and interpret proposed representations of linguistic knowledge.
42 Joan Bresnan

After Miller’s (1962, 1964) pioneering work in support of the competence


hypothesis, experimental psycholinguistics began to accumulate substantial
evidence showing that the form of linguistic knowledge representation giverr
by standard transformational generative grammars is not employed in angi
straightforward fashion in sentence perception, production, and language
acquisition (Fodar, Bever and Garrett, 1974; Levelt, 1974). Since more
successful ways of employing transformational grammars in psycholinguistic
theories have not been forthcoming, a number of researchers have con-
cluded from these results that the competence hypothesis cannot be
maintained. This conclusion, however, is unwarranted: from such results
WC can equally well conclude that the form in which linguistic knowl-
edge is mentally represented- G-cannot be identified with transfol.lla-
tio& grammars (Bresnan, 1978). To conclude that the competence
hypothesis itself is incorrect would be to relinquish the major source of
cognitive constraint on representations of linguistic knowledge, thus losing
any real possibility of explaining the interactions among the various
linguistic information-processing systems, and multiplying theories of the
mental representation of language without constraint. To the extent that
these psycholinguistic considerations are well-founded, they indicate that a
new form of linguistic knowledge representation-a new theory of Universal
Grammar---is needed. The Sam<:conclusion can be drawn from theoretical
consideration of a problem central to Universal Grammar-the syntactic
mapping problem.

4. The syntactic mapping problem


The syntactic mapping problem is one of the major information-processing
problems of Universal Grammar. It is the problem of finding for every
natural language a Imapping which will associate each sentence of the
language with a representation of its grammatical relations. (The term
‘grammatical relations’ is used neutrally to refer to the associations-between
the surface word and phrase configurations and the semantic predicate-
argument structures of a sentence.) The syntactic mapping problem is exm
tremely difficult -first, because of the complex, many-to-many relations
between the sentences of any natural language and their grammatical rela-
tions, and second, because of the radical variations in surface form across
languages. However, the competence hypothesis imposes important natural
constraints on the solution to the problem, and these constraints limit the
forms in which knowledge of language can be represented.
Five constraints on the mapping can be identified, the first two of which
are presupposed by all theories of generative grammar. The five are cratNtpr
(the domain and of the mapping are theoreti&ly infinite), finite
capacity (there is a finite capacity for the knowledge representations
), reli&Wy (the mapping is effectively computable),
mmaticai relations that the mapping derives
a sentence must be directly included in the
the mapping derives from the entire sentence,
o:h prior or subsequent segments), and untver-
as a universal procedure for constructing rep-
elations). Each of these constraints has a
~1 motivation: for example, order-free composi-
tion is motivated by our fluent and relatively effortless ability to interpret
sentence fragments; universali@ is motivated by the interaction of language
acquisition and language compwhension. These constraints on solutions to
the syntactic mapping problem impose important limitations on the possible
forms of syntactic knowledge representation, ruling out many possible
systems of grammar -even apparently descriptively adequate ones such as
transformational grammar -as systems of the mental representation of
language (see Bresnan, in press).
A soIutiort to the syntactic mapping problem does exist. For any language
L, let MPt&e G to be a lexical functional grammar for the language, and R
(the represerrtation of grammatical relations) to be the set of functional
structures of ahe language as defined by G. Then a map m: (L, Gj -) R exists
which satisfier all of the given constra!nts. This result ir, based on the
mathematical characterization of lexical fun ,tional grammars by Kaplan and
Bresnaxl (1980). It provides the foundations of a computational theory (in
the sense given above) for investigating the mental processes that construct
representations of syntactic structure.

5. Lmhl functional mmar


A lexical functional grammar. provides each sentence of a language with two
structures-one representing its surface form (the constituent structure),
the other representing its grammatical relations (the functional structure).
The functional structure represents grammatical relations in a universal
format, abstracting away from language-particular and typological charaeteris-
tics of the surface forms of sentences. The universal format is achieved by using
grammatical functions (SUBJECT, OBJECT, etc.) rather than language-
particular phrase structure configurations or morphological case to specify
the associations between semantic predicate argument structures and the
surface forms of sentences. The universality of grammatical functions is a
fundamentally important contribution of research in relational grammar
(Perlmuttel, and Postal, 1977.),which has been adopted even in some current
versions of transformational grammar (Chomsky, 1980). However, the role
of grammatical fi.1qdions in lexical functional grammar differs from both of
these in essential ways.
In the theory of lexical functional grammar, the predicate argument struc-
tures of lexical items are represented independently of their syntactic con-
texts-I features as functions of a fixed number of grammatically inter-
pretable arguments. A mapping between predicate argument structure and
syntactic constituent structure is specified by means of grammatical func-
tions. These are assigned to surface phrase structure positions by syntactic
encoding functions and to predicate argument structure positions by lexical
encodi.rg functilons. A predicate argument structure with grammatical func-
tions specified is called a lexicalform. How these terms apply to a simple
English sentence is illustrated in Figure 1.

Figure 1.

tb the baby

------------------ _-_---em.----
An approach to Univem! Cmmmar 45

In Figure 1, the information above the dotted line is provided by the


syntactic component of a generative grammar for English, and that below
the dotted line is provided by the lexical component. Thus the lexical form
for the verb hand has a triadic predicate argument structure whose three
arguments may be identified with the thematic roles SOURCE, THEME, and
GOAL (Gruber, 1965; Jackendoff, 1976). Arguments 1, 2, and 3 of this
predicate argument structure have been assigned the respective grammatical
functions SUBJ(ect), OBJ(ect), and OBL(ique)ooAL by lexical rules. The
syntactic component generates the surface phrase structure tree for -r;‘,,d
handed a toy to the baby and identifies the NBSdominating Fred, a ?oy, and
the baby as SUBJ, OBJ, and OBL~~AL, respectively.
In Figure 2, the verb hand has a different lexical form, produced by
assigning a different set of grammatical functions to the same predicate argu-

Figure 2.

phrase st_ructure

surface grammatical functions

handed

hand ((SUBJ), (OBJ2), (OBJ)) - lexical form for hand

------I _ --_ - ----- ------ -----

(SUBJ) (OBJ2) (OBJ) lexicalassignmentof


grammatical fur-:tion
Iland ( predicate argument structure

(SOURCE) (THEME) (GOAL)


46 Joan .Bwsnan

ment structure as before. As shown in Ffgdre 2, OBJ is now assigrred to argu-


ment 3 and 0BJ2 (second object) is assigned to argument 2. Note that the
position of grammatical function symbols in a lexical form is completely
independent of the left-to-right order of constituents in phrase structure.
When the structural grammatical functions are matcheal with the lexical
grammatical functions in Figures 1 and 2, it is evident that Fred, a toy, and
the baby correspond to the same ‘logical’ arguments in the predicate argu-
ment structure of hand. Both syntactic structures can be base-generated, and
the seeming transformational relationship between thelm can be expressed
instead as a relationship between the lexical forms for ikund. (Details of the
dative alternation- and its interaction with passivization and other rules are
discussed in work by Bresnan [ 19801.) Thus the verb hand has several
lexical forms, and general conditions on functional structures ensure that
the- correct lexical form is paired with the appropriate syntactic structures
,Xaplan and Bresnan, 1980).
The functional structure of a sentence assembles the relations between the
syntactically encoded functions and the lexical forms; thus it is the func-
tional stru ture that represents the associations between surface form and
semantic p,.zdicate argur:ient structure, and these associations constitute the
grammatical relations of the sentence. Figure 3 shows the functional struc-
tures for the examples of Figures 1 and 2. Note the very direct relation
between the surface forms of these examples and their respective functional
structures.
Figure 3.

Functional structure of Functionalstructure of


Fred handed a toy to the baby Red handed the baby a toy

SUBJ SUBJ PRED ‘Fred’


1
rENSE Past TENSE Past

PRED ‘hand((SUBJ), (OBJ), (CJB&,&) PRED ‘hand((SWBJ), (OBJ2), (OBJ))’

1
GBJ SPEC ‘a’
PRED ‘toy’ OBJ ~::tiyj
c

1
0852 kfWy, ]
oB&ML SPEC ‘the’
PRED ‘baby’
_ [
An approachto Universal
Grammar 47

The directness of this relation reflects a fundamental property of the lexical-


functional theory of grammatical representation: only lexical rules can
alter grammatical relations; al! syntactic rules must preserve function-
assignments (Jackendoff, 1976; Kaplan and Bresnan, 1980). It is this
property-the principle of direct syntactic encoding-which ensures that
lexical functional grammars satisfy the order-free composition constraint; for
now surface syntactic forms can be specified entirely by context-free grem-
mars (or recursive transtiion networks (Woods, 1970) and the functional
structures can be built up by a simple ‘additive’ operation that applies
directPy to surface forms (Kaplan and Bresnan, 1980). The functional
construction process is effectively computable, satisfying the reliability
constraint. Similarly, it is the universal format of functional structures that
ensures the universality constraint.
The dative alternation shown in Figures 1 and 2 clearly illustrates these
principles. Since in Figure 1 the OlBJCject)NP is associated with the THEME
argument, while in Figure 2 the OBJ(ect) NP is associated with the GOAL
argument, the grammatical relations of the sentences differ. Hence the dative
alternation must be effected by a lexical, rather than a syntactic, rule. In
contrast, the examples Fred gave a toy to the baby and Fred gave to the
baby a toy YIOno; differ in their grammatical relations: in both examples,
the OBJ(ect) NP is associated with the THEME argument and the preposi-
tional phrase is associated with the GOAL argument. Hence the different
orders of the phrases must be effected by a syntactic rule (In this case, one
which simply specifies alternative sequences of constituents of the verb
phrase).

6. Alternative theories of Universal Grammar

It is evident from the principle of direct syntactic encoding that any rule of
grammar which changes the grammatical functions of constituents must be
a lexical rule. Moreover, any such rule will have a universal characterization
which reveals its invariant form across languages. l%is follows because gram-
matical functions are independent of language-particular realizations in terms
of syntactic structure or morphological case.
The lexical functional theory thus differs in essential ways from trans-
formational’ theories (Chomsky, 1965, 1980) from other structuralist
theories (Gazdar, in press; Peters, in press), and from Relational Grammar
(Perlmutter and Postal, 19775.
AI1 versions of transformational grammar share the fundamental represen-
tational principle that at some (‘deep’) level of representation, there is a one-
48 Joan Bresnan

toone correspondence between the predicate argument structure (or


thematic role structure) of a sentence and its phrasal structure. For example,
Chomsky (1980) refers to this as the ‘theta criterion’. The associations be-
tweer predicate argument structure and surface form must then be effected
by operations on phrase structure representations (such as syntactic trans-
formations or equivalent structure-dependent rclles). But it is a fact that
surface phrasal structures of natural language sentences do not bear an
isomorphic relation to their predicate argument structures: for example, give
has the same predicate argument structure in Figures 1 and 2, surprised has
the same predicate argulment structure in John was surprised at Mary’s idea
and Mary’s idea surprised John, and prevent has the same predicate argument
structure in You can ‘t prevent it from raining and You can’t prevent it;
raining; yet there is no 3omorphy between the surface phrasal structures in
each pair of examples. Consequently, the syntactic mapping between predr-
cate argument structure and surface forms must faii to preserve grammatical
function assignments, violating the principle of direct syntactic encoding.
This result can only be avoided by giving up the basic representational
prmciple of transformational grammar and permitting multiple lexical
correspondences between predicate argument structure (or thematic role
structure) and phrasal structure, as in the lexical functional theory of
grammar.
The idea that grammatical functions are defined and derived on phrase-
structural configurations such as deep structure is a fundamentally
structuralist conception of grammatical relations which has been preserved in
transformational grammar (ChJmsky, 1965, 1980). While the structural%
conception seems descriptively adequate for languages like English, it clearly
fails to provide a unified theory of grammatical relations for languages
having radically different surface forms. For example, it has been shown that
the structuralist representations of transformational theory are insufficiently
abstract to characterize the universal attributes of rules like passivization
(Bresnan, 1980; Mohanan, 1980; Perlmutter and Postal, 1977). RecenL work
within the lexical functional theorv of grammar shows that at a suitably
abstract level of grammatical representation, two languages that differ
radically in their phrase structures exhibit identical lexical encoding, and
that the strikingly different syntactic manifestations of these rules follow
from a simple difference in the syntactic enccding function for each
language. Moreover, this work shows that the appearance of ‘NP movement’
in constru:tiorr; like the. passive is an illusion created by the particular
properties of the syntactic encoding functions for ‘configurational’ languages
like English. Far from being the reflection of a universal transformational
component of grammar, as many linguists currently maintain, the pheno-
An upproachto UniversalGrammar 49

menal characteristics of ‘NP movement’ are detivative of more abstract


priniples of mmatical representation.
Rulational Grammar, like lexical functional grammar, rejects the struc-
turalist conception of grammatical relations and provides a universal theory
of function-dependent processes such as the dative alternation, passivization,
and causativization. But unlike the lexical-functional theory, Relational
Grammar has assumed that there is a one-to-one correspondence between predi-
cate aguments and an initial assigment of grammatical functions. These
predicate arguments are mapped onto surface constituents by a sequence
of ‘strata’ which syntactically modify the initial assignment of grammatical
functions until a final assignment of grammatical functions is produced. The
final grammatical function assignment is then mapped onto surface strings
by linearization rules. Because of the functionchanging character of the syn-
tactic mapping, Relational Grammar violates the principle of direct syntactic
encoding, and so differs from the Iexical-functional theory in the decomposi-
tion of grammars into lexical and syntactic rules. But there is evidence from
the interactions of passivized verbs with rules of word formation that
passivization is indeed a lexical, not a syntactic, process (Bresnan, 19%)).
Generalized phrase structures hPl.ri:been developed by Gazdar, Peters, and
others. In their use of a single level of constituent-structure representation,
these theories of grammar resemble the lexical-functional theory and differ
from both transformational theories and Relational Grammar. But lexical
functional theory contrasts with generalized phrase structure theories in its
claims that Yunctional structure, not constituent structure, is semantically
interpreted (Haltorsen, in preparation) and that there are universal and lan-
guage-particular generalizations that are captured only in function-iiependent
terms (Bresnan, in press). The fact that there are natural languages in which
sentences lack virtually all constituency relations poses a severe problem for
generalized phrase structure grammars as universal theories of grammar. For
example, the Australian idnguage Ngarluma (Simpson, 1980) exhibits to an
extreme degree the lack of phrase structure that Hale has identified as a
typological property ori natwAx language (Hale, 1979). In Ngarluma, the
sentence consists of a stt;ing of words without phrasal structure; nevertheless,
the perceived grammatical relations of the sentence in Ngarluma have the:
same kind of functional structure as those in English. In particular, the
Ngarluma sentence has grammatical subjects and objects and a rich system
of predicative : fa adjunct modifiers, all of which may be composed of
information derived from several words which are discontinuous in the
string; it differs from a language like English in that these functional units
do not correspond to phrase structure constituents-they are syntactically
encoded in terms of case inflections rather than phrasal structure.
50 Joan &man

7. Toward a’unified theory of the psychology of language

The design constraints embodied in theories of Universal Grammarhave


profound consequences for theories of the mental representation of
language. Research in the theory of lexical functional grammar has begun
explicitly ‘to investigate these consequences.
Pinker (1980), arguing that lexical accounts of acquisition have certain
advantages over previous syntactic theories, has proved that lexical func-
tional grammars are learnable in principle, and has developed principles
underlying a new theory of language acquisition based on lexical functional
grammars.
Ford, Bresnan, and Kaplan (198 1) have constructed and motivated a com-
petence-based theory of syntactic closure which directly incorporates lexical
functional grammars as representations of linguistic knowledge. A particularly
interesting and important research problem in the development of a theory
of sentence perception has been to explain the structural biases shown in
structurally ambiguous sentences, since the effects observed have been
assumed to reflect the operating principles of the human parsing mechanism
very directly. The study by Ford, Bresnan, and Kaplan examines syntactic
bias effects and shows that they are a joint function of (i) the linguistic rules
which define the structures of sentences, (ii) the predicate argument struc-
tures and grammatical functions of lexical items, and (iii) a well-defined
interaction between rule-driven and data-driven analysis procedures.
Although relatively little psycholinguistic research on adult language
behavior has been concerned with how speech is produced, Ford (1980) has
completed an experimenti study on this problem. Together with the results
obtained by Ford and Holmes (1978), the findings provide very strong
evidencle that sentence production proceeds by the successive planning and
uttering of segments of a sentence called ‘basic clauses”. Raising the question
of what representation of the meaningful relations underlying sentences
would permit sentences to be planned basic clause by basic clause, Ford
shows that the required type of representation is one in which for each basic
clause, there is a representational unit which encodes the meaningful
matical relations for the surface items in that clause, such as the surface
subject, object and verb. Ford gives evidence that these representations
cannot be analogous to the representations of underlying relations in various
versions of trzmsformational grammar, and shows that the lexical functional
theory of grammar does represent underlying relations in a way that would
permit sentence planning to proceed basic clause by basic clause.
Because of the computational theory provided by lexical functional gram-
mars, it has become feasible to consider the design of explicit algorithms for
acqu and entations of syntactic lcnowl-
level of th representation with that of
1s will make it possible to study experimentally how
properties of the ~p~~~tati~n of syntactic knowledge are related to
properties sf the mental processes that construct and interpret those
representatisns.

References

Bresnan,J. (1978) A Realistic Transformational Grammar. In M. Halle, J. Brescan, and G. Miller,


(eds.1, ~f?~g~ubtic Theory 4nd YsYchc~ogicuII&&y. Cambridge, Mass., The MIT Press,
Bresnan, J. (1980) The Passive in Lexical Theory. Occ&ortul f4p4r #7, The Center for Cognitive
Science, MT; also to appear in J. Bresnan (ed.), The Menfd Represenfution of Gra~muficul
R&ions. Cambridge.,Mass.,The MlT Press.
Bresnan, 1. (In press)UniversalGrammar and the Mental Representation of Language. Introduction in
J. Bresnan, (ed.), T%re Menfd Represenfufion of Gn;mmuticui Relations, Cambridge, Mass.,
The MIT Prea.
ChOmSkY, N. (1965; Aspects offhc Theory ofSynfcuc. Cambridge, Mass.,The MITPress.
Chomsky, N. (1980) On the Representation of Form and Function. Presented at the C.N.R.S.Confer-
ence at Royasmont, France, June 1980.
Fodor, 4. A., Bever, T. and Garrett, M. (1974) ThePsychoioay oflunguage. N.w York, McGraw-Hill.
Ford, J. M., Bmsnan,1. and Kaplan, R. (1981) A Competence-BasedTheory of Syntactic Closure.
Ckc4sfund Paper #II The Center for Cognitive Science, MIT; also to appear in J. Bresnan
biut.A Y%eM?fffd Ropmenfation of Gmmaficai Rd4tkm. Cambridge, Mass., The MIT Press.
Ford, M. (1980) Sentence PkmningUnita: Implicationsfor the Speaker’s Representation of Meaning
firI Relations UadarSving Santencee. Occasionirl Pup&v #2, The Center &Jr CognitiveScience,
HIT; also to appearin J. Bresnan(ed.), The MeMa Repraenfufion of Grammaticd Relations.
a,, nte MITPrese.
Ford, M., and Holmes,v, (197$) Plnnnhg.Units in Syntax and Sentence Production. Cog., 6, W-53.
Gazdar, 0. (In press) E&rase Structure Grammar.In P. Jacobson anc, G. Pullurn (cds.), The Nature of
8y~factk Re~~~~tff~~, London, Groom Helm,
Gruber, 1. (1965) ~~~~~s In &.@x.M/ R&,imr, MITdactoral dissertation, reprinted by the lndiana
UnivoreityLinghties Club, Bloomington,Indiana.
Hale, K. (1979) On the PosWwa crfW&d in o Typdogy sf the &se. Department of Lin$ustics and
philosophy, MIT. Mamser~pt avaIlablefrom the Indian2 WversjY Linguistics CiuL, 1981,
Bloomington, Indiana.
Hakorson, P.-K. (In preparation) An lntcrprettive
Procedurefor Functional Structures. To appear as an
Q&WOWJ P@erc The Centerfor Cognitive Sdence, MlT.
Jackendoff, W. (1976) Toward an Explanatory Semantic Representation.Linguistic Inquiry, 7,
89-150.
~aphn, R. (1979) On ptacca yodels for Sentence Analysis. In D. Norman and D. Kumebrt (eds.),
i?xplrvn%rc,vIn Cognition. !&IIFrancisco, W. H. Freeman and Co.
Kaplan, R. and B;esnan, J, (1980) LexichFunctional Grammar: A Formal System fcr Grammatical
Representation. &ca~iod Puper #f3, The Center for CognitiveScA-wc, ET, Jlso lo *PpeN
52 Joan &esnan

in J. Bresnan (ed.), The Mental Representation of Grammaticalkelations. Cambridge, Mass.,


The MIT Press.
Level& W. J. M. (1974) Fotwtal Grammarsin Linguisticsand Psycholinguistics,3 ~01s. The Hague,
Mo;lton.
Marr, D. and Nislhara, K. (1978) Visual Information Processing: ArtiWcial Intelligence and the Sen-
sorium of Sight. Technol. Rev., 82* (1).
Miller, G. A. (1962) Some Psychological Studies of Grammar.&. Psychol., 2 7,748-762.
Miller, G. A. and McKean, K. (1964) A Chronometric Study of Some Relations between Sentences.
Q. J. Exper. Psychol,, 16, 297-308.
Mohanar:, K. i . (1980) Grammatical Relations and Clause Structure in Malayalam. Manuscript,
Departm.ent of Linguistics a,.3 Philosophy, MIT; to appear in J. Bresnan (ed.), The Men&J
RqreseMation of GrammaticalRelations.Cambridge, Mass., The MIT Press.
Perlmutter, D. and Postal, P. (1977) Toward a Universal Characterization of Passivization. In
Proceedings of the Third Annual Meeting of the Berkeley Linguistics Society, Berkeley,
California.
Peters, S. (In [press) Definitions of Linked-Tree Grammars. Technical Report, The Cognitive Science
Center of the University of Texas at Austin, Texas.
Pinker, S. (1980) A Theory of the Acquisition of Lexical-Interpretive Grammars. OccasionalPaper
#6, The Center for Cognitive Science, MIT; also to appear in J. Rresnan (ed.), The Mental
Representationof GrammaticalRelations,Cambridge, Mass., The MIT Press.
Simpson, J. (1980) Ngarluma as a W* Language. Manuscript, Department of Linguistics and
Philosophy, MIT.
Woods, W. (1970) Transition Network Grammars for Natural Language Analysis. Communicationsof
the ACM, 1X 591-606.

You might also like