You are on page 1of 58

The analysis of Stimulus-Response associationism its general failure as a science of behavior.

. 6 sentences demonstrating syntactic phenomena inducing abstract structure from surface comparisons. The nativist acquisition syllogism children must create language from scanty data.

Noam Chomsky the early years 1955 1965 .the evidence of things unseen.

the circularity of behaviorism Behaviorist associa=onism dominance 1910 1960. If it can t be learned, it isn t in the mind Learning = S=mulus->Response/reinforcement B.F. Skinner s S-R learning theory of Language Chomsky s review of Skinner s book Showed the circularity of the S->R/reinf. Paradigm: It is impossible to determine any of the terms independently Ergo, S-R behaviorism is not science

Death by analysis

Linguis=c discoveries by examples Science through the intuitive


What do speakers know in their language? Sentences.e.g., Colorful green frogs snore noisily But does a sentence need meaning to be recognized as a sentence.! Colorless green ideas sleep furiously No, Meaning is not necessary.but structure is.contrast the above with..! *Green sleep ideas colorless furiously

Two levels of sentence organiza=on Surface and Deep Structures


(Talking rela=ves) can be a nuisance The relatives are talking (Discussing rela=ves) can be a nuisance Someone is discussing the relatives! So there are different relations underlying similar surface phrases.! (Visi=ng rela=ves) can be a nuisance A single surface sequence can have different underlying relations!

Deep Structures can span sentences The ducks were anxious to eat The ducks want to eat something! The ducks were easy to eat Some(thing) was easily eating the ducks! So, specific deep structure relations can be far apart in the surface (duckseat)! The ducks were ready to eat A single surface form can have different underlying relations at a distance !

Inner rela=ons at a distance

The ducks were hungrier than the sh were ac=ve The ducks were hungrier than the sh were *hungry So, identical surface comparative adjective must be deleted by.rule. transformation ! The ducks were hungrier for fresh food than the sh were hungry for fresh food. Transformations: STRUCTURE SENSITIVE they apply to phrases, not just words

Constraints at a distance Structure sensitive transformations

Competence vs. Performance Structure vs. Complexity octopus chased a]acked a duck some lobsters one

swam away Is this ungrammatical? a duck some lobsters a]acked swam away some lobsters one octopus chased a]acked a duck No, it is just hard to understand! So we study grammaticality competence , not processing ease.! Performance is a separate study, what makes sentences hard and easy.

Language Acquisi=on and Na=vism Standard Chomsky Syllogism


Structure of many papers and talks.the child s data are too limited to explain what is learned, so there must be innate founda=ons for language. To be proven: language is innate Language has property P P is very very complex and deep P could not be learned from surface evidence alone Ergo, P must be innate Ergo Language is innate Causa Finita

The analysis of Stimulus-Response associationism its general failure as a science of behavior. 6 sentences demonstrating syntactic phenomena inducing abstract structure from surface comparisons. The nativist acquisition syllogism children must create language from scanty data.

Noam Chomsky the early years 1955 1965 .the evidence of things unseen.

Colorless green ideas: The Chomskyan Revolution"


Massimo Piattelli-Palmarini

A progressive climbing!

Linguistics once was supposed to be: the study of a class of overt behaviors In the best case it was: the study of the causes of those behaviors Chomsky suggested that it must rather be: The study of an internally represented knowledge of language

The Descartes-Chomsky paradox!


Linguistic expressions are neither caused by external circumstances, nor independent of them. The crux of the matter is: They are appropriate to the circumstances Without being determined by them. The shift is, therefore Away from stimulus-response To factors internal to the mind of the speaker/ hearer

The Chomsky hierarchy!

A hierarchy of computational power

The lowest in the hierarchy!


Finite State Languages also called Regular Languages Example: Instructions in a software manual Historically important example: statistical grammars Shown by Chomsky to be radically insufficient to model human natural languages

Chomsky s famous sentence!


Colorless green ideas sleep furiously. What about the following? Furiously sleep ideas colorless green. A statistical grammar treats both sentences exactly equally Contrary to our most basic intuitions In fact: The probability, in any corpus, that green follows colorless is zero Same for furiously following sleep, or the converse

One step up!


Context-Free Languages Pushdown Automata (Chomsky s discovery) A finite memory tape Last in first out Computation may stop and restart after a check Basically: do such and such whenever you encounter symbol X Also radically insufficient to model human natural languages

One further step up: The right one!

Context-Sensitive Languages Content-addressable memory tape Basically: do such and such whenever you encounter symbol X If, an only if, X is flanked by Z on its left and/or by Y on its right This is the right class of grammars

BUT, BUT!

A purely mathematical analysis of language Does not go very far We must take into account The obvious intuitions shared by every native speaker/hearer

Deep commonalities!

No statistical analysis can capture these commonalities No classical rule of grammar can either. Phrase Structure Grammars are out

The traps of surface structure!


They saw that he was trying to escape. They saw him trying to escape. (OK) They suspected that he was trying to escape. *They suspected him trying to escape (BAD) One can t sit down and think about the concepts of see and suspect Don t even try to make movies or click snapshots of those kinds of actions

The difference is syntactic, not conceptual:!

See bands with catch, find, spot etc. Suspect bands with know, denounce, dissuade etc. In essence: the difference lies in the different potentials that those classes of verbs have To combine with other elements in the sentence Lexical considerations enter into syntax

Silent parts of language!

I saw the man standing at the bar. This sentence somehow contains the sentence The man was standing at the bar. No traditional grammar ever contemplated such phenomena Silent (physically empty) parts of a sentence are treated exactly like their manifest counterparts

Further silent parts of language!


John wants to win. John wants that John (he himself) wins Everyone wants to win. NOT everyone wants that everyone wins The key: an unpronounced pronoun (called PRO) The silent subject of the infinitival to win With different powers of referring to what precedes And the peculiar properties of quantifiers, like everyone

In essence:!

The theory of syntax is about the tacit knowledge of language possessed (largely innately) by the native speaker/hearer It s essential to dig below the surface form of expressions In-expressed, silent, elements are a crucial part of syntax It s not really the business of syntax to prescribe word-order in the sentence This is the automatic consequence of much deeper principles

The Principles & Parameters Model:


Grammar in Ten Minutes

Andy Barss, Linguistics & Cognitive Science

25

The Principles & Parameter Model


late 1970 s today
late 1970 s to mid-1990 s, then adapted into Minimalist programme

kinds of puzzles & argumentation attracting new researchers
huge shift in conception of what learner s task is
pursuit of the weird: irreducibly grammatical phenomena
emphasis on (a) coverage of detailed data and (b) elegance of analysis
(but a lacuna)

26

The pursuit of the peculiar



(1) She s arriving at noon with them .


(uninteresting)
But
Setup: Mary is running for Mayor, as is John. Various reports:
(2) John believes [she won]
(3) John believes [her to have won]
(4) John believes [he won]
(5) John believes [him to have won]









(Interesting! Unexpected!)
But
left us in an odd position: why? Why these phenomena, and not others?

27


Energetic, growing community &
Huge growth in what was known about

qwidely differing languages, English vs. Warlpiri vs. Hungarian vs. Japanese


outlining the boundaries of the hypothesis space permitted by UG

qclosely related languages, e.g., English vs. Italian


brought into focus the quanta of grammar: the minimal shifts between grammars permitted by UG

28

The Boundaries and the Quanta of Grammar










29

The Boundaries and the Quanta of Grammar










30

The Principles & Parameters Model



Parameter: property of language pre-specied by UG, but
with at least TWO options available.

Real-world example: appetizer choices on a xed menu.

Linguistic Example #1: pro-drop effects, aka null-subject languages

(1) PJohn said that he bought a new bicycle

(2) * John said that bought a new bicycle

(3) P Gianni ha detto che che ha comprato una nuova bicicletta



31

The Principles & Parameter Model



General shape of system driven by three factors:

(a) experience

(with the ltration problem how does learner know what is the







evidence/stimulus?)
(b) initial state (UG)
(c) general laws, including: properties of organisms; complex systems; physics

The lacuna was in (c) largely ignored by practicing linguists.

32

Massive Shift in the Learner s Problem Earlier: Child as little linguist, finding data, formulating grammars, etc. Presently: Pre-determined system, with a few limited parameters left open, triggered by simple data

33

The Principles & Parameter Model



The pro-drop parameter:

Target language



a) allows


b) does not allow

null subject positions.







34

The Principles & Parameter Model



The pro-drop parameter:

Target language







a) allows


b) does not allow

null subject positions.

Learners evidence:





Italian

35

The Principles & Parameter Model



The pro-drop parameter:

Target language







a) allows


b) does not allow

null subject positions.

Learners Evidence:

??????????????







English

36

The Principles & Parameter Model



The problem:

Every relevant sentence in English is grammatical in Italian.
But not conversely.


Italian



English

37

The Principles & Parameter Model



So, a child can switch from [-prodrop] (e.g., English) to [+prodrop] (e.g., Italian)




Italian




English

38

The Principles & Parameter Model



But a child exposed to English, temporarily guessing Italian! , would be driven into a deadly trap: no further evidence would ever show that she was wrong.


Thus would require negative evidence
AND
a learner capable of making use of that negative evidence







39

The Principles & Parameter Model



The line of argument just advanced leads to these results:

a) for parameter settings that create an abstract subset-superset relationship between languages, the narrower/more conservative parameter setting must be pre-specied in UG.

Constraint on learner:

a) no free guessing.
b) make consevative changes. All switching is forced



input sentence



compatible? stay put



incompatible? change setting to next available option




40


The Principles & Parameter Model



The pro-drop parameter is really mentally represented as this:

Target language does not allow null subject positions (initial setting)
Target language does allow null subject positions (marked, derived setting)





41

The Principles & Parameter Model



These results lead to the switchbox model:

UG includes a (nite) set of parameters. Each parameter P comes with an unmarked initial setting. This remains xed unless the learner encounters incompatible data.

Initial state (including unmarked settings)
input sentence
compatible? stay put
incompatible? change setting to next available option

42

The Principles & Parameter Model



Left unresolved:

(a) Reduction of learner s problem by increase in UG content
(b) where did THAT come from?

Why these options, and not others?


43

From Principles & Parameters to Minimalist Program


Simin Karimi Department of Linguistics February 2, 2012

How are grammars evaluated?


" Grammars, like other systems, are

evaluated along several dimensions.


" Naturalness, Parsimony, Simplicity,

and Explanatory adequacy.

How are grammars evaluated?


" Explanatory adequacy (how children acquire

language in the absence of adequate data) has carried the greatest light throughout the history of the generative grammar till Minimalism. established, there was an opening for other evaluating factors: simplicity, parsimony, naturalness.

" When the architecture of P&P was

Chomsky:
Well-designed systems should have simple, sensible properties.

The P&P Model


Lexicon PS Rules DS Transformations SS PF
" "

LF Logical Form

Phonological Form

MP Model
" Language: Sound and Meaning. " DS and SS can be eliminated without empirical loss. " " "
"
PF Phonological Form
"

Lexicon Numeration Narrow Syntax

LF Logical Form

Locality
" Many locality constraints in P&P theory. " Subjacency: " (1)
"

a. What did John buy e?


b. *What did you say where John bought e ?

" Head Movement Constraint: " (2) a. " b. " c.

John will be working at home.


Will John e be working at home? *be John will e working at home.

Locality
" Superiority " (3) a. "

Who e bought what? * what did who buy e ?

b.

" Superraising " (4) a. It seems that John was believed e to be rich. "

b. *John seems that it was believed e to be rich.

Minimal Link Condition (MLC)


" All those constraints are subsumed under MLC " " " " "

Target * * X Y Z

Cross-linguistic Factor: Optionality


" (5) a. "

Who did John give the book to? *John gave the book to whom?

b.

" Korean (From Jaehoon Choi, Ph.D. Candidate) " " " "

(6) a. John-i

ku-chayk-ul

nwukwu-eykey cwu-ess-ni? give-Pst-Q cwu-ess-

John-Nom that-book-Acc who-to ni? b. Nwukwu-eykey John-i who-to ku-chayk-ul

John-Nom that-book-Acc give-Pst-Q

Optionality
" Turkish (From Deniz Tat, Ph.D. Candidate) " (7) " " "

a.

John kitab- kim-e ver-di John book-acc who-dat give-pst

b.

Kim-e John kitab- ver-di who-dat John book-acc give-pst

Optionality
" Persian " (8) a. " " "

Kimea ketb-ro

be ki

dd?

Kimea book-Obj to whom gave b. Be ki Kimea ketb-ro dd

to whom Kimea book-Obj gave

How to account for the Optionality?


"

a sentences:

" (9) Who did you give the book to? " "

b sentences: focused (cleft constructions in English) (10) Who was it that you gave the book to difference between a and b sentences in Korean, Turkish and Persian, on the one hand, and English, on the other.

" Presence or absence of features are responsible for the

Feature selection
" " " "

XP . YEF Z YP ZP WP

Major properties of MP
" Universal Grammar is reduced to the Principle of Economy

(Locality), the Principle of Full Interpretation, and The Linear Correspondence Axiom (LCA) .
" Narrow Syntax (the input to the Logical Form which is itself

the input to the semantic interpretation) is universal.


" Parametric differences are outside syntax. They are

idiosyncratic properties of Lexicon.


" MP accounts for language variations in a simple, efficient

and natural fashion.

You might also like