You are on page 1of 14

Phonology

[These slides are missing most
examples and discussion from
class …]

600.465 - Intro to NLP - J. Eisner 1

What is Phonology? Pronunciation Spelling  cat + -s “kats” cats  dog + -s “dawgz” dogs  rose + -s “roziz” roses  kiss + -s “kisiz”why? kisses How do you pronounce a phonology doesn’t care about the spelling sequence of morphemes? (that’s just applied Especially.465 . how & why do you morphology) fix up the pronunciation at the seams between morphemes? 600.Intro to NLP 2 .

bedding/betting) 600. these are pronounced identically: na∂Id thanks to the English “flapping” rule (similarly: ladder/latter.465 .What is Phonology? Pronunciation Spelling  nap + -t “næpt” napped  nab + -t “næbd” nabbed  nod + -t “nadid” nodded  knot + -t “natid” knotted Actually.Intro to NLP 3 .

465 . What is Phonology? “Trisyllabic Shortening” in English divine  divinity serene  serenity futile  futility supreme  supremity senile  senility obscene  obscenity satire  satirical obese  *obesity decide  decision wild  wilderness (and similarly for other vowels) 600.Intro to NLP 4 .

What is Phonology?  A function twixt head and lip Morphology underlying Phonological surface Articulation (head) phonemes mapping phones (mouth) resign ree-ZIYN resign + -ation reh-zihg-NAY-shun  What class of functions is allowed?  Differs from one language to next  Often complicated.465 .Intro to NLP 5 . but not arbitrary  Comp Sci: How to compute. invert. learn? 600.

Intro to NLP 6 . O = f(I) = g3(g2(g1(I)))  Function composition (e. transducer composition) 600..465 .Successive Fixups for Phonology  Chomsky & Halle (1968)  Stepwise refinement of a single form  How to handle “resignation” example? e1 ul input (I) 2 R le output (O) 3 Ru le Ru  That is.g.

 Rules version: successive winnowing  No running in the house is allowed.  Rent must be paid by the first day of each month. Crosswhite How to Give Orders  Directions version: successive fixup (derivation)  Break two eggs into a medium mixing bowl. plus a search procedure for finding the best solution). Where else have we seen this? 600. describe what a good solution would look like. (optimization)  All dogs must be on a leash.  Remove this tab first.465 .  On the last day of each month.Intro to NLP 7 . come to this office and pay your rent.  In rules version. example courtesy of K.

Intro to NLP 8 .Optimality Theory for Phonology  Prince & Smolensky (1993)  Alternative to successive fixups  Successive winnowing of candidate set en nt . .. G 1 i 2 ra 3 nt input st nt ai on ai str tr output C ns n Co Co 600.465 .

Intro to NLP 9 . but only allowed to break tie among B.D.465 .E 600.Optimality Theory “Tableau”  = candidate violates constraint twice (weight 2) Constraint 1 Constraint 2 Constraint 3 Constraint 4 Candidate A        Candidate B       Candidate C       Candidate D      Candidate E        Candidate F        constraint would prefer A.

Intro to NLP 10 . Optimality Theory for Phonology ts s h e i g t ) e a it e ts s w id y h e ) s nd a ig t s d a m e a t ie d a l w i d a c r e o v e s n d m t e d a so (s a d c s s k h to r e a t (b en a nt .. . p G s 1 t h es at i t2 ra 3 b p in input (I) st t in t on tra a s str output (O) C ns e n Co Co b 600.465 .

When do we prune back to best paths?  Optimality Theory: At each intermediate stage  Noisy channel: After adding up all weights .. output (O) 600. .465 .Intro to NLP 11 .

465 .Why does order matter?  Optimality Theory: Each machine (FSA) can choose only among outputs that previous machines liked best  Noisy channel: Each machine (FST) alters the output produced by previous machines .Intro to NLP 12 . .. output (O) 600.

. .465 . if every constraint is binary: Assigns each candidate either 1 star (“bad”) or 0 stars (“good”) en nt . G 1 i t2 ra 3 in input (I) st t in on tra a str output (O) C ns n Co Co 600.Intro to NLP 13 . Final Remark on OT Repeated best-paths only works for a single input Better to build full FST for I  O (invertible) Can do this e.g.

Intro to NLP 14 .465 .Optimality Theory “Tableau” Constraint 1 Constraint 2 Constraint 3 Constraint 4 Candidate A        Candidate B       Candidate C       Candidate D      Candidate E        Candidate F        all surviving candidates violate constraint 3. so we can’t eliminate any 600.