Theory of Computation Class Notes1

1

based on the books by Sudkamp and by Hopcroft, Motwani and Ullman

ii

. . . . . . . . . . . . .2 Regular Expressions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. . . . . 1 1 3 5 5 9 9 12 13 18 19 19 19 21 2 Languages and Grammars 2. . . . . .1 Languages . . . . . . . . . .1 Chomsky Normal Form (CNF) . . .Contents 1 Introduction 1. .4 Proof Techniques . . . . . . . . .5 Normal Forms of Context-Free Grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5. . . . . . . . . . . . . . 2. . . . . . . . . 3 Finite State Automata iii . . . . . . . . .1 Sets . . . . . . . . . . .5. . . . . . . . . . . . . . . . . . . . . . . . 2. . . . . . .3 Countable and uncountable sets 1. . . . . . . . . . . . . 2. . . 1. . . . . . . . . . 2. . . . . . . . . . . . .4 Classification of Grammars and Languages . . . . . . . . . . . . . . . . . . . . . . 1. . . . . . . . . . . . . . . . . 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3 Grammars . . . . . . . . . . . .2 Functions and Relations . . . . . . . . . . . . . . . . . .2 Greibach Normal Form (GNF) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

iv CONTENTS .

. . . . . . . . . .List of Figures 2. . 15 v .1 Derivation tree . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

vi LIST OF FIGURES .

A set is specified by enclosing some description of / its elements in curly braces. thus Jn = {1. 1. such that i is greater than zero. we write x ∈ S. intersection (∩). 1. 2. We write this as S1 ⊆ S If S1 ⊆ S. called the empty set is denoted by ∅. i is even} for the last example. we say that S1 is a proper subset of S. .e. in which we write S = {i|i ≥ 0.” ¯ Considering a “universal set” U. for example. and i is even. It is obvious that S∪∅=S−∅=S S∩∅=∅ ¯=U ∅ ¯ S=S A set S1 is said to be a subset of S if every element of S1 is also an element of S. 3. We read this as “S is the set of all i.Chapter 1 Introduction 1. the set of all natural numbers 0. We use ellipses (i. 2. but S contains an element not in S1 .. n} represents the set of all natural numbers from 1 to n. · · · is denoted by N = {0. we write this as S1 ⊂ S 1 .1 Sets A set is a collection of elements. the complement S of S is defined as ¯ S = {x|x ∈ U ∧ x ∈ S} / The usual set operations are union (∪). 2.. · · · }. To indicate that x is an element of the set S. 3. and difference(−). defined as S1 ∪ S2 = {x|x ∈ S1 ∨ x ∈ S2 } S1 ∩ S2 = {x|x ∈ S1 ∧ x ∈ S2 } S1 − S2 = {x|x ∈ S1 ∧ x ∈ S2 } / The set with no elements. . The statement that x is not in S is written as x ∈ S. · · · .) when the meaning is clear. we use more explicit notation. When the need arises.

2. {2. otherwise it is infinite.1 If S is the set {1. then |2S | = 2|S| Proof: (By induction on the number of elements in S). 2. then the sets are said to be disjoint. . 3}. S1 ∪ S2 = S1 ∩ S2 .2 The following identities are known as the de Morgan’s laws. A set is said to be finite if it contains a finite number of elements. {1. Example 1. yk+1 } = Sk ∪ {yk+1 } . Denote Sk+1 = {y1 . {1. This is an instance of a general result. . 2}. S1 ∪ S2 = S1 ∩ S2 . The size of a finite set is the number of elements in it. 3}} Here |S| = 3 and |2S | = 8. S} ⇒ |2S | = 21 = 2 Induction Hypothesis: Assume the property holds for all sets S with k elements. 3}. Observe that 2S is a set of sets. A set may have many subsets. The set of all subsets of a set S is called the power set of S and is denoted by 2S or P (S). S1 ∩ S2 = S1 ∪ S2 . y2 . x ∈ S1 ∪ S2 ⇔ x ∈ U and x ∈ S1 ∪ S2 / CHAPTER 1. INTRODUCTION ⇔ x ∈ U and ¬(x ∈ S1 or x ∈ S2 ) (def. Basis: |S| = 1 ⇒ 2S = {∅. then its power set is 2S = {∅. . this is denoted by |S| (or #S). 3}. {2}.union) (negation of disjunction) ⇔ x ∈ S 1 ∩ S2 ⇔ (x ∈ S1 and x ∈ S2 ) ⇔ (x ∈ U and x ∈ S1 ) and (x ∈ U and x ∈ S2 ) / / ⇔ x ∈ U and (¬(x ∈ S1 ) and ¬(x ∈ S2 )) ⇔ x ∈ U and (x ∈ S1 and x ∈ S2 ) / / (def.complement) (def. 1.1. {3}. if S is finite. S1 ∩ S2 = ∅. Induction Step: Show that the property holds for (all sets with) k + 1 elements. {1}. 2. that is. {1. .intersection) If S1 and S2 have no common element. 1.

(1. 2). y ∈ S2 } Example 1.tail)} the set of all possible throws of two coins. (2. . The notation is extended in an obvious fashion to the Cartesian product of more than two sets.head).1. 2. for some x ∈ S1 } = Rf . 2). generally S1 × S2 × · · · × Sn = {(x1 . we say that f is a total function on S1 . 1). .y∈Sk {x.(head. y3 . Range f = {y ∈ S2 | (x. For the Cartesian product of two sets. yk+1 } ∪ . x2 . .(tail. i. 3). yk+1 } ∪ .1. (2.head). yk+1 }∪ A set which has as its elements ordered sequences of elements from other sets is called the Cartesian product of the other sets. · · · . 1). y) ∈ f. ∪ Sk+1 2Sk has 2k elements by the induction hypothesis. 3)} Note that the order in which the elements of a pair are written matters. y) | x ∈ S1 .2 Functions and Relations A function is a rule that assigns to elements of one set (the function domain) a unique element of another set (the range).2. . yk+1 } ∪ {y2 . (2. for some y ∈ S2 } = Df 2. The number of sets in 2Sk+1 which contain yk+1 is also 2k . xn ) | xi ∈ Si } 1. . y) ∈ f. y2 . We write f : S1 → S2 to indicate that the domain of the function f is a subset of S1 and that the range of f is a subset of S2 . 2) is not in S 1 ×S2 . If the domain of f is all of S1 . otherwise f is said to be a partial function on S1 . Then S1 × S2 = {(1. we write S = S1 × S2 = {(x. then A × A = {(head. . yk } 2Sk+1 = 2Sk ∪ {yk+1 } 3 ∪x. Domain f = {x ∈ S1 | (x..tail). . ∪ {yk . ∪ {y1 . 3}.e. . y. A ={head.tail}. (1.1. Example 1. the pair (3. 1.(tail.2 Let S1 = {1. FUNCTIONS AND RELATIONS where Sk = {y1 . which itself is a set of ordered pairs. Consequently |2Sk+1 | = 2 ∗ 2k = 2k+1 .3 If A is the set of throws of a coin. 2} and S2 = {1.

The restriction of f to A ⊆ S1 . · · · } {· · · . −2m. otherwise f is undefined at x. i = j mod m if i − j is divisible by m. y) ∈ r then (y. z ∈X An equivalence relation on X induces a partition on X into disjoint subsets called equivalence classes Xj . y) ∈ f } 5. For such a set to define a function.4 3. The inverse f −1 : S2 → S1 is {(y. · · · } ··· {· · · . z) ∈ r ∀x. m − 1. y) ∈ f | x ∈ A} 4. A relation denoted r on X is an equivalence relation if it satisfies three rules. f is a partial function if Df ⊆ S1 9. 2m + 1. −2m + 2. x) | (x. A total function f is a bijection if it is both an injection and a surjection. ∪j Xj = X. −m + 1. y) ∈ r. If Rf ⊆ S2 then f is a function from S1 (Df ) into S2 10. x) ∈ r ∀x. where each xi is an element in the domain of the function. f is an onto function or surjection if Rf = S2 . −1. the reflexivity rule: (x.1 The relation congruence mod m (modulo m) on the set of the integers Z. 1. y ∈X and the transitivity rule: (x. 2m + 2. and any two elements taken from different classes are not in the relation. f : S1 → S1 is called a function on S1 CHAPTER 1. If this is not satisfied. f is a total function if Df = S1 . · · · }. · · · } . Z is partitioned into m equivalence classes: {· · · . 2m. y1 ). INTRODUCTION 6. such that elements from the same class belong to the relation. f|A = {(x. A specific kind of relation is an equivalence relation. (x2 . Example 1. y2 ). 2m. If x ∈ Df then f is defined at x. z) ∈ r then (x. y. 0. 7. 2. m. −m. −m + 2. m + 1. each xi can occur at most once as the first element of a pair. (y. 3m − 1. x) ∈ r ∀x ∈X the symmetry rule: (x. A function can be represented by a set of pairs {(x1 . 8.2. such a set is called a relation. · · · } {· · · . f is a one to one function or injection if (f (x) = z and f (y) = z) ⇒ x = y 11. and yi is the corresponding value in its range. m + 2. −2m + 1. −m − 1.

The function f (n) = 2n + 1 establishes the bijection between N and the set of the odd natural numbers. P2 .3. Sets that are either finite or denumerable are referred to as countable sets. where p and q are integers. The elements of a countably infinite set can be indexed (or enumerated) using N as the index set. We then try to argue that this implies that the claim also holds for Pn+1 .1 The set J = N − {0} is countably infinite. A set that has the same cardinality as the set of natural numbers N .3. #R = ℵ1 (“aleph1 ”) A set is infinite if it has proper subset of the same cardinality. In proof by induction.Two sets X and Y have the same cardinality if there is a total one to one function from X onto Y (i. we have a sequence of statements P1 . proof by contradiction.. It is this property that differentiates finite and infinite sets. and proof by Cantor diagonalization. We denote cardinality of X by #X or |X|. A correspondence is defined by the function f (n) = Example 1. there is no one to one mapping of a finite set onto a proper subset of itself.4 Proof Techniques We will give examples of proof by induction. Sets that are not countable are said to be uncountable.3 #Q+ = #J = #N Q+ is the set of the rational numbers p q n 2 −n 2 + 1 if n is odd if n is even > 0. about which we want to make some claim. The index mapping yields an enumeration of the countably infinite set.3 Countable and uncountable sets Cardinality is a measure that compares the size of sets. Example 1. has the same cardinality as N . The cardinality of a finite set can thus be obtained by counting the elements of the set. and if we have some starting point for the induction. If we can carry out this inductive step for all positive n. the function s(n) = n + 1 defines a one-to-one mapping from N onto J . P2 . a bijection from X to Y ). · · · . • The cardinality of denumerable sets is #N = ℵ0 (“aleph0 ”) • The cardinality of the set of the real numbers. The one to one correspondence between the natural numbers and the set of all integers exhibits the countability of set of integers. obtained by removing an element from N .3. The assumption that the claim holds for .3. Example 1. 1. The starting point for an induction is called the basis. up to Pn . The set J . The cardinality of a finite set is the number of elements in it.1. we can say that the claim holds for all statements in the sequence. Suppose that we know that the claim holds for all statements P1 . q = 0. The cardinality of a set X is less than or equal to the cardinality of a set Y if there is a total one to one function from X into Y . COUNTABLE AND UNCOUNTABLE SETS 5 1. · · · . is said to be countably infinite or denumerable.2 The set of odd natural numbers is denumerable. Clearly.e.

In a proof by contradiction. we assume the opposite or contrary of the property to be proved.e. then we prove that the assumption is invalid.2 √ Show that 2 is not a rational number. INTRODUCTION statements P1 . we have 2m2 = n2 Therefore n2 must be even.4. Example 1. we need only show that (k)(k+1)(2k+1) 6 + (k + 1)2 = (k+1)(k+2)(2k+3) 6 The latter equality follows from simple algebraic manipulation. Pn is the induction hypothesis. P2 .6 CHAPTER 1. · · · .. we assume the contrary of what we want to show. (b) For the induction hypothesis. k 2 i=0 i = = (k)(k+1)(2k+1) 6 (k+1)(k+2)(2k+3) 6 ⇒ Since k+1 2 i=0 i k+1 2 i=0 i = k 2 i=0 i + (k + 1)2 and in view of the induction hypothesis. and the argument connecting the induction hypothesis to Pn+1 is the induction step. Rearranging ( 2 = m ).1 Let us prove n 2 i=0 i = n(n+1)(2n+1) 6 by mathematical induction. This implies that n is even. k 2 i=0 i k(k+1)(2k+1) 6 = (c) In the induction step. Inductive arguments become clearer if we explicitly show these three parts.4. As in√ proofs by contradiction. Here we assume all that 2 is a rational number so that it can be written as √ n 2 = m. we show that the property holds for n = k + 1. Example 1. We establish (a) the basis by substituting 0 for n in n 2 i=0 i = n(n+1)(2n+1) 6 and observing that both sides are 0. we assume that the property holds with n = k. so that we can write n = 2k or 2m2 = 4k 2 and m2 = 2k 2 . i. √ n where n and m are integers without a common factor.

Therefore A cannot be enumerated. f2 . But this contradicts our assumption that n and m have no common factor. By making a certain assumption we are led to a contradiction of the assumption or some known fact. This leads to a contradiction. f differs from fx on input x. If all steps in our argument are logically sound. is uncountable. This is essentially a proof by contradiction. we prove that the set A = {f |f a total function. Hence f does not appear in the given enumeration. i. m and n in ( 2 = m ) cannot exist and 2 is not a rational number. f : N → N }. . f1 . f1 f1 (0) f1 (1) f1 (2) · · · f2 f2 (0) f2 (1) f2 (2) · · · f0 f0 (0) f0 (1) f0 (2) · · · f3 f3 (0) f3 (1) f3 (2) · · · Remarks: The set of all infinite sequences of 0’s and 1’s is uncountable. To come to a contradiction. we can give an enumeration f0 .1. we construct a new function f as f(x) = fx (x) + 1 x∈N The function f is constructed from the diagonal of the function values of fi ∈ A as represented in the figure below. hence A is uncountable. we must conclude that our initial assumption was false. This example exhibits the essence of a proof by contradiction. Such an f can be given for any chosen enumeration. As a consequence. 1) is uncountable. With each infinite sequence of 0’s and 1’s we can associate a real number in the range [0. To illustrate Cantor’s diagonalization method. 1).4. √ √ n Thus. Note that the set of all real numbers is also uncountable. the set of real numbers in the range [0.e. For each x. · · · of A. PROOF TECHNIQUES 7 Therefore m is even. However f is total and f : N → N .. so we assume that A is countable.

INTRODUCTION .8 CHAPTER 1.

bb.1. Any set of strings over/on Σ is a language over/on Σ.2 Σ = {a. . k = 0. . denoted by wv. . . ab. The empty string ε is a string with no symbols at all. an b1 b2 . is wv = a1 a2 . From the individual symbols we construct strings (over Σ or on Σ). . . called the alphabet. aa. that is. k = 0. . ababab. we define w0 = ε.1 Languages We start with a finite. 3. . . . ccc} L3 = {w|w = ck . b} L1 = {ab. . 9 . .} The concatenation of two strings w and v is the string obtained by appending the symbols of v to the right end of w. .1 Σ = {c} L1 = {cc} L2 = {c. . an and v = b 1 b2 .Chapter 2 Languages and Grammars 2. c.} = {ε. 2. bm . . cc. abab. . .} Example 2. As a special case. 1. . 1. ε} L2 = {w|w = (ab)k .} = {ε. then w n is the string obtained by concatening w with itself n times. ccc. . Example 2. 2. cc. nonempty set Σ of symbols. ba. then the concatenation of w and v. if w = a 1 a2 .1. . which are finite sequences of symbols from the alphabet. bm If w is a string.

is the number of symbols in the string.1. then we use Σ∗ to denote the set of strings obtained by concatenating zero or more symbols from Σ. Note that εw = wε = w for all w. Then. .10 CHAPTER 2. . The sets Σ∗ and Σ+ are always infinite. b}. Basis: |uv| = |u| + |v| holds for all u of any length and all v of length 1 (by definition). so that |uv| = |u| + |w| + 1 = |u| + |v|.3 Σ = {a. and the length of any string is incremented by one if we add another symbol to it. then its reverse w R is w R = a n . denoted by |w|. by |a| = 1 |wa| = |w| + 1 for all a ∈ Σ and w any string on Σ. A language can thus be defined as a subset of Σ∗ . . |uv| = |uwa| = |uw| + 1. The length of a string w. By the induction hypothesis (which is applicable since w is of length n). a2 a1 If w = uv. Induction Step: Take any v of length n + 1 and write it as v = wa. |uw| = |u| + |w|. To prove this by induction on the length of strings. n. LANGUAGES AND GRAMMARS for all w. . A string w in a language L is also called a word or a sentence of L. The reverse of a string is obtained by writing the symbols in reverse order. This definition is a formal statement of our intuitive understanding of the length of a string: the length of a single symbol is one. If Σ is an alphabet. Induction Hypothesis: we assume that |uv| = |u| + |v| holds for all u of any length and all v of length 1. Example 2. let us define the length of a string recursively. |v| = |w| + 1. . We denote Σ+ = Σ∗ − {ε}. 2. then u is said to be prefix and v a suffix of w. then the length of their concatenation is the sum of the individual lengths. . Then . |uv| = |u| + |v| Let us show that |uv| = |u| + |v|. if w is a string as shown above. Note that. |ε| = 0 If u and v are strings. which completes the induction step.

2. aaab. we call it a finite language. ab. the union. aaabbb} Example 2. aaa} L2 = {b. b. the complement of L is L = Σ∗ − L The concatenation of two languages L1 and L2 is the set of all strings obtained by concatenating any element of L1 with any element of L2 . bbb} n L1 L2 = {ab. The complement of a language is defined with respect to Σ∗ . aa. intersection.}. specifically. is a language on Σ.1. abbb. m ≥ 0} The string aabbaaabbb is in L2 . . LANGUAGES Σ∗ = {ε. L1 L2 = {xy | x ∈ L1 and y ∈ L2 } We define L as L concatenated with itself n times. The set L = {an bn |n ≥ 0} 11 is also a language on Σ. bb.1. with the special case L0 = {ε} for every language L. but the string abb is not in L. Because it has a finite number of words. then L L = L2 = {an bn am bm |n ≥ 0. aaa. . . aab}.1. a. aa.4 L1 = {a. ba. that is.5 For L = {an bn |n ≥ 0}. This language is infinite. The set {a. Example 2. Since languages are sets. and difference of two languages are immediately defined. The strings aabb and aaaabbbb are words in the language L. aab.The star-closure or Kleene closure of a language is defined as L∗ = L 0 ∪ L 1 ∪ L 2 · · · ∞ = i=0 Li and the positive closure as L+ = L 1 ∪ L 2 · · · ∞ = i=1 Li .

* and concatenation. α(βα)∗ = (αβ)∗ α. A string is a regular expression if it is a primitive regular expression or can be derived from the primitive regular expressions by applying a finite number of the operations +. 5. dbb. Then. . .2. Example 2. .2. They are called primitive regular sets.2. . For example. important equivalences involving regular expressions concern porperties of the closure (Kleene star) operation. 2. dbbdbb. daabb. If S and S1 are regular sets. 6. Regarding the notation of regular expression. * and concatenation. X ∪ Y and X Y. . . If r and r1 are regular expressions so are (r). where α. They are called primitive regular expressions. . however. (r r1 ). 1.1 b∗ (ab∗ ab∗ ) is a regular expression. c. dabb. . . 3. ε is used to represent {ε} and a is used to represent {a}. γ stand for arbitrary regular expressions: 1.2 Let Σ be a given alphabet. ∅. 8. ε (representing {ε}). dbb. (α + β)∗ = (α∗ β ∗ )∗ . (r ∗ ). and {a} ∀a ∈ Σ are regular sets. . 2. . Example 2. Definition 2. αα∗ + ε = α∗ . . (α∗ )∗ = α∗ . . a (representing {a}) ∀a ∈ Σ are regular expressions. In general.} Beyond the usual properties of + and concatenation. (r1 + r2 ). A set is a regular set if it is a primitive regular set or can be derived from the primitive regular sets by applying a finite number of the operations cup. LANGUAGES AND GRAMMARS 2. ∅. cc.12 CHAPTER 2. Then. . texts will usually print them boldface. we assume that it will be understood that. 1. dabbdabb. 2.2 Regular Expressions Definition 2. A regular expression denotes a regular set. 7. α(β + γ) = αβ + αγ. cdbb. (α + β)∗ = (α∗ + β ∗ )∗ .1 Let Σ be a given alphabet. Some are given below. . so are S ∗ . . the statement ? α∗ + β ∗ = (α + β)∗ is false because the right hand side denotes no string in which both α and β appear. the distribution law does not hold for the closure operation. . (α + β)∗ = α∗ (βα∗ )∗ . 4.}∗ = {ε. cdabb. (αα∗ ) = α∗ α. 3. .2. dabb. β. . {ε}.2 (c + da∗ bb)∗ = {c. 3. in the context of regular expressions.

Given a string w of the form w = uxv we say that the production x → y is applicable to this string. ∗ + ∗ . Thus w⇒w is always the case. We assume V and Σ are non-empty and disjoint sets. and write w1 ⇒ w. w1 .3 Grammars G = (V. If w1 ⇒ w 2 ⇒ w 3 · · · ⇒ w we say that w1 derives w. then the sequence S ⇒ w1 ⇒ w2 ⇒ · · · ⇒ w is a derivation of the sentence (or word) w. is a special symbol called the start symbol. S. P ) be a grammar. we can write w⇒v Let G = (V. A production can be used whenever it is applicable.3. The strings S. Σ. w ⇒ z. Then the set L(G) = {w ∈ Σ∗ |s ⇒ w} is the language generated by G. If we want to indicate that atleast one production must be applied. we say that w derives z or that z is derived from w.3.2. are called sentential forms of the derivation. If w ∈ L(G). Successive strings are derived by applying the productions of the grammar in arbitrary order. finite set of symbols called terminal symbols or terminals. They are of the form x→y where x ∈ (V ∪ Σ)+ − Σ and y ∈ (V ∪ Σ)∗ . finite set of productions or rules or production rules. and we may use it to replace x with y. S. Production rules specify the transformation of one string into another. GRAMMARS 13 2. thereby obtaining a new string. P ) Definition 2. w2 . Σ. The * indicates that an unspecified number of steps (including zero) can be taken to derive w from w1 . · · · .1 A grammar G is defined as a quadruple where V is a Σ is a S∈V P is a finite set of symbols called variables or nonterminals. and it can be applied as often as desired.

so we can write S ⇒ aabb. S → aSb S→ε Then S ⇒ aSb ⇒ aaSbb ⇒ aabb.1 Consider the grammar CHAPTER 2.2 P: < sentence >→< N oun phrase >< V erb phrase > < N oun phrase >→< Determiner >< N oun phrase > | < Adjective >< N oun > ∗ < N oun phrase >→< Article >< N oun > < V erb phrase >→< V erb >< N oun phrase > < Determiner >→ T his < Adjective >→ Old < N oun >→ M an|Bus < V erb >→ M issed < Article >→ T he Example 2. {a. b}. S. LANGUAGES AND GRAMMARS G = ({S}. Example 2.3. The string aabb is a sentence in the language generated by G.3 < expression >⇒< variable > | < expression >< operation >< expression > < variable >→ A|B|C| · · · |Z < operation >→ +| − | ∗ |/ Leftmost Derivation < expression >→< expression >< operation >< expression > ⇒< variable >< operation >< expression > ⇒ A < operation >< expression > ⇒ A+ < expression > ⇒ A+ < expression >< operation >< expression > ⇒ A+ < variable >< operation >< variable > ⇒ A + B < operation >< expression > ⇒ A + B ∗ < expression > ⇒ A + B ∗ < variable > ⇒A+B∗C .3. P ) with P given by.14 Example 2.3.

L ⊆ L(G) Let w ∈ L. are in L.2. every grammar that generates L is ambiguous.3. ⇒ na (w) = nb (w) ⇒w∈L 2. (All strings derived by G. Note that another leftmost derivation can be given for the above expression. na (w) = nb (w). GRAMMARS 15 Figure 2. L(G) ⊆ L.1: Derivation tree This is a leftmost derivation of the string A + B ∗ C in the grammar (corresponding to A + (B ∗ C)). for an inherently ambiguous language L. We show that w ∈ L(G) by induction (on the  CA @ PEDBI  ! RC Q  £!1 ) ' CA @ H¦G&¨  0FEDB¢   £!1 ) ' %¨£ " $¦2¨  0(&¦$#   7 ¥¨ 3 $9£8654  £   %¨£ " Ue¦H#  T£! D¦Df! c  CA @ 0RD#¢   d1 c  £6  ¥ £! b8¦§£R¦¦#   £!1 ) ' CA @ $¦2¨  a¤EDB¢  9T7 `  £!1 ) ' CA @ $¦G&¨  0SB¢   £ ¥6£ T 3 HGWYXDGWVU¢  §!    ¨£  ©¨ ¥£ ¡ ¦¦§£¦¤¢  . all productions of G add a number of a’s which is same as the number of b’s added.) For w ∈ L(G).3. An unambiguous grammar for the language is the following: < expr >→< multi − expr > | < multi − expr >< add − expr >< expr > < multi − expr >→< variable > | < variable >< multi − op >< variable > < multi − op >→ ∗ | / < add − op >→ + | − < variable >→ A | B | C | · · · | Z Note that. By definition of L. A grammar G (such as the one above) is called ambiguous if some string in L(G) has more than one leftmost derivation. Example 2.4 G : S → ε | aSb | bSa | SS L = {w|na (w) = nb (w)} Show that L(G) = L 1.

CHAPTER 2. we assume that w ∈ L(G). We derive w1 = bwa using the rule S → bSa.) w1 = w w can be derived in G from w and w . [. LANGUAGES AND GRAMMARS Basis: ε is in both L and L(G). A. H. Induction Step: Let w1 ∈ L. Thus for w1 ∈ L the total count = 0. |w| = 2. (a) w1 of the form w1 = awb (or bwa) where |w| = 2i ⇒ w ∈ L(G) (by I.) We derive w1 = awb using the rule S → aSb. H. T. using the rule S → SS. ]. |w1 | = 2i + 2.5 L(G) = {a2 |n ≥ 0} G = (V. P ) where V = {S.3. (b) w1 = awa or w1 = bwb Let us assign a count of +1 to a and -1 to b. count = 0). ⇒ w1 = w (count = 0) w where w ∈ L. Example 2. We also have |w | ≥ 2 and |w | ≥ 2 so that |w | ≤ 2i and |w | ≤ 2i ⇒ w . D} T = {a} P :S → [A] n A→a DA → AAD ]→ε [→ [D | ε D] →] . S.16 length of w). The only two strings of length 2 in L are ab and ba S ⇒ aSb ⇒ ab S ⇒ bSa ⇒ ba Induction Hypothesis: ∀w ∈ L with 2 ≤ |w| ≤ 2i. w ∈ L. We will now show that count goes through 0 at least once within w1 = awa (case bwb is similar) w1 = a (count = +1) (count goes through 0) (count = -1) a (by end. w ∈ L(G) (I.

2. L ⊆ L(G) .6 L(G) = {w ∈ {a. c} P : S → ε|ABCS AB → BA AC → CA BC → CB BA → AB CA → AC CB → BC A →a B →b C →c derive ccbaba Solution: S ⇒ ABCS ⇒ ABCABCS ⇒ ABCABCε ⇒ ABCABC ⇒ ACBACB ⇒ CABCAB ⇒ CACBBA ⇒ CCABBA ⇒ CCBABA ⇒ ccbaba Example 2. S} T = {a. aaabbb.3.3. GRAMMARS For example. C.3. .7 S → ε | aSb 17 L(G) = {ε. aabb. B. let us derive a4 . . . ab. L(G) ⊆ L 2.} L = {ai bi |i ≥ 0} To prove that L = L(G) 1. S ⇒ [A] ⇒ [DA] ⇒ [AAD] ⇒ [AA] ⇒ [DAA] ⇒ [AADA] ⇒ [AAAAD] ⇒ [AAAA] ⇒ εAAAAε ⇒ AAAA ⇒ aaaa ⇒ a4 Example 2. b. c}∗ | na (w) = nb (w) = nc (w)} V = {A. b.

Type 3 If the production rules are of the following forms: A → xB A→x where x ∈ Σ∗ (a string of all terminals or the empty string).e. 2. Basis: (i = 1) aSb is a sentential form.18 2. into the Type 0. by definition. then the grammar is called right linear. L(G) ⊆ L: We need to show that. Note that x has to be of length at least 1 and thereby y too. Grammar G will generate a language L(G) which is called a context-sensitive language. We first show that all sentential forms are of the form ai Sbi . Induction Step: Sentential form of length 2(i + 1) + 1 = 2i + 3 is derived as S ⇒ aSb ⇒ a(ai Sbi )b = ai+1 Sbi+1 . and A. the left hand side of each rule is of length 1). ε is in the language. B ∈ V (variables). Hence. LANGUAGES AND GRAMMARS S⇒ak Sbk then S → ε S ⇒ a k bk 1.. for a left linear grammar. then the grammar is context sensitive or Type 1. hence G derives only strings of the form ai bi (i ≥ 0).. Type 2 and Type 3 classes. we must apply the production S → ε.e. Induction Hypothesis: Sentential form of length 2i + 1 is of the form ai Sbi . i. since S ⇒ aSb. by induction on the length of the sentential form. it is not possible to derive the empty string in such a grammar.4 Classification of Grammars and Languages A classification of grammars (and the corresponding classes of languages) is given with respect to the form of the grammar rules x → y. Similarly. then w ∈ L. then the grammar is said to be context-free or Type 2 (i. Type 1. Type 1 If all the grammar rules x → y satisfy |x| ≤ |y|. thus CHAPTER 2. the production rules are of the form A → Bx A→x . Type 2 If all production rules are of the form x → y where |x| = 1. S ⇒ ai Sbi ⇒ ai bi represents all possible derivations. w = ak bk we apply S → aSb (k times). Type 0 Unrestricted grammars do not put restrictions on the production rules. To get a sentence. if w can be derived in G. L ⊆ L(G) : Let w ∈ L.

1 Convert the given grammar G to CNF.5 2. . S) is in Chomsky Normal Form if each rule is of the form i) A → BC ii) A → a iii) S → ε where B.2 Greibach Normal Form (GNF) If a grammar is in GNF.2. G :S → aABC|a A → aA|a B → bcB|bc C → cC|c Solution: A CNF equivalent G can be given as : G : S → A T1 |a A →a T1 → AT2 T2 → BC A→AA B → B T3 |B C B →b T3 → C B C → C C|c C →c 2. C ∈ V − {S} Theorem 2.5. a language that can be derived by a regular grammar iff it can be derived by a right linear grammar iff it can be derived by a left linear grammar. 19 A language which can be generated by a regular grammar will (later) be shown to be regular. Σ. S) be a context-free grammar. Σ.1 Normal Forms of Context-Free Grammars Chomsky Normal Form (CNF) Definition 2.1 A context-free grammar G = (V.5. 2.5. Note that.5. the production rules are of the form A → aB A→a A→ε with a ∈ Σ. S) in Chomsky normal form that is equivalent to G (L(G ) = L(G)).5. P. P .5. P. Example 2. NORMAL FORMS OF CONTEXT-FREE GRAMMARS For a regular grammar. thereby enabling the prevention of the left recursion. Σ.1 Let G = (V. There is an algorithm to construct a grammar G = (V. then the length of the terminals prefix of the sentential form is increased at every grammar rule application.

P. i) A → aA1 A2 .5. Σ.20 CHAPTER 2. . S) is in Greibach Normal Form if each rule is of the form. An ii) A → a iii) S → ε . LANGUAGES AND GRAMMARS Definition 2.2 A context-free grammar G = (V. .

Chapter 3 Finite State Automata 21 .

FINITE STATE AUTOMATA .22 CHAPTER 3.

An Introduction to the Theory of Computer Science. Ullman. Languages and Computation.Bibliography [1] A. 3rd edition. Introduction to Automata Theory. 23 . 2nd edition. Languages and Machnes. [2] John E. Addison-Wesley 2001. Rajeev Motwani and Jeffrey D. Hopcroft. Addison-Wesley 2005. Thomas Sudkamp.

Sign up to vote on this title
UsefulNot useful

Master Your Semester with Scribd & The New York Times

Special offer for students: Only $4.99/month.

Master Your Semester with a Special Offer from Scribd & The New York Times

Cancel anytime.