This action might not be possible to undo. Are you sure you want to continue?

Chapter 3.2

Chapter 3.2

1

S} This machine can easily be converted to our standard TM. cl . cl ) = (qk . For each δS (qi.Turing Machine Variations The “stay put TM”: δS : Q → Q × Γ × {L.2 2 . cl . R) ′ δ(qk . R. S) add the following pair of transitions to δ: ′ δ(qi. L) Note: Each variation on our basic TM does not add more robustness. cj ) = (qk . cj ) = (qk . Chapter 3. cl .

S..8: Every multi-tape TM has an equivalent single tape TM. show how to simulate M with S...2 3 . bk .... Proof Sketch: Convert the multi-tape TM M to an equivalent single tape TM. R}k Example: δM (qi. δM : Q × Γk → Q × Γk × {L. L. That is. a1.Turing Machine Variations Multi-tape TM: Like a regular TM but with k tapes each with its own head for reading and writing. the input is a Tape 1.. . and the other tapes start out blank. Initially. R) Theorem 3.. . Chapter 3. b1. ak ) = (qj . ..

. 1. a.. ... ΓM = {0. b.. a. #. 1. . . # 0 1 0 10 # a a a # b a # . b. .. b.. ⊔ } Chapter 3.2 4 . 1.Turing Machine Variations M 0 1 0 1 0 a a a b a S . . a.. ⊔. 0. ⊔} ˙ ˙ ˙ ˙ ˙ ΓS = {0.

in order to determine the symbols under the virtual heads. one unit to the right.wn#⊔#⊔ . First S puts its tape into the format that represents all k tapes of M . Then S continues as before.. from this cell until the rightmost #. So S writes a blank symbol on this tape cell and shifts the tape contents. # ˙ 2. 3.. To simulate a single move of M .wn: 1... If at any point S moves one the virtual heads to the right onto a #. to the (k + 1)st #. S scans its tape from the ﬁrst #. The formatted tape of S contains: ˙ ˙ #w1w2.2 5 . which marks the right-hand end. Then S makes a second pass to update the tapes according to the way that M ’s transition function dictates.. which marks the left-hand end.” Chapter 3..Turing Machine Variations S = “On input string w = w1. this action signiﬁes that M has moved the corresponding head onto the previously unread blank portion of that tape.

. . . . Recall: Defn: A language is Turing regonizable if some TM recognizes it. Chapter 3.Turing Machine Variations S S # 0 1 0 10 # a a a # b a # .. . # 0 1 0 10 # a a a # b a # . ...2 6 . .

then some multi-tape TM recognizes it.2 7 . then L is Turing-recognizable. Chapter 3.8 shows this. This TM is just a multi-tape TM with one tape. So L is recognized by a multi-tape machine. single-tape TM. Must show that any multi-tape TM can be constructed as a single-tape TM. Theorem 3.9: A language L is Turing-recognizable iﬀ some multi-tape TM recognizes it. By deﬁniton. Proof: If a language L is Turing-recognizable. L is recognized by an ordinary.Turing Machine Variations Corollary 3. Proof: If some multi-tape TM recognizes L. Assume L is Turing-recognizable.

The TMs form a tree. R}) Where there is a choice point. then the nondeterministic machine accepts its input. Chapter 3. If one of the TMs succeeds (enters an accept state). the TM may proceed in a number of ways. δN : Q × Γ → P (Q × Γ × {L.Turing Machine Variations Nondeterministic TMs At any point in its computation. a new TM is spawned.2 8 .

2 9 .Turing Machine Variations Theorem 3. N 1 N N 13 2 N N 31 3 N 32 11 N 12 N 21 N N Why not search depth-ﬁrst? Must execute each machine. Chapter 3.10: Every nondeterministic TM has an equivalent deterministic TM. adding one step at a time. Proof Idea: Keep a tree of the nondeterministc TMs and search it breadthﬁrst.

D q1 0 0 1 0 ... D q4 0 0 1 0 .address tape 0 0 1 0 D q1 ..10: Every nondeterministic TM has an equivalent deterministic TM. D q7 0 0 1 0 ... Proof Idea: For D. 0 0 1 0 1 3 . # 0 1 0 1 3 ... # # 1 0 1 3 .. use three tapes. Tape 3 ..Turing Machine Variations Theorem 3.. Tape 2 . Chapter 3.input string 2. Tape 1 ..2 10 ..simulation tape of a version of N run for k steps 3. 0 0 1 0 2 1 . 1. the deterministic version of N .....

accept the input/ 4.Turing Machine Variations Theorem 3. Replace the string on tape 3 with the lexicographically next string. tape 1 contains w and tapes 2 and 3 are empty. If an accepting conﬁguration is encountered. and go to stage 2. abort this computation by going to stage 4. Chapter 3.2 11 .10: Every nondeterministic TM has an equivalent deterministic TM. look at the next symbol on tape 3 to determine which choice to make among those that are possible. If no more symbols are on tape 3 (at a blank) or the choice is invalid. Initially. Description of D’s operation: 1. Use tape 2 to simulate N with input w on one branch of N ’s computation. Also go to stage 4 if a rejecting conﬁguration is encountered. Copy tape 1 to tape 2 3. Before each step on this branch. 2. in start state.

Use Theorem 3. If L is decidable. All deterministic TMs are also nondeterministic. If some nondeterministic TM recognizes L. then some nondeterministic TM decides it. Corollary 3. If some nondeterministic TM decides L.10. then some nondeterministic TM recognizes it. All deterministic TMs are also nondeterministic.2 12 . then L is decidable. then L is Turing-recognizable.Turing Machine Variations Corollary 3.11: A language is Turing-recognizable iﬀ some nondeterministic TM recognizes it.12: A language is decidable iﬀ some nondeterministic TM decides it. If L is Turing-recognizable.10. Chapter 3. Use Theorem 3.

Strings can be repeated or in any order... aa baba abba printer control 0 0 1 0 . Chapter 3.2 13 .13: A language is Turing-recognizable iﬀ some enumerator enumerates it.Turing Machine Variations Enumerators A TM with a printer: The language enumerated by E is all strings printed to the printer. Theorem 3.

Every time that E outputs a string.Turing Machine Variations Theorem 3. then A is Turing-recognizable. 2. TM M works the following way. Note: We could use the TM of Example 3. M = “On input w: 1. the TM that recognizes A. Run E. Proof: If an enumerator E enumerates a language A. compare it with w.13: A language is Turing-recognizable iﬀ some enumerator enumerates it. Question: Is this decidable? Chapter 3. else go to 1.2 14 . We need to construct M .5 to do the comparison of each string that E enumerates to w. accept. If this output of E matches w.

... be an inﬁnite list of all the strings in Σ∗. That is. 1. For i = 1. we need to show that if some TM M recognizes A then an enumerator E can be constructed to enumerate A.. then some enumerator enumerates it. 3. 2.” Chapter 3.. A must be over some alphabet Σ. . print it. (a) Run M for i steps on each input s1.Turing Machine Variations Theorem 3. si. A ⊆ Σ∗..2 15 . . s2.. Proof: If language A is Turing-recognizable. E = “Ignore the input. In this direction.13: A language is Turing-recognizable iﬀ some enumerator enumerates it. Let s1. . s3. (b) If M accepts any of these strings.

s5 s1 . s2 . s3 . If M accepts some string. s3 . Printer output s2 s2 . .s2..s2. s5 s2 . Why are we running M this way? Chapter 3.. s2 3 s1 .s5. s7 8 s1..s2. s3 4 s1. s5 s2 . i = Strings Tested 1 s1 2 s1 . s4.s5. s6.Turing Machine Variations Example: Run M for i Steps.. s7 .s3. s6.s5 6 s1. s4. s4.s3..s3.2 16 . s4. s5 . it will eventually get printed out.s2.s5.. s2 . s6 7 s1. s3 .s3.s3. s4 5 s1.s2.

but restricted access 2.limited memory PDA .Turing Machine Variations Summary: There are many other models of general purpose computation that have been proposed. and that is: 1. Chapter 3.limitless memory. Unrestricted access to a limitless memory FA . Example: “How about a PDA with a deque?” Peter Joachim. contributor All have the same features as a TM (or less).2 17 . All perform a ﬁnite amount of work in a given step All machines with the above two features have the same power of computation as a TM.

BIG IMPLICATION: Since all computational models can simulate each other and compute all the same algorithms. 2. precise description of the class “algorithm”. A precise deﬁnition of algorithm was not worked out until Turing and Church came along. Any algorithm that runs on any other machine can be run on a TM. This is a unique. Any model of computation can be simulated by a TM and vice versa. Chapter 3. we can describe an algorithm as “being able to be run on a machine”.2 18 .Turing Machine Variations What does this mean? 1.

Sign up to vote on this title

UsefulNot useful- Introduction to Turing Machines
- IntroToComutationalComplexity.pdf
- lec1
- Comp 230 Assignment 3
- paper
- 1112.1744v1
- Dirchlet Prime Number Theorem
- root.pdf
- Data Structures & Algorithm Analysis in C++
- Greedy Notes
- 1.10.2
- A Computational Foundation for the Study of Cognition
- 01David+J.+Chalmers_3교
- Client-Server, Signed Methodologies
- Chip-Firing Games, G-Parking Functions, And an Efficient Bijective Proof of the Matrix-Tree Theorem, by Farbod Shokrieh
- Scimakelatex.63170.a.B.cc.D
- l Hopital
- Refinement of Architecture
- c- strings 11
- Scimakelatex.24643.Auth4.Auth5.Auth6
- 20110418 Accessible Out Comes
- On A Generalization of the Binomial Theorem
- Exploring Application-Level Semantics for Data Compression Abstract by Coreieeeprojects
- A Case for E-Business
- page2
- 03b_3
- Social Network
- Deconstructing SCSI Disks Using FUELER
- The Impact of Multimodal Models on Networking
- Model-Based Identification of Dominant Congested Networks IEEE Project Abstract
- tm