You are on page 1of 5
(646. Recall from §2.13 that a Markov chain of order 2can be made a chain of order 1 by Suitably enlarging its tate space. Foran irreducible Markov chain of oer 1 having ane ‘Sion matrix P = {p.jl.jes and Kiting distribution 1 = [ries et us enlarge is state space ina similar manner; thas, we want wo study the vector ci (Xy-1» Xn)n-0s2 ‘Find the transition matrix of the expanded chain in terms of 28 vs, Verify that im Prius Koad = OF Explain using the regenerative property «e IMustrate with the taxicab example. What is the limiting probability that Tes drives to city B immediately after visiting city A? =mipy forall, 7€S 647 Consider an irreducible Markov chain. ‘a. Use first-step analysis to show that re 3 e[iny']=) 2 ne bul Eraefi) foratijes . bf Compare with Problem $46, bb. Hence, find &[(nj)"] for all i j 6.48 Consider an irreducible Markoy chain. ‘a. Use Problem 6.47 (o show that #6 [(y)” 142 ¥ ins los] forall j € S| taf A. B, C in the taxicab example. 1b. Hence, find &{(n,)°) for all i = A,B, C in the taxicab example. 649 Refer o Problem 2.17 n wich a sequential seach method ised to finda item fmong thee items A, B, and C seers now desable ind the Kmiting probability for state ABC the other tates canbe limped together (§2.17- > Honve, Bd the limiting dstbation forthe chain (Xan Show that, with this method of relocation, the most frequently searched item wil Fikely bein location 1 and the last searched item in locaton 3. 650) O50 sx, bene state of machine A atime n, which ean be either up or down fit atime ete probability What il be down time e+ Ibe di 38 seep a Mame ne the probity that t wil be up at time m1 be w Find the limiting distribution for {Xn}n-=0.3.2,..- Hint: ‘Use Problem 3.24. De aenyeive ihe state of mother machine B at ime n, which ean also be either a yrdown, I behaves inpendemly and identically with machine A. Find ‘the transition matrix for the vector chain {Xj, Ya}am0.1.2..° Guess the limiting distribution and verify that it satisfies Equation (6.3.1). ce Soppoe we enly want to know the nuraber of machines that ae up at ime Pears amp the sates? From thi, obtain the limiting asebaton ofthe umber of machines that ae up. only. Show how 268 4 Markov Chains icks on the road are followed by a car, white only 30. Three out of every four tu sone out of every five cas is followed by a truck. What fraction of vehicles on the road are trucks? i 31, A certain town never has to sunny days in a row. Bach day is classified arcing either sunny, cloudy (bat dry), oF eainy. If tis sunny one day, dhe Hs qual Likely tobe ether cloudy o any the net day is rainy ox cloudy ne sty then there is one chance into that it wll be the same the next day, an ik hangs then iis equally Tikely tobe either ofthe otter two possibile, fn the tong run, uitat proportion of das ae sunny? What proportion are cloudy? #32. Bach of wo switchesis either on or off during aday. On day each switch ‘will independently be on with probability [1+ number of on switches daring day — 11/4 i For instance, if both switches are on during day — 1, then cach will indepen- so Ging day with rbabity 3/4, What faction of ay ae bot switches on? What action ro both off? GB) & professor continally gives exams to hee students She ean Be three pos- | 33 ret and tt class in graded seer ving done well bx} | See ree tn aba thse cas des well om 2179 extra aPe | i pose thut pi = 033, p2 =0.6, and ps =09. Ifthe class does well on tn © ven the next exam is equaly likely to be any ofthe three types. Ifthe class does bradly, then the next exam is always type 1. What proportion of exams are Pe i223? 34, A flea moves around the vertices of a trans Whenever itis at vertex # it moves to its clockwise neighbor vertex wit lity», and to the counterclockwise neighbor with probability ¢1 = I~ Pi i 21,2.3 (a) Find the proportion of time that the fie is t cach of the vertices {b) How often does the flea make a counterclockwise move which 5 then FO owed by five consecutive clockwise moves? wle in the following manner: th prob 435, Consider @ Markov chin with states 0,1, 2.3. 4. Suppose Po. = 1: a 2appose tat when the chin sn state {> Os he next tate equally Uikely aa ot the states 0, 1.-.-4 ~1-Find he Kiting prbabies of this Markor cin & ‘The state of a process changes daily accor “ve process ib in state 1 during one day, ding io a two-state Markov chai. then it is in state j the follow ted by a car, while only ction of vehicles on the Bach day is classified ‘unny one day, then itis itis rainy or cloudy one ve the next day, and if it two possibilities. tn the sion are cloudy? y-Onday n, each switeh nays then each will indepen- action of days are both 5, She can give three pos- wing done well or baly. + atype f exam, and sup- s does well on an exam, ve types. [the class does ortion of exams are type 1 the following manner: ighbor vertex with prob- probability qi = 1 ~ pe of the vertices. .e move which is then fol 4, Suppose Fo.s = Ii and state is equally Tkely © ‘babiltes of this Maskov a two-state Markov chain. is in state j the follow Exercises 269 ing day with probability F;,;, where 04, 02, Pir =08 Bvery day a message is sent Ifthe state of the Markov éhain that day is J then “good” with probability p; and is “bad” with probabitity 4, (a) fF the process isin state 0 message is sent on Tuesday? (©) Ifthe process isin state 0 on Monday, what isthe probability that a good message is sent on Friday? (© Inthe Tong run, what proportion of messages are good? (@) Let Yx equal 1 if a good message is sent on day n and let i equal 2 other wise. I (Ya, > I} a Markov chain? Iso, give is transition probability matrix If not, briefly explain why not Monday, what is the probability that a godd 37. Show thatthe stationary probabilities for the Markov chain having transition probabilities Pj are also the stationary probabilities for the Markov chain whose for any specitied positive integer k 38. Recall that state / is said to be positive recurrent if mi <0, where my, is the expected number of transitions until the Markov chain, starting in state 7, ‘makes a transition back into that state. Because 1, the Tong run proportion of time the Markov chain, starting in tated, spends in state i satisfies it follows that sae is positive recurrent if and only if x; > 0, Suppose that state 4s positive recurrent and that state ¢ communicates with state j. Show that state Ji8 also positive recurrent by arguing that there is an integer n such that xj > Pf; >0 39. Recall that a recurrent state that is not positive recurrent is called null recur ‘eat. Use the result of Exercise 38 to prove that null recurrence is a class property. ‘Thatis, if state iis null recurrent and state ¢ communicates with state j, show that state j is also null recurrent. 274 4 Markov Chains roblem of Section 45.1, suppose the gambler's for pose that we kaow thatthe gambler's fortune will A goes to 0). Given this information, show that the 58, In the gambler’s ruin pr tune is presently i, and supy eventually reach N’ (before it probability he wins the next gamble is pli= Gey} 4 { eee ifpe = (@ie¥ PFE | { int | oS itp=} Hint: The probability we want is (Xan Sie Xe PX sith Pilinoy Ga tote es in eo Sin 4 a den ema 9. Fone ge le gambler het ges ke of ES Pe ta ea esas i t=O = fee IN, Show that + Mini + aMint Mo=Mv=0, Mi 60. Solve the equations given in Exercise $9 to obtain MUN ~D. itp=$ Git tpeh =q-p apt ai Gs. Suppose in the gamble’ rin problem tat he probably of wn 2 Sor the pe’ pesent onc Specially, suppose th 9 he ere ur rs 2 bt we is oe foun is GE Oe a ial ene fe Peete probity ca ie gamblers fortune reaches N before 0. | tates PQQ w PG — Dard P+ D: (a) Derive a formula that rel nas in the gambler’s ruin problem (b) Using the same approad tion of part (a) for PG. (©) Suppose that balls are initially in thatat each stage one of the 1 balls is randomly ct raat in, and placed in the other urn. Find the probability 1 becomes empty before the second. yolve the equa’ tum Land N-—Z are in urn 2. and suppose hosen, taken from whichever bat the Gest um | exercises 275 *62. In Exercise 21 (a) what is the expected number of steps the particle takes to return to the starting position? {(b) what isthe probability that all other positions are visited before the particle returns to its starting state? 63. For the Markov chain with states 1, 2, 3, 4 whose transition probability matrix P is as specified below find fis and srs for i = 1,2,3. 04 02 01 03 01 05 02 02 03 04 02 01 0 0 01 64, Consider a branching process having 1 < J. Show that if Xq = 1, then the expected number of individuals that ever exist in this population is given by 1/0 =p). What if Xo =n? 65. Ina branching process having Xp = 1 and jt > 1, prove that 79 isthe small! st positive number satisfying Equation (4.16). Hint: Let x be any solution of = S-92gx/ Py. Show by mathematical in- duction that © > P(%_ = 0) for all n-sd let“ 00. In using the induction argue that Oy P; P(X, =O) = PX = (GR Fora branching proves, clelse m9 when 67. At all times, an urn contains N balls—some white balls and some black bulls, At each stage, a coin having probability p,0 < p < J, of landing heads is flipped. If heads appears, then & ball is chosen at random from the um and is, replaced by a white ball i tails appears, then a ball is chosen from the wm and is replaced by a black ball. Let X, denote the number of white balls in the urn after the nth stage. (2) Is [Xq, 1 20) a Markov chain? If so, explain why. (©) What are its classes? What are their periods? Are they transient or recur |

You might also like