You are on page 1of 6
28 RANDOM WALKS AND MARTINGALES Example 7.1(c). We will present a proof when the X; are 0 — 1 (that is, Bernoulli) random variables, THEOREM 7.16 (De Finettis Theorem). To every infinite sequence of exchangeable random variables X,, X3.... taking values ether 0 or 1, there corresponds a probability distri bution G on [0,1) such that, for all 0< 1} is said to be a Martingale process if EllZ,[]< 0 foralln lernoulli) shangeable iy distri tioning on » Pis.=i) af size j of ld follow at for some ‘tion G and hangeable a P(X, = (7.16) 722. MARTINGALES El2n1|ZuZ. ‘A Martingale is a generalized version of a fair game. For if we interpret Z, as a gambler’s fortune after the nth gamble, then (7.2.1) states that his expected fortune after the (n+ 1)st gamble is equal to his fortune after the nth gamble no matter what may have previously occurred, ‘Taking expectations of (7.2.1) gives FlZ,+1]= FIZ.) E[Z,]= E[Z,] forall, ‘Some Examples of Martingales. (1) Let X,, X,,... be independent random variables with O mean; and let Z, = S., X,. Then (Z,,n> 1) is a Martingale since BLZnes| Zr 0-2) = lat Xoo s|Zte-- Za) BL ZalZaveosZa) + EX geslZin- Za] HZ, + Xe) =2. and so (@) IX,, Xa,..-are independent random variables with E[X,] = 1, them {Zyy 2 1} is ‘Martingale when Z, = []f-1 X,. This follows since ElZee i121. --Ze) = AZ Xue s|Zis---sZa] 2ybLXyealZan-- Za] ZeE[Xue 1] -z. (3) Consider a branching process (see Section 45 of Chapter 4) and let X, denote the size ‘of the nth generation. fm isthe mean numberof offspring per individual, then {Z,,m> 1} isa Martingale when Zym Kult. We leave the verification as an exercise. DEFINITION. The positive integer-valued, possibly infinite, random variable Nis said to be a random time for the process {Z,,n 1) ifthe event {N =n) is determined by the random variables Z,,...,Z,. That is, knowing Zj,...,Z, tells us whether or not IP{N < co} = 1, then the random time N is said to be a stopping time. Let N be a random time for the process {Z,, m2 1} and let ie ifmN. {Zs n> 1} is called the stopped process. PROPOSITION 721 ICN is a random time for the Martingale {Z,], then the stopped process {Z,} is also a Martingale. RANDOM WALKS AND MARTINGALES 1 tNen 0 tN en ‘That is, J, equals 1 if we haven't yet stopped alter observing Z;....,2Z,-y+ We claim that (722) 2, 2p + M2y~Zyoph To verify (722) consider two cass: () Nz: In this case, Z, = Zp Z.-1 = Z,-s0 and I= 1, and (722) follows i) Nn: inthis ease, 2, Zy, Ie = O,and (7.22) follows Now (723) EUR Zp oy Zeoa] = BlZaa + Ma ~ Zan Zay oo Zan Zany $UEZy— Zaes2iy---eZyai] where the next to lst equality follows since both Z,_. and I, are determined by Z,, » and the last since (2,) is a Martingale We must prove that E[Z,|Zy.-.-,Ze-1] = Zy-s. However, (7.23) implies this result since, if we know the values of Z;,..-, Zy—1, then we also know the valves of Zys...5 Zs More formally, we have EZ,|Z., Zyct Zaps Bu sM2ares Pu Zant Ziy +4 2n-1) A2.-s125y---.Zyoa] (by (023) Zan Since the stopped process is also a Martingale, and since Z, = Z,, we have (124) F[Z,)= E[Z,] forall ‘Now let us suppose that the random time N isa stopping time, that is, P(N < co} = 1. Since Z, Z, ifn N Zy ifn>N, it follows that Z, equals Zy when n is sufficiently large. Hence, Z,+Zy — asn-+ co, with probability 1 Ist also true that (725) E[Z,] > E[Zy] as n 00. Since E[Z,] = E[Z,] for all n, (7.2.5) states that El2y] ~ £121] turns out that, subject to some regularity conditions, (7.2.5) is indeed valid. Ve state the following theorem without proof. THEO ©, and th then (1.2 Theor decide w fortune possible ‘coro! UX,i2) ©, then Proof Le isa Martin But Toshow the and thus Exampl that a seq observed s ‘until some either 0,1, the run 02 then the re To comy ata fair ga faim that this result Z, deed valid. 12. MARTINGALES 21 THEOREM 7.22. If either()Z, are wniformly bounded, or (i) N is bounded, or (i EEN] < oy and there ian M < 20 such that Flv — Zl Zu -oos ZT 1, are id with E{]X] < oo and if N is a stopping time for X,, Xp,.-. with E[N] < co, then [3 x]-etena Proof Let «= BLX]. Since is a Martingale, it follows, if Theorem 7.22 is applicable, that [21] =0. 3 x]- 200. To show that Theorem 7.2.2is applicable, we verify condition iil, Now Zp. —Z,=Xnes~ Ht and thus El lps — Zl Zar +++ Zn) = EU Xaer — wilZse--- Za) = 2X — al) < AUX +a Example 7.2(@). Computing the Mean Time Umil a Given Pattern Occurs. Suppose that a sequence of independent and identically distributed discrete random variables is ‘observed sequentially, one at each day. What isthe expected number that must be observed until some given sequence appears? More specifically, suppose that each outcome is, either, 1, or 2 with respective probabilities 4, 4, orf, and we desire the expected time until the run 020 occurs. For instance, ifthe sequence of outcomes is 2, 1,2, 0,2 1,0, 1,0,0, 2,0, then the required number N would equal 12 ‘To compute E[N] imagine a sequence of gamblers, each initially having 1 unit, playing at a fair gambling casino, Gambler i begins betting at the beginning of day i and bets his 232 RANDOM WALKS AND MARTINGALES 1 unit that the value on that day will equal 0. If he wins (and thus has 2 units) he then bet the 2 units onthe next outcome being 2, and ifhe wins this bet (and thus has 12 unis), then all 12 units are bet onthe next outcome being O. Hence each gambler will lose 1 unit ifeny of his bets fais and will win 23 all three of his bets sucoed. At the beginning of ach day another gambler stars playing, Ifwellet X, denote the total winnings ofthe casino Sher the nth day, then since all bets ace fair it follows that (Xy. 2 1} is a Martingale ‘vith mean 0. Let N'denote the time until the sequence 020 appears. Now at the end of day N each ofthe gamblers 1,...,.N-~ 3 would have lost 1 unit; gambler N ~ 2 would frye won 23, gambler N ~ 1 would have lost 1, and gambler N’ would have won 1 (ince the outcome on day N is 0), Hence, Xy=N—3-B4 1-12 N-% and, since E[X,] = 0 (itis easy to verily condition (i of Theorem 7.22) we se that ELA In the same manner as in the above, we can compute the expected time until any given Hitters of outcomes appears. For instance in coin tossing the meantime until HHT TH H focus is prigr? + p 2 + P's where p= P(t} =1— 4 “Suppose now that we want to compute the probability that a given pattern, say pattern A, coeur cari than a second pattern, say pattern B. To obtain this probability we wll ind {Tasca to frst consider the additional time after a given one of the patterns occurs until theother one does, For instance, in the above example suppose A = 0,2, Oand B= 1,0,0,2 land considet Nap the additional number of trials needed for A to oecur alter B occurs Thats, Naga = min(k: 0,2,0%8 a connected subset of 10,0, 2, Xu, X2y---+ Xa} ‘To compute E[N qs] again imagine that each day a gambler begins betting on the subse- ‘quence = 0,2, 0 appearing on the next three outcomes. Now given that the inital four gitcomes are 1,0, 0,2, i follows that gamblers | and 2 would have lost their I unit, gambler $ would be winning 11 (since he won his ist two bets) and gambler 4 would have lost his I Now, as before, at the time 4+ Ny the casinos winnings will equal 4 + Nay — 26 Nan 22, Since their winnings constitutes a Martingale sequence and since they are 8 ‘behind after the frst four gambles, it follows that F{Naa— 22) = 8 “Thus, letng 17, be the numberof trials needed to obtain A, we have shown that FINA = 26 FEN (Asa check note that since A = 0, 2,0, B = 1,0,0, 2, wemust have E[N,y] = 1 + 4E(N4}) ‘Similarly we can show that FUNg] = 72, ELNaa] = 72 “To compute Py, the probability that A occurs before B, let M = min(N Na) Then BN) = BEM] + BLN — MY = ELM] + £[N«— M|B before Ala ~ Pa) ELM] + (1 — PEL a) ELNa} = ELM] + Pab {Nava} denote valued THE! some M Proof. convers this img Let sitial four ‘gambler SEL) 7.3 BACK TO RANDOM WALKS Solving these equations yields © ELNoul + ELNaa] aw a= FUN] + EtNaa] ~ BLN] BLM] = FLNa] = EN a] In the particular case under consideration, n p= R+M=25_30 Example 7.2(8). Suppose we wanted to compute the expected time until the simple random walk, with p > $, hits i,i> 0. To do so let N denote this time. Then, since ue ‘we have, by Wald's Equation, ‘ BLN] ~ AN) 7.3, BACK TO RANDOM WALKS Ex, met denote a random walk. Our first result is to show that if the X, are finite integer- valued random variables, then S, is recurrent if E[X] = 0. THEOREM 731. Suppose X, can only take on one of the values 0, 1,...,:£M for some M < co. Then {Syn > 0} isa recurrent Markov chain if and only if E[X] = 0. Proof. Its lea thatthe random walk is transient when LX] 0, since it will either converge to +00 (i E[X] > 0, or ~ en i E[X] <0), So suppose E[X] = 0 and note that ‘his implies that (S,,m 1) isa Martingale Let A denote the set of states fom ~ M upto —1—thatis, d= {—M, ~(M — 1... -1). Suppose the process starts in state, where /> 0, For J>1 let A, denote the st of sates Ay= (nj + Teveenj + Mand let N denote the frst ime that the proces isin ether A or Ay. By Theorem 7.22, A1Se] = E180] i= E[SJSyeA] P(SyeA} + E1SHSyeAP(S,€ A) 2 —MP{Sye A) + ill ~ P(Sye-d})

You might also like