You are on page 1of 12
LECTURE 13 The Bernoulli process * Readings: Section 6.1 Lecture outline «Definition of Bernoulli process Random processes «Basic properties of Bernoulli process «Distribution of interarrival times + The time of the kth success ‘© Merging and splitting The Bernoulli process * A sequence of independent Bernoulli trials © At each trial, i = P(success) = P(X; =1) =p — P(failure) = P(X, = 0) = 1—p + Examples = Sequence of lottery wins/losses ‘Sequence of ups and downs of the Dow Jones = Arrivals (each second) a bank — Arrivals (at each time slot) to server Random processes First view: sequence of random variables Xy, X, © BI = © Var(xi) = Second view: what is the right sample space? © POG = 1 for all) = ‘* Random processes we will study — Bernoulli process (memoryless, discrete time) Poisson process (memoryless, continuous time) = Markov chains (with memory/dependence across time) Number of successes 5 in n time slots + PS=H= * Efs] © vars) = ~ Interarrival times © Ty: number of trials until first success PO =O= = Memoryless property - BIN = = var(rs) = ‘Ifyou buy a lottery ticket every day, what is the distribution of the length of the first string of losing days? ‘Time of the kth arrival © Given that first arrival was at time ¢ le, hat additional time, T2, until next arrival = has the same (geometric) distribution — independent of Ty ‘+ Ye: number of trials to kth success ~ B= = var(yi) = - POLS) = Splitting of a Bernoulli Process (using independent coin flips) Tall Ct eh eh ylelds Bernoulli processes Merging of Indep. Bernoulli Processes SUieLniitidel n Ll J fe Ll fol ii 1119 ylelds @ Bernoulli process (collisions are counted as one arrival) a LECTURE 14 The Poisson process * Readings: Start Section 6.2. Lecture outline «Review of Bemoulli process + Definition of Poisson process ‘+ Distribution of number of arrivals ‘Distribution of interarrival times 1* Other properties of the Poisson process Bernoulli review Discrete time; success probability p Number of arrivals in n time slots: binomial pmt Interarrival times: geometric pmf Time to k arrivals: Pascal pmt Memorylessness Definition of the Poisson process + Time homogeneity; P(k,7) = Prob. of & arrivals in interval of duration + ‘© Numbers of arrivals in disjoint time intervals are independent ‘© Small interval probabilities: For VERY small 5: 1M, if k=0; PUA EIA R= D 8 itk>d = A: “arrival rate PMF of Number of Arrivals V Time Finely discretize [0, |: approximately Bernoulli Nj (0f discrete approximation): binomial Taking 6-0 (or n> 20) gives: Qt En PCr) , k= 01 BLM) = var(N Ey Example © You get email according to a Poisson process at a rate of \=5 messages per hour. You check your email every thirty minutes. ‘+ Prob(no new messages) = + Prob(one new message) = Interarrival Times Yj time of kth arrival = Erlang distribution diy tev = Fa y2o Image By MIT OpenCourseWare ¢ Time of first arrival ( ‘exponential D: fy ly) =A, y20 = Memoryless property: The time to the next arrival is independent of the past Bermoulli/Poisson Relation apaoossa new Opa CTme= AD poisson |sERNOULLI Times of Arrival | Continuous | Discrete Arival Rate Afunit time | —p/per trial PME of # of Arrivals | Poisson | Ginomiat Time to kth va Eriang Pascal Merging Poisson Processes ‘+ Sum of independent Poisson random variables is Poisson ‘+ Merging of independent Poisson processes is Poisson Fd bub ashos (Poisson) A lashes (Po'sson) Groen bub ashes (Posson) — What is the probability that the next arrival comes from the first process? Ey LECTURE 15 Poisson process — 11 * Readings: Finish Section 6.2 ‘+ Review of Poisson process ¢ Merging and splitting © Examples ‘+ Random incidence Review © Defining characteristics Time homogeneity: P(é,7) — Independence = Small interval probabilities (small 4) =, if k=0, PCRS) 1 a4 itk=1, 6, ifk>1 +X; is a Poisson ev., with parameter Ar: _ Orie a . Pky BNs] = var(v,) = Ar ‘# Interarrival times (k= 1): exponential Pr(Q=r™, 20, El] = 1/r ‘© Time Yj to Ath arrival: Erlang(k) diytely file) =F y20 Poisson fishing ‘© Assume: Poisson, = 0.6/hour. — Fish for two hours. = if no catch, continue until first a) P(fish for more than two hours)= b) P(fish for more than two and less than five hours)= ©) P(catch at least two fish)= 4) E[number of tish]= €) Effuture fishing time | fished for four hours}= 1) iftotal fishing time}= Merging Poisson Processes (again) ‘* Merging of independent Poisson processes is Poisson Fd bub ashes (Poisson, A lashes (Po'sson} n bub ashes (Posson) = What is the probability that the next arrival comes from the first process? 30 Light bulb example ‘+ Each light bulb has independent, exponential(a) lifetime ‘¢ Install three light bulbs. Find expected time until last light bulb dies out Splitting of Poisson processes ‘© Assume that email tratfic through a server Is a Poisson process. Destinations of different messages are independent mal Trae leaving MIT Foreign ‘© Each output stream is Poisson Random incidence for Poisson ‘© Poisson process that has been running forever ‘© Show up at some “random time! (really means “arbitrary time") Tine fe ‘+ What is the distribution of the length of ‘the chosen interarrival interval? Random incidence in “renewal processes” = Series of successive arrivals = iid, interarrival times (but not necessarily exponential) + Example: Bus interarrival times are equally likely to be § or 10 minutes, # Ifyou arrive at a “random time’ — what is the probability that you selected 2.5 minute interarrival interval? — what is the expected time to next arrival? a LECTURE 16 Markov Processes — ‘+ Readings: Sections 7.1-7.2 Lecture outline ‘+ Checkout counter example ‘+ Markov process definition ‘+ restep transition probabilities ‘+ Classification of states ‘Checkout counter mode! + Discrete time n= 0,1, ‘+ Customer arrivals: Bernouli(y) = geometric interarrival times '¢ Customer service times: geometrie(a) 2 "State" Xp: number of customers at time n Finite state Markov chains + Xp! state after n transitions — belongs to a finite set, €.9., {2,...,m) — Xo is either given or random ‘+ Markov property/assumption: (civen current state, the past does not matter) Py = Png » = PU 9 Xn 8, Xyaye ee Xo) ‘© Model specification: — Identity the possible states — identity the possible transitions = Identity the transition probabilities restep transition probabilities '* State occupancy probabilities ‘ven initial state & nylon) = Key recursion: rajkn) = Snags — Duy = With random initial state: PCy =3) = So PLKo = Dryln) 32 Example 0.5, 0.8 0.5 02 n=0[n=1 | n=2 [n= 100[n=101 Taney Fae) 1) Tate) Generic convergence questions: 1+ Does rij(n) converge to something? Cao) nods: 122(0= even: 12.260) # Does the limit depend on initial state? Fp 1a)= Recurrent and transient states ‘+ State + is recurrent if starting from 3, and from wherever you can go, there is a way of returning to i ‘+ IF not recurrent, called transient — é transient: PO =) > {visited finite number of times + Recurrent class Collection of recurrent states that “communicate” with each other and with no other state 33 LECTURE 17 Markov Processes — I * Readings: Section 7.3 Lecture outline + Review «Steady-State behavior — Steaay-state convergence theorem = Balance equations ‘© Birth-death processes Review ‘© Discrete state, discrete time, time-homogeneous = Transition probabilities p,; — Markov property # rjQ) = PO = 5 |Xo=) © Key recursion: ri) = Dram — Day z Warmup P(X, =2,X2 = 6,X3=7/ X= = P(X4=7 | Xo=2)= Recurrent and transient states State i is recurrent if starting from i, and from wherever you can go, there Is a way of returning to i ‘+ If not recurrent, called transient © Recurrent class: collection of recurrent states that to each other and to no other state Periodic states «The states in a recurrent class are periodic if they can be grouped into > 1 groups so that all transitions from one group lead to the next group cr) 36 Steady-State Probabi + Do the ry(n) converge to some 1,7 (independent of the initial state 3 © Yes, if — recurrent states are all in a single class, and = single recurrent class is not periodic ‘* Assuming “yes,” start from key recursion ra(m) = Dram — pes = take the limit as n 00 Visit frequency interpretation 5 = Dray © (Long run) frequency of being in j: mj ‘© Frequency of transitions kj: mapas © Frequency of transitions into j> Somupas E ; Dee os 08 An 0 Q os COCO +O 02 CED wieanan ‘© Special case: p, p= p/q=load factor oe Sop, ESOL ym © Assume p <9 and m= 0 (in steady-state) 8 LECTURE 18 Markov Processes — III Lecture outline + Review of steady-state behavior ‘Probability of blocked phone calls ‘© Calculating absorption probabilities © Calculating expected time to absorption Review # Assume a single class of recurrent states, aperiodic; plus transient states. Then, linn, rim) = xy where mj does not depend on the initial conditions: slim, Pn = 3 | Xo=) =x, © ma cum can be found as the unique solution to the balance equations r FAP Aj: together with Asm, Example 0.5 1 = 2/7, ‘© Assume process starts at state 1 © POY = 1, and xy =y= © POX 00 = 1 and X01 = 2) The phone company problem ‘© Calls originate as a Poisson process, rate » — Each call duration is exponentially distributed (parameter 1) — B lines available © Discrete time intervals of (small) lengt (A tele 8 nent wa S% 36 Calculating absorption probabi © What is the probability a, that: process eventually settles in state 4, given that the initial state is i? For i= 4, a= For i=5, a= 4, = Xpiay, for all other # = unique solution Expected time to absorption + Find expected number of transitions 4 until reaching the absorbing state, given that the initial state is 7? Wy 20 fori = For all other m= 1-4 Yom) = unique solution Mean first passage and recurrence times ‘= Chain with one recurrent class; fix s recurrent ‘+ Mean first passage time from i to s: }1X0 t, = Elmin{n > 0 such that Xp #1, fa, 5 bs are the unique solution to & = 0, 6 = 1tS eet — foraies + Mean recurrence time of «: & = Blmingn > 1 such that Xn =s}|Xo= 5) 1+ Syms 7

You might also like