You are on page 1of 23
Stochastic Processes 2 1. Theoretical background 1 Theoretical background 1.1 The Poisson process Given a sequence of independent events, each of them indicating the time when they occur. We assume va time interval J S [0,-+0{ does only di interval is on the time axis. 1. The probability that an event occ of the interval and not of where th 2. The probability that there in a time interval of length # we have al least one ¢ M+te(), ere \ > 0 is a given positive constant. w tz(t) erval of length ¢ tina time i ave more than one et 3. The probability that we It follows that 4. The probability that there is no event in a time interval of length is given by ly one event in a time interval of length fis At + t2(8)- 5. The probability that there is precisel >0. Here ¢(t) denotes some unspecified function, which tends towards 0 for ¢ Given the assumptions on the previous page, we let X(¢) denote the number of events in the interval JO, t}, and we put P,(t):= P(X() =k}, fork E No Then X(t) is a Poisson distributed random vartable of parameter At. The process {X(t) | t € (0. +90[} is called a Poisson process, and the parameter is called the intensity of the Poisson process Concerning the Poisson process we have the following results: 1) If =0, (ic. X(0) =0), then 1, for k = 0. P= 0, fork EN, 2) It > 0, then P,(t) is a differentiable fimetion, and AL Pe) = Palt)} s for ke Nand £>0, P(t) = -A P(t). for k=O and t > 0. When we solve these differential equations, we get for k € No. proving that X(t) is Poisson distributed with parameter Al. Remark 1.1 Even if Poisson proc ses are very common, they are mostly applied in the theory of tolo-traflic. ©. If X(0) is a Poisson process as d cribed above, then X(s +t) — X(s) has the X(t). thus une distribution as — bok P{X(s40)-X(s a ™ for k € No. If 0 t} = P{X(t)=0}se™™, fort > 0. All random variables 7), T, . Ty are mutually independent and exponentially distributed of pa- rameter A. hence Y= Ty t+ Tete +Th eid 1 is Gamma distributed, Yn EP (ns x Connection with Erlang’s B-formula. Since ¥,,.1 > t. if and only if X(t) the from which we derive that ” ke nl pt co ot | yt dy. ra We have in particular for A th og pom Ron yre"dy, — n E No kao 1.2. Birth and death processes Let (X(t) | €€ (0, +00} be a stochastic process, which can be in the states Eu, Fi. B2. 7 process can only move from one state to a neighbouring state in the follow in state Hy, and we re we receive a negative 1, sense: If the process cceive a positive signal, then the process is transferred to Hx41, and if ignal (und k € N), then the process is transferred to E—1- stead We assume that there are non-negative constants Ay and jig, such that for & é Ny, 1) P{one positive signal in }f,t+ Af] X(t) b} = Agh + he(hi). 2) P{one negative signal in ] t+ hl] X(t) = A} = pte h-+ he(h). 3) P{no signal in Jt.t + hf X(t) = 1 (Ag + pee) Ab he(h). We call Ax the birth intensity at state Ex, and jy is called the death intensity at state Ex, and the process itself is called a birth and death process. Uf in particular all jg = 0. we just call it a birth process, and analogously a death process, if all Ax = 0. A simple analysis shows for k ¢ N and h > 0 that the event (X(t +h) = &} is realized in on of the following ways: © X() =k, and no signal in J0-+ hl, © X(t) ~ 1, and one positive signal in ] t,t +h © X(t) =k +L, and one negative signal in Jf,t +h [ © Mon signals in | 40+ h[. We put Pt) = P(X() = ‘ By a rearrangement and taking the limit h > 0) we easily derive the differential equations of the proc Pott) = Ao Polt) + on Pr(Q)s for k= 0. Put) = (Ae + a) Pelt) + Aca Pr-i(t) + peer Pics s(t): In the special case of a pure birth process, where all jz = 0, this system is reduced to PY) = wi Po(t)s for k= 0. PL(L) = —Ay Pelt) + Aga Poa(t). for ke N. If all Ay > 0. we get the following iteration formnla of the complete solution, Palt) = eyes, for k = 0. Palit) = Apre fem? Pralr)dr ten Mt, for ke N. Fron /)(/) we derive P,(1), ete.. Finally, if we know the initial distribution, we are e.g. at time = 0) in state E,,, then we can find the values of the arbitrary constants cx. Let (X(t) | t € (0, +0c[} be a birth and death process. where all Ay and fix are positive. with the exception of ji = 0. and Ay = 0, if there is a final state 2x, The process can be in any of the states, therefore, in analogy with the Markov chains, such a birth and death process is called irreducible Processes like this often occur in queueing theory. If there exists a state Ey, in which Ay = pig. then Ey is an absorbing state. because it is not possible to move away from Ey. For the most common birth and death processes (including all irreducible processes) there exist non- negative constants py, such that PUI) pe and P(t) 0 for + boo. These constants fulfil the infinite system of equation Heal Pea = Ap pee for k € No, which sometimes can be used to find the Dk If there is a solution (p;,), which satisties +0 pe > 0 for all k € No, and Spe = b=0 we say that the solution (pj) is a stationary distribution. and the px are called the stationary proba- bilities. In this case we have P,.(1) > Pr for | =» +oe. is, then If {X(1) | &€ [0, +o0[} is an irreducible proc Apetdn—2 2 AAO PR = ——————___ Po: Pk aaa Hal Qk; Pos for k € where all ay > 0. an is © exis! stati istribution i reduced to that the ser xii ,ationary distribution is then rec he condition of the existence of a stationary U. In this case we have po = 7 convergent of finite sum a > 1.3. Queueing theory in general Let {X(t) | # € {0,+0¢[} be a birth and death process as described in the previous section, We shall consider then: as services in a service organization, where “birth” corresponds to the arrival of a new customer, and “death” correspond to the ending of the service of a customer. We introduce the followin, 1) By the anneal distribution (the arrival process) we shall understand the distribution of the arrivals of the customers to the service (the shop). This distribution is often of Poisson type. 2) It the arrivals follow a Poisson process of intensity A, then the random: variable, which indicates the time difference between two succeeding arrivals exponentially distributed of parameter A. We say that the arrivals follow an exponential distribution. and A is called the arrival intensity. 3) The queueing system is described by the number of shop assistants or serving places, if there is the possibility of forming quenes or not, and the way a queuc is handled. The serving places are also called channels. 4) Concerning the service times we assume that if a service starts at time 4, then the probability that 1+ h[ is equal to it is ended at some time in the interval | nh+he(h), — where p> 0. Then the service time is, exponentially distributed of parameter 1 If at time £ we are dealing with & (mutually independent) services, then the probability that one of these is ended in the interval |t,t + h[ equal to kuhthe(h) We shall in the following sections consider the three most commion types of queueing systems. Concern- ing other types, cf. eg. Villy Baik Iversen: Teletraffic Engineering and Network Planning Technical University of Denmark, 1.4 Queueing system of infinitely many shop assistants The model is described in the following way: Customers arrive to the service according a Poisson process of intensity A, and they immediately go to a free shop assistant, where they are serviced according to an exponential distribution of parameter 44 ‘Lhe process is described by the following birth and death proc {X(t1) [LE [0,+oc[} med Ay = A and py = Ky for alle k, The process is érreducible, and the differential equations of the system are given by Pitt) = ~A Pot) + WP. for k= 0. Put) (A+ by) Pet) +A Peale) + (k++ We Peal): for KEN. The stationary probabilities exist and satisfy the equations (K+ Ua pesr = Ape, REN, of the solutions nal (ay d maa) ep -3) KEN. These are the probabilities that there are & customers in the system, when we have obtarned equality rium. ‘The system of differential equations above is usually difficult to solve. One has, however. some partial results, e.g. the expected number of customers at time I, i mt) = 0k Pelt) imi satisfies the simpler differential equation m'(t) + emit) = 2. If at time t = 0 there is no customer at the service, then d m(t) =o (-e") for t > 0. H 1.5 Queueing system of a finite number of shop assistants, and with form- ing of queues We consider the case where 1) the customers arrive according to a Poisson process of intensity A, 2) the service times are exponentially distributed of parameter 41, 3) there are NV’ shop assistants, 1) it is possible to form queues. Spelled out, we have AV shop assistants and a customer, who arrives al state Ex. If & < N, then the to a free shop assistant and is immediately serviced. If however k ~ thus all shop then he joins a queue aud waits until there is a free shop assistant. We assume here queucing culture. customer go assistants are busy. With a slight change of the notation it follows that if there are NV shop assistants and & customers (and not & states as above), where k > N. then there is a common queue for all shop assistants consisting of k Y customers. This process is described by the following birth and death process {X(1) | 1 € [0 -oe[} of the parameters kp. for k< Ae= dX and pe = Nu for k > The process is irreducible. The equations of the stationary probabilities arc (k4 Dupes =Ape fork oN N ppt = Ade We introduce the traffic intensity by A = - Nu Then we get the stationary probabilities d\f 4 tave (4) pin Se om. for k< N »\" 1 w i N Remark 1.2 ‘logether with the traffie intensify one also introduce in teletraffic the offer of traffic. By this we mean the number of customers who at the average arrive Lo the system in @ time interval of p= for & > N. Jength equal to the mean service Lime, In the sit Xd . ion above the offer of traffic is . Both the tratlic H intensity and the offer of traffic are dimensionless. ‘They are both measured in the unit Erlang. The condition that (p,) become stationare probabilities is that the traffic intensity 9 < 1. where If, however, 9 > 1. it is casily seen that the queue is increasing towards infinity. and there does not exist a stationary distribution, We assume in the following that g < 1, so the stationary probabilities exist 1) IPN =1, then Pe = oF (1 = 9), for k € No. 2) If N = 2. then - _~ for k= 0, l+o Pe = 2g ' : 2" Ite fork eN. a) rN> . the formule: become somewhat complicated, so they are not given here The average number of custome at the service is under the given assumptions. Q for T-@ or VN = 1, iS K pr. generelt (naturligvis). The average number of busy shop assistants is oO. for N= 1, pe rynen~ , in general. we ‘rh prt N Soren Pky in gene The waiting time C1 er fi ig time of a customer is defined as the time el psed from his arrival to the service of him starts. The staying time is the tin i ne is e time from his arrival until he leaves the system afte le service him. Hence we have the splitting stem ater ce saree of staying time waiting time + service time. The average waiting time is in general given by +9 for N =1, wNDG-oe generelt In the special case of N = 1 the average staying time is given by a 1 1 Wl-a th BHA The average length of the queue (i.e. the mean number of custumers in the queue) is oNth NS 1.6 Queueing systems with a finite number of shop assistants and without queues We consider here the case where 1) the enstome s arrive according to a Poisson process of intens! 2) the times of service are expouential distributed of parameter (1 3) there are N’ shop assistants or channels, 4) it is not possible to form a queue. ss at a Lime when all shop assistants also called a systen uf rejection, The difference from the previous section is that ifa customer a are busy, then he inunediately leaves the system, ‘Therefore, t In this ease the process is described by the following birth and death process (X(t) | £€ [0. Fel} with a finite number of states Eo, Ei. ..., Ey, where the intensities are given by d. fork <.N, an = and yy = ky 0. fork >N This process is also irreducible, The correspomting system of differential equations is Pa(t) = —A Polt) + w Pile). for k = 0. PU) = (AF kp)Pa(t) +A Proalt) + (b+ Ve Phsilt) forlSkSN-1 P(t) = —N p Pxil) + Px), fork=N. In general, this system is too complicated for a reasonable solution, so instead we use the stationary probubilities, which are here given by Erlang’s B-formula: n(3) EE, for 01.2.0, Ek OY The average number of customers who are served, is of course equal to the average nuruber of busy shop assistants, or channels. The common yalite is Pk= Dem = 4 apy). oH We notice that py can be interpreted as the probability of rejection, This probability py is lage. when A >> jt. We get from N () a xP( =) gee +4) -—t | y ody. WG I]. i Mn the probability of rejection 7 Some general types of stochastic processes Given two stochastic processes, {. eT] nh sses, { V(t) | te T} a s€T}, wl moments below exist, We define “| engi aaa ewe assume that all the 1) the mean value function, m(t)= FAX(Q}. for be T. 2) the auto-correlation, R(x.) E{X(s)X(t)}, for s, t€ T, 3) the aulo-covariance, Cs.) Cov(X(s), Y (1) for 8, LET. 4) the cross-correlation, Ryy(s,t):= E{X(s)V()}, for s, te T. 5) the cruss-covariance, Cxy(s.t) = Cov(X(s).¥ (0), for s. eT. A stochastic process {X(l) | R} is slrielly stationary, if Ube translated process {X(I +h) | 1€ Rp for every h € B has the same distribution as {X(0) | 1 € Rp In this case we have for all n € N, all ary, 1%, @ Rand all ty. ....6, €R that PAX (tr 4H) Sty No AX (ta Fh) Sn} does uot depend on h € R Since P{X(t) <2} does not depend on t for such a process, we have m(t) =m. and the auto-covariauce C(s.t) becomes a function in the real variable s— &. We therefore write ht this case, Cs, N= Els - 0. Analogously, the auto-correlation is also a function only depending on s and [, 80 we write R(s,1) = Ris 0). Conversely, if m(t) =m and C(s.t) = C(y = 2), then we call the stochastic process {X(#) | ¢€ B} weakly stationary. Let us consider che 4 7 ider a stochastic process {X(t) | f € R} of mean 0 and auto-correlation R(t) = E{X(t+7)X(O}. If R(>) is absolutel y integrable, we define the effect spe Slw) = / e°" R(x) ar. run by i.e. as the Fourier transformed of R(r). Furthermore, if we also assume that S(w) is absolutely, integrable, then we can apply the Fourier inversion formula to recons Rr) from the effect spectrum, In particular, |X@P} = RO) [A stochastic process {X (2) | £€ T} is called a normal process, or a Gaujpiann process, if for evers ee andl every fhe ses fy & 7 the distribution of (X (f4),o--+ X (dn) #8 aut n-dimensional norma] distribution. A normal process is always completely specified by its mean value funclron m(1) and its ‘auto-covariance function C(s,1)- ‘The most important normal process is the Wiener {Ww [12 0} process, or the Brownian movements This is characterized by 1) W(0) = 0, 2) m(t) = 0, 3) V{IK(Q} sat, — where a is a positive constant, 4) umtually independent increments 2 The Poisson process Example 2.1 Let {X(t), t € (0, 00[} be @ Poisson process of intensity \, and let the random va T denote the Lime when. the | ne Occurs. Find the conditional distribution of T, giv hat at time ty precisely one event has occurred, t PAT $t| X (to) = 1}. 1 € [0, fo], then the conditional distribution is given by P{X(t) SLA X (to) = _ P{X(t)'= LA X (to) — X(t) = 0} P{T 0} intensity Ay and Ag, resp., and let the process {¥(t). t YO) = Xill) + X2(e). Prove that {¥(t), 1 2 O} is a Poisson proc denote two independent Poisson processes of O} he defined by We first identity Pa(t) = P{X(t) =n} and Quit) = P{X(t) =n} = Got. at nl We get from Xy(£) and X2(t) being independent that PAY(t)=n} = PG (t) + Xo(t) =n} _ ye ST a, Qt oat = Tren an P{Xal(t) =n > qo ae n! = yoy inom (5 jx yy Ei eane = (A. +2)" . exp (= (Ai + A2) 4) Tt follows that {¥(4), O} is also a Poisson process (of intensity Ay +2) Example 2.3 A Geiger counter only records cvery second particle, which arrives to the counter Assume that the particles arrioe according lo a Poisson process of intensity \. Denote by N(t) the number of particles recorded in J0.t], where we assume that the first recorded particle is the second to arrive, 1. Find P{N(t) =n}. n € No. 2. Find E{N(t)} Let T denote the time difference between two succeeding recorded ar rivals. 3. Find the frequency of T. 4. Find the mean E{T} 1. It follows from (Qy" Pith= Rem we Ny. that Aye" (a og Pt I Palt) + Pavitt) = (OO. LOO} ‘ (at Rati (QQn+L+dte ne Ny. 2. The mean is EIN(O} = SoaP{N(Q=nsse™ — oe Pe ane ‘ {ae mt 0 Sm n(atyentt (n+ 1! = 1S aeeet (Qn +1)! 22 ns 1) N = on{t vinta + ME (eosh = 1) = BGs at = an} AM 1 Pho Dotcom} 3. & 4. It follows from T=, + Tj41 that T'¢ 1 (2. i: thus the frequency is Moe for « > 0, f(z) = 0 for x and the mean is EAT) = 5 Example 2.4 From a ferry port a ferry is sailing every quarter of an hour. Each ferry can carry N cars. The cars are arriving to the Jerry port according to a Poisson process of intensity \ measured m quarter—*). Assuming that there is no car in the -y port immediately after a ferry has sailed al 9%, one shall 1) find the probability that there is no car waiting at 9° (immediately after the departure of the neat ferry) 2) find the probability that no car is waiting at 9°" (ammediately after the departure o| the next ferry). ly ? 9) A motorist arrives al pPTE. What is the probability that he will not catch the ferry at p'. but p instead the ferry at 9°? Measuring ¢ in the unit quarter of an hour we have QU” ae P{X() =n} = néNo. 1) From ¢ = 1 follows that the wanted probability is PIX() SN 2) We have two possibilities: a) Either there has arrive i [ither there has arrived during the first quarter of an hour i ° the average length of the queue = 2) If instead the service is improved as indicated. then the parameters become Nel unchanged, jr is doubled. ‘The average waiting time is then 0 121 Ble) 1G and the average staying time is 7 1 20 1 Way wh Remark 4.2 Again we add for completeness. the average ummber of customers = 3 ther average number of busy shop assistants = the average length of the queue By comparing the two cases we get Vi<¥a, and on the contrary Oy > Oo. and the question docs not have a unique answer, ‘The customer will prefer that the sum of waiting time and service time is as small as possible. Wt0= Hee Fog aM Ve Os it follows that the customer will prefer the latter would prefer, because we do not know the c em, while it is far more uncertain what the shop sts of each of the two possible improvements. 4.3 We consider an i joing « left-hand pa intersection which is not controlled by traffic lights. One has noticed Therefore, one plans to build g ene een and therefore delay the cars which ere going streaght on : @ left-hand turn lane, Assuming that arrivals and departures of the cars doing the left-hand turn are ¢ iponentially distributed with the parameters \ and p, where > = } shall compute the smallest number than 5 % of the onc in os of cars of the planned left-hand turn lane, of the probability ts less event that there are more cars than the new lane can contain. Here N = 1, so the capacity of the system is ‘The stationary probabilities are 1\ eH pe=h(1~ 9) = (3) , KEN, Let. n denote the maximun number of cars in the left turn lane, Then we get the condition eS eI ke 1 1 o med (3) = yar <3 % nah ent thus i < a which is fulfilled for n > 4. Example 4.4 Given a queueing system of exponential distribution of arvivaly und exponential distre- 1 bution of service times (Uke means are called + anid —, resp.). ‘The number of service places is 2. We z 1 L furthermore assume thal it is possible to form a quewe. Assuming thal 5 = 1 (rninute) and — H (minute), Il 1. find the averaye waiting time, 2. find the average staying time. For economic reasons the wuinber of service places is cut down from 2 to 1, while the service at the same time is simplified (so the service time is decreased), such that the customer's average staying time is not prolonged. Assuming that the constant A is unchanged, 3. find the average service time —, such thal the average slaying Hime in the new system is equal to im the average staying time in the previous mentioned system, 4. find in which of the two systems the probability is largest for a customer to wart. ) Th erage t Ss 1 1é averag iti average waiting time i: ya PO oN 1 NUL—e)? oa dye minute The stayi ime i aying time vaiti i i ying time is the waiting time plus the serving time, so the average staying time is 1 1 1 4 : O=V4+—=54¢]l=qii . =5 = = minute. uo8 3 In the new system the traffic intensity is » 1 a=aaFT idet Ni = TAL rae The average waiting time is for Nj given by some theoretical formula, a 1 Wea 7mm ya (1 ~ @1) ya a 72) = 1 given by and the average staying time is for o ene te pee ino l We want that O,=O= _ » Hence, op — = e a 1 fe fly = ie 1 4 Wy 7 4) The probability of waiting in the old system is l-o 1~@ Poo ki_lt —p, =1-——-2 Se l-po-pi=l ite rrr 3 3°33 The probability of waiting in the new system is ne l=fei1-(U-a)aa= = 7 Ve yy Ci ere is largest probability vaiting in the new system. jgon that there is largest probability of waiting see by compariso’ We see by co}

You might also like