You are on page 1of 15
Chapter 5 et get LetT be the time you spend in the system; let S, be the service time of person iin the queue: let R be the remaining service time ofthe person in service; let S be your service time. Then, E[T]=E[R + S) + Sp + $3 + Sq +S] = FIR] + 5 EIS) + EIS) = 6/1 where we have used the lack of memory property to conclude that R is also exponential with rate j. The conditional distribution of X, given that X > 1,is the same as the unconditional distribution of | + X. Hence, (a) is correct, 1 1 @0 wa Og © by lack of memory. Condition on which server initially finishes first. Now, {Smith is last|server 1 finishes first} = P{server 1 finishes before server 2} by lack of memory a “Xe Similacly, {Smith is last|server 2 finished first} and thus {Smiths st) = | 47 P(X, < Xa) min(X1,Xa)= #) POG < Xo, minX, Xa) = Pimin(X, Xa) =} _ P(X, =X > 0) PERS HP EMH SbF * FORO +fOR® Dividing though by Fy(0F2(0 yields the result (For a more rigorous argument, replace” = ¢” by” © (t+ 6)” throughout, and then let « ~> 0) Let X; have density and tail distribution F, Sper n= "E LPT =O if) LPT =D nOFO, LPT = ARO The result now follows from PUT =i) Dr = P{T =iX >} Condition on whether machine 1is still working at time f, to obtain the answer, a MA at l-eM4e (a) Using Equation (6.5), the lack of memory prop- exty of the exponential, as well as the fact that the minimum of independent exponentials is exponential with a rate equal to the sum of their individual rates, it follows that pay) =< PA) = ae s “Answers and Solutions and, for >1, PAYAL Aya) = Hence, Pa lee () When =2, Pomme 2X) = [Play < xi =) h = [Primacy ne h -[ Oe PRN Me lo = [Pan rem fo 212 eR Da A NF a” EX - 2 DAFA 12. @) PIM << %} = P(X, = min(X1, Xo, Xa} P(X < X3/X1 = min(X, Xz, Xa)} AL = Pee <0 = min(X1, Xz, Xs)} -_— 4 do “Neue eats where the final equality follows by the lack of memory property. (b) PO < XX = max( %,%)} P(X Athen E{S] is minimized when P(1) is as large as possible. Hence, because minimizing E[S] is equivalent to minimizing E[M], it fol- lows that E[M] is minimized when jobs 1 and 3 are initially processed (8) Inthiscase E[M] is minimized when jobs 1 and 2 are initially processed. In all cases E[M] is minimized when the jobs having smallest ates are initiated first. (@) Pia) = < Let C, denote the cost of the # link to be constructed, i=1,....n—1. Note that the first link can be any of the ('3) posible Hinks Given the first one, the second link must connect one of the 2 cities joined by the first link with one of the nt — 2 cities without any links, Thus, given the first constructed link, the next link constructed will be one of 2(1~ 2) possible links. Similarly, given the first two links that are constructed, the next one to be constructed will be one of 3(n ~ 3) possible links, and so on. Since the cost of the first link to be built, is the minimum of ( ) exponentials with rate 1, 2 it follows that e1=1/(3) By the lack of memory property of the exponential it follows that the amounts by which the costs of the other links exceed C; areindependentexponen- tials with rate 1. Therefore, Co is equal to C; plus the minimum of (1 — 2) independent exponentials with rate 1, and so 1 FCA = ELC + say 50 19, 20. 2, “Answers and Solutions Similar reasoning then gives 2. 1 ELCs] = E[C2] + 3-3 and so on. (©) Letting A = Xi ~ Xqy we have EX] = EIXqy] + EIA] 1d de “wnt mint mm te ‘The formula for E[A] being obtained by condi- tioning on which X, is largest. (@) Let! equal if X; < X2 and letitbe? otherwise Since the conditional distribution of A (either exponential with rate jor 42) is determined by I, which is independent of Xin, it follows that A is independent of Xo) Therefore, Var(Xq)) = Var(X¢y) + Var(A) With p = ya /(a1 + #2) we obtain, upon condi- tioning on |, ELA] =p/ua + (1~ p/n E[A?] = 2p/u3 + 20 — p/n} Therefore, Var(A) = 2p/o3 + 20. pia 23 ~ (pins + pis)? Thus, 7 UA Pas} Var(Xy) = 1a + oa)? + phd + = pied (pia + Pin Hs a) P, () Fa Ht en (&) Pp=1 ( Ha ) tm (©) EIT] = A/a + 1/na + Pa/na + Pan 4, Eltime] = Eltime waiting at 1] + 1/11 Eltime waiting at 2] + 1/13 Now, Eltime waiting at 1]=1/)n,, Eltime waiting at 2] = (1/12) rear The last equation follows by conditioning on whether or not the customer waits for server 2. Therefore, Eltime] = 2/pr + (1/2) + p/n + 2) Eltime] = Eltime waiting for server 1] + 1/y + Eltime waiting for server 2] + 1/12 Now, the time spent waiting for server 1 is the remaining service time of the customer with server 1 plus any additional time due to that customer blocking your entrance. If server 1 finishes before server 2 this additional time will equal the addi tional service time of the customer with server 2. Therefore, Eltime waiting for server 1] = 1p + ElAdditional] = Von + Opole (an + 0) Since when you enter service with server I the cus tomer preceding you will be entering service with server 2, it follows that you will have to wait for server 2 if you finish service first. Therefore, condi- tioning on whether or not you finish first Eltime waiting for server 2] = (1/adlon fo + H2)1 Thus, Eleimne] = 2/ps + (2/sadloa/(an + sa) + 2/0 @ 472 (©) 1/2)": whenever battery 1 is in use and a failure occurs the probability is 1/2 that it is rot battery 1 that has failed. © ari, (@) Tis thesum of — 1 independent exponentials with rate 21 (since each time a failure occurs the time until the next failure is exponential with rate 2) iDd (©) Gamma with parameters 1 — 1 and 2u. Let T; denote the time between the (j— 1)!" and the #* job completion. Then the T; are independent, with T,,1=1,...,1~ 1 being exponential with rate y H+ Ha. With probabilit ., Ty is exponen- P iat a tial with rate ya, and with probability exponential with rate p:. Therefore, itis mt 2 eqn) ="¥ Ett + Etted 26. 27. 28. ‘Answers and Solutions 51 Var(t) =", Var(t) + VartT) 1 1 + Var(T Ga nae * Now use . 29. Var(T) = ETE] — (ELT aD weak) ntR in ( jo 1 ia +03 Parts (a) and (b) follow upon integration. For part (6), condition on which of X or Y is larger and use the lack of memory propery to conclude that the amount by which i is larger is exponential rate A For instance, for x < 0, fe—ylxdx = PIX < Y}P{-x < YX < x4 del > X} eM dx For (d) and (e), condition on I L 2a L mame hatte 4 at me + 9 1 5 + (@) = Fis @) 30. ) me mtn te nm @) nt mi WF ts ) 1, om mt © Do+ = Latte mimi m fa m1 3 toa lee Ba Hs Bs ot lati =e lati is @ Die For both parts, condition on which item fails first. hs ds @) BEES 32, oe x» ©) fierce very = Sha. xtve.0 = Ci fxytee—ay = Sxl frle— 2) = Cee), Dex ce aCe PM, O ¥ +] = Elmin(X,Y)|X > ¥,X-Y >] = Elmin (X,Y)|X > Y] where the final equality follows from (a). 38, Let k = min(n,m), and condition on Mz(0) P{NA() =m, Nalt) = mm} = PEN) =H, Nale) = mM) =) fi — Se an Oud, a QA0, Se oI ee YF age at)! 7 39. (a) 196/25 = 784 (b) 196/(2.57 = 31.36 We use the central limit theorem to justify approx: imating the life distribution by a normal distri bution with mean 78.4 and standard deviation 5.6, Inthe following, Z isa standard nor- 313 mal random variable. © P< srayap{z < 72a} Be =P{Z <2} = 0227 (a) P{L > 90} = P{Z > 207} =.0192 (0 Feu toys {2 > M0784} = P{Z > 3857} = 00006 40. ‘The easiest way is to use Definition 5.1. It is easy. to see that {N((), > 0} will also possess station- ary and independent increments, Since the sum of two independent Poisson random variables is also Poisson, it follows that N(?) is a Poisson random, variable with mean (A; + Aa)t da/Or +42) (a) ES) = 4/0 (b) E[Ss\N(1) = 2) E[time for 2 more events] = 1 + 2/4 (©) EIN() —N@)INA) = 3] = EIN) — NQ)] =2 The first equality used the independent increments property Let, denote the service time atserveri,i = 1,2and Jet X denote the time until the next arrival. Then, with p denoting the proportion of customers that are served by both servers, we have PAX > Sy +52} = P(X > Sy}PX > S; + S21X > Sr) mi mee @ or (b) Let W denote the waiting time and let X denote the time until the first car. Then aw)= [~ ELW|X = x]e"Madx -[ BIW X = xp\e™ ae + [Paix = ape ax t = [eet pre Mae Hence, EIW]=T + ef wre Md EIN) = ELELN()|TIN = ELATI = EIT ELTN(T)] = ELELTN(T)ITI = ETAT] = ALT?) EIN*(T)] =E|EIN*(T)(T}] = ELT + (AT!) *y DEIT) + XE Hence, Coo(T, N(T)) = AE[T?] ~ E[T]AEIT] = Ao? 47. “Answers and Solutions 33 and Var(N(T)) = ETT) + MELT? = Mut Ro? — etry? xe xo FLY, Xi=E [a ive] A a = ENOL = wrt Ne No EIN) Y X= [ ING) & Xi ne] a = FuNR] = HOt + Therefore, x Coo(N(b, Xi) = wat + 72) — GA) = pat (@) 1/@u) +1/ (©) Let T; denote the time until both servers are busy when you start with i busy servers i = 0,1. Then, E[To] = 1/A ~ ELT] Now, starting with 1 server busy, let T be the time until the first event (arrival or departure); lot X = 1if the first event is an arrival and let it bbe O if it is a departure; let Y be the additional time after the first event until both servers are busy. F(T] = EIT] + LY] 1 a =p TAR y RE K + BYiX =o K - ity) tt reat Maly ty 48. 49, (©) Let L; denote the time until a customer is lost when you start with i busy servers, Then, reasoning as in part (b) gives that + Ely} wae Fa M aL cerry + eta sept et + ede a H vant et Aes Given T, the time until the next arrival, N, thenum- ber of busy servers found by the next arrival, is a binomial random variable with parameters n and. poet, @) Htv= [etree nd = | nero Mat = / oa For (b) and (c), you can either condition on T, or use the approach of part (a) of Exercise 11 to obtain pyri + De PW=0= [Li qape ie P{N =n—i} A fy (i+ De “areola (a) PEN(T) —N(s) = 1} = AT sje 49 (©) Differentiating the expression in part (a) and then setting it equal 0 0 gives ENT=9 NCP sje implying that the maximizing value is s=T-1/d (©) Fors =T~1/A,wehave that A(T —s) = 1 and thus, PIN() -N@) = 1} = Let T denote the time until the next train arrives; and so Tis uniform on (0, 1). Note that, conditional on T, X is Poisson with mean 7T. (a) EIX] = EIE(X TI = E77] = 7/2 5, “Answers and Solutions (&) EIXT|=77, Var(X'T)=77.By theconditional 55, variance formula Var(X) = 7E[T] + 49VarIT] ayn. 7/24 49/12 = Condition on X, the time of the first accident, to obtain uwcel= [ E10)x ~ siae-Pas -[ “(1+ a(t —syae-Sas I This is the gambler's ruin probability that, start- 56. ing with k, the gambler’s fortune reaches 2k before 0 when her probability of winning each P ty B bet is p= 1/1 + Aa). The desired probability is 1-0a/A* 1-Q2/a)* (@) et b) Tee Mayet (a) P{Ly=0}=e" (b) P(r < xpos) © P{R =1pse%™ (@) P{R, > x) = to rel [' Pee> see 57 =m4 [ PAR > x}ax 58 =m+4 fie Mm dy nati-m) =m + — Now, using that P{L > x}=1—P{L1, P(N =nT=t} PAT = 11N = n)p(t — py ~ Fr) (at) =e aap" cOdapyrt eater = pine where the last equality follows since the probabilities must sum to 1 (©). ThePoisson events are broken into two classes, those that cause failure and those that do not By Proposition 5.2, this results in two indepen- dent Poisson processes with respective rates dp and A(1 ~ p). By independence it follows that given that the first event of the first pro- cess occurred at time f the number of events of the second process by this time is Poisson with mean A(1 ~p}t 74. (a) Since each item will independently, be found with probability 1~ eM" it follows that the number found will be Poisson distribution with mean 1 —~). Hence, the total expected return is RN — eH) — Ct. (©) Caleulusnow yields that the maximizing value of fis given by Lig (BME t= [hee () provided that RAys > C; if the inequality is reversed then ! = (is best. (©) Since the number of items not found by any. time # is independent of the number found (ince each of the Poisson number of items will independently either be counted with proba- bility 1 e* or uncounted with probability e-##) there is no added gain in letting the deci sion on whether to stop at time f depend on the number already found 75. (a) {Yq} is a Markov chain with transition proba- bilities given by Poy = aj. Phin where Es “Answers and Solutions (©) {%q} is @ Markov chain with transition proba~ bilities Pri p= G PaO Mt Pio LG ata 76, Let Y denote the number of customers served in a busy period, Note that given , the service time of the initial customer in the busy period, it fol- lows by the argument presented in the text that the conditional distbution of Y¥—1 i that ofthe compound Poisson random variable . Y;, where the Y; have the same distribution as does Y. Hence, Fly|s] = 1+ ASEM Var(Y $) = XSELY*} Therefor, 1 1 ag Also, by the conditional variance formula Var(¥) = AEISIELY?] + QELYD?Var(S) = AE[SIVar(Y) + XEISIETYD? + (AELY)? Var(s) implying that Var(y) — NELSMEDA? + CAEL)? Var(S) Var(y) = 1 — EIS] 77. 2 +2y iy © Te jot o Wtas (&) Conditioning on N yields the solution; namely =1 dpa © Erw=E sta 78. Poisson with mean 63, 79, Consider a Poisson process with rate A in which an event at time f is counted with probability \(1)/2 independently of the past. Clearly such a process will have independent increments. In addition, P{2 or more counted events in( +} 2events}P{> 2} = [228 ore aay +09 x ‘Ah + o(ft) = Ath + off) 80. (a) No. b) No. (©) P{T, >} = PENG = 0} mi(t) = f "Noms 81, (a) Let 5; denote the time of the ith event, i > 1. Let fi + ij 0: “Answers and Solutions 39 Fig Sallie bal NO =m) = nt [Jen mee) at and the right-hand side is seen to be the joint density function of the order statistics from a set of n independent random variables from the distribution with density function f(x) — (x) mit),x-< (b) Let N(t) denote the number of injuries by time Now given N(t) ~n, it follows from part (©) that then injury instances are independentand identically distributed, The probability (den- sity) thatan arbitrary one of those injuries was ats is N%)/m(), and s0 the probability that the injured party will stil be out of work at time tis p “f Pfoutotwork at finjured at 5} $c — Ft 2%a¢ [Fes ayes Hence, as each ofthe N) injured partes have the same probability p of being out of work at twe see that ELXOING] = Neop and thus, ELX()] = pEIN()] =pm() = [0 Fase as 82, Interpret N as a number of events, and correspond X; to the i event. Let hla, --, k be k nonover- laping intervals. Say that an event from N is a type j event if its corresponding X lies in I, j = 1,2, k. Say that an event from N is a type k + Leventotherwise. It then follows that the num: bers of type j,j = 1, ...,k, events—call these num- bers N(J),j = 1, .., kare independent Poisson random variables with respective means FING =aPEK ch} =A [ fos The independence of the N({)) establishes that the process {N(#)} has independent increments Because N(t + h) — N(t)is Poisson distributed with 83. 84. af fads = Mflt) + off) EING +H) - NO) it follows that, P(N( +H) —N() = 0) —e-AHNOH01H) =1-Aafity + oft) PINE +) NO =1) = (Mf E) + off e O*FO + 00) = Of + oft) AAs the preceding also implies that PENG +I) NO > 2} =0(0) the verification is complete, Since m(t) is increasing it follows that nonover- lapping time intervals of the {N()} process will correspond to nonoverlapping intervals of the {Na()) process. As a result, the independent increment property will also hold for the {N(})} process. For the remainder we will use the identity a(t +h) = mi(t) + Ns + o(h) P{N(t +h) —N(t) > 2} = PNolm(¢ + 1) — Nelon()] > 2} = P{Nolm(t) + A(Hh + o(h)] — Nolmi(t)] > 2} lA + oft] = off) PEN +h) - NO =1) = P{Nalmlt) + AH + o(8)] ~ Nolrm(t)] = 1} = P{L event of Poisson process in interval of length A(h + o(f)]} = AI + oft) There is a record whose value is between t and f + de ifthe first X larger thant lies between # and # + dt. From this we see that, independent of all record values less thatt, there willbe one between and + dt with probability \()dt where X(t) is the failure rate function given by de) = fFD/LL — FOI Since the counting process of record values has, by the above, independent increments we can con- clude (since there cannot be multiple record val- tues because the X; are continuous) that it is a oo 86. 87. 88. 89, “Answers and Solutions nonhomogeneous Poisson process with intensity function X(t). When f is the exponential density, A) =% and so the counting process of record values becomes an ordinary Poisson process with rate $ 40,000 and $1.6 x 10° (@) PIN@ =n} =. 3e GN" jn! + (©) No! (6) Yest The probability of n events in any interval of length t will, by conditioning on the type of ‘year, be as given in (a). (4) No! Knowing how many storms occur in an. interval changes the probability that it is a ‘good year and this affects the probability dis- tribution of the number of storms in other intervals (©) P{good|N(1) = 3} __P{N(Q) = 3igood} P{good} PING) = Sigood}P{good} + PINT), ‘= Sibad}P {bad} Te (5E)" nl - (38/393 (AF 33 + e537 Cool X(t), X(E + 9) = CoulX(0),X() + XE+ 5) — X(0)] = CoolX(0), X(O] + CoulX(H), XC +5) ~ XO] CoolX(#), X(®)] by independent increments = VarlX()] = MED] Let X(13) denote the daily withdrawal. Its mean and variance areas follows: F[X(15)] = 12-15-30 = 5400 Var[X(15)] = 12-15 - [30-30 + 50-50] = 612,000 Hence, P{X(15) < 6000) “| <0 voI2, 000 Ve12, 000, = P(Z < .767} where Zis a standard normal = 78 from Table 7.1 of Chapter 2. Let T; denote the arrival time of the first type i shock, i= 1,2,3. 90, a 92, POX > 5X2 > = PAT > 5,7, >5,Ta >1,Ta >t = P{Ty > 8,T2 > t,Ts > max(s,t)} me hte ate Prat P(X; > 5} =P{X: > 5,X2 > 0} ee meta To begin, note that Pix > 3x = P(X, > Xa}P(%, — Xz > XalXi > Xa} = P(X, = Xp — Xp > Xl, > Xp + Xa} = PUK, — Mav Nyt > XelXy > Me + Xya) = ayy Hence, ofors $x} Sfx fu} enj2-t Mal) =D 1, if bug icontributes 2 errors by ¢ wert (9, ods and so FIM,(9] = P(N) =2) = De “Ore (2) max(X;, Xz) + min(X, Xs) = Xy + Xo () This can be done by induction max{(Xyy- Xe) = max(Xj,max(Xo, Xp) Xn) min(X,,max(Xq, 1%) —max(min(X;,X2), = Xp4 max(Xp, Xe) = Xy+ max(Xp, min (Xi, Xq)} Now use the induction hypothesis. 94, “Answers and Solutions Asecond method is as follows: Suppose X; < Xp <-+- < Xq. Then the coeffi- 95, cient of X; on the right side is 1 [ 7 ] +[2']-["s]+ =qa-1"* 0, ifn ~ (1, isn and so both sides equal X,. By symmetry the result follows for all other possible orderings of the X's. (0) Taking expectations of (b) where X; is the time of the first event of the i" process yields DAT LDesayrt i rs + LEDs +207 roa oa [3a 9%. @ PX>H = P{no events in a circle of area rf} aot «i rads [” Poc> a lo = [ore = [Pe *2dx byx=ivDr 97. Van to = wi where the last equality follows since Lvir [~ Pade = 1/2 since it represents the probability that a standard variable is greater than its mean. normal random a [ age ata FILING) = nj = L——____— [sore onras Conditioning on L yields FIN@)IN® =n) = FLEIN(S)|N() =n, LIIN® =n) = Ela +L(5~0|N() =n =n + (8 DELLING =n] For (c), use that for any value of L, given that there have been m eventsby time tthe set of n event times are distributed as the set of n independent uniform, (0,t) random variables. Thus, for s < EIN(G)|N( = nl = ns/t EING)N(|L] ELEING@)N@)L, NOL] EING)EIN(@)L,NOIILL Stl E[NP()|L] + L¢- 9ELNG)|L] EIN()ING) + Let Ls + (Ls)? + (¢—s)sL? Thus, CookN(s).N(O) = sm + stm — stn With C= 1/P(V@) =n), we have han (Ale) = Ce-™ 2 pov PA fanio() mT = Ke“ ODA where K does not depend on , But we recognize the preceding as the gamma density with param- eters m +m, + , which is thus the conditional density.

You might also like