You are on page 1of 265
STOCHASTIC PROCESSES Second Edition Sheldon M. Ross University of California, Berkeley | JOHN WILEY & SONS, INC. | ‘New York * Chichester * Brisbane * Toronto * Singapore ACQUISITIONS EDITOR Bad Wiley 11 MARKETING MANAGER Debra Rieger SEMIDR PRODUCTION EDITOR Tony VonOrsis MANUFACTURING MANAGER Dorothy Scar ‘TEXT AND COVER DESIGN A Good Thing, In. PRODUCTION COORDINATION Elm Siret Publishing Series, Ie “This book was et in Times Roman by Bi-Comp, Ie. end printed ad bound by CourenStoughton, The over Wes printed By Phoenix Cole Recognizing the importance of preserving what has boen wet, it sa poe of John ‘Wily & Sons, Ie to hve Books of enduring value published in the United Sats printed on aci-tree paper and we exert ou Bet efor 0 that end, ‘The paper ie this book was manticure by il whose forest management progrtns ingle ssained yes harvesting of its inbeviands. Sutsined yield harvesting principles ‘tear thatthe auiber of tees cu each year does Got exceed the amount of new BrOwt, Copyright © 1996, by John Wiley & Sons, Ine. Al ig reserved. Published simultaneous io Canada Reproduction of tanlation of 2 pat of thir woek beyond that pert by Sections 107 and 108 of the 1976 United States Copyright ‘Ac without the permission ofthe copyright ‘owaer i unlawful. Requests for permission ‘or further information should be addressed to the Permissions Departmen, John Wily & Sons, Ine Library of Congress Cataloging-tn-Publcation Deva: Ros, Sheldon M. ‘Stochastic proctssesSheldon M. Ross-—2nd ed. Ioclaesbibigrapical references and index ISBN 0471-12062. otal paper) 1 Stochastic processes. L Tile OADERES 1996 siga—-a20 osgoi2 cr ot in the United states of Ameria 1098765432 On March 30, 1980, a beautifl six-year-old girl died. This book is dedicated to the memory of Nichole Pornaras Preface to the First Edition ‘This text is a nonmeasure theoretic introduction to stochastic processes, and as such assumes a knowledge of calculus and elementary probability. In it we ‘attempt to present some of the theory of stochastic processes, 10 indicate its diverse range of applications, and also to give the student some probabilistic {intuition and insight in thinking about problems. We have attempted, wherever possible, to view processes from a probabilistic instead of an analytic point of view. This attempt, for instance, has led us to study most processes from a sample path point of view. would like to thank Mark Brown, Cyrus Derman, Shun-Chen Niu, Michael Pinedo, and Zvi Schechner for their helpful comments. SHELDON M. Ross La nn CC AR A A Preface to the Second Edition ‘The second edition of Stochastic Processes includes the following changes: (@) Additional material in Chapter 2.on compound Poisson random vari ables, including an identity that can be used to efficiently compute moments, ‘and which leads to an elegant recursive equation for the probability mass function of a nonnegative integer valued compound Polsson random variable; (i) A separate chapter (Chapter 6) on martingales, including sections on the Azuma inequality; and (iil) Anew chapter (Chapter 10) on Poisson approximations, includi both the Stein-Chen method for bounding the error of these approximations and a method for improving the approximation itself. Tn addition, we have added numerous exercises and problems throughout the text. Additions to individual chapters follow: Tn Chapter 1, we have new examples on the probabilistic method, the ‘multivariate normal distribution, random walks on graphs, and the complete ‘match problem. Also, we have new sections on probability inequalities (includ- ing Chernoff bounds) and on Bayes estimators (showing that they are almost never unbiased). A proof of the strong law of large numbers is given in the Appendix to this chapter. 'New examples on patterns and on memoryless optimal coin tossing strate- gies are given in Chapter 3, ‘There is new material in Chapter 4 covering the mean time spentin transient states, as well as examples relating to the Gibb’s sampler, the Metropolis algorithm, and the mean cover time in star graphs. ‘Chapter 5 includes an example on a two-sex population growth model. Chapter 6 has additional examples illustrating the use of the martingale stopping theorem. ‘Chapter’ includes new material on Spitzer's identity and using it to compute ‘mean delays in single-server queues with gamma-distributed interarrival and service times. Chapter 8 on Brownian motion has been moved to follow the chapter on ‘martingales co allow us to utilize martingales to analyze Brownian motion, PREFACE TO THE SECOND EDITION ater on stochsticorder relation now includes section on associated art on ee ellos new examples ilzingcoulingin coupon calle in packing probs. Ing an TS nan al tose who were kind enough fo writ and send cone re the ha edition, wit particular thanks to He Sheng, Same ehorn, Reet Ker, nes Mats, Erol Pekoz, Maria Rieder, sere Me Rol fr thes many hepfl comments : ssto0n M. Ross Contents CHAPTER 1. PRELIMINARIES 1 LL. Probability 1 1.2, Random Variables 7 13, Expected Value 9 14, Moment Generating, Characteristic Functions, and Laplace Transforms 15 15. Conditional Expectation 20 15.1. Conditional Expectations and Bayes Estimators 33 1.6, ‘The Exponential Distribution, Lack of Memoi Hazard Rate Functions 35, 1,7. Some Probability Inequalities 39 18. Limit Theorems 41 1.9. Stochastic Processes 41 Problems 46 References 55 Appendix 56 and CHAPTER 2. THE POISSON PROCESS 59 2.1, The Poisson Process 59 22. Interarrival and Waiting Time Distributions 64 23. Conditional Distribution of the Arrival Times 66 23.1. The MIGII Busy Period 73 2.4, Nonhomogeneous Poisson Process 78 25. Compound Poisson Random Variables and Processes 82 25.1. A Compound Poisson Identity 84 252. Compound Poisson Processes 87 26. CHAPTER 3. 3a. 33. 34. 35. 36. BR. 38. CHAPTER 4. 4a, 42. 43. 4a. 45. 46. ‘CONTENTS Conditional Poisson Processes 88 Problems 89 References 97 RENEWAL THEORY 98 Introduction and Preliminaries 98 Distribution of N() 99) Some Limit Theorems 02 33.1. Wald’s Equation 104 332. Back to Renewal Theory 106 ‘The Key Renewal Theorem and Applications 109 34.1. Alternating Renewal Processes 114 3.42. Limiting Mean Excest and Expansion of ma) 119 3.43. Age-Dependent Branching Processes 121 Delayed Renewal Processes 123 Renewal Reward Processes 132 36.1. A Queueing Application 138 Regenerative Processes 140 37.4. The Symmetric Random Walk and the Are Sine Laws 142 Stationary Point Processes 149 Problems 153 References 161 MARKOV CHAINS 163 Introduction and Examples 163 Chapman-Kolmogorov Equations and Classification of States 167 Limit Theorems 173 ‘Transitions among Classes, the Gambler's Ruin Problem, ‘and Mean Times in Transient States 185 Branching Processes 191 ‘Applications of Markov Chains 193 46.1, A Markov Chain Model of Algorithmic Efficiency 193 4.62. An Application to Runs—A Markov Chain with @ Continuous State Space 195 463. List Ordering Rules—Optimality of the ‘Transposition Rule 198 4.7. Time-Reversible Markov Chains 203 48. Semi-Markov Processes 213 Problems 219 References 230 CHAPTER 5. CONTINUOUS-TIME MARKOV CHAINS 231 Sl, Introduction 231 52. Continuows-Time Markov Chains 231 53.-Birth and Death Processes 233 54. The Kolmogorov Differential Equations 239 5.4.1. Computing the Transition Probabilities 249 55. Limiting Probabilities 251 56. Time Reversibility 257 56.1. Tandem Queues 262 562. A Stochastic Population Model 263 5.7, ‘Applications of the Reversed Chain to Queueing Theory 270 S74. Network of Queues 271 57.2. The Eriang Loss Formula 275 5.73. The MIGII Shared Processor System 278 58 Uniformization 282 Problems 286 References 294 CHAPTER 6. MARTINGALES 295 Introduction 295 61. Mariingales 295 62. Stopping Times 298 63. Azuma’s Inequality for Martingales 305 64. Submartingales, Supermartingales, and the Martingale Convergence Theorem 313 65. A Generalized Azuma Inequality 319 Problems 322 References 327 CHAPTER 7. RANDOM WALKS 328 Introduction 328 7A. Duality in Random Walks 329 xiv 12. 13. 14 15. 84. 8s. 86. 87. 88 CHAPTER 9. 94 Some Remarks Concerning Exchangeable Random Variables 338 Using Martingales to Analyze Random Walks 341 ‘Applications to G/G/1 Queues and Ruin Problems 348 7A. The GIGI Queue 344 742. A Ruin Problem 347 Blackwell's Theorem on the Line 349 Problems 352 References 355 |. BROWNIAN MOTION AND OTHER MARKOV PROCESSES 356 Introduction and Preliminaries 356 Hitting Times, Maximum Variable, and Are Sine Lays 363 Variations on Brownian Motion 366 83.1. Brownian Motion Absorbed at @ Value 366 832. Brownian Motion Reflected at the Origin 368 833. Geometric Brownian Motion 368 834. Integrated Brownian Motion 369 Brownian Motion with Drift 372 84.1. Using Martingales to Analyze Brownian Motion 381 Backward end Forward Diffusion Equations 383 Applications of the Kolmogorov Equations to Obtaining Limiting Distributions 385 86.1. Semi-Markov Processes 385 862. The MIGII Queue 388 863. A Ruin Problem in Risk Theory 392 ‘A Markov Shot Noise Process 393 Stationary Processes 396 Problems 399 References 403, STOCHASTIC ORDER RELATIONS 404 Introduction 404 Stochastically Larger 404 contents 92. 93, 94. 95, 96. 91. CHAPTER 10. 10.4, 102, 103, Coupling 409 921, Stochastic Monotoniiy Properics of Birth and Death Process 416 922, Exponential Convergence in Markov Chaine 418 Hazard Rate Ordering and Applications to Covnti Processes 420 * . Likettood Ratio Ordering 428 Stochascally More Variable” 433 Applications of Variablly Orderings 37 964. Comparison of GIGIT Queves 439 942. ARetewal Process Appliation 40 963. A Branching Process Application 443 Asocated Random Variables 6 Problems 449 References 456 POISSON APPROXIMATIONS 457 Introduction 457 Brun’s Sieve 457 ‘The Stein-Chen Method for Bounding the Error of the Poisson Approximation 462 Improving the Poisson Approximation 467 Problems 470 References 472 ANSWERS AND SOLUTIONS TO SELECTED PROBLEMS 473 INDEX 505 CHAPTER 1 Preliminaries 1.1 Propasmity A basic notion in probability theory is random experiment: an experiment Whose outcome cannot be determined in advance. The set of all possible ‘outcomes of an experiment is called the sample space of that experiment, and we denote it by S. ‘An event is a subset of a sample space, and is said to occur if the outcome of the experiment is an element of that subset. We shall suppose that for each event E of the sample spate $a number P(E) is defined and satisfies the following three axioms*: Axiom (1) 0= P(E) S11 Axiom (2) P(S) = 1. Axiom (3) For any sequence of events E,, E:, ... that are mutually exclusive, that is, events for which E,E) = @ when j (where is the nal set), (ge) We refer to P(E) a8 the probability of the event E. Some simple consequences of axioms (1), (2), and (3) are: ALA IEE C F, then P(E) = PUP). 12. PEE P(E) where E* is the complement of E. LL3. PCS &)) = 31 PCE;) when the E; are mutually exclusive. 1A, P(U; E) = 3) PCE). ‘The inequality (1.1.4) is known as Boole's inequality. * Aciualy P(E) will only be defined forthe so-called measurable events ofS. Bu this estition aed no concern 1 2 PRELIMINARIES ‘An important property ofthe probability function Ps that it is continuous. ‘To make this more precise, we need the concept of a limiting event, which wedefine as follows: A sequence of events {E,,n = 1}ssaid tobe an increasing Sequence if Ey C Basi, m= 1 and is said to be decreasing if E, 2 Bnei = 1 It (E,, n = 1} is an increasing sequence of events, then we define a new event, denoted by lim... E, by B=) B when B,C Bua m1. Similarly if {E,, n = 1} 8 a decreasing sequence, then define lim... En by fee Ae, wee ne. ‘We may now state the following: {OPOSITION 4.3.1 IS {E,, » = 1) is either an increasing or decreasing sequence of events, then lin 16) = (in) Proof Suppose, first, that {E,, m 2 1} isan increasing sequence, and define events Fane iby R- “That is, F, consists of those points in E, shat art notin any of the earlier E, 1< 1 Ins easy to verily that the Fare mutually exclusive events such that Ur-Ue ma Ur=Ue toratinet PROBABILITY : Thus tim SPC) = hn PEED) which proves the result when (E,, 2 18 =m 2 1) increasing Ut (Em n= 1) 4 deresing sequene, hen ‘quence: hence, oe De), »(We)=tereo But, as UF Es = (7 E,)', we see that 1-(Ae) U-PED or, equivalent, hich proves the result Exnne4 s(n) Consider a population consisting of individual able to prodice ofgpring of the same kind. The numberof individuals initially present, denoted by Xo is called the size of the zeroth generation. All offspring of the zeroth generation constitute the first generation and their number is denoted by X;. In general, let 1X, denote the sizeof the nth generation. Since X, = Oimplies that Xe = 0, it follows that P{X, = 0} is increasing and thuslim,« P(X. ~ 0} exists. What does it represent? ‘To answer this use Proposition 11.1 as follows: {1imx, = 09} {y%-9} {the population ever dies out}. Jim PEX, = 0} ‘That is, the limiting probability that the th generation is void of individuals is equal to the probability of eventual extinction of the population. Proposition 1.1.1 ean also be used to prove the Borel-Cantelli lemma. PROPOSITION 1.2.2 “The Bore-Cantelli Lemma Let Ey, Es,--- denote a sequence of events. If Sree, then Plan infinite number of the F; occur } Proof The event that an infinite aumber ofthe F, occur, called the lim sup bbe expressed as, finsp = A UE ‘Tis follows since if an infinite number of the F, occur, then Uj. occurs for each thus Mr Uz, , occurs. On the other hand, if 3, Ure &, occurs, then [Uns E,oceurs for each n, and thus foreach n at least one ofthe occurs where i= ‘ns and, hence, an infinite number ofthe E, oceu. PROBABILITY, ‘As Ul Bim & 1, isa decreasing sequence of event, it follows from Proposition 11 that and the result is proven Exams 1.40) Let Xi, Xo... be such that PIX, Unt 1- PIX, th one. It we let E, = {X, = 0), then, as 2; P(E,) < eit follows from the Borel-Cantelli lemma that the probability that X, equals 0 for an infinite number of 7 is equal to 0. Hence, for ail n sufficiently lng, Xq must equal, andso we may conelude that, with proba iy 1, lim X, For a converse to the Borel-Cantelli lemma, independence is required, PROPOSITION 4.1.3, Converse to the Bore!-Cantell Lemma IE), Ey... are independent events such that Srey-«, then lan infinite number of the E, occur} = 1. 6 PRELIMINARIES Proof tte rite) =i) 8} -ser(C) [:-(Q)] Now, (Aa) flee erinegentney = fre - PED) afer cyteingutyt x0") ~en(-$ 706) =0 since $6) = fora Hen he et ese Exam 1:20) Let Xi, X;,..-be independent and such that PIX, = 0} = Un =1-P(X.= 1, neh Af we let Ey = = @ it follows from Proposition 1.13 that £, occurs. infinitely often. Also, as Zyei P(ES) = 2 it also follows that E;, also occurs infinitely often. Hence, with probability 1, X, will equal 0 infinitely often and will also equal | infinitely often, Hence, with probability 1, X, will not approach a limiting value as n> ©. RANDOM VARIABLES. 7 1.2 Ranpom VARIABLES Consider a random experiment having sample space S. A random variable X is a function that assigns a real value to each outcome in S. For any set of real numbers A, the probability that X will assume a value that is contained in the set A is equal to the probability thatthe outcome of the experiment is contained in X°¥(A). That i, PIXE Ab = PUX(A)), ‘where X-(A) isthe event consisting ofall pints s € S such that X(3) € A. ‘The distribution function F of the random variable ¥ is defined for aay real number x by Fa) = PIX Sa} = PIX € (—™, 2). We shall denote 1 ~ F(a) by F(x), and so Flay = PX >a, A random variable X is said to be discrete if its set of possible values is countable, For diserete random variables, FQ) = 5 PX= yh ‘A random variable is called continuous if there exists a function f(x), called the probability density function, such that P(Xisin B}= [, /() dx for every set B. Since F(x) fla) di Flows that soy - Zr. The joint distribution function F of two random variables X and Y is de- fined by F(x, y) = PIX =x, ¥ Sy) The distribution functions of X and ¥, Fy) = P(X Sx) and Fry) = PLY = yh,

You might also like