Professional Documents
Culture Documents
A First Course in Stochastic Processes Karlin S Taylor H M PDF
A First Course in Stochastic Processes Karlin S Taylor H M PDF
@ao avomumy Ao ns)
z a ie) Pd20cpeypeueny : ae
ES) re ta) OTR
Wy dy etd
‘ v we
ssn — ume ogounf manag, sopumcan f2 einen sro SOT wosmay
aarp meet
mse aire
suoyanquisiq Amigeqoud exo981q perewunooua Apuenbedd eos
wataveREVIEW OF BASIC TERMINOLOGY 7
Let |, || be an n x n aymmetric positive definite matrix, let by," be
the iaverse matrix of ‘a,| and let B— det! be the determinant of
[bjpy. Let m,, 7= 1 be any real constants, The random variables
Xq. ony Xq are said to have a joint normal distribution if they possess a
probability density function of the form
VA
1
Pleo He) 9g Os 9 A)
bp— mig] sly — my)
i]
The joint characteristic function is
fonteS XI)
Ola td =
~wslefim sinytl
From this one can compute
EX ]=m, for on
and
EUX,— m)(X,— my] =a,
which justifies the name covariance maarix for the matrix |/a,|)
From the nature of the characteristic function it is easily checked that
X, have a joint normal distrilmation if and only if ¥— aX, +--+
|'a,X, Nas a normal distribution for every choice of real numbers
(b) The Multinomial Distribution
‘Uhis is a disercte joint distribution of r vasiables in which only nome
exative integer vahies 9, .... 2 ure possible, Tt is dofined by
PrLX,
where p,> Oi
‘Tho joint generating function is given by
AASt5 8) = BLE oo 8]
= (pis be tPes)A sequence {ay} of real numbers is suid to converge to a real number a,
written lim,..,, G4, if for every positive ¢ there exists a number N(o)
such that ja, —@) < N(o). There are several ways to general
ine this concept to random variables, Let Z, Z,, Z,, ... be jointly dis
tributed random variables.
{a) Convergence with probabidity une
We say Z, converges to Z with probability one if
Prflim Z,=Z)=1.
In words, lim, ... 54 == for a set of outcomes Z.
having total probability one.
a Fs
{b) Convergence in probabitity
‘We say %, converges to Z in probability if for every positive o
Jim Pr{|Z,— Z| >0}=0,
or conversely, if for every positive «
Sima Pr{lZ,
A ceal.
In words, by taking # sufficiently large, one can achieve arbitracily
high probability that Z, is urbitrasily dose to Z.
(0) Convergence in quadratic mean
We say Z, converges to Zin quadratic mean if
E(|Z,— 2)7]=0.
In words, by making n sufficiently large. one can ensure that Z, ix
arbitrarily close to Z in the sense of mean square difference.
(A) Convergence in distribution (= Convergence in late)
Lot Fi) =PrZ< 6 and Fy) —PriZ, <0) k= 1 2... We say 4,
converges in distrilmation to Z (or F, converges in distribution to F) if
lim F(a) = Fee)
for all t at which is continuous.
Tt can be proved that if Z, converges to Z with probability one. then
2, converges to Z in probability, and that this in turn implies that Z,
converges to Zin distribution, Thus convergence in distribution ix the1, REVIEW oF BASIC TERMINOLOGY
weakest form of convergence, In fact it can be shown that every family
{F,} of distribution functions contains a sequence {F,,} that: converges
to wfunetion Fat all at which Fis continuous (tke Helly-Bray Lemma)
but F may not he a proper distribution in that F(a} may bo less than on.
‘Many of the basic results of probability theory are in the form of limit
theorems and we will mention a few here, (We do not gtate these results
under the weakest possible hypotheses.) ~
Tet X,. Xz, ., be independent identically distributed random variables
‘with finite mean m, Let S,—=X, + +Xq and let X,= S,)n he the
sample mean.
Law of Large Numbers (Weak), X, converges in probability to as. That
is, for any positive ¢
lim Prf|S,—
Lav of Large Numbers (Strong). X, converges to m with probability one.
That is
Pe fim &,
Central Limit Theorem. Suppose cach X, has the finite variance o?, Let
and let Z be a normally distributed random variable having mean
zero and variance one, Then Z, converges in distribution to 2, That,
is, for all real a,
lim Pr{Z, } )
which gives the inequality. If X is a random variable with mean y and
variance o* and we apply (1.16) with Z=(X— 1}? wo obti
PrfZ> 22} = Pr(|X— nh >e}
‘The Schwarz Inequality. Let X and Y he jointly distributed random
variables haying finite eecond moments. Then
(ELXY]}? = ELX*] £[Y*].
Proof. For all real £
0< EUX 4AYP] = EXP] 4 27P [XY] + PELY
Considered as a quadratic function of 2, there is, then, at most one real
root. Equivalently. the discriminant of the quadratic expression is non-
positive. That is
HELLY)? < SERIE?
‘which completes the proof.
2: Two Simple Examples of Stochastic Processes
The developments in this hook are intended to serve as an iotroduction
to various aspects of stochastic processes. The theory of stochastic pro=
cesses is concerned with the inventigation of the structure. of familion ofcandom variables X,, where ¢ is a parameter running over @ suitable
index set. Sometimes, when no ambiguity can arise we write X(j)
instead of X,.
A realization or sample function of « stochastic process (X,,46 T} is an
assignment, to euch £7, of a possible value of X,. ‘The index set # may
correspond to discecte units of time T—{0, 1,2,3,...} and {X,} could
‘then represent the outcomes at successive trials like the result of tossing
1 coin, the successive reactions of a subject to a learning experiment, oF
successive observations of some characteristic of a population, ete.
‘The values of the X, may he one-dimensional, two-dimensional, or
n-dimensional, or even more general. In the case where X, is the outcome
of the nth toss of u die, its possibile values are contuined in the set
{1,2, 3,4, 5, 6f and a typical realization of the process would be 5, 1, 3,
2.2.4, 1,6, 3, 6,.... This is shown schematically in Fig. 1, where the
ordinate for =n is the value of X,. In this example, the random
rie, 1
variables X, arc mutually
variables X, are dependent.
Stochastie processes for which T—[0, 02) are particularly important
in applications. Here # can usually be interpreted as tine.
We will content ourselves, for the moment, with a very brief dis-
cussion of some of the concepts of stochastic processes and two examples
thereof; u summary of various types of stochastic processes is presented
at the end of the chapter, while the examples themselves will be treated
in greater detail in succeeding chapters.
independent but generally the random
Example 1. A very important example is the eclebrated Brownian
motion process. This process hus the following charaetreisties:
(a) Suppose ty <¢) 0. Assume f(0) > 0.
fine: Show fest that the joint density function af U end Vis
Suva. 8) = Slee + |u|).
Next, equate this with the product of the marginal densities for U, Ps
ze Lie Xml Vt to lneendan tomgae tp, dom
lic whan habia he epee
0)
(*")
ety
{or all nonnegative integers x and y where mt and m are given pestive intogers.
Assume that Pr{X —0} and Pr{¥-=0} are steietly positive. Show that both
X and Y have binomial distributions with the same parsmeter p, the other
pezameters being m anil 2, respectively.
BB, (a) Let X and Y Ine independent yandom variables such that
Pr{X = alX $
ssty=
, PAK = Of, PLY == at,
FO~9, 0, HO.12,..
ana
sappose ,
merce ran [O)re-e’ esas
6, bok
Prove that
Siar G
=O 12
whose a= pf —p) and 0-0 is arbitrary.
(b) Show that pis detennined by the eon
Ly
or) =i
Hint: Lor F(s) =D flijs!. (6) = Delis. Eetablish first the eelation
Flu) Fle) = Flop + (1 —p))Glep + — pa).a 1. ELEMENTS OF STOCHASTIC PROCESSES
24, Let X be a nonnegative integer-valued random variable with probability
gonerating function f(s) —Sta9 qe". After observing X, then conduct, X
hinowial trate with probability pof success. Let ¥ denote the resulting number
‘of eaceesses,
{a) Determine the probability generating fonction of Y.
(b) Detarmine the probability goneratiug funetion of X given that Y= X.
Solution: (a) fl —p-+ ps); ) fpslifir).
25. {Continuation af Problem 24) Suppose that for every p (0- 0.
26. ‘There aro at least four sehocls af thought on the statistical distribution af
stack poice differences, or more generally, stochastic models far sequenecs of
stock prices. In terms af number of followers, by far the most popular approach
ig that of the so-called “technical analysist”, phrased in terms of shart term
‘trends, suppart and resistance levels, technical rebounds, aud so on. Itojecting
this technical viewpoint, two other echools agree that scquensss of prices
deseribe a random walk, when price changes are statistically independent of
pravious price history, but these sehocle disagree in their choice of the ap-
propriate probabibity distributions. Some authors find price changes tw have a
formal distribution while the other group finds @ dietribution with “fatter
{ail probabilities, and perhaps even an infinite variance, Finally, a fourth
group (overlapping with the preceding twa) admits the random walk as a
first-order approsimation but uotes recognizable second-order effecte.
"This exercise is to show « compatibility between the middle two groupe. It
has been noted that those that Bnd price changer to he normal typically
measure the changes over a fixed mamber of transactions, while those that find
the larger tail probabilities typically measure price changes over a fixed time
period that may cuatain a random auuber uf transactions, Let Z be price
change. Use as the measure of'*fatness "(and there could be dispute about this)
the coefficient of axcese
vas [malm)"]— 3,
where mis the Mth moment of Z about ite mean,
Suppose on ench transaction that the price advances hy one unit, or lowers hy
fone unit, each with equal probability, Let A be the number of transactions and
wetite Z =X, fo +Xy where the Xys are independent and identically
distributed xaudom variables, each equally likely to be +1 or —1. Compute
for Z:(a) When Nie afixed mumbora, and (by When IV has a Poisson distyibn
‘with mean a.
27. Consider an infinite number of urns into which se toss halls independoatly.
sn such ¢ way that a ball falls into the kth ure with probability 1/2 f= 1,2, 3,
"For each positive futeger NY, let Zy be the number of urns which eoutain atleast one ball after a total of N balls have been tossed, Show that
H9)— S012"),
sof that there exist constants C, 50 and €, > 0 euch that
Clog N= BlZy) 0,720, and 2p --r= 1. Conventionally, “ symmetvie random
walk" refers only to the case r= 0, p= 4.
Motivated hy consideration of certain physical models we are led to
the study of random walks on the set of the nonnegative integers. We2. EXAMPLES OF MARKOV CHAINS o
classify the diferent processes by the nature of the zero state, Let us
fy attention on the random walk desexibed by (2.2). If py = 1, and there-
fore rp =0, we have a situation where the zero state acts like a reflecting
harrier, Whenever the particle reaches the zevo state, the next transition
automatically returns it to state onc. This corresponds to the physical
process where an clastic wall exists at zero, and the particle bounces off
‘with no after-ctfects.
Tf po= 0 and ro—=1 then 0 sets as an ubsorbing barrier, Onee the
particle reaches zero it remains there forever. If p, > 0 andr, > 0, then
Qis a partially reflecting harrier.
‘Whou the random walk is restricted to a finite number of states $,
say 0, 1,2, ....4, then both the states 0/and a independently and in any
combination may br reflecting, absorbing, or partially reflecting barriers.
We have already encountered a model (gambler’s xuin, involving two
adversaries with finite resourers) of a random walk confined to the states
‘S where 0 and a are absorbing [see (2.3)}.
A classical mathematical model of diffusion through a membrane is
the famous Bhrenfest model, namely, a random walk on a finite set of
states whereby the boundary states are reflecting. ‘The random walk is
restrieted to the statos #— —a —@-+ Tyee he, Te seogt with, tra
sition probability matrix
if j=i+l.
P,
“ if joi.
0, otherwise,
‘The physical interpretation of this model is as follows, Imagine two con-
tuiners containing a total of 2a balls. Suppose the first container, labeled
A, holds i halls and the second container B helds 2a — k balls. A hall is
elected at random (all selections are equally likely) from among the
Autality of the 2a balls and moved to the other container. Each selection
genorates a transition of the process, Clearly the balls fluctuate between
the: two containers with a drift from the one with the larger concentration
‘of bolls to the one with the smaller concentration of balls. A physical
nystem which in the main is governed by a set of restoring forees esken-
ally proportional to the distance from an equilibrium position may
nomctimes be approximated by this Ehrenfest model.
‘The classical symmetric random walk in n dimensions admits the fol-
lowing (ormutation. The state space is identified with the set of all integral
uttice points in B" (Buclidean a space): that is, a state is an n-tuple2. MARKO CHAINS
k= (ky,ky.-sk) of integers. The transition probability matrix is
defined by
1 ®
if Sk,
Py iin Rbk
otherwise.
Analogous to the one-dimensional case, the symmetric random walk,
in EP reprevents a discrete version of n-dimensional Brownian motion.
1G. A DISCRETE QUEUEING MARKOY CHAIN
Customers arrive for service and take their place in a waiting ac.
During cach period of time a single customer is served, provided that at
least ane customer is present. If no customer avaits sorvice then daring
this period no serviee ie performed. (We can imagine, for example, @
taxi stand at which « cab arrives ut fixed time interval: to give service.
If no one is present the cab immediately departs.) During a service
period new customers may arzive. We suppose the actual number of
arrivals in the nth period is a random variable £, whose distribution
function is independent of the period and is given by
Prik customers arrive in a serviee period) — Px,
ROM a 20 and
=1. 4)
‘We also assume the r.v."s , are independent. The stute of the system at
the start of each period is defined to be the number of customers waiting
in line for service. If the prevent state isi then aftor a lapso of one period
the state is
f-1+E if GEL,
J \g fi
where & is the number of new customers having urrived in this period
while a single customer was served. In terms of the random variables
of the process we can express (2.5) formally as
Xan OG +E es
where ¥* =max(¥. 0). In view of (24) and (2.5) the transition prob-
ability matrix may be trivially caloulated and we obtain
(25)
oe ene en
Pall =2. EXAMPLES OF MARKOV CHAINS. 5s
It fs intuitively clear that if the expected mumber of new customers,
Tho kas, that arrive during a service period exceeds 1 then certainly
‘with the passage of time the length of the waiting line inereazes without
Simic.
(On the other hand, if DP.» faa, <1 then we shall see that the length
of the waiting line approaches an eqpilibrinm (stationary state). Tf
‘ka, = 1. a situation of gross instability develops. These statements will
bbe formally elaborated after we have sot forth the relevant theory af
recurrence (ser Section 5, Chapter 3).
. INVENTORY MoDEL
Gansider a situation in which # commodity is stocked im order to satisfy
a continuing demand. We assume that the replenishing of stock take
place ut successive times #425... and we assume that the cumlative
demand for the commodity over the interval (I, ) is a random
variable ¢, whoce distribution function is indopendeat of the time period,
Pram, k= 01,2, 28)
where 1, 20 and YP.g a, —1, The stock level is examined at the start
of each period. An inventory policy is prescribed by specifying two
nonnegative critical values ¢ and S>-s. The implementation of the
inventory policy is as follows: If the available stock quantity is not
greater than s then immediate procurement is dane so a to bring the
‘quantity of stock on hand to the level S. If, however, the available stock
is in exeese of s then no replenishment of stock is undertaken. Let X,
denote the stock on hand just prior to restocking, at f,. The states of the
process {X,} consist of the possible values of the stock size
SSL th 0 A ee
where a negative value is interpreted as an unfulfilled demand for stock,
unediately upon restocking. According to the
2.9)
where é, is the quantity of demand that arises in the nth period, based
‘on the probability law (2.8). If wo assume the &, to be mutually indopend-
‘ent, then the stack values Xq,X,,X,,... plainly constitute a Markov
chain whose transition probability matrix can be ealeulated in accord:
‘ance with the relation (2.9).. success AUNS.
Consider a Markov chain on the nonnegative integers with transition
probability matrix of the form
Poe 0 0
PO ag 0
(Pyl=|p 9 0 a mf 20)
PO Oo 0
0, 72> O-and 9. m= 1, $= 9, 1,2, ne The zero state plays
« distinguished role here in that it ean be reached in one transition from
any other state, while state i+ 1 can be reached only from state i.
‘This example is very easy to compute with and ve will therefore
frequently hustrate concepts and results in terms of it.
A special caze of this transition matrix arises when one is dealing with
success runs resulting from repeated trials each of which admits two pos-
sible outcomes, success ($) or failure (F). More explicitly, consider a
sequence of trials with two possible outcomes (8) or (F). Moreover,
suppose that in each trial, the probability of (S) is x and the probability
of (F) is B= 1—a, We say a success run of length x happened at trial
if the outeomes in the preceding ¢-+1 trials, ineluding the present trial
as the lust, were respectively, F, 8, S, .., 8. Let us now label the prosent,
state of the proccss by the length of the success run currently under way.
In particular, if the last trial resufted in a failure then the state is zero.
Similarly, when the preceding r+ trials in order had the outcomes
F,S.S, 05, the state variable would carry the label r, The process is
clearly Markovian (since the individual trials werr independent of each
other) and its transition matrix has the form (2.10) where
Pe=B 2a OL
F. BRANCHING PROCESSES
Suppose an organism at the end of its lifetime produees a random
number ¢ of offspring with probability distribution
Pre Bag, be O12 an)
where, at usual, a, 20 and Sf.ga,— 1. We assume that all offspring
uct independently of cack other and at the end of thoir lifetime (for2. EXAMPLES OF MARKOV CHAINS
simplicity, the lifespans of all organisms are assumed to be the same)
individually have progeny in accordance with the probability distribu-
tion (211), thus propagating their species. The process {X,}, where
X,, is the population size at the ath generation. is a Markov chain,
Jin fact, the ony relevant knowledge rogarding the distribution of
mae Xqys eve Xqey Kgs My
j there exists m, w 21 such that
PE>O and PIA,
Tet 20. We obtain, by the usual argument (seo page 60),
Pay" > PEPSPs, and, on summing,
kore
Honeo if D2» Ph diverges, then Tio Pj also diverges.
This corollary proves that reoureenen, like periodicity, is a ease prop-
erty: that is, all states im an equivalence class are either recurrent or
monrecorrent.
y PEPLPY, — PRP*, x Py.
Remark. The expected number of eetuens tw state i, piven Xo— i, is4. EXAMPLES OF RECURRENT MARKOV CHAINS o
$y. Than Throeem 5.1 states thet estate fa recurrent if and oa
it the expected mumbor of returns is infinite.
6: Examples of Recurrent Markov Chains
Example L. Consider fist the one-dimensional random valk on tho posie
ive and negative integers, where at each transition the particle moves
with probability p one unit to the right and with probability g one unit
to the loft (p-|-q = 1). Hence
ent
Patt n= 0,12 and PB are
(6.1)
We appeal now to Stirling's formula,
mt ~ nite" V2a. (6.2)
Applying (6.2) to (6.1) we obtain
pag oO" _ Oe”
Van Van
It is readily verified that p(1—p) = py <4 with equality holding if und
only if $- Hence Sig Pjq = 00 if and only if
feom Theorem 5.1, the one-imensional random walk is recurrent if and
only if p ==}. Remember that recurrence is w class prope
tively, if pz£q there is positive probability that a perticle i
the origin will drift to +o if p> q (to —ob if p-
Poo = Poll — Pid Opa a) = (L Polk ps) +L =P):
a>.
Pe
(6.19)
Let
[oro pa (1p,
7
‘Then if we sum the fhg's we have
Spe te a A te) Hee a) be Et 0)
“Site~1- (ox
‘To complete our argument we need the following:
Lemaa 61. If 0
You might also like