P. 1
markov-chains

markov-chains

|Views: 2,308|Likes:
Published by elgianka

More info:

Published by: elgianka on Jan 22, 2011
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

08/17/2013

pdf

text

original

For both discrete-time and continuous-time Markov chains we have given
definitions which specify the probabilities of certain events determined by
the process. From these specified probabilities we have often deduced ex-
plicitly the values of other probabilities, for example hitting probabilities.
In this section we shall show, in measure-theoretic terms, that our defini-
tions determine the probabilities of all events depending on the process.
The constructive approach we have taken should make this seem obvious,
but it is illuminating to see what has to be done.

Let 0 be a set. A 7r-system A on 0 is a collection of subsets of 0 which
is closed under finite intersections; thus

We denote as usual by a(A) the a-algebra generated by A. If a(A) = :F
we say that A generates :F.

Theorem 6.6.1. Let (O,:F) be a measurable space. Let PI and P2 be
probability measures on (0, F) which agree on a 7r-system A generating :F.

Then PI = P2 ·

Proof. Consider

6.6 Uniqueness of probabilities and independence ofa-algebras 229

We have assumed that A ~ V. Moreover, since PI and P2 are probability
measures, V has the following properties:

(i) 0 E V;

(ii) (A, B E V and A ~ B) :::} B\A E V;

(iii) (An E V, An i A) :::} A E V.

Any collection of subsets having these properties is called a d-system. Since
A generates F, the result now follows from the following lemma. D

Lemma 6.6.2 (Dynkin's 1r-system lemma). Let A be a 1r-system and
let V be a d-system. Suppose A ~ V. Then a(A) ~ V.

Proof. Any intersection ofd-systems is again a d-system, so we may without
loss assume that V is the smallest d-system containing A. You may easily
check that any d-system which is also a 1r-system is necessarily a a-algebra,
so it suffices to show V is a 1r-system. This we do in two stages.

Consider first

VI = {A E V : A n B E V for all B E A}.

Since A is a 1r-system, A ~ VI. You may easily check that VI is ad-system
- because V is a d-system. Since V is the smallest d-system containing A,
this shows VI = V.

Next consider

V 2 = {A E V : A n B E V for all B E V}.

Since VI = V, A ~ V 2 . You can easily check that V 2 is also ad-system.
Hence also V 2 = V. But this shows V is a 1r-system. D

The notion of independence used in advanced probability is the indepen-
dence of a-algebras. Suppose that (0, F, P) is a probability space and F1
and F2 are sub-a-algebras of F. We say that Fl and F2 are independent if

The usual means ofestablishing such independence is the following corollary
of Theorem 6.6.1.

Theorem 6.6.3. Let Al be a 1r-system generating F1 and let A2 be a
1r-system generating F2. Suppose that

Then Fl and F2 are independent.

230

6. Appendix: probability and measure

Proof. There are two steps. First fix A2 E A2 with P(A2 ) > 0 and consider
the probability measure

We have assumed that P(A) = P(A) for all A E AI, so, by Theorem 6.6.1,

jp> = P on :Fl. Next fix Al E :F1 with P(A1) > 0 and consider the probability

measure

We showed in ~ e first step that P(A) = P(A) for all A E A2, so, by

Theorem 6.6.1, P = P on :F2 . Hence:F1 and :F2 are independent. D

We now review some points in the main text where Theorems 6.6.1 and

6.6.3 are relevant.

In Theorem 1.1.1 we showed that our definition ofa discrete-time Markov
chain ( X n ) n ~ O with initial distribution ,\ and transition matrix P deter-
mines the probabilities of all events of the form

But subsequently we made explicit calculations for probabilities of events
which were not of this form - such as the event that ( X n ) n ~ O visits a set
of states A. We note now that the events {Xo = io, ... ,Xn = in} form a
1r-system which generates the a-algebra a(Xn : n ~ 0). Hence, by Theorem
6.6.1, our definition determines (in principle) the probabilities of all events
in this a-algebra.

In our general discussion of continuous-time random processes in Section
2.2 we claimed that for a right-continuous process ( X t ) t ~ O the probabilities
of events of the form

for all n ~ 0 determined the probabilities ofall events depending on ( X t ) t ~ o .
Now events of the form {Xto = io, ... ,Xtn = in} form a 1r-system which
generates the a-algebra a(Xt : t ~ 0). So Theorem 6.6.1 justifies (a precise
version) of this claim. The point about right-continuity is that without
such an assumption an event such as

{Xt = i for some t > O}

which might reasonably be considered to depend on ( X t ) t ~ O , is not nec-
essarily measurable with respect to a(Xt : t ~ 0). An argument given in

6.6 Uniqueness of probabilities and independence of a-algebras 231

Section 2.2 shows that this event is measurable in the right-continuous case.
We conclude that, without some assumption like right-continuity, general
continuous-time processes are unreasonable.
Consider now the method of describing a minimal right-continuous pro-
cess ( X t ) t ~ O via its jump process ( Y n ) n ~ O and holding times ( S n ) n ~ 1 . Let
us take F = a(Xt : t ~ 0). Then Lemmas 6.5.1 and 6.5.2 show that ( Y n ) n ~ O
and ( S n ) n ~ 1 are F-measurable. Thus 9 ~ F where

9 = a ( ( Y n ) n ~ O ' ( S n ) n ~ 1 ) .

On the other hand, for all i E I

{Xt = i} = U{In ~ t < In+d n {Yn = i} E g,

n ~ O

so also F C g.

A useful1r-system generating 9 is given by sets of the form

B = {Yo = io,.. · ,Yn = in,S1 > S1,·.· ,Sn > sn}.

Our jump chain/holding time definition of the continuous-time chain

( X t ) t ~ O with initial distribution .x and generator matrix Q may be read

as stating that, for such events

l1J)(B) - \ . I'fr..

I'fr.

• e-qio81 e-qin-18n

C

- A'tO"'tO'tl ••• "'tn-l'tn

•••

Then, by Theorem 6.6.1, this definition determines JP> on 9 and hence on F.
Finally, we consider the strong Markov property, Theorem 6.5.4. Assume
that ( X t ) t ~ O is Markov(.x,Q) and that T is a stopping time of ( X t ) t ~ o . On
the setn= {T < (} define Xt = XT+t and let j = a(Xt : t ~ 0); write

C Y n ) n ~ O and ( S n ) n ~ O for the jump chain and holding times of ( X t ) t ~ O and

set

9= a ( ( Y n ) n ~ O ' ( S n ) n ~ O ) .

Thus F and 9are a-algebras on n, and coincide by the same argument as
for F = g. Set

B= {Yo = io,... ,Yn = in,S1 > S1, .. · ,Sn > sn}.

Then the conclusion of the strong Markov property states that

JP>(B IT < (, XT = i) =lPi(B)

with B as above, and that

JP>(Cn A IT < (,XT = i) = JP>(C IT < (,Xt = i)JP>(A IT < (,XT = i)

for all C E F and A E FT. By T ~ e o r e ~ 6.6.3 it suffices to prove the
independence assertion for the case C = B, which is what we did in the
proof of Theorem 6.5.4.

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->