J. M. Akinpelu
± The
is often interpreted as time.
± is called the of the process at time .
± The set
is called the
of the process.
If
is countable, the stochastic process is said to be a
process.
If
is an interval of the real line, the stochastic process is
said to be a
process.
± The is the set of all possible values that the
random variables can assume.
R
½ amples:
1. Winnings of a gambler at successive games of
blackjack;
è {1, 2, .., 10}, è set of
integers.
2. Temperatures in Laurel, MD. on September
17, 2008;
è [12 a.m. Wed., 12 a.m. Thurs.),
è set of real numbers.
G

A stochastic process 0 ë
,
0 0, 1, 2, ¦ is
called a 
provided that
{
1 0 
0 ,
ü1 0
ü1 , ¦ , 1 0 1 , 0 0 0 }
0 (
1 0 
0 }
Î

0 {
1 0 
0 }
are independent of
, and for which
½ ´ {0, 1, 2, ¦ }
(which is equivalent to saying that is finite or
countable). Such a Markov chain is called
.
X

Since
± probabilities are nonnegative, and
± the process must make a transition into some
state at each time in
, then
0 , ; 01 .
00
Ë
 
Let be a square matri of entries defined for
all , . Then is a  provided
that
for each , ½ , á 0
for each ½ ,
0
1.
0 { 9 0  8 0 , 7 0 }{ 8 0 7 0 }
0 { 9 0  8 0 }{ 8 0 7 0 }
0
{
1 1 , ¦ ,
î î
0 }
0 1 12 ¦ î ü1î .
u
uu
½ ample:
1 1 1
167 3 3
12 4 4
ù 16 8
ù
2 2ù 0 0 83 3 ù
1
! 0 2 2 1
4 8
ù
1 1 1ù
3 7 ù
4 4 2
8
3
16 16
3
±ence, {
2 0 2
0 3} 0 (3, 2) 0 .
2
16
uR
{
0 
0 } 0 .
!
î !
!
½
î
or all
, î á 0, all , ½.
uG
3 (0) 0 { 0 0 }.
(
) ( 0)
3 3 ! .
uÎ
!{3 0 } 1 .
± A recurrent state is call
u
if
[
 0
± Otherwise, it is called
.
uX
! 0
± whenever
is not divisible by , and is that
largest integer with this property. A state with
period 1 is said to be .
± Recurrent positive, aperiodic states are called
. A Markov chain is called if all
of its states are ergodic.
uË
{
 0 0 } 1 .
± There is a positive probability of never returning
to that state.
ur
u
u
recurrent transient
positive null
absorbing nonabsorbing
R
State is
± absorbing if and only if
! 1.
± recurrent if and only if
0 .
01
.
01
Ru
RR
RG
1
2 0 ù
ù
. ù
ù
0 . ù
ù
0 .
ù
ù
ù
!1 !2 . . . !
ù
1 1 1 ù
4 2 0 4
0 ù
1 1 1ù
3 0 3 0 3
RX
2 4 1 5
RË
Vote that:
± {1, 3, 5} and {1, 2, 3, 4, 5} are closed.
± {1, 3, 5} is irreducible.
3
2 4 1 5
12 0 1
0 0
2 ù
0 1
0 3
0ù 12 1
0
!
4
1
4
2ù 2 ù
0 0 3
0 3ù 2 0 1 2
ù
1 1 1
0ù 3 3
14
0 1 1ùù
1ù
2 4 1
0 1
0 ù 3 3 3
3 3 3
is a Markov matri corresponding to the state space {1, 3, 5}. Since the
set is finite, all states in the set are recurrent positive.
Rr
3 0 lim 3 (
) 0 lim 3 (0)
2
2
Moreover, either:
a. all states are transient or all states are recurrent null,
in which cases 3 0 0 for all , or
b. all states are recurrent positive (i.e., ergodic), in
which case 3 M 0 for all and the 3 are uniquely
determined by the following equations:
3 0 3
3 0 1.
1
î 0
3
Gu
It follows that for recurrent positive states, the
limiting probabilities have two interpretations:
± the limiting distribution of the state at time
± the longrun proportion of time that the process
spends in each state.
GR
!robability that process is in state at time 2 3
GG
GÎ
½ ample:
1 1 1
2 4 4
ù
!0 1
2 0 1
2 ù
1 1 1 ù
4 4 2
GX
13 13 25
512
205 205 409
32 64 64
ù 1024 1024
ù
3 0 13 3 13 ù 5 0 512
205 51 205 ù
32 16 32
ù 256 512
ù
25 13 ù 409 205 205 ù
64
13
64 32 1024 1024 512
m
. In a finitestate irreducible Markov
chain, the rows of
converge to 3 u
GË
Gr
0
! ùù
!3
G
$ ( ü ! 3 ) ü1 .
G
12 12 0 0 0
ù
½ ample: 0 1 2 0 0ù
2 3
ù
0 13 13 13 0 0 ù
×
ù
1 3ù
0 0 0 4 4ù
1 1 1ù
4 0 0 2 4
14 3 34 ü 34 4 4
0 1 4ù ü !3 ù ( ü
) ü1
0 8 ù
1ù ü 1 3 ù 4 ù
2 4 2 4 3
where 0 1, 1 ü .
Îu
All states can be reached from each other, so the
Markov chain is irreducible. Consequently, all
states are transient, all are recurrent positive, or all
are recurrent null.
3 0 3 .
ÎR
Any solution is of the form
ü1
1
3 0 ùù 3 , 0 1, 2, ¦
ÎG
If å , then / å 1 and
2 1
3 0 3 0 , 3 0 0 1 ü ùù
00 ü 2
and
Õ
1 1 ü
2
if 0 0
3 0 ü1
1
2
1 Õ Õ
ü
if 1
ÎÎ
ÎX
If , then we can use the following theorem to determine if
the states are transient or recurrent null.
%. Let X be a irreducible Markov chain with transition
matri , and let & be the matri obtained from by deleting
the row and the column for some . Then all states are
recurrent if and only if the only solution of
ÎË
We eliminate state 0 and solve for , obtaining:
0 for all if 0
0
1ü Õ
for all if M .
Îr
Î
Î
Let X be a Markov chain with state space
½ {0, 1, 2, ¦ , and
N }, transition matri
1
ù
0 0 ù
ù
0 ù
ù
. . . ù
ù
ù
0 0 ù
ù
1
1ü Õ
1
N , i
! 1 ü Õ 2
1
, i
N 2
Xu
Slotted Aloha is a Vetwork Random Access Method
± Used in packet radio and satellite communications in which satellites
communicated by radio
± Time is divided into ³slots´ of one packet duration
± When a node has a packet to send, it waits until the start of the ne t slot
to send it
± If no other node attempts transmission during that time slot, the
transmission is successful
± If a collision occurs, each collided packet is transmitted with
probability in each successive slot until it is successfully transmitted
u R G Î X Ë
XR
Slotted Aloha can be represented as a Markov chain.
Assume that packets arrive in a time slot with
probability , and at most one packet arrives at each
node in each time slot. If 1, then
0 (1 ü ) ü1 0 ü1
0 [1 ü (1 ü ) ü1 1 (1 ü ) 0
, 0
1 [1 ü (1 ü )
0 1
ü 2
XG
Using Markov chain analysis, one can show that, if
0 + 1 å 1, then slotted Aloha is unstable, i.e., all
states are transient, and the number of the packets
that have not been successfully transmitted grows
without bound (see Ross pg 21).
XÎ
Assign discussion problems to two students.
± Ross Chapter 4: 5, 14, 29, 45
± !rove that for a random walk, the all of the
states are recurrent null if p è ½ and transient
if p > ½.
XX
Let X be a Markov chain with state space {1, 2}, initial
distribution 6 è (1/3, 2/3) and transition matri
0.5 0.5
0 ù
0.3 0.7 ù
1. Calculate
1. !{X1 è 1}
2. !{X3 è 2  X1 è 1}
3. the limiting distribution 3
2. Classify the states in this Markov chain
XË