Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
P. 1
dtmc

dtmc

Ratings: (0)|Views: 113 |Likes:
Published by alexandru_bratu_6

More info:

Published by: alexandru_bratu_6 on Jun 05, 2012
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

12/12/2012

pdf

text

original

 
Discrete time Markov chainsPage 1
MS&E 121January 29, 2004Introduction to Stochastic Modeling
Discrete Time Markov Chains(DTMCs)Discrete-Time Stochastic Process
 A stochastic process{
 X 
,
∈ 
}is a collection of random variables.
 X 
is the stateof the process at time
t.
is the index setof the process: most often it representstime.The state spaceof the stochastic process is the set of possiblevalues
 X 
can assume.If 
is countable, {
 X 
,
∈ 
}is a discrete-time stochasticprocess.example:
 X 
is the amount of inventory on hand at the endof the day, for day
,
∈ 
= [0,1,2,…]We will denote a discrete-time stochastic process by
 X 
,
 X 
1
,
 X 
,…Denote its state space by
E =
{0,….
S
}.If 
 X 
n
=
, we say the process is in state
at time
n
.(We will use the notation
 X 
n
and
 X 
interchangeably.)
 
Discrete time Markov chainsPage 2
MS&E 121January 29, 2004Introduction to Stochastic Modeling
Overview of DTMC Lectures
Definition of a Discrete Time Markov ChainThe Markov PropertyFormulation of a Discrete Time Markov ChainTransition probabilities: matrix representation & transition diagramsChoosing the state space for a Markov ChainExamples: taxi queuing system(s,S) inventory systemExample: weather forecastingExample: gambler’s ruin
Overview of DTMC Lectures, continued
Characterizing the behavior of a Markov Chain over timeN-step transition probabilities: definition and computationUnconditional and initial probability distributionsClassification of states of a Markov ChainSteady state probabilities: proof of existence and computationAverage costs and rewards in the long runExample: brand switching/market shareExample: taxi queue
 
Discrete time Markov chainsPage 3
MS&E 121January 29, 2004Introduction to Stochastic Modeling
 A simple Markov chain
EXAMPLE 1: Gambler’s Ruin
Suppose I go to Sands Casino with $2 in my pocket. I’m going toplay a series of blackjack games in which I bet $1 each game. Withprobability
 p
I win the game, and I lose with probability
1-p.
I quitplaying as soon as I have $4 or I run out of money. After game
I have a certain amount of cash --call it
 X 
. My cashposition after the next game is a random variable
 X 
t+1
.The key property of this example is that my cash position after game
t+1
depends only on my cash position after game
(and howI did in game
t+1
). It does not depend on the history of outcomesbefore game
. This is the essence of the Markov Property.
 
Discrete-Time Markov Chains
MARKOV PROPERTY
is independent of 
)|(
1
 X  j  X 
==
+
STATIONARITY ASSUMPTION
)|()|(
011
 X  j  X  X  j  X  p
ij 
=====
+
STATIONARY TRANSITION PROBABILITIES
stationarity assumptionthe conditional distribution of the future state given the past statesand the present state is independent of the past states anddepends onlyon the present state. A discrete time stochastic process
 X 
,
 X 
1
,
 X 
,… is a Markov Chain if for all
and for every sequence of states
,
1
, …,
t-1
,i,j
)|(),...,,|(
100111
 X  j  X  X  X  X  j  X 
=======
++

You're Reading a Free Preview

Download
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->