You are on page 1of 11

Markov chain and transition probabilities (class 5)

• A
  random process is called a Markov process if
• = for all <…. In other words if the future behaviour
of a process depends on the present but not on
the past, then the process is called a Markov
process. ...,are called the states of the Markov
process. Random process with Markov property
which take discrete values whether t is discrete or
continuous are called Markov chains. Poisson
process discussed earlier is a continuous time
Markov chain. In this section we will discuss
discrete time Markov chains. The outcomes are
called the states of the Markov chain. If has the
outcome j , that is = j , the process is said to be at
state j at trial. If for all n ,
•• If
  for all n ,
• = P,
• then the process is called a Markov chain. The conditional
probability is called the one step transition probability from state
to state , at the step and is denoted by . If the one step
transition probability does not depend on the step then the
Markov chain is called a homogeneous Markov chain or the chain
is said to have stationary transition probabilities.
•• When
  the Markov chain is homogeneous the one step transition
probability is denoted by The matrix ) is called one step
transition probability matrix or shortly tpm.
• Remark 1 The tpm of a Markov chain is a stochastic matrix since
and for all i. That is the sum of all the elements of any row of the
tpm is 1. The conditional probability that the process is in state at
step n, given that it was in state at step 0, that is is called the n-
step transition probability and is denoted by
• Remark
  2 If the probability that the process is in state is
at any arbitrary step then the row vector is called the
probability distribution of the process at .that time. In
particular is the initial probability distribution.
• Remark 3 The transition probability matrix together with
the initial probability distribution completely specifies a
Markov chain {.
•• Chapman
  Kolmogorov Theorem
• If P is the tpm of a homogeneous Markov chain then the n step
tpm is equal to .That is .
• Definition : A stochastic matrix P is said to be a regular matrix
if all the entries of (for some positive integer m ) are positive.
A homogeneous Markov chain is said to be regular if its tpm is
regular.
•• We
  state below two theorems without proof
• 1.If p is the state probability distribution of the process at an
arbitrary time, then after one step is .If is the initial state
probability distribution, then after one step the probability
distribution is =P. =P=.Continuing in this way we get =
• 2.If a homogeneous Markov chain is regular, then every sequence
of state probability distributions approaches a unique fixed
probability distribution called the stationary (steady) state
distribution of the Markov chain.
•• That
  is ,where the state probability distribution at step n , and
the steady state distribution are row vectors.
• 3.Moreover if P is the tpm of the regular chain, then .Using this
property of it can be found out.
•  Remark 1
• Suppose take the values . Then + ++ …+ = =
•  = = = 1.
• The first row in the tpm is
• []
• Remark 2
• Suppose ,……take the values 1, 2, 3.
• Then if the one step tpm is homogeneous then P = P=P
• Remark
  3
• Two step transition probabilities are P = P=P
•  

You might also like