Professional Documents
Culture Documents
Applications
Example – Flight cancellation concerns: Daily Flight Cancellations Data
State
0 no cancellations
1 one cancellation
2 2 cancellation
3 more than 2 cancellations
• Questions:
• If there are no cancellations initially, what is the
probability that there will be at least one
cancellation after 2 days?
• Calculate steady state expected loss due to
cancellation of flights
• Through simultaneous equations
• Simulation
Retention Probability and Customer
Lifetime Value using Markov Chains
• Let {0,1,2,….s} be the states of the Markov Chain in
which {1, 2, …s} represents different customer
segments and state 0 represents non-customer
segment
• Retention Probability: Probability of retaining the
customer
𝜋𝜋0 (1 − 𝑃𝑃00 )
𝑅𝑅𝑡𝑡 = 1 −
1 − 𝜋𝜋0
𝑅𝑅𝑡𝑡 steady state retention probability
Retention Probability and Customer
Lifetime Value using Markov Chains
• Customer Lifetime Value (CLV): Net present value of
the future margin generated from a customer or
customer segment
• Customer Lifetime Value for N periods
𝑁𝑁
𝐏𝐏𝐈𝐈 × 𝐏𝐏𝑡𝑡 𝐑𝐑
𝐶𝐶𝐶𝐶𝑉𝑉 = �
1 + 𝑖𝑖 𝑡𝑡
𝑡𝑡=0
• 𝐑𝐑 Reward Vector
• i interest rate
Example – Retention of customers
in Data services
State
0 no customers
different customer
1 to 4 segments
𝐹𝐹𝑖𝑖𝑖𝑖 = � 𝑓𝑓𝑖𝑖𝑖𝑖𝑛𝑛 = 1
𝑛𝑛=1
• For transient state ∞
𝜇𝜇𝑖𝑖𝑖𝑖 = � 𝑛𝑛 × 𝑓𝑓𝑖𝑖𝑖𝑖𝑛𝑛
𝑛𝑛=1
• Positive recurrent state: If 𝜇𝜇𝑖𝑖𝑖𝑖 < ∞ (i.e. finite time)
• Null-recurrent state: If 𝜇𝜇𝑖𝑖𝑖𝑖 = ∞ (i.e. infinite time)
Periodic State
• Periodic state is special case of recurrent state
• Let 𝑑𝑑(𝑖𝑖) be the greatest common divisor of n such
that 𝑃𝑃𝑖𝑖𝑖𝑖𝑛𝑛 > 0
• Aperiodic State: 𝑑𝑑 𝑖𝑖 = 1
• Periodic State: 𝑑𝑑 𝑖𝑖 ≥ 2
1 2 3
1 0 1 0
P= 2 0 0 1
3 1 0 0
2
𝑃𝑃11 = 0, but for n=multiples of 3, 𝑃𝑃𝑖𝑖𝑖𝑖𝑛𝑛 = 1 > 0 ⇒ 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 1 𝑖𝑖𝑖𝑖 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝
Ergodic Markov Chain
• A state i of a Markov chain is ergodic when it is
positive recurrent and aperiodic
• Markov chains in which all states are ergodic is an
ergodic Markov chain
• An ergodic Markov chain has a stationary distribution
that satisfies 𝑚𝑚
� 𝜋𝜋𝑘𝑘 = 1
𝑘𝑘=1
Limiting Probability
• Limiting Probability is
lim 𝑃𝑃𝑖𝑖𝑖𝑖𝑛𝑛
𝑛𝑛→∞
• Limiting probability depends may depend on the
initial state and is not unique
• Stationary distribution is unique and does not
depend on the initial state
Markov Chains with Absorbing
States
• Absorbing state: 𝑃𝑃𝑖𝑖𝑖𝑖 = 1
• Absorbing state Markov Chain is a Markov chain in
which there is at least one state k such that 𝑃𝑃𝑘𝑘𝑘𝑘 = 1
• Absorbing state markov chains are not ∞
ergodic
𝑛𝑛
since
other states will be transient (i.e. ∑𝑛𝑛=1 𝑃𝑃𝑘𝑘𝑘𝑘 < ∞)
• Transition matrix corresponding to an absorbing state
markov chain is not a regular matrix and thus do not
have stationary distributions
• i.e. 𝑃𝑃𝐼𝐼 𝑃𝑃𝑛𝑛 may not converge to a unique value and depends on
the initial distribution
• The long run probability of finding the system in
transient states is zero
Canonical Form of the Transition
Matrix of an Absorbing State Markov
Chain
• I = Identity matrix (corresponds to transition between absorbing states)
• 0= matrix in which all elements are zero (i.e. from absorbing state to
transient state)
• R = matrix in which element represents probability of absorption from
transient state to absorbing state
• Q= matrix in which elements represent transition between transient
states
A T
P= A I 0
T R Q
A T
Pn= A I 0
T ∑𝑛𝑛−1 k
𝑘𝑘=0 Q R Qn
Fundamental Matrix
• For large value of n, ∑𝑛𝑛−1
𝑘𝑘=0 Qk R gives the
𝐸𝐸𝑗𝑗𝑗𝑗 = 0
Example – how long does it take for
NPA problem to become worse
State1 NPA is less than 1%
State2 NPA is between 1% and 2%
State3 NPA is between 2% and 3%
State4 NPA is between 3% and 4%
State5 NPA is between 4% and 5%
State6 NPA is between 5% and 6%
State7 NPA is greater than 6%
Transition probability matrix (based on monthly data)
States State1 State2 State3 State4 State5 State6 State7
1 0.95 0.05 0 0 0 0 0
2 0.1 0.85 0.05 0 0 0 0
3 0 0.1 0.8 0.1 0 0 0
4 0 0 0.15 0.7 0.15 0 0
5 0 0 0 0.15 0.65 0.2 0
6 0 0 0 0 0.2 0.6 0.2
7 0 0 0 0 0 0.1 0.9
Question
• Calculate the expected duration (in months) for the
process to reach 7 from state 4
𝐸𝐸𝑖𝑖7 = 1 + � 𝑃𝑃𝑖𝑖𝑖𝑖 𝐸𝐸𝑘𝑘7 ∀𝑖𝑖, 𝑖𝑖 ≠ 7
𝑘𝑘
𝐸𝐸77 = 0