This action might not be possible to undo. Are you sure you want to continue?

BooksAudiobooksComicsSheet Music### Categories

### Categories

### Categories

Editors' Picks Books

Hand-picked favorites from

our editors

our editors

Editors' Picks Audiobooks

Hand-picked favorites from

our editors

our editors

Editors' Picks Comics

Hand-picked favorites from

our editors

our editors

Editors' Picks Sheet Music

Hand-picked favorites from

our editors

our editors

Top Books

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Audiobooks

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Comics

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Sheet Music

What's trending, bestsellers,

award-winners & more

award-winners & more

Welcome to Scribd! Start your free trial and access books, documents and more.Find out more

Prabhat Mittal

profmittal@yahoo.co.in

Prabhat Mittal

profmittal@yahoo.co.in

1

Markov Processes

Prabhat Mittal

profmittal@yahoo.co.in

2

Prabhat Mittal profmittal@yahoo. It is also useful in the study of stock market price movements.co. Markov processes has become a versatile tool for solving some of the management problems specially in the area of marketing .in 4 . analyzing accounts receivable that will ultimately become bad debts. • • .Introduction Markov Process (Chain) is a stochastic (probabilistic) process which has the property that the probability of a transition from a given state to any future state is dependent only on the present state and not on the manner in which it was reached. Markov process has also been employed in the study of equipment maintenance and failure problems.in 3 Management Applications • It is widely used in examining and predicting the behavior of consumer in terms of their brand loyalty and their switching patterns to other brands. Prabhat Mittal profmittal@yahoo.co.

. a2. ….. pmm by P: The matrix P is a square matrix whose element is a m non-negative probability ∑1 p ij = 1. . . 5 • Prabhat Mittal profmittal@yahoo. probability matrix.... p1m form and such a matrix is a2 p 21 p22 ....co. am. the transition probability is a conditional probability. Each individual outcome is called a state and the probability that a particular outcome occurs depends only on the probability of the outcome of the preceding experiment. m and sum of the elements of j= each row is unity and 0 ≤ pij ≤ 1 6 Prabhat Mittal profmittal@yahoo..in Transition Matrix • • The transition probabilities a1 a2 … am can be arranged in a matrix a1 p11 p12 . denoted am pm1 pm 2 ..in .Basic concepts of Markov Process • Markov Process is a sequence of n experiments in which each experiment has m possible outcomes a1. p2 m called a one-step transition P = . i = 1. . The probability of moving form one state to another or remaining in the same state in a single time period is called transition probability. Because of the probability of moving to one state depends on what happened in the preceding stat e..co. 2 .

4 0. A zero element in the above matrix indicates that the transition is impossible Prabhat Mittal profmittal@yahoo.7 particular situation.co.3 0.3 x3 0.7 0. 0.3 0.co.3 0.in Transition Diagram • The transition probabilities can also represented by two types of diagrams: 0.3 0 0. any matrix P whose elements are non-negative and sum of the elements either in each row or in column is one is called a transition matrix or a stochastic matrix or a probability matrix. 7 • Prabhat Mittal profmittal@yahoo. of columns) and therefore it gives the complete description of the Markov process.Transition Matrix ctd.4 0. a transition matrix is a square stochastic matrix (since no.3 x1 x2 0.7 The arrows form each state indicate the possible states to which a process can move from the given state.3 0. • In general.3 0 T= 0. of rows = no.7 transition diagram.in 8 . Thus. which shows the transition probabilities or shifts that can occur in any 0.

3 .3 x1 x2 x3 Prabhat Mittal profmittal@yahoo.in 10 .co.7 T= 0. 0.4 . Also former customers purchasing the product from B have switched back to A’s 60 percent of the time (a)Construct and interpret the state transition matrix in terms of retention and loss & retention and gain (b)Calculate the probability of a customer purchasing A’s product at the end of the second period.co.3 0.3 0.4 0.7 x1 0 0.Probability tree diagram 0. A’s customers have exhibited a high degree of loyalty as measured by the fact that customers using A’s product 80 percent of the time.7 x1 x2 x3 0.7 Probability tree diagram.3 0 0.3 0 0.3 x2 0.in 9 Example-I Two manufacturers A and B are competing with each other in a restricted market. Over the year.3 x1 x2 x3 x3 0 0. Prabhat Mittal profmittal@yahoo.

6 0.20 0.72 0.2 A B B 0.28 P2 = P.24 A 0. Customer using A at present will use B in future Retention & gain B Retention & Loss 0.P Present purchase (n =0) B Transition Diagram 0.60 0.co.8 0. Customer using A at present will use A in future P(B1|A0)= prob.4 A B 12 .Example 1 Solution Transition Matrix A Present purchase (n =0) Next Purchase (n=1) A B 0.8 0.in 0.80 0.24 0.28 B 0.76 0.72 Probability tree diagram A Prabhat Mittal profmittal@yahoo.2 A Probability tree diagram A Prabhat Mittal profmittal@yahoo.4 B 0.in 0.76 0.40 P(A1|A0)= prob.co.6 0.8 0.6 Transition Diagram 0.2 A B B 0.4 A B 11 Example 1 Solution Transition Matrix 2nd Stage A Next Purchase (n=2) A B 0.

Thus all regular chains must be ergodic chains but not all ergodic chains must be regular.co. after the Markov Process has been in operation for a long time period a given outcome will tend to occur a fixed percent of time.Steady State (Equilibrium) Conditions • • Given that the process being modeled as a Markov Process has certain properties. regardless of the present state. It is defined as a chain having a transition Matrix P that for some power of P. However. A regular chain is a special type of ergodic chain. in a finite number of steps. the chain must be • • ergodic regular 13 Prabhat Mittal profmittal@yahoo. 14 Prabhat Mittal profmittal@yahoo. The easiest way to check if an ergodic chain is regular is to continue squaring the transition matrix P until all zeros are removed. it is then possible to analyze its long run behavior to determine the probabilities of outcomes after steady state conditions have been reached. it has only non-zero positive probabilities. for a markov chain to reach steady-state conditions.in Steady State (Equilibrium) Conditions • • • An ergodic Markov chain has the property that it is possible to go from one state to any other state.in . For example.co.

Therefore. Finally.co. to reach 1 state 2 to 3-1. Hence. it is possible to go to state 3. Prabhat Mittal profmittal@yahoo. it is possible to go to state 3 & 4 but not 1. For state 3. Like from state 1.Example-II Determine if the following transition matrix is ergodic Markov chain Future states 1 2 3 4 1 Present purchase (n =0) 1/3 1/3 0 ¼ 0 ½ 0 0 0 ¼ ½ 1/3 1/3 ¼ ¼ 2/3 2 3 4 Prabhat Mittal profmittal@yahoo. from state 4. Also from state 3. from state 2. then state 3 to state 1.in 15 Example-II (Solution) Check. similarly.co.in 16 . it is possible to reach from state 1 to 2 to 3. state 1 can directly be approached. if it is possible to go from one state to all other states and back. it is possible to go from state 1 to any other state. and to go from all other states to state 1. the above transition matrix is an ergodic markov chain since we have shown that is possible to go from state 1 to all other states. it is possible to go directly to every other state except state 3.

Since all the elements are not nonzero positive elements. Prabhat Mittal profmittal@yahoo. while P raised to an odd-numbered power will give the original matrix. from state 2 to state 1.Example-III Determine if the following transition matrix is ergodic and regular Markov chain. therefore the above given matrix is not regular But it is ergodic since it is possible to go from state 1 to state 2 or state 3 from state 2 to state 1 or 4.co. From state 3 to state 1.co. From state 4 to state2 or state 1.in 17 Example-III (Solution) 1 2 P= 0 × × 0 × 0 0 × × 0 0 × 0 × × 0 P 2= 1 2 3 4 1 2 P8 = 0 × × 0 0 × × 0 × 0 0 × × 0 0 × × 0 0 × × 0 0 × 0 × × 0 0 × × 0 3 4 1 2 P4 = 0 × × 0 × 0 0 × × 0 0 × 0 × × 0 3 4 3 4 Note that P raised to an even numbered power gives the result as above. X is some positive probabilities. Future states 1 2 3 4 1 Present purchase (n =0) 0 × × 0 × 0 0 × × 0 0 × 0 × × 0 2 3 4 Prabhat Mittal profmittal@yahoo.in 18 .

the probability of needing five units is 0. Prabhat Mittal profmittal@yahoo.6 0.3 0.36 0. The 2 day transition matrix can be used for guiding ordering decisions. As a result of delivery time requirements.31 0.3 0.16 Prabhat Mittal profmittal@yahoo.36 and that twelve units is 0.3 0.P= Tomorrow 5 10 12 0. For e.in 19 Example-IV (Solution) Two day transition matrix 5 P(2) = P.g. an order placed today arrives 2 days later.in 20 .31 0. Tomorrow 5 10 12 5 Today 0.48.6 10 12 a.co.16 0.48 10 12 Imagine that each morning a manager must place an order for inventory replenishment.33 0.36 0. ten units is 0.4 0. If today the manager experiences a demand for 5 units.Example-IV The number of units of an item that are withdrawn from inventory on a day-to-day basis is a Markov chain process in which requirements for tomorrow depend on today’s requirements.1 0. Comment on how a two day transition matrix might be helpful to a manager who is responsible for inventory management.0 0. Develop a two day transition matrix b. then 2 days later (when replenishment stock arrives in response to today’s order).21 0.co.4 0.48 0.

Assume that the working conditions of the equipment follows a Markov processes with the following transition matrix.2.Example-V A manufacturing company has a certain piece of equipment that is inspected at the end of each day and classified as just overhauled.in 22 . on the average. 21 profmittal@yahoo.co. 1 2 P= Today 1 0 0 0 1 Tomorrow 2 3 4 ¾ ¼ 0 ½ 0 0 ½ ½ 0 0 ½ 0 3 4 Prabhat Mittal If it costs Rs. compute the expected per day cost of maintenance. Let us denote the four classifications as states 1. p2. p2. Using the steady-state equation R= RP 0 0 (p1. and Rs.p3. Let p1. a procedure that takes one day.75 in production is lost if a machine is found inoperative. If the item is inoperative it is overhauled.p3. good. p2.and p4 be steady state probabilities representing the proportion of times that the machine will be in states 1.in Example-V (Solution) The given matrix P represents an ergodic regular Markov process.co.p3. p4 ) ¾ ½ 0 0 ¼ ½ ½ 0 0 0 ½ 0 0 1 Prabhat Mittal profmittal@yahoo. p4) = (p1. 3 and 4 respectively. Using the steady state probabilities. therefore it shall reach to steady state equilibrium.2 3 and 4 respectively.125 to overhaul a machine. fair or inoperative.

Thus. two out of every 11 days the machine will be overhauled. p1 = p4 p2 = ¾ p1+ ½ p2 p3 = ¼ p1+ ½ p2 + ½ p3 p4 = ½ p3 And p1+ p2 + p3 + p4 =1 Solving the above equations we obtain the steady state prob. Hence the average cost per day of maintenance will be =2/11*125 + 2/11*75 = Rs. requires solving the simultaneous equations.36 Prabhat Mittal profmittal@yahoo.in 24 .Example-V (Solution) Solving the steady state probability. four out of every 11 days it will be in fair condition and two of every 11 days it will be found inoperative at the end of the day.36.in 23 Excel Worksheet Prabhat Mittal profmittal@yahoo. on an average. p2 = 3/11.co. p3 = 4/11 and p4 =2/11. three out of every 11 days it will be in good condition. as p1= 2/11.co.

B = 45% and C = 30%. Assume that the initial consumer sample is composed of 1000 respondents distributed over three dairies A. Construct the state transition probability matrix to analyze the problem. B and C in a small town which supply all the milk consumed in the town. the customers retained for the period under review are divided by the number of customers at the beginning of the period.in Example-VI (Solution) To determine the retention probabilities (Probability to remain in the same state).co.co. of customers Changes during the period Number lost A B C 250 450 300 50 60 55 Number retained 200 390 245 200/250=0. The following table illustrates the flow of customers over an observation period on one month. price and dissatisfaction.81 Probability of retention Prabhat Mittal profmittal@yahoo.80 390/450=0. of customers 250 450 300 Changes during the period Gain A B Prabhat Mittal Period two No. it is known by all the dairies that consumer switch from one dairy to another due to advertising.in 26 . All these dairies maintain records of the number of their customers and the dairy from which they obtained each new customer. Assume that the matrix of transition probabilities remains fairly stable and at the beginning of period one. market shares are A = 25%. Dairy Period one No. Dairy Period one No. of customers 262 443 295 25 Loss 50 60 55 62 53 50 C profmittal@yahoo. B and C.Example-VI There are three dairies A.86 245/300=0.

retains 86.055 245/300=0.09 25/250=0.5% to dairy C The third row shows that dairy C losses 9% of its customers to dairy B and retains 81% of its own customers A Matrix of transition probabilities B C Retention and gain A Dairies 200/250=0.in 27 Example-VI (Solution) ctd. Dairy Period one No.in 28 ..80 35/450=0. (Probability to remain in the same state).10 25/450=0.co. the customers retained for the period under review are divided by the number of customers at the beginning of the period.08 27/300=0.Example-VI (Solution) ctd..81 B C Retention and loss Prabhat Mittal profmittal@yahoo.86 28/300=0.6% of its own customers and losses 5.co. ctd To determine the gain and loss probabilities. of customers A A B C 250 450 300 0 25 25 Gain from B 35 0 25 C 27 28 0 A 0 35 27 Loss to B 25 0 28 C 25 25 0 262 443 295 Probability of retention Prabhat Mittal profmittal@yahoo. losses 10% (25) of customers to dairy B and losses 10% (25) to dairy c The second row shows that dairy B losses 8% to A.10 390/450=0. ctd • • • The first row shows that dairy A retains 80% of its customers.093 25/250=0. it is necessary to show gains and losses among the dairies in order to complete the matrix of transition probabilities.

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue listening from where you left off, or restart the preview.

scribd