1

© All Rights Reserved

2 views

1

© All Rights Reserved

- HW2
- Engineering Mathematics
- hw1-2017[2140]
- Multi-criteria decision making
- Distributed Event-Triggered Consensus Tracking of First-Order Multi-Agent Systems with a Virtual Leader
- Markov Processes
- Tutorial on Singular Value Decomposition
- Markov Process
- MTH256 - Lab 2
- New Microsoft Office Word Document
- Lecture 3
- 00975511
- Bundesbank Analyzing Business and Financial Cycles Using Multi-level Factor Models
- 9231_s14_qp_11
- Hw 2 Solution
- Stochastic Modeling of Rainfall Data in the Agro Claimatic Area Kanchipuram
- Cap 15 Traducir
- Modal Analysis of Asymmetric Rotor System Using
- NumericalMethodsDataSheet FINAL Corrected
- Correlation Method Based PCA Subspace using Accelerated Binary Particle Swarm Optimization for Enhanced Face Recognition

You are on page 1of 53

600: Lecture 32

Markov Chains

Scott Sheffield

MIT

Outline

Markov chains

Examples

Outline

Markov chains

Examples

Markov chains

taking values in the same state space, which for now we take

to be a finite set that we label by {0, 1, . . . , M}.

Markov chains

taking values in the same state space, which for now we take

to be a finite set that we label by {0, 1, . . . , M}.

I Interpret Xn as state of the system at time n.

Markov chains

taking values in the same state space, which for now we take

to be a finite set that we label by {0, 1, . . . , M}.

I Interpret Xn as state of the system at time n.

I Sequence is called a Markov chain if we have a fixed

collection of numbers Pij (one for each pair

i, j {0, 1, . . . , M}) such that whenever the system is in state

i, there is probability Pij that system will next be in state j.

Markov chains

taking values in the same state space, which for now we take

to be a finite set that we label by {0, 1, . . . , M}.

I Interpret Xn as state of the system at time n.

I Sequence is called a Markov chain if we have a fixed

collection of numbers Pij (one for each pair

i, j {0, 1, . . . , M}) such that whenever the system is in state

i, there is probability Pij that system will next be in state j.

I Precisely,

P{Xn+1 = j|Xn = i, Xn1 = in1 , . . . , X1 = i1 , X0 = i0 } = Pij .

Markov chains

taking values in the same state space, which for now we take

to be a finite set that we label by {0, 1, . . . , M}.

I Interpret Xn as state of the system at time n.

I Sequence is called a Markov chain if we have a fixed

collection of numbers Pij (one for each pair

i, j {0, 1, . . . , M}) such that whenever the system is in state

i, there is probability Pij that system will next be in state j.

I Precisely,

P{Xn+1 = j|Xn = i, Xn1 = in1 , . . . , X1 = i1 , X0 = i0 } = Pij .

I Kind of an almost memoryless property. Probability

distribution for next state depends only on the current state

(and not on the rest of the state history).

Simple example

rainy and sunny.

Simple example

rainy and sunny.

I If its rainy one day, theres a .5 chance it will be rainy the

next day, a .5 chance it will be sunny.

Simple example

rainy and sunny.

I If its rainy one day, theres a .5 chance it will be rainy the

next day, a .5 chance it will be sunny.

I If its sunny one day, theres a .8 chance it will be sunny the

next day, a .2 chance it will be rainy.

Simple example

rainy and sunny.

I If its rainy one day, theres a .5 chance it will be rainy the

next day, a .5 chance it will be sunny.

I If its sunny one day, theres a .8 chance it will be sunny the

next day, a .2 chance it will be rainy.

I In this climate, sun tends to last longer than rain.

Simple example

rainy and sunny.

I If its rainy one day, theres a .5 chance it will be rainy the

next day, a .5 chance it will be sunny.

I If its sunny one day, theres a .8 chance it will be sunny the

next day, a .2 chance it will be rainy.

I In this climate, sun tends to last longer than rain.

I Given that it is rainy today, how many days to I expect to

have to wait to see a sunny day?

Simple example

rainy and sunny.

I If its rainy one day, theres a .5 chance it will be rainy the

next day, a .5 chance it will be sunny.

I If its sunny one day, theres a .8 chance it will be sunny the

next day, a .2 chance it will be rainy.

I In this climate, sun tends to last longer than rain.

I Given that it is rainy today, how many days to I expect to

have to wait to see a sunny day?

I Given that it is sunny today, how many days to I expect to

have to wait to see a rainy day?

Simple example

rainy and sunny.

I If its rainy one day, theres a .5 chance it will be rainy the

next day, a .5 chance it will be sunny.

I If its sunny one day, theres a .8 chance it will be sunny the

next day, a .2 chance it will be rainy.

I In this climate, sun tends to last longer than rain.

I Given that it is rainy today, how many days to I expect to

have to wait to see a sunny day?

I Given that it is sunny today, how many days to I expect to

have to wait to see a rainy day?

I Over the long haul, what fraction of days are sunny?

Matrix representation

i, j {0, 1, . . . , M}.

Matrix representation

i, j {0, 1, . . . , M}.

I It is convenient to represent the collection of transition

probabilities Pij as a matrix:

P00 P01 . . . P0M

P10 P11 . . . P1M

A=

PM0 PM1 . . . PMM

Matrix representation

i, j {0, 1, . . . , M}.

I It is convenient to represent the collection of transition

probabilities Pij as a matrix:

P00 P01 . . . P0M

P10 P11 . . . P1M

A=

PM0 PM1 . . . PMM

PM

j=0 Pij = 1 for each i. That is, the rows sum to one.

Transitions via matrices

I Suppose that pi is the probability that system is in state i at

time zero.

Transitions via matrices

I Suppose that pi is the probability that system is in state i at

time zero.

I What does the following product represent?

P00 P01 . . . P0M

P10 P11 . . . P1M

p0 p1 . . . pM

PM0 PM1 . . . PMM

Transitions via matrices

I Suppose that pi is the probability that system is in state i at

time zero.

I What does the following product represent?

P00 P01 . . . P0M

P10 P11 . . . P1M

p0 p1 . . . pM

PM0 PM1 . . . PMM

Transitions via matrices

I Suppose that pi is the probability that system is in state i at

time zero.

I What does the following product represent?

P00 P01 . . . P0M

P10 P11 . . . P1M

p0 p1 . . . pM

PM0 PM1 . . . PMM

I How about the following product?

An

p0 p1 . . . pM

Transitions via matrices

I Suppose that pi is the probability that system is in state i at

time zero.

I What does the following product represent?

P00 P01 . . . P0M

P10 P11 . . . P1M

p0 p1 . . . pM

PM0 PM1 . . . PMM

I How about the following product?

An

p0 p1 . . . pM

Powers of transition matrix

(n)

I We write Pij for the probability to go from state i to state j

over n steps.

Powers of transition matrix

(n)

I We write Pij for the probability to go from state i to state j

over n steps.

I From the matrix point of view

(n) (n) (n)

n

P00 P01 . . . P0M P00 P01 . . . P0M

(n) (n) (n)

P10 P11 . . . P1M 10 P11 . . . P1M

P

=

(n) (n) (n) PM0 PM1 . . . PMM

PM0 P M1 ... P MM

Powers of transition matrix

(n)

I We write Pij for the probability to go from state i to state j

over n steps.

I From the matrix point of view

(n) (n) (n)

n

P00 P01 . . . P0M P00 P01 . . . P0M

(n) (n) (n)

P10 P11 . . . P1M 10 P11 . . . P1M

P

=

(n) (n) (n) PM0 PM1 . . . PMM

PM0 P M1 ... P MM

transition matrix.

Questions

Questions

I Answer: state sequence Xi consists of i.i.d. random variables.

Questions

I Answer: state sequence Xi consists of i.i.d. random variables.

I What if matrix is the identity?

Questions

I Answer: state sequence Xi consists of i.i.d. random variables.

I What if matrix is the identity?

I Answer: states never change.

Questions

I Answer: state sequence Xi consists of i.i.d. random variables.

I What if matrix is the identity?

I Answer: states never change.

I What if each Pij is either one or zero?

Questions

I Answer: state sequence Xi consists of i.i.d. random variables.

I What if matrix is the identity?

I Answer: states never change.

I What if each Pij is either one or zero?

I Answer: state evolution is deterministic.

Outline

Markov chains

Examples

Outline

Markov chains

Examples

Simple example

theres a .5 chance it will be rainy the next day, a .5 chance it

will be sunny. If its sunny one day, theres a .8 chance it will

be sunny the next day, a .2 chance it will be rainy.

Simple example

theres a .5 chance it will be rainy the next day, a .5 chance it

will be sunny. If its sunny one day, theres a .8 chance it will

be sunny the next day, a .2 chance it will be rainy.

I Let rainy be state zero, sunny state one, and write the

transition matrix by

.5 .5

A=

.2 .8

Simple example

theres a .5 chance it will be rainy the next day, a .5 chance it

will be sunny. If its sunny one day, theres a .8 chance it will

be sunny the next day, a .2 chance it will be rainy.

I Let rainy be state zero, sunny state one, and write the

transition matrix by

.5 .5

A=

.2 .8

I Note that

2 .64 .35

A =

.26 .74

Simple example

theres a .5 chance it will be rainy the next day, a .5 chance it

will be sunny. If its sunny one day, theres a .8 chance it will

be sunny the next day, a .2 chance it will be rainy.

I Let rainy be state zero, sunny state one, and write the

transition matrix by

.5 .5

A=

.2 .8

I Note that

2 .64 .35

A =

.26 .74

.285719 .714281

I Can compute A10 =

.285713 .714287

Does relationship status have the Markov property?

In a relationship

Married Engaged

Does relationship status have the Markov property?

In a relationship

Married Engaged

Does relationship status have the Markov property?

In a relationship

Married Engaged

I Markov model implies time spent in any state (e.g., a

marriage) before leaving is a geometric random variable.

Does relationship status have the Markov property?

In a relationship

Married Engaged

I Markov model implies time spent in any state (e.g., a

marriage) before leaving is a geometric random variable.

I Not true... Can we make a better model with more states?

Outline

Markov chains

Examples

Outline

Markov chains

Examples

Ergodic Markov chains

I Say Markov chain is ergodic if some power of the transition

matrix has all non-zero entries.

Ergodic Markov chains

I Say Markov chain is ergodic if some power of the transition

matrix has all non-zero entries.

I Turns out that if chain has this property, then

(n)

j := limn Pij exists and the j are the unique

non-negative solutions of j = M

P

k=0 k Pkj that sum to one.

Ergodic Markov chains

I Say Markov chain is ergodic if some power of the transition

matrix has all non-zero entries.

I Turns out that if chain has this property, then

(n)

j := limn Pij exists and the j are the unique

non-negative solutions of j = M

P

k=0 k Pkj that sum to one.

I This means that the row vector

= 0 1 . . . M

Ergodic Markov chains

I Say Markov chain is ergodic if some power of the transition

matrix has all non-zero entries.

I Turns out that if chain has this property, then

(n)

j := limn Pij exists and the j are the unique

non-negative solutions of j = M

P

k=0 k Pkj that sum to one.

I This means that the row vector

= 0 1 . . . M

I We call the stationary distribution of the Markov chain.

Ergodic Markov chains

I Say Markov chain is ergodic if some power of the transition

matrix has all non-zero entries.

I Turns out that if chain has this property, then

(n)

j := limn Pij exists and the j are the unique

non-negative solutions of j = M

P

k=0 k Pkj that sum to one.

I This means that the row vector

= 0 1 . . . M

I We call the stationary distribution of the Markov chain.

I One can

P solve the system of linear equations

j = M k=0 k Pkj to compute the values j . Equivalent to

considering A fixed and solving A = . Or solving

(A I ) = 0. This determines

P up to a multiplicative

constant, and fact that j = 1 determines the constant.

Simple example

.5 .5

I If A = , then we know

.2 .8

.5 .5

A = 0 1 = 0 1 = .

.2 .8

Simple example

.5 .5

I If A = , then we know

.2 .8

.5 .5

A = 0 1 = 0 1 = .

.2 .8

we also know that 0 + 1 = 1. Solving these equations gives

0 = 2/7 and 1 = 5/7, so = 2/7 5/7 .

Simple example

.5 .5

I If A = , then we know

.2 .8

.5 .5

A = 0 1 = 0 1 = .

.2 .8

we also know that 0 + 1 = 1. Solving these equations gives

0 = 2/7 and 1 = 5/7, so = 2/7 5/7 .

I Indeed,

.5 .5

A = 2/7 5/7 = 2/7 5/7 = .

.2 .8

Simple example

.5 .5

I If A = , then we know

.2 .8

.5 .5

A = 0 1 = 0 1 = .

.2 .8

we also know that 0 + 1 = 1. Solving these equations gives

0 = 2/7 and 1 = 5/7, so = 2/7 5/7 .

I Indeed,

.5 .5

A = 2/7 5/7 = 2/7 5/7 = .

.2 .8

I Recall

that

10 .285719 .714281 2/7 5/7

A = =

.285713 .714287 2/7 5/7

- HW2Uploaded byWesam QWesam
- Engineering MathematicsUploaded byGaurav Sachan
- hw1-2017[2140]Uploaded byMohit Duseja
- Multi-criteria decision makingUploaded byMarin Florin
- Distributed Event-Triggered Consensus Tracking of First-Order Multi-Agent Systems with a Virtual LeaderUploaded byAnonymous vQrJlEN
- Markov ProcessesUploaded bydudealok
- Tutorial on Singular Value DecompositionUploaded byGaurav Bhatnagar
- Markov ProcessUploaded byJansi Sony
- MTH256 - Lab 2Uploaded byJustin Drawbert
- New Microsoft Office Word DocumentUploaded byNeelabh Sukhatme
- Lecture 3Uploaded byAlessia Pacioni
- 00975511Uploaded byjunedboerhan
- Bundesbank Analyzing Business and Financial Cycles Using Multi-level Factor ModelsUploaded bysdinu
- 9231_s14_qp_11Uploaded byweews
- Hw 2 SolutionUploaded bySarahB
- Stochastic Modeling of Rainfall Data in the Agro Claimatic Area KanchipuramUploaded byIJAETMAS
- Cap 15 TraducirUploaded byMiguel Angel Pinzon
- Modal Analysis of Asymmetric Rotor System UsingUploaded bybinifs
- NumericalMethodsDataSheet FINAL CorrectedUploaded byEdgar Dario Obando
- Correlation Method Based PCA Subspace using Accelerated Binary Particle Swarm Optimization for Enhanced Face RecognitionUploaded byEditor IJRITCC
- 1.MatricesEigenCayleyPropertiesUploaded byEmmanuvel Joseph Aju
- Evaluation of Methods for Analysis of Multi-Degree-Of-freedom SysUploaded byAhmadA.Swidan
- lec6.pdfUploaded byPurushotham Prasad K
- chap4Uploaded byOsamah Talal Makki
- Markov chain approximations for transition densities of Lévy processesUploaded byVincentLiang
- 606b-wk14 (1).pptUploaded byPrincess Melanie Melendez
- Linear algebra part-5 (Eigen Values & Eigen Vectors).pdfUploaded bysakina ferdousi
- PreviousYearSolvedProblemsPhysics2016-fiziks.pdfUploaded bySushant Vijay
- MpUploaded byChiraag Chiru
- questtionaireUploaded byHasnain Khar

- File 0085.pdfUploaded byEd Z
- File 0096.pdfUploaded byEd Z
- File 0099.pdfUploaded byEd Z
- File 0098.pdfUploaded byEd Z
- File 0094.pdfUploaded byEd Z
- File 0100.pdfUploaded byEd Z
- File 0095.pdfUploaded byEd Z
- File 0097.pdfUploaded byEd Z
- File 0083.pdfUploaded byEd Z
- File 0092.pdfUploaded byEd Z
- File 0078.pdfUploaded byEd Z
- File 0077.pdfUploaded byEd Z
- File 0091.pdfUploaded byEd Z
- File 0090.pdfUploaded byEd Z
- File 0089.pdfUploaded byEd Z
- File 0086.pdfUploaded byEd Z
- File 0093.pdfUploaded byEd Z
- File 0082.pdfUploaded byEd Z
- File 0088.pdfUploaded byEd Z
- File 0081.pdfUploaded byEd Z
- File 0080.pdfUploaded byEd Z
- File 0087.pdfUploaded byEd Z
- File 0079.pdfUploaded byEd Z
- File 0084.pdfUploaded byEd Z
- File 0071.pdfUploaded byEd Z
- File 0069.pdfUploaded byEd Z
- File 0068.pdfUploaded byEd Z
- File 0061.pdfUploaded byEd Z
- File 0063.pdfUploaded byEd Z
- File 0075.pdfUploaded byEd Z

- DiplomaThesisFlorianMoritzUploaded by196sigma
- 1-s2.0-S1049964413000327-mainUploaded byKieu Viet Anh
- day 1 founding documents lesson planUploaded byapi-224128717
- Bismillah Dapus 1-1.docxUploaded byhalimfathonii
- Blood CancerUploaded byJatinder Sandhu
- Umbrella ManufacturingUploaded bypradip_kumar
- Element and Their Compund AnsUploaded byraihanamzad
- Python Advanced_ Pipes in PythonUploaded byPedro Elias Romero Nieto
- prolia_piUploaded byNoha Rosi
- JavaScript Lab 1.0Uploaded byMohamed Nazim
- Camara - OV7670Uploaded byCarlos Oreza Sanz
- ComputerUploaded byAlexander Beltran
- Building Materials i - EarthUploaded bySivaRaman
- Energy Efficient Protocol with Static Clustering (EEPSC) Comparing with Low Energy Adaptive Clustering Hierarchy (LEACH) Protocol.pdfUploaded byAlexander Decker
- Sun Fire E6900-E4900 Systems Service Manual 817-4120-13Uploaded byclovsz
- Human RrrrUploaded byGaurav Agarwal
- Turbines Paper Ray BeebeUploaded byaliscribd46
- 09_chpts1-4Uploaded byNelly A. Juma
- Ochratoxin-45532Uploaded byJuanManuelAmaroLuis
- CopywritingUploaded byfrank05
- DBMS_MCQs.pdfUploaded byManish Kumar
- 1-4 Analytical ConsiderationsUploaded byVerónica Margarita Dubó Almendares
- Resignation as Resident Agent of the Pussy Church of Modern WitchcraftUploaded byCathy Brennan
- Recipes for nanoscribeUploaded byabc1000
- Kimble K2200C PartsUploaded byviemey1952
- 19 ConversionUploaded byGerardo Alves
- Business Associations Outline DANOFFUploaded byjmdanoff
- Configuration Management at a GlanceUploaded byneohckim
- DD_CEN_TS_1992-4-2-2009Uploaded bybruno_oliveira_emilio
- UConn Prosthodontics Clinic Manual 12-13Uploaded bylippincott2011

## Much more than documents.

Discover everything Scribd has to offer, including books and audiobooks from major publishers.

Cancel anytime.