You are on page 1of 2

Ben Gaehle (1)

Linear Algebra
Markov Chains and Weather Prediction

Abstract
A Markov Chain is commonly used to predict the distant future of
dependent events. This ideal can be applied to a simplified weather
prediction, although the accuracy is low in predicting immediate forecasts.

A Markov Chain is a mathematical device used in predicting the distant future.

They are made possible only when events are dependent, where the next time periods

state is affected by the current state. A basic example of a Markov Chain consists of a

current state, often compiled into a state vector c, and a stochastic matrix P, an nxn

collection of probability vectors based on the number of states (n). The current state is

used for the basis of the future states. The resulting Markov Chain is thus formed where

c1=Pc0, c2=Pc1, c(k+1)=Pck, where k is an integer.

One commonly used example of Markov Chains is weather forecasting. In a

simplified example, if a particular location, say there are two states per day for the

weather: dry or wet. This results in a state vector c. In this example, the first day (k=0)
1
will be assumed to be dry, so c0 = . If the current day is dry, then the weather on
0
future days can be recorded when the next days probability of being dry or wet is known

(a probability vector). When the probability vectors are compiled, the result is the

stochastic matrix P. Observe the given next-day weather probabilities based on the

current state, with arbitrary values chosen, but columns add to 1.

D W
3 2
P = 4 3 D Where D signifies dry and W signifies wet

1 1 W
4 3
Ben Gaehle (2)
Linear Algebra
The stochastic matrix shows that if the current day is dry, then there is a .75

chance that the next day will also be dry. Using the definition of Markov Chains, the

weather on future days can be loosely predicted (c1=Pc0, c2=Pc1, c(k+1)=Pck).

Thus, with an initial state of c0 stated earlier, c1, c2.ck can all be determined based on the

c0 vector.
3 2 1 .75
c1 = 4
3 0 = .25 Then, using c1 as the new state vector, c2 can be found.

1 1

4 3

c2 = 3 2 .75 .729
= As k increases, eventually a steady-state vector s is defined.
4 3 .25 .271

1 1

4 3
ck = 3 2 .727 = .727 The steady state vector is representative of when Pc=c.
4
3 .273

.273
This is the most important feature of a Markov Chain:
1 1

4 3
determining long-term behavior of events. In this example,

when the initial state is dry and when given a stochastic matrix P, the state vectors

converge to a single state, where the next day has a .727 chance of being dry and a .273

chance of being wet. Thus, Markov Chains can be used to predict overall tendencies of

weather for a given area, but do a poor job of predicting the immediate forecast due to the

sporadic nature of weather, especially in areas know for rapid weather change like

Missouri.

References

1. Kolman, Bernard, Robert C. Busby, and Sharon Cutler Ross, Discrete Mathematical

Structures, fifth edition, Pearson Education, Inc., 2004.

2. Lay, David C, Linear Algebra and Its Applications, third edition, Pearson

Education, Inc., 2003.