You are on page 1of 59

Spreadsheet Modeling

& Decision Analysis


A Practical Introduction to
Management Science
5th edition

Cliff T. Ragsdale
Chapter 11

Time Series Forecasting


Introduction to Time Series Analysis
 A time-series is a set of observations on a quantitative
variable collected over time.
 Examples
 Dow Jones Industrial Averages
 Historical data on sales, inventory, customer
counts, interest rates, costs, etc
 Businesses are often very interested in forecasting
time series variables.
 Often, independent variables are not available to build
a regression model of a time series variable.
 In time series analysis, we analyze the past behavior
of a variable in order to predict its future behavior.
Some Time Series Terms
 Stationary Data - a time series variable exhibiting
no significant upward or downward trend over
time.
 Nonstationary Data - a time series variable
exhibiting a significant upward or downward
trend over time.
 Seasonal Data - a time series variable exhibiting
a repeating patterns at regular intervals over
time.
Approaching Time Series Analysis
 There are many, many different time series
techniques.
 It is usually impossible to know which technique
will be best for a particular data set.
 It is customary to try out several different
techniques and select the one that seems to
work best.
 To be an effective time series modeler, you need
to keep several time series techniques in your
“tool box.”
Measuring Accuracy
 We need a way to compare different time series techniques for a given data set.
 Four common techniques are the:

– mean absolute deviation,

– mean absolute percent error,

n
Yi  Y

 i
– the mean square error,
MAD =
– root mean square error.
i 1 n

100 n Yi  Ŷi
MAPE = 
n i 1 Yi

MSE = 
 Yi  Yi 
n
2

i 1 n
RMSE  MSE
We will focus on the MSE.
Extrapolation Models
 Extrapolation models try to account for the past
behavior of a time series variable in an effort to
predict the future behavior of the variable.

  f  Y , Y , Y ,
Yt 1 t t 1 t 2

 We’ll first talk about several extrapolation


techniques that are appropriate for stationary data.
An Example
 Electra-City is a retail store that sells audio and
video equipment for the home and car.
 Each month the manager of the store must order
merchandise from a distant warehouse.
 Currently, the manager is trying to estimate how
many VCRs the store is likely to sell in the next
month.
 He has collected 24 months of data.
 See file Fig11-1.xls
Moving Averages

Yt  Yt-1  Yt-k +1
Yt 1 

k

 No general method exists for determining k.


 We must try out several k values to see what works
best.
Implementing the Model

See file Fig11-2.xls


A Comment on Comparing MSE Values

 Care should be taken when comparing MSE


values of two different forecasting techniques.
 The lowest MSE may result from a technique that
fits older values very well but fits recent values
poorly.
 It is sometimes wise to compute the MSE using
only the most recent values.
Forecasting With
The Moving Average Model
Forecasts for time periods 25 and 26 at time period 24:

Y24  Y23 36 + 35
Y25 
   355
.
2 2
 Y
Y 35.5 + 36
Y26 
 25 24
  35.75
2 2
Weighted Moving Average
 The moving average technique assigns equal
weight to all previous observations
1 1 1
Yt 1  Yt  Yt-1  Yt- k -1

k k k
 The weighted moving average technique allows
for different weights to be assigned to previous
observations.
  w Y  w Y  w Y
Yt 1 1 t 2 t-1 k t- k -1

where 0  wi  1 and w
i 1

 We must determine values for k and the wi


Implementing the Model

See file Fig11-4.xls


Forecasting With The Weighted
Moving Average Model
Forecasts for time periods 25 and 26 at time period 24:

Ŷ25  w1Y24  w2 Y23  0.291 36  0.709  35  35.29


Ŷ26  w1Ŷ25  w2 Y24  0.291 35.29  0.709  36  35.79
Exponential Smoothing
 Y
Y   (Y  Y
 )
t 1 t t t
where 0    1

 It can be shown that the above equation is equivalent to:


  Y   (1   ) Y   (1   ) 2 Y  (1   ) n Y 
Yt 1 t t 1 t 2 t n
Examples of Two
Exponential Smoothing Functions
42

40

38
Units Sold

36

34

32
Number of VCRs Sold
30 Exp. Smoothing alpha=0.1
Exp. Smoothing alpha=0.9

28
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
Time Period
Implementing the Model

See file Fig11-8.xls


Forecasting With The
Exponential Smoothing Model
Forecasts for time periods 25 and 26 at time period 24:

Ŷ25  Ŷ24   (Y24  Ŷ24 )  35.74  0.268(36  35.74)  35.81

Ŷ26  Ŷ25   (Y25  Ŷ25 )  Ŷ25   (Ŷ25  Ŷ25 )  Ŷ25  35.81

Note that,
  35.81, for t = 25, 26, 27, 
Yt
Seasonality

 Seasonality is a regular, repeating pattern


in time series data.
 May be additive or multiplicative in
nature...
Stationary Seasonal Effects
Additive Seasonal Effects

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

Tim e Period

Multiplicative Seasonal Effects

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
Tim e Period
Stationary Data With
Additive Seasonal Effects
Ŷt  n  E t  St  n  p
where
E t   (Yt - St-p )  (1 -  )E t 1
St   (Yt - E t )  (1 -  )St  p
0  1
0   1
p represents the number of seasonal periods

 Et is the expected level at time period t.


 St is the seasonal factor for time period t.
Implementing the Model

See file Fig11-13.xls


Forecasting With The Additive
Seasonal Effects Model
Forecasts for time periods 25 to 28 at time period 24:
Ŷ24 n  E 24  S 24  n  4

Ŷ25  E 24  S21  354.44  8.45  363.00


Ŷ26  E 24  S22  354.44  17.82  336.73

Ŷ27  E 24  S 23  354.44  46.58  401.13


Ŷ28  E 24  S24  354.44  31.73  322.81
Stationary Data With
Multiplicative Seasonal Effects

Ŷt  n  E t  St  n  p
where
E t   (Yt /St-p )  (1 -  )E t 1
St   (Yt /E t )  (1 -  )St  p
0  1
0   1
p represents the number of seasonal periods

 Et is the expected level at time period t.


 St is the seasonal factor for time period t.
Implementing the Model

See file Fig11-16.xls


Forecasting With The Multiplicative
Seasonal Effects Model
Forecasts for time periods 25 to 28 at time period 24:

Ŷ24 n  E 24  S 24 n  4

Ŷ25  E 24  S21  353.95  1.015  359.13


Ŷ26  E 24  S22  354.44  0.946  334.94

Ŷ27  E 24  S23  354.44  1.133  400.99


Ŷ28  E 24  S 24  354.44  0.912  322.95
Trend Models

 Trend is the long-term sweep or general


direction of movement in a time series.
 We’ll now consider some nonstationary time
series techniques that are appropriate for
data exhibiting upward or downward trends.
An Example
 WaterCraft Inc. is a manufacturer of personal
water crafts (also known as jet skis).
 The company has enjoyed a fairly steady
growth in sales of its products.
 The officers of the company are preparing sales
and manufacturing plans for the coming year.
 Forecasts are needed of the level of sales that
the company expects to achieve each quarter.
 See file Fig11-19.xls
Double Moving Average
Ŷt  n  E t  nTt
where
E t  2M t  D t
Tt  2(M t  D t ) /( k  1)
M t  (Yt  Yt 1    Yt  k 1) / k
D t  (Mt  Mt 1    Mt  k 1) / k

 Et is the expected base level at time period t.


 Tt is the expected trend at time period t.
Implementing the Model

See file Fig11-20.xls


Forecasting With The
Double Moving Average Model
Forecasts for time periods 21 to 24 at time period 20:

Ŷ20 n  E 20  nT20

Ŷ21  E 20  1T20  2385.33  1 139.9  2525.23


Ŷ22  E 20  2T20  2385.33  2  139.9  2665.13
Ŷ23  E 20  3T20  2385.33  3  139.9  2805.03
Ŷ24  E 20  4T20  2385.33  4  139.9  2944.94
Double Exponential Smoothing
(Holt’s Method)

Ŷt  n  E t  nTt
where
Et = Yt + (1-)(Et-1+ Tt-1)
Tt = (Et Et-1) + (1-) Tt-1
0    1 and 0    1

 Et is the expected base level at time period t.


 Tt is the expected trend at time period t.
Implementing the Model

See file Fig11-22.xls


Forecasting With Holt’s Model
Forecasts for time periods 21 to 24 at time period 20:
Ŷ20  n  E 20  nT20

  E  1T  2336.8  1  152.1  2488.9


Y21 20 20
  E  2T  2336.8  2  152.1  2641.0
Y22 20 20
  E  3T  2336.8  3  152.1  27931
Y .
23 20 20
  E  4T  2336.8  4  152.1  2945.2
Y24 20 20
Holt-Winter’s Method For
Additive Seasonal Effects
Ŷt  n  E t  nTt  St  n  p
where
 
E t   Yt  St  p  (1 -  )(E t 1  Tt 1 )
Tt    E t  E t 1   (1 -  )Tt 1
St    Yt  E t   (1 -  )St  p
0  1
0   1
0   1
Implementing the Model

See file Fig11-25.xls


Forecasting With Holt-Winter’s
Additive Seasonal Effects Method
Forecasts for time periods 21 to 24 at time period 20:

Ŷ20 n  E 20  nT20  S20 n  4

Ŷ21  E 20  1 T20  S17  2253.3  1 154.3  262.66  2670.3


Ŷ22  E 20  2  T20  S18  2253.3  2  154.3  312.59  2249.3
Ŷ23  E 20  3  T20  S19  2253.3  3  154.3  205.40  2921.6

Ŷ24  E 20  4  T20  S 20  2253.3  4  154.3  386.12  3256.6


Holt-Winter’s Method For
Multiplicative Seasonal Effects

Ŷt  n   E t  nTt St  n  p


where
 
E t   Yt / St  p  (1 -  )(E t 1  Tt 1 )
Tt    E t  E t 1   (1 -  )Tt 1
St    Yt / E t   (1 -  )St  p
0  1
0   1
0   1
Implementing the Model

See file Fig11-28.xls


Forecasting With Holt-Winter’s
Multiplicative Seasonal Effects Method
Forecasts for time periods 21 to 24 at time period 20:

Ŷ20 n   E 20  nT20  S20 n  4


Ŷ21  (E 20  1T20 ) S17  (2217.6  1 137.3)1.152  2713.7

Ŷ22  (E 20  2T20 ) S18  (2217.6  2  137.3)0.849  2114.9

Ŷ23  (E 20  3T20 ) S19  (2217.6  3 137.3)1.103  2900.5

Ŷ24  (E 20  4T20 ) S 20  (2217.6  4  137.3)1.190  3293.9


The Linear Trend Model

Y t  b0  b1X1t
where X1t  t

For example:
X11  1, X12  2, X13  3, 
Implementing the Model

See file Fig11-31.xls


Forecasting With
The Linear Trend Model
Forecasts for time periods 21 to 24 at time period 20:

  b  b X  3751
Y .  92.6255  21  2320.3
21 0 1 121

  b  b X  3751
Y .  92.6255  22  2412.9
22 0 1 122

  b  b X  3751
Y .  92.6255  23  2505.6
23 0 1 123

  b  b X  3751
Y .  92.6255  24  2598.2
24 0 1 124
The TREND() Function
TREND(Y-range, X-range, X-value for prediction)
where:
Y-range is the spreadsheet range containing the dependent
Y variable,
X-range is the spreadsheet range containing the
independent X variable(s),
X-value for prediction is a cell (or cells) containing the
values for the independent X variable(s) for which we want
an estimated value of Y.

Note: The TREND( ) function is dynamically updated whenever


any inputs to the function change. However, it does not provide
the statistical information provided by the regression tool. It is
best two use these two different approaches to doing regression
in conjunction with one another.
The Quadratic Trend Model

  b b X b X
Yt 0 1 1t 2 2t

where X1t  t and X 2 t  t 2


Implementing the Model

See file Fig11-34.xls


Forecasting With
The Quadratic Trend Model
Forecasts for time periods 21 to 24 at time period 20:

Ŷ21  b0  b1X121  b2 X 2 21  653.67  16.671 21  3.617  212  2598.9

Ŷ22  b0  b1X122  b2 X 2 22  653.67  16.671 22  3.617  22 2  2771.1

Ŷ23  b0  b1X123  b2 X 2 23  653.67  16.671 23  3.617  232  2950.4

Ŷ24  b0  b1X124  b2 X 2 24  653.67  16.671 24  3.617  24 2  3137.1


Computing Multiplicative
Seasonal Indices
 We can compute multiplicative seasonal
adjustment indices for period p as follows:
Yi
i Y
Sp  i
, for all i occuring in season p
np

 The final forecast for period i is then


  S , for any i occuring in season p
 adjusted = Y
Yi i p
Implementing the Model

See file Fig11-37.xls


Forecasting With Seasonal Factors
Applied To The Quadratic Trend Model
Forecasts for time periods 21 to 24 at time period 20:

Ŷ21  (b0  b1X121  b2 X 2 21 ) S1  2598.9  105.7%  2747.8

Ŷ22  (b0  b1X122  b2 X 2 22 ) S 2  2771.1 80.1%  2219.6

Ŷ23  (b0  b1X123  b2 X 2 23 ) S 3  2950.5  103.1%  3041.4

Ŷ24  (b0  b1X124  b2 X 2 24 ) S 4  3137.2  111 .1%  3486.1


Summary of the Calculation and Use
of Seasonal Indices
1. Create a trend model and calculate the estimated value ( )
for each Ŷ
observation in the sample.
t
2. For each observation, calculate the ratio of the actual value
to the predicted trend value:
Yt / Ŷt .
(For additive effects, compute the difference:
Yt  Ŷt ).
3. For each season, compute the average of the ratios
calculated in step 2. These are the seasonal indices.
4. Multiply any forecast produced by the trend model by the
appropriate seasonal index calculated in step 3. (For additive
seasonal effects, add the appropriate factor to the forecast.)
Refining the Seasonal Indices
 Note that Solver can be used to simultaneously
determine the optimal values of the seasonal
indices and the parameters of the trend model
being used.
 There is no guarantee that this will produce a
better forecast, but it should produce a model
that fits the data better in terms of the MSE.

See file Fig11-39.xls


Seasonal Regression Models
 Indicator variables may also be used in regression
models to represent seasonal effects.
 If there are p seasons, we need p -1 indicator variables.

 Our example problem involves quarterly data, so p=4


and we define the following 3 indicator variables:
1, if Yt is an observation from quarter 1
X 3t  
0, otherwise
1, if Yt is an observation from quarter 2
X4t  
0, otherwise
1, if Yt is an observation from quarter 3
X5t  
0, otherwise
Implementing the Model
 The regression function is:
  b b X b X b X b X b X
Yt 0 1 1t 2 2t 3 3t 4 4t 5 5t

where X1t  t and X 2t  t 2

See file Fig11-42.xls


Forecasting With The
Seasonal Regression Model
Forecasts for time periods 21 to 24 at time period 20:

Ŷ21  824.471  17.319(21)  3.485(21) 2  86.805(1)  424.736(0)  123.453(0)  2638.5

Ŷ22  824.471  17.319(22)  3.485(22) 2  86.805(0)  424.736(1)  123.453(0)  2467.7

Ŷ23  824.471  17.319(23)  3.485(23) 2  86.805(0)  424.736(0)  123.453(1)  2943.2

Ŷ24  824.471  17.319(24)  3.485(24) 2  86.805(0)  424.736(0)  123.453(0)  3247.8


Crystal Ball (CB) Predictor

 CB Predictor is an add-in that simplifies


the process of performing time series
analysis in Excel.
 A trial version of CB Predictor is available
on the CD-ROM accompanying this book.
 For more information on CB Predictor see:
http://www.decisioneering.com

See file Fig11-46.xls


Combining Forecasts
 It is also possible to combine forecasts to create a composite forecast.
 Suppose we used three different forecasting methods on a given data
set.

 Denote the predicted value of time period t using


each method as follows:
F1t , F2t , and F3t

 We could create a composite forecast as follows:


  b b F b F b F
Yt 0 1 1t 2 2t 3 3t
End of Chapter 11

You might also like