You are on page 1of 29

IE3120 Manufacturing

Logistics
The Forecasting Function
Production Planning Framework
Data time series patterns
You should know that…

 Forecasting is the starting point of all planning.


 Forecasts are always wrong (when there are uncertainties)
 Aggregate forecasts are more accurate than detailed forecast
 The further into the future the less reliable the forecast will
be
Quantitative forecasting models
• Causal models
• Predict a future variable as a function of other variables
• Y = f(x1, x2, …, xn)

• Time series models


• Predict a future variable as a function of past values of that variable
• yt = f(yt-1, yt-2, …)
Time Series components
p Trend
 growth or decline
p Seasonality
 repeats at fixed interval
p Cycles
 similar to seasonality – length and magnitude may vary
p Randomness
 no recognizable pattern
Notations

• Dt is the observed value of demand during period t.


• Ft is the forecast made in period t-1 for period t
• Fs,t is the forecast made in period s for period t.
• et is the forecast error at period t
where et = Ft – Dt
Evaluating Forecasts
 Mean absolute deviation (MAD)
n

| e i |
MAD  i 1
n
 Mean squared error (MSE)
n

 i
e 2

MSE  i1
n
 Mean absolute percentage error (MAPE)
n

| e / d i i |
MAPE  i 1
 100%
n
Evaluating Forecasts: Example

• Which is better?
Forecasting Stationary Series
 A stationary time series has the form:

Dt = m+ et

where mis a constant and et is a random variable with


mean 0 and var s2.
 Two common methods used
Moving Average
 Exponential Smoothing
Moving Average Forecast
 Using the data from N periods ago to predict
the future demand
t1
Ft  (1/ N)  Di
itN

• The one-step ahead forecast is the same with multi-step ahead forecast since
demand is stationary!
Some Comments of Moving Average
 Advantages
 Easily understood
 Easily computed
 Provide stable forecasts

 Disadvantages
 Require saving all past N data points
 Always lagging if there is a actually a
trend
Moving Average Forecast Lagging a Trend
Exponential Smoothing
Using the weighted average of the last period
demand and the last prediction to predict the
future demand

where 0 < a<1and generally is small for


stability of forecasts ( around .1 to .2)
Weights in Exponential Smoothing
Comparison of Exponential
Smoothing and Moving Average
Similarities Differences
 Demand process is assumed  Exponential smoothing is weighted average
stationary for all data points, the moving average only uses
 Only need one parameter, N or the latest N data.
a  Memory use – which one uses less memory?
 Both will lag behind the trend if it
exists
Trend based Forecasting Methods

• Double exponential smoothing using Holt’s method


• Assume linear growth trend in data
• Separate smoothing constants for ‘slope’ and ‘intercept’
component
Holt’s Double Exponential
Updating Equations


Holt’s Method Example

Month Demand How to compute slope


1 200 and intercept?
2 250

3 175
Need to have initial G
4 186 Assume S0 = 200, G0 = 10,
5 225
and Sα = 0.1, β = 0.1
6 285

7 305 Assume S=200, and G


8 190 =10
Holt’s Method Example
Month Demand How to compute slope
St Gt Ft (one step)

1 200 and intercept?


2 250

3 175
Need to have initial G
4 186

5 225
and S
6 285

7 305 Assume S=200, and G


8 190 =10
Time Series with Seasonal Patterns
Demand

Month
Forecasting for Seasonal Series
 Seasonality corresponds to a pattern in the data
that repeats at regular intervals.
 Multiplicative seasonal factors: c1 , c2 , . . . , cN where i = 1 is first
period of season, i = 2 is second period of the season, etc.
 ci = 1.25 implies 25% higher than the baseline on avg.
 ci = 0.75 implies 25% lower than the baseline on avg.
 Sci = N.
Estimating seasonal factors

 Compute the sample mean of the entire data set (should be at least
several seasons of data).
 Divide each observation by the sample mean. (This gives a factor
for each observation.)
 Average the factors for similar periods in a season.
De-seasonalizing the data series

• To remove seasonality from a series, divide each observation in


the series by the appropriate seasonal factor.
• The resulting series will then have no seasonal effect, and we can
use other appropriate method for forecasting.
• Once forecast is made, it is multiplied with appropriate seasonal
factor for forecasting of the original series.
An example
Week 1 Week 2 Week 3 Week 4

Monday 16.2 17.3 14.6 16.1

Tuesday 12.2 11.5 13.1 11.8

Wednesday 14.2 15.0 13.0 12.9

Thursday 17.3 17.6 16.9 16.6

Friday 22.5 23.5 21.9 24.3


Data with Seasonal Patterns
An example

Week 1 Week 2 Week 3 Week 4

Monday 16.2 17.3 14.6 16.1

Tuesday 12.2 11.5 13.1 11.8

Wednesday 14.2 15.0 13.0 12.9

Thursday 17.3 17.6 16.9 16.6

Friday 22.5 23.5 21.9 24.3


De-seasonalized data
Week 1 Week 2 Week 3 Week 4

Monday 16.57 17.70 14.94 16.47

Tuesday 16.49 15.54 17.70 15.95

Wed 16.93 17.88 15.50 15.38

Thurs 16.61 16.90 16.23 15.94

Friday 16.03 16.74 15.60 17.31


Example forecast

Week 1 Week 2 Week 3 Week 4 Week 5

Monday 16.57 17.70 14.94 16.47

Tuesday 16.49 15.54 17.70 15.95

Wed 16.93 17.88 15.50 15.38

Thurs 16.61 16.90 16.23 15.94

Friday 16.03 16.74 15.60 17.31

You might also like