Professional Documents
Culture Documents
Forecasting
Forecasting
2
Introduction
• Forecasts affect decisions and activities
throughout an organization.
Accounting Cost/profit estimates
Finance Cash flow and funding
Human Resources Hiring/recruiting/training
Marketing Pricing, promotion, strategy
MIS IT/IS systems, services
Operations Schedules, workloads
Product/service design New products and services
3
Features of Forecasts
• Assumes causal system past ==> future
• Forecasts are rarely perfect
• Forecast accuracy decreases as time horizon
increases
4
Elements of a Good Forecast
Timely
Reliable Accurate
Written
Cost Effective 5
Steps in the Forecasting Process
“The forecast”
6
Forecasting Process
1. Identify the purpose 2. Collect historical data 3. Plot data and identify
of forecast patterns
7.
Is accuracy of No 8b. Select new forecast
forecast model or adjust
acceptable? parameters of existing
model
Yes
9. Adjust forecast based 10. Monitor results and
8a. Forecast over
on additional qualitative measure forecast
planning horizon
information and insight accuracy
7
8
Introduction
10
Types of Forecasts
• 1. Judgmental model
• It makes its decision based on qualitative or subjective
factors. It depends on expert’s opinions, individual
experiences and judgments and other subjective factors. This
type of model is useful when quantitative data is unavailable
or when subjective factors are very important.
• Executive opinions
• Sales force opinions
• Consumer surveys
• Outside opinion
• Delphi method
– Opinions of managers and staff
– Achieves a consensus forecast
11
Introduction
Forecasting
Models
Exponential
Smoothing Multiple
Regression
Analysis
Trend
Projections
14
Scatter Diagram
300
250
200
150 Series1
100
50
0
0 5 10 15
500
400
300
Series1
200
100
0
1 2 3 4 5 6 7 8 9 10
250
200
150
Series1
100
50
0
0 5 10 15
16
A scatter diagram of demands for stereos
Time series forecasting models
• A time series is based on evenly spaced (weekly, monthly, quarterly and so
on) data points.
• Forecasting time series data implies that future values are predicted based
on the past values only, no matter how potentially valuable they are.
• A time series typically has 4 components: trend, seasonality, cycles, and
random variation.
• Analyzing time series means splitting past data into components and then
projecting them forward. The components are as follows:
1. Trend (T) is the gradual upward or downward movement of data over time.
2. Seasonality (S) is a pattern of the demand fluctuation above or below the trend
line that occurs every year.
3. Cycles (C) are patterns in the data that occur every several years. They are
usually tied into the business cycle.
• 4. Random Variations (R) are “blips” in the data caused by chance and unusual
• situations; they do not follow any noticeable pattern.
• The following figure shows a time series and its components
• 17
Time series forecasting models
Seasonal Peaks
Actual Demand
curve
18
Forecast Variations
Irregular
variation
Cycles
Seasonal variations
19
Naive Forecasts
Uh, give me a minute....
We sold 250 wheels last
week.... Now, next week
we should sell....
20
Naïve Forecasts
• Simple to use
• Virtually no cost
• Quick and easy to prepare
• Data analysis is nonexistent
• Easily understandable
• Cannot provide high accuracy
• Can be a standard for accuracy
21
Moving Average
• This method is applied when market demands stay fairly steady over
time.
• Moving Average is defined as follows:
• With each passing month, the earliest month’s data are dropped and
the most recent month’s data is added. This tends to smooth out
short-term irregularities in the data series.
• Example 1 The 3-months moving average of shed sales at Wallace
Garden Supply is shown in the following table:
22
Moving Average
• Three-month Moving Average of actual shed sales
Month Actual shed sales 3-month Moving Average
Jan 10
Feb 12
Mar 13
Apr 16 (10+12+13)/3 = 11.67
May 19 (12+13+16)/3 = 13.67
Jun 23 (13+16+19)/3 = 16
Jul 26 (16+19+23)/3 = 19.33
Aug 30 (19+23+26)/3 = 22.67
Sep 28 (23+26+30)/3 = 26.33
Oct 18 (26+30+28)/3 = 28
Nov 16 (30+28+18)/3 = 25.33
Dec 14 (28+18+16)/3 = 20.67
23
Weighted moving average (WMA)
• When there is a trend or pattern, weights is used to give more emphasis on recent
values. This makes the technique more responsive since later periods may be more
heavily weighted.
• Usually, the latest period is weighted too heavily to reflect a larger unusual change
in the demand or sales pattern too quickly. Mathematically, an n-period weighted
moving average (WMA) is defined as follows:
• A 3-month weighted moving average to forecast storage shed sales for Walace
Garden Supply can be obtained as follows:
• Example 2 Weighted Moving Average forecasts for Wallace Garden Supply problem
are shown in the table below:
•
24
Weighted moving average
•
would be
F3 162 0.3(155 162) = 159.9
= 160 (rounded off) 26
Forecast Error
Forecast error = Actual demand – forecasted value
Mean Absolute Deviation (MAD) is a kind of measure for the overall forecast
error and is defined by
MAD
Forecast error of each of n periods
n
• Mean squared error (MSE) is the average of the squared differences
between the observed and the forecasted value.
• Mean absolute percentage error (MAPE) is the absolute difference between
the observed and the forecasted value expressed as a percentage of the
observed values.
• The forecasted value is dependent of the smoothing constant . This is shown
by an example below for = 0.2 and = 0.5:
• Example 4 The port of Baltimore has unloaded large quantities of a grain
from ships during the past eight quarters. The operations manager of the
port wants to test the use of exponential smoothing to see how well it works
in predicting tonnage unloaded. He assumes that the forecast of the grain
unloaded in the first quarter was 175.
•
27
Exponential Smoothing Model
Exponential smooth forecasts with absolute deviations and MAD
Here MAD with = 0.2 is 82/8 = 10.25 and MAD with = 0.5 is 12.5. Since MAD
with = 0.2 has smaller value, we prefer = 0.2.
28
Exponential Smoothing with Trend Adjustment
Forecast including trend (FITt) = New Forecast (Ft) + Trend correction (Tt)
Trend correction, Tt can be defined as
Tt Tt 1 ( Ft Ft 1 )
where Tt smooth trend for the current period t
= trend smoothing constant ( 0 1
The 3 steps below can be followed to calculate a trend adjusted forecast:
Month 1 2 3 4 5 6 7 8 9
Demands 12 17 20 19 24 26 31 32 36
Smoothing constants are assumed as 0.2 and 0.4 and 11 units were assigned
as the initial forecast for month 1.
29
Exponential Smoothing with Trend Adjustment
Step 0 Forecast for month 2 (F2) = Forecast for month 1 (F1)
+ (Demand for month 1 – Forecast for month 1)
That is, F2 11 0.2(12 11) 11 .0 0.2 11 .2 units.
Step 1 Assume T1 = 0. T2 T1 ( F2 F1 ) 0 0.4(11 .2 11 .0) 0.08.
Step 2 FIT2 F2 T2 11 .2 0.08 11 .28.
• In the same way we do the calculations for the other months. The results
are given in the following table
Forecast by exponential smoothing with trend adjustment for an example problem
In the figure above the series 1 represents the actual demand, the series 2 represents
the forecast including trend and the series 3 represents forecast without trend.
The value of the trend smoothing constant , resembles the constant in that a high
value is more responsive to recent changes in trend. A low value gives less
weight to the most recent trends to smooth out the trend present.
Values of can be found out by the trial-error approach, with MAD used as a
measure of comparison.
Simple exponential smoothing is often referred to as first-order smoothing and trend
adjusted smoothing is called 2nd order or double smoothing. Other exponential
smoothing such as seasonal adjusted and triple smoothing are also in use but these are31
not included here.
Trend Projections
• Often we try to predict the value of the dependent variable from the given value of the
independent variable. For instance,
• Dependent Variable Independent variable
• Sales of product price of products
• Automobile sales Interest rate
• Total production cost No. of units produced
• The trend projection technique fits a trend line to a series of historical data points, and
then projects the line for medium-to long-range forecasts.
• Though there are several mathematical trend equations (e.g. quadratic and exponential)
available, we will restrict our discussion to a linear trend (simple linear regression only).
• To develop this linear trend the least square method (a statistical method) may be
applied.
• This method projects a straight line that minimizes the sum of the squares of the vertical
distances from the line to each of actual observations.
• The figure below illustrates the least square approach:
•
32
Trend Projections
Values of dependent variable
*
Dist 72
* *
Dist 52
Dist 32 Dist 62
Dist 42 *
* *
Dist12 Dist 22
*
Time
Least Square Method for finding the best fitting straight line
A least square line is described in terms of its Y-intercept (the height at which it
intercepts the Y-axis) and its slope (tangent of the angle of the line). Computation of
the Y-intercept and the slope leads to the equation of the line as follows:
y i a bxi i
where i is an error term representing the fact that for given value of xi, the value of yi
might not always equal to a bxi .
33
Trend Projections
Suppose we have data points ( x1 , y1 ), ( x 2 , y 2 ),..., ( x n , y n ). We need to select the values
of a and b in such a way that the sum of the square of the distance
i yi (a bxi ) , i = 1, 2,…,n
is minimum. That is, we need to determine the values of a and b that leads to
minimum value of the function
n n
F (a, b) i y i a bxi
2 2
i 1 i 1
Equating the partially differentiated values of this function with respect to a and b to
zero obtain their optimal values a* and b* as follows (proof is shown in the Appendix
1):
xi x yi y
b
*
; a *
y b x; (1)
( xi x ) 2
where x = average value of all xi’s and y average value of all yi’s.
34
Trend Projections
The 1st equation in (1.1) can be written as
n
n n
i 1
xi y i ny xi / n nx y i / n nx y
i 1 i 1
b n
n
xi 2nx xi / n nx 2
2
i 1 i 1
implying b*
x i y i nx y ; a * y b * x ; (1.2)
i
2
x n x 2
Thus using equation (1.2) we can find out the values of a and b and hence the
equation of the least square regression line Y a * b * X .
Example 6 The demand data for electrical generators over the period 2000-2006 of
the Midwestern Manufacturing Company is given in the following table:
Table1.6 Demand data for electrical generators of Midwestern Manufacturing Company
Year Electrical
Generators sold
2000 74
2001 79
2002 80
2003 90
2004 105
2005 142
2006 122
Find the equation of the least square regression line and hence estimate the demands
for 2007 and 2008. 35
Trend Projections
• Solution. Designating 2000 as year1, 2001 as year2 and so on, the values
of are shown in the following table:
Table 7 Trend calculations of demand data
Now x
x i 28
4; y
yi 692 98.86
7 7 7 7
b*
xi yi nx y 3063 7(4)(98.86) 294.92 10.53(appr.);
x i nx 2 140 7( 4 2 )
2
28
a * y b * x 98.86 10.53( 4) 56.74;
36
Trend Projections
• Hence the equation of the least square regression line is which is shown in
combination with the actual demand line in the following figure:
150
100
Series1
Series2
50
0
1 2 3 4 5 6 7
• The year 2007 and 2008 are denoted by the periods 8 and 9 respectively. So
respective sales forecasts for these years are given respectively by
• Sales forecast for 2007 = 56.74 + 10.53(8) =140.98 = 141 (appr.)
• Sales forecast for 2008 = 56.74 + 10.53(9) =151.51 = 152 (appr.)
37
Seasonal Variations
• Time series forecasting such as described above depends on the trend of
data over a series of time observations. Sometimes, recurring variations
at certain seasons make seasonal adjustment in the trend line forecast
necessary. For example, demand for coal and fuel oil, usually peaks during
cold winter months. Seasonal index defined by
Average two - year demand
Seasonal Index
Average monthly demand
Average two - year demand
Average monthly demand
12
• is used to forecast demands for different periods.
• Example 7 Monthly sales of one brand of telephone answering machine of
a company for the two most recent years are given in the following table:
38
Seasonal Variations
Table 8 Answer machine sales and seasonal indices
39
Seasonal Variations
Consider last two year’s demands as follows:
Year (X) Demand (Y)
1 1055
2 1201
Using the seasonal indices from the above table and expected third year’s annual
demand for answer machine to be 1347 units, we can forecast the monthly demand
(rounded off values) for that year as follows:
where x
xi 18 3; y yi 15 2.5 and
n 6 n 6
b
xi yi nx y 51.5 6(3)( 2.5) 0.25 and a y b x 2.5 3(0.25) 1.75
x i nx 2 80 6(3 2 )
2
0 1 2 3 4 5 6 7 X
Figure7 Regression line of a numerical example problem
If the local chamber of commerce predicts that the Albany area payroll will be 6
hundred million dollars next year, an estimate of sales for Triple A is found with the
above regression equation as follows:
Sales = 1.75 + 0.25(6) = 3.25 i. e. $325,000
43
Standard Error of the estimate
S y,x
(y i Yc ) 2
n2
where yi = the y-value of data i
yc = a b xi , the computed value of the dependent variable from the regression
equation
for the corresponding xi
n = the number of data points.
The standard error for the previous example is given by
0 0.25 0.0625 0.0625 0 0
S y,x
62
0.375
= 0.306
4
Thus the standard error of the estimate is $30600 in sales.
S y,x
y 2
a y b xy
n2
The proof is given in Appendix 2. 44
Correlation coefficient for regression lines
• * * *
• * * * * *
• * *
• * * *
• * *
• Figure 8(a) Perfect positive correlation ( r = 1) Figure 8(b) Positive correlation ( 0 < r < 1)
Figure 8(c) Perfect negative correlation (r = -1) Figure 8(d) No correlation (r = 0) 45
Correlation coefficient for regression lines
r
n xi xi n y i y i
2 2 2 2
For the Triple A construction Company (Example 8) the value of r is given by
46
Multiple Regression Analysis
Multiple regression models are an extension of the linear regression models. It allows
in building a model with several independent variables. The model for two
independent variables is given by
Y a b1 X 1 b2 X 2 ,
where Y = the dependent variable
a = Y-intercept
Xi (i = 1,2) = the independent variable i
bi = Slope for independent variable Xi (i = 1, 2).
The mathematics of multiple regression is quite complex. So the formulae for a, b1
and b2 are not included here.
47
Monitoring and Controlling Forecasts
• Positive tracking signals indicate that demand is grater than the forecast while
negative tracking signal means that demand is less than the forecast.
• A good tracking signal is one with a low RSFE, has about as much positive as
negative error. Small deviations are okay, but the positive and negative ones
should balance one another so the tracking signal centers closely around zero.
• Plot of tracking signal is shown in the following figure:
48
Monitoring and Controlling Forecasts
• Once tracking signals are calculated, they are compared to predetermined
control limits, an upper and lower tracking limits. There is no single answer
in deciding the tracking limits.
• George Plossl and Oliver Wight, two inventory control experts suggest to use
Maximums of 4 MADs (for high-volume stock items) and 8 MADs (for lower-
volume items). Other forecasters suggest slightly lower ranges.
One MAD is equivalent to 0.8 standard deviations, so that 2 MADs = 1.6
standard deviations, 3 MADs = 2.4 standard deviations and 4 MADs = 3.2
standard deviations.
Using 1.6, 2.4 and 3.2 standard deviations in Normal distribution table find the
expected percentage errors falling within 2 MADs, 3 MADs and 4 MADs
respectively as 94.52%, 99.18% and 99.93%.
• Example 9 A bakery’s quarterly sales of a product (in thousands) as well as
forecast demand and error computations are shown in the following table:
The objective is to compute the tracking signal and determine whether
forecasts are performing adequately.
49
Monitoring and Controlling Forecasts
Table 10 Data to calculate Tracking signal for a product of a bakery
Quar Forecast Actual Error RSFE Absolute Cum. MAD Tracking
ter Demand Demand forecast error error Signal
1 100 90 -10 -10 10 10 10.0 -1
2 100 95 -5 -15 5 15 7.5 -2
3 100 115 +15 0 15 30 10.0 0
4 110 100 -10 -10 10 40 10.0 -1
5 110 125 +15 +5 15 55 11.0 +0.5
6 110 140 +30 +35 30 85 14.2 +2.5
RSFE 35
Tracking signal = 2.5
MAD 14.2
This tracking signal is within acceptable limit. It can be seen that it drifted from -2.0
to + 2.5.
• Adaptive Smoothing
• Adaptive smoothing refers to computer monitoring of tracking signals and self
adjustment if a signal exceeds its limit.
• In exponential smoothing, the coefficients are first selected based on values that
minimize error forecasts and then adjusted accordingly whenever the computer
notes an errant tracking signal. This is called adaptive smoothing.
50
The Delphi Technique
52