You are on page 1of 19

JOURNAL OF MULTI-CRITERIA DECISION ANALYSIS

J. Multi-Crit. Decis. Anal. 8: 162–180 (1999)

Response Surface Methodology as a Sensitivity Analysis Tool in


Decision Analysis
KENNETH W. BAUER Jr.a, GREGORY S. PARNELLb,c,* and DAVID A. MEYERSa
a
Department of Operational Sciences, Graduate School of Engineering, Air Force Institute of Technology,
2950 P Street, Wright – Patterson AFB, OH 45433 -7765, USA
b
Department of Systems Engineering, United States Military Academy, West Point, NY 10996 -1779, USA
c
Toffler Associates, 302 Harbor’s Point, 40 Beach Street, Manchester, MA 01944, USA

ABSTRACT
This paper proposes the use of Response Surface Methodology (RSM) as a sensitivity analysis tool for single
attribute and multi-attribute decision analysis (DA). It is shown that any single or multi-attribute decision analysis
value (or utility) function can be transformed to a response function of the uncertain variables. A sensitivity
analysis framework designed to facilitate simultaneous perturbation of a number of uncertain variables is
proposed. Specifically, RSM is used with influence diagrams, but the methodology can also be used with decision
trees. This approach is illustrated with the well-known Oil Wildcatter Problem. This new framework exceeds the
current sensitivity analysis capability of decision analysis software tools. The approach is shown to be more
efficient than current DA sensitivity analysis techniques and can provide improved insights for decision makers.
Copyright © 1999 John Wiley & Sons, Ltd.

KEY WORDS: decision analysis; Multi-attribute Utility Theory; Multi-attribute Preference Theory; influence
diagrams; multi-criteria decision analysis; response surface methodology; experimental design;
regression; statistics

1. INTRODUCTION gram solution (decision tree) cycles required for


thorough sensitivity analysis. As the model in-
The objective of this article is to show that the use creases in complexity, the savings can be signifi-
of Response Surface Methodology (RSM) can cant. The second major benefit is the added
facilitate effective and efficient sensitivity analysis insights available to decision makers from the use
for decision analysis (DA) models. Effectiveness of RSM. These insights are gained through the
(to within an acceptable tolerance of error) is estimation of specific coefficients (effects) of sig-
measured against current methods. Efficiency is nificant variables and the identification of the
evaluated through resource utilization (analyst interaction between variables. These insights are
time) as compared with current practice. Specifi- invisible to one-way sensitivity analysis
cally, RSM is employed in the use of influence approaches.
diagram models. However, the approach is the The preliminary results show that using RSM
same for decision tree models. for sensitivity analysis on decision analysis prob-
The purpose of using RSM as a sensitivity lems is a sensible approach. Further, RSM sub-
analysis tool is to improve the analyst’s under- stantially increases the explanatory power of
standing of the sensitivity between the output of sensitivity analysis by offering the decision analyst
the model (response) and its associated input an improved tool and presenting the decision
parameters, and then use this information to im- maker a more useful sensitivity analysis.
prove the decision model. There are two major This paper is organized as follows. In Section 2,
benefits of using RSM for sensitivity analysis on a influence diagrams and their subsequent sensitiv-
decision analysis problem. The first is a substan- ity analysis are briefly described. An overview of
tial savings in the total number of influence dia- RSM and an introduction to the basic paradigm
are also provided. Then, it is shown that any
* Correspondence to: Department of Systems Engineer- single or multi-attribute decision analysis value
ing, United States Military Academy, West Point, NY (or utility function) can be transformed to a
10996-1779, USA. e-mail: FG7526@exmail.usma.edu response function of the uncertain variables.

CCC 1057–9214/99/030162-19$17.50 Recei6ed 8 October 1997


Copyright © 1999 John Wiley & Sons, Ltd. Accepted 6 April 1999
RESPONSE SURFACE METHODOLOGY 163

Finally, the Oil Wildcatter Problem is introduced, tain event nodes as circles, and value nodes as
which will be used to illustrate the proposed rounded rectangles. These three components are
methodology. Next, the sensitivity analysis frame- pictorially represented as nodes connected with
work is presented. Various screening experiments arcs that describe their relationships. Decision
are described and standard DA sensitivity analysis analysts use the influence diagram to communi-
procedures are applied to the problem. In Section cate with the decision maker and use an influence
3, RSM is applied to generate an expanded sensi- diagram or decision tree algorithm to perform the
tivity analysis. Section 4 compares the DA sensi- decision analysis.
tivity analysis techniques with the present RSM Shachter (1986) describes the influence diagram
sensitivity analysis. Section 5 concludes this as ‘ . . . intuitive enough to communicate with de-
paper. cision makers and experts and, at the same time,
precise enough for normative analysis.’ In the
same paper, Shachter proposed an algorithm for
2. BACKGROUND evaluating influence diagrams. The capability to
solve the influence diagram was an important step
In this section, we briefly discuss influence dia- for decision analysis. The influence diagram pro-
grams and standard DA sensitivity analysis proce- vides analysts with a good communication tool in
dures. We also provide an overview of RSM and a convenient structure for computer solution tech-
its use within a general sensitivity analysis proce- niques. There are several decision analysis soft-
dure. The section ends with an expository prob- ware packages on the market. One such software
lem description. package, DPL, solves the influence diagram and
its associated decision tree. The solution outputs
2.1. Influence diagrams include optimal decision policies, cumulative risk
Howard provided an excellent summary of the profiles and sensitivity analyses (ADA, 1995).
practice of decision analysis and identified the
central role of influence diagrams in the decision 2.2. Standard sensitivity analysis in decision
analysis process (Howard, 1998). Howard and analysis
Matheson (1983) developed the concept of an Sensitivity analysis approaches in decision analy-
influence diagram to provide a problem descrip- sis have been largely ad hoc. Rios Insua and
tion that could be solved by a computer and yet French (1991) review the sensitivity analysis litera-
be understood by people. An influence diagram is ture, propose a framework for mulit-objective de-
a graphical representation of the problem situa- cision making, and examine distance based tools
tion condensed to the decisions, the important for sensitivity analysis. Lowell (1994) provides a
uncertain variables and the values (see Figure 1). recent survey of the sensitivity analysis literature
Decision nodes are depicted as rectangles, uncer- and proposes a new sensitivity analysis, sensitivity

Figure 1. Influence diagram

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
164 K.W. BAUER Jr. ET AL.

to relevance. Sensitivity to relevance shows the not matter. If it does change the recommendation,
effects of modelling relevance (or conditional de- the decision maker needs to know. One approach
pendence) between the uncertain variables. would be to exhaustively enumerate the possibili-
In DA sensitivity analysis, typically one vari- ties and run the model at all the possible combi-
able is varied at a time. DPL displays this type of nations of variable settings. This approach may
analysis via a device known as a rainbow dia- be reasonable for small problems with a few
gram, since the changes in optimal policy are variables, but quickly becomes impractical for
denoted by colour changes (ADA, 1995; pp. 468– large problems.
474). It is also common to plot several one-way
sensitivity analyses on one graph. This technique 2.3. Response Surface Methodology
is known as a tornado diagram since a ‘tornado’ RSM is a collection of mathematical and statisti-
shape is obtained by sorting the variables by their cal tools used in investigative experimentation by
variation in value function and plotting them top scientists and engineers. When applied to models,
to bottom by decreasing variation (Howard, 1998) such as influence diagrams or decision trees, it is
and (ADA, 1995; pp. 474 – 481). An analyst typi- helpful to think of the model as being character-
cally examines the diagram to determine the most ized by a feature of interest called an output or a
significant variables. Significance can be assessed response. This response arises in reaction to sys-
by considering the contribution to the response (a tem inputs or independent variables. In this pa-
large percentage of the response is due to the per, the response is the value used to solve the
variable), contribution to risk (the range of uncer- influence diagram (decision tree) and the indepen-
tainty in the variable generates a large range is the dent variables will be the various probabilities and
response), and contribution to indecision (the outcomes of the uncertain variables which
variable impacts the choice of the best decision). parameterize the influence diagram. Generally,
In our experience, selection of the significant vari- there will be a different response function for each
ables is based on analyst and decision maker alternative. Consider a situation in which a re-
judgement. As analysts, we may want to delete sponse, y, can be approximated over a region of
variables that do effect the above three criteria. operability, R, by either a first- or second-order
However, our decision makers sometimes prefer polynomial function of its inputs. If the input
to keep variables in the model to respond to variables are j1, j2, . . . , jk, then a typical region
stakeholder perceptions that the variables are of operability is defined by a set of inequalities;
important. Ll 5 jl 5 Ul, l= 1, . . . , k, where Ll and Ul are the
Next, time permitting, two-way sensitivity lower and upper bounds of the lth input or factor,
analyses are performed. Two-dimensional graphs respectively. Further, it is often convenient to
standardize or code the variables as
 
are plotted and the preferred strategy regions are
identified. These plots are called strategy region jl − jl0
graphs (Clemen, 1991; pp. 121 – 138). (A rainbow xl = 2
diagram, tornado diagram and strategy region Ul − Ll
graph are illustrated in the example in Section 3.) for l= 1, . . . , k, where jl0 is the midpoint be-
McNamee and Celona (1990) describe joint sensi- tween Ll and Ul. This coding maps the upper
tivity analysis when the variables are probabilisti- (lower) bound to +1 ( − 1). There are advantages
cally dependent. A general limitation of the in accomplishing this transformation. The vari-
technique (and most DA software) is the lack of ables are unitless, this aids in interpretation. Fur-
an automated two-way (or higher) sensitivity ther, it is easy to array experimental design
analysis. settings into an array called a design matrix (de-
Although performing sensitivity analysis on any scribed below) in such a way to produce orthogo-
one variable can be an effective screening tool for nal columns that in turn lead to uncorrelated
many problems, stopping there could miss a large estimates of model parameters (see Equation (1)),
part of the potential information to be gleaned which also greatly aids interpretation. Now, the
from a model. If the problem contains many response is
variables, what is the effect of three or four being k k
slightly different than anticipated? If the answer y= b0 + % bi xi + % bij xi xj + % bii x 2i (1)
does not change the recommended action, it does i=1 iBj i=1

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
RESPONSE SURFACE METHODOLOGY 165

where the bs are coefficients that must be esti- underlying system or model with the express pur-
mated from the data. Expression (1) can be pose of estimating a function of the form (1) then
thought of as a Taylor’s series expansion (of the settings of the factors can be arranged in a
degree 2) of some true underlying function. design matrix
The purpose of using RSM as a sensitivity
analysis tool is to improve the analyst’s under- Æ1 x 11 · · · x 21kÇ
standing of the sensitivity between the output of à Ã
1 x 21 · · · x 22k
the model (response) and its associated input vari- X= Ã ·· Ã
— — —
ables and then use this information to improve à · Ã
the decision model. There are two major benefits È1 x n1 · · · x 2nkÉ
of using RSM for sensitivity analysis on a deci-
sion analysis problem. The first is a substantial Each row represents a set of experimental condi-
reduction in the total number of influence dia- tions as well as transformations of these condi-
gram solution (decision tree) cycles required for tions. If the responses are arrayed in a vector
thorough sensitivity analysis. As the model in- Y= (y1, . . . , yn )% then the coefficients b. =
creases in complexity, the reduction can be signif- (b0, b1, . . . , bk, . . . , bij, . . . , bkk )% can be esti-
icant. The second major benefit is the added mated by the well-known least-squares formula
insights available to decision makers from the use
of RSM. These insights are gained through the b. = (X%X) − 1X%Y
estimation of specific coefficients (effects) of sig-
The resolution of a design reflects the ability of an
nificant variables and the identification of the
experimental design to render clear estimates of
interaction between variables. These insights are
effects of differing orders. For example, a resolu-
invisible to one-way sensitivity analysis
tion IV design can render estimates of interactions
approaches.
which are ‘clear’ of single-factor effect estimates.
The analyst must choose a numeric perturba-
However, in designs of this resolution, certain
tion range for each variable. These ranges define
interaction estimates may be ‘confounded’ with
the region of operability, R. Depending on the other interaction estimates. This means that the
decision model development, the perturbation effect that is actually computed reflects both the
range may vary from differential perturbations interactions in question. For example, in a resolu-
(small perturbations about the nominal values) to tion III design, it is impossible to distinguish
large perturbations (due to our preliminary between the effects of first-order factors (xi ) and
knowledge of the variables). If the perturbations certain two-way interactive effects (xixj ); however,
are large, it is more likely that the first-order such a design does not confound first-order effects
model will be inadequate (as reflected by the with one another. Excellent RSM references are
goodness-of-fit test and other diagnostics). This Meyers (1976), Box and Draper (1987) and Khuri
means that for R, certain variables have a nonlin- and Cornell (1987).
ear impact on the response — changes in one vari- RSM has been used extensively to perform
able are related to the levels of another. Later in sensitivity analysis in computer simulation. Simu-
our model development, as variable ranges be- lation models typically have many input variables
come smaller, we could define a smaller R and and usually only a fraction of them are critical
redo the RSM analysis. This may result in a over the design region. Meidt and Bauer (1992)
model with less interaction terms. In this paper, take advantage of this property and use RSM to
Equation (1) is used to approximate the expected create metamodels for simulation. The flow of
value of alternatives in an influence diagram. sequential experimentation in this metamodelling
Often, Equation (1) is called a metamodel. scheme is shown in Figure 2. The purpose of this
Box and Draper (1987) advocate the use of an paper is to demonstrate the usefulness of this
iterative approach to building models of the type general scheme when applied in the sensitivity
portrayed in (1). At each stage, the experimenter analysis of an influence diagram. Decision analy-
needs to make decisions about which input vari- sis models are typically characterized by a large
ables to include, what levels to set the inputs to variable (factor) set. Rather than initially attempt-
and what order of model to estimate. For exam- ing to set up extremely large experimental designs,
ple, if it is decided to observe n realizations of the in which we vary all the variables, we employ a

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
166 K.W. BAUER Jr. ET AL.

Figure 2. RSM metamodelling paradigm

‘group screening’ strategy. We examine the struc- Once the group screening narrows the field to
ture of the model and arrive at groups based on critical groups of input variables, another low
function. Different analysts may select different resolution design is run on the main factors re-
initial groupings. As long as the regression results maining. From here, a first-order (linear) model
are interpreted correctly, the final groupings of can be used to estimate the response surface over
the sensitive variables should be the same. a small area of interest. There may be a lack of
Group screening combines sets of factors that fit and a second-order model can be used to
are believed to exhibit like behaviour in the better fit the surface. Using this methodology,
model. Each group is treated as a single variable
valuable insight is gained into the response func-
(each variable in the group is changed by the
tion over a range of inputs.
same amount) and regressed against the re-
sponses using a low resolution design. A Plack- Stone (1988) used RSM as an analysis tool in his
ett–Burman or similar resolution fractional study the nonconformity of parts in the US Air
factorial design could be used to ensure effi- Force Supply System. He developed a decision
ciency. This technique would be extremely benefi- model for determining the best policy for sampling
cial where the effects of uncertain variables are parts in the inventory. The model was displayed as
closely correlated in a large model. It can drasti- an influence diagram and solved using INDIA
cally reduce the number of runs required while (INDIA, 1991). A post-solution analysis was per-
identifying the significant variable groups. Small- formed using RSM. For his decision problem,
wood and Morris (1980) developed a grouping Stone found that RSM explored the interaction
technique along these lines, but outside the RSM effects between the variables in a structured, con-
framework. sistent and reliable way. His research provided the

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
RESPONSE SURFACE METHODOLOGY 167

initial groundwork for applying RSM to decision y= 6(z)= g(61(h1(X)), 62(h2(X)), . . . , 6n (hn (X)))
analysis problems. Bauer et al. (1995) com-
= f(X)
pared the use of RSM and current decision anal-
ysis sensitivity tools in a Department of Defence Again, if we let y be our response (6(z) is a
force structure decision problem. In their app- scalar), we have a function of the required form.
lication, RSM significantly reduced the analysis
man hours and provided a more insightful analy- 2.4.4. Multi-attribute Utility Theory
sis. With uncertainty, we have Multi-attribute Utility
Theory (Kirkwood, 1997; pp. 245–259). We can
2.4. Converting single and multi-attribute decision write the value function of the form:
analysis problems to the Response Surface
u(z)= g(u1(z1), u2(z2), . . . , un (zn ))
Paradigm
To use the Response Surface Paradigm, we must where u(z) is the multi-attribute utility function,
be able to identify a response function of ui (zi ) are the single-dimensional utility functions,
the form y= f(X) (see Equation (1)), where y is and z= (z1, z2, . . . , zn ) is the vector of scores for
the response and X is a vector of the system each attribute. But, z is a function of the input
inputs or independent variables. Using the fol- variables X (see Section 2.4.1), therefore zi =
lowing four cases, we demonstrate that we can hi (X). Substituting, we get
define a response function of the form y = f(X) y= u(z)= g(u1(h1(X)), u2(h2(X)), . . . , un (hn (X)))
for all single and multi-attribute decision analysis
problems. = f(X)
Again, if we let y=u(z) be our response, we have
2.4.1. Single attribute a function of the required form.
For single attribute decision analysis, y is the
single attribute. For example, y might be the net 2.5. Illustrative problem
present value of the alternatives. The vector X is Since we have shown in the above section that we
the vector of the input variables that we would can obtain a response function for all single and
assess using one-way sensitivity analysis (e.g. a multi-attribute decision analysis problems, for
tornado diagram). Therefore, using RSM we can simplicity, we use a single attribute decision anal-
develop a response function for each alternative ysis problem to illustrate the methodology. Raif-
of the form y =f(X). For single attribute, we fa’s Oil Wildcatter Problem is chosen for its
usually refer to f(X) as the value model. familiarity to decision analysts (Raiffa, 1968).
The problem consists of two decisions faced by
2.4.2. Single attribute utility an oil wildcatter determining whether to make
When we use utility, we convert our single at- preliminary tests on a particular drilling site and
tribute to a utility, u =u(y). From the above, whether to drill or not. In this problem, the test
y = f(X), therefore u= u( f(X)) =g(X). Again, if decision is to be made between an experimental
we let u be our response, we have a function of seismic test costing $3000, a core sample test
the required form. costing $10000 and no testing. The two tests
provide information on the seismic structure
2.4.3. Multi-attribute Preference Theory (and, therefore, the probability of various
Without uncertainty, we have Multi-attribute amounts of oil being found) at the drill site. The
Preference Theory (Kirkwood, 1997; pp. 227– potential amount of oil is discretized into three
270). We can write the value function of the classifications: dry, wet and soaking. These three
form: amounts are outcomes with varying revenue; a
6(z)= g(61(z1), 62(z2), . . . , 6n (zn )) dry well will have no revenue, a wet well will
produce $120000, while a soaking well will give
where 6(z) is the value function, g( ) is any gen- $270000. There is a variable drilling cost, which
eral function, 6i (zi ) are the single-dimensional is discretized as $40000, $50000 or $70000. The
value functions, and z =(z1, z2, . . . , zn ) is the vec- expected profit is the revenue (from the amount
tor of scores for the n attributes. But, zi is a of oil produced) minus the costs of drilling and/
function of the input variables X (see Section or testing. The influence diagram of the Oil Wild-
2.4.1), therefore zi =hi (X). Substituting, we get catter Model is shown in Figure 1.

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
168 K.W. BAUER Jr. ET AL.

3. SENSITIVITY ANALYSIS The initial solution shows that the optimal ex-
pected profit is realized through no testing and
In the following sections we present a sensitivity deciding to drill. The expected profit was
analysis of the Raiffa Oil Wildcatter Problem. $40000. A decision tree showing the optimal de-
Our overall strategy is to accomplish this analy- cision policy is shown in Figure 3.
sis within the general framework of the RSM We use a tornado diagram to show the effects
metamodelling paradigm described in Section of perturbing one main group at a time. This
2.3. First, we form two main groups of variables: one-way sensitivity analysis displays the effect of
probability and value variables (described be- the perturbations on the overall expected profit
low). Perturbing these variables as a group, we and any changes in decision policy. The shaded
form a standard tornado diagram to assess the areas indicate that the optimal decision policy
possibility of screening one (or perhaps both) has changed. DPL evaluates the model at three
groups of variables from consideration in a sensi- points; the original base case and two endpoints
tivity context. If neither group perturbation pro- on either side of the base case. The shaded area
duces a change in the optimal decision policy, we shows only that the change in policy occurred
could consider the optimal decision as being ro- somewhere between the base case and the end-
bust relative to the magnitudes of the perturba- point evaluation. We note that in the application
tions being examined. The next step in the of tornado diagrams, single variables are per-
paradigm involves decomposing the remaining turbed individually. Here, we refer to group
groups into subgroupings and/or individual vari- effects.
ables and performing the appropriate analysis. This initial group analysis gives some in-
Finally, we contrast the use of rainbow diagrams sight into the effect of changing the prob-
versus RSM to generate two-way strategy re- ability or value parameters (see Figure 4). Per-
gions. turbing any of these parameters can lead to
a significant change in expected profit and/or
decision policy. For example, we did not delete
3.1. Grouping variables and initial perturbations the probability group because there was a sig-
Initially, the variables were placed into two main nificant change in NPV—from − 25% to
groups. The probability variable group included + 20%. Clearly, additional groups would be re-
the 24 conditional and marginal probabilities quired.
associated with the possible outcomes of each
node. The value variable group included nine
variables, made up of the possible drilling and 3.2. Expanded groups
testing costs and the amount of oil revenue. We decided on seven subgroups: four probabili-
Perturbations of these variables were limited by ties and three value nodes. The seven new groups
the relationships within each group. For in- are defined by using the initial nodes in the influ-
stance, within the probability variable group, ence diagram. The probability group has one
each discrete distribution’s probabilities must variable for each chance node. These new vari-
sum to 1. We array these probabilities into ables are:
a vector. This vector of probabilities was varied
by changing the largest probability (9 0.1), Oil revenue probability perturbation ORP
and reducing the other probabilities pro- Seismic structure probability SSP
portionally. The value perturbations were accom- perturbation
plished by using a change of 10% of the largest Experimental seismic test probability ETP
outcome (e.g. the greatest amount of oil outcome perturbation
is $270000 and hence the perturbation was Drilling cost probability perturbation DCP
9 $27000). For illustration, we have used 90.1
for probabilities and 9 10% for values. For deci- The value group has one variable for each node
sion analysis practice, other systematic schemes with cost or profit values:
have been proposed; for instance, one common
practice is to use the 10th and 90th percentile Drilling cost perturbation DC
values for each variable (McNamee and Celona, Test cost perturbation ETC
1990). Oil revenue perturbation OR

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
RESPONSE SURFACE METHODOLOGY 169

Figure 3. Optimal decision policy of initial solution

Figure 4. Tornado diagram for initial groups

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
170 K.W. BAUER Jr. ET AL.

Figure 5. Rainbow diagram for ORP

3.3. One-way analysis sensitivity analysis. There are alternative methods


These groups were perturbed in the same manner of generating strategy regions using standard
as the initial groupings. The probability pertur- software. A rainbow diagram displays the model
bations were 9 0.1 from the significant element solutions across a range of input settings (up to
in each vector. The value perturbations were 21, using DPL) for a single variable. If there is a
again 910% of the largest possible outcome specified relationship between two variables, they
in each distribution. A rainbow diagram display- can be varied at the same time by defining one
ing the effects of ORP variation is shown in variable in terms of the other, or by defining
Figure 5. The tornado diagram for this case is both variables in terms of a third (ADA, 1995).
shown in Figure 6. Of the seven new variables, As a reasonable alternative, we employed a
the following variables exhibited the greatest ef- method in which we produced a rainbow dia-
fect on expected profit for the preferred decision gram of one variable across the interval − 0.1 to
policy:
0.1 and generated this rainbow diagram ten sepa-
rate times (across the range of the second vari-
Oil revenue probability perturbation ORP
able). The second variable was incremented
Oil revenue perturbation OR
Drilling cost perturbation DC between runs to establish the strategy region
graph seen in Figure 7. We display only one of
the three possible strategy region graphs using
the three remaining variables given above. The
3.4. Two-way analysis extraction of the data from the rainbow dia-
Clemen (1991) discusses various methods for the grams after each run was a time consuming pro-
construction of strategy regions. The DPL rain- cess, but it did provide the means to gather
bow diagram option was used to do two-way two-way sensitivity information.

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
RESPONSE SURFACE METHODOLOGY 171

If the most likely values of the uncertain variables actual values of these variables. Notice that as the
are near the break point between the decision to test estimates approach the bottom left corner of Figure
or not to test using experimental seismic methods, 7, the level of confidence increases in experimental
the optimal decision would be highly sensitive to the seismic testing being the best decision.

Figure 6. Tornado diagram of seven factors perturbed one at a time

Figure 7. Strategy region based on perturbation of oil revenue (OR) vs. probability of oil revenue (ORP)

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
172 K.W. BAUER Jr. ET AL.

Figure 8. Oil Wildcatter problem

Table I. Initial fractional screening design (resolution III)

RUN Seven input factors Expected profit of test alternative

1 2 3 4 5 6 7 No Core Exp
ORP ORPP DCP DCPP SSPP TCP ESTPP test sample seismic

1 −1 −1 −1 1 1 1 −1 50.8 44.775 46.8


2 1 −1 −1 −1 −1 1 1 82.2 71.2 78.2
3 −1 1 −1 −1 1 −1 1 16.2 20.8 25.3724
4 1 1 −1 1 −1 −1 −1 38.3 29.8 36.8
5 −1 −1 1 1 −1 −1 1 36.8 27.8 34.8
6 1 −1 1 −1 1 −1 −1 68.2 64.475 66.2
7 −1 1 1 −1 −1 1 −1 2.2 1.17 4.03963
8 1 1 1 1 1 1 1 24.8 30.07 33.4701
9 0 0 0 0 0 0 0 40 34.3 37

4. THE RSM APPROACH TO SENSITIVITY model would be sufficient. The nine runs were
ANALYSIS made using DPL (Figure 8 is the modified influ-
ence diagram that facilitated this portion of the
To demonstrate the use of RSM, we begin by analysis) and the design matrix was regressed
screening the seven groups described previously. against the expected profit response for each alter-
First, a 27 – 4 (resolution III) fractional design was native. The three decision alternatives are: (1) no
used to screen these seven groups for significant test, (2) a core sample test and (3) an experimental
factors. Hence, the seven main factors, discussed seismic test. The design matrix and associated
above, are not confounded with other main fac- responses are shown in Table I.
tors. This design requires eight runs of the model. As will be described in this section, four of the
Additionally, a centre point was included in the seven main factors were significant in terms of
design to better evaluate whether a first-order approximating the response surface for the no test

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
RESPONSE SURFACE METHODOLOGY 173

option. The intercept estimate was the same as obtainable. However, a central composite design
our initial solution of 40 and indicated that a would provide the orthogonality for our regres-
first-order model might be sufficient for this op- sion analysis, and take advantage of the full facto-
tion. The regression analysis of the remaining rial design points, which have already been run.
options showed less promise relative to first-order The idea is to augment the original factorial ar-
models. The resulting residual analysis suggested rangement with axial points. These points are
that the first-order model was not sufficient to taken along each variables axis by setting the
approximate the core sample test and experimen- other variables to zero. Six axial points were
tal seismic test options. The residuals for these added to our previous full factorial design. The
options indicated a large amount of error at the final design matrix is shown in Table II.
centre point, which was unacceptable for our An example of the regression analysis is de-
needs. picted in the ANOVA table and corresponding
In all three cases, OR, ORP and DC, were the residual analysis found in Table III and Figure 9.
most significant factors in the estimated response In this application of regression it is important to
surface. Therefore, a 23 full factorial design was remember that there was no stochastic error;
run using these three variables as the three factors hence, standard errors, t statistics and p values
plus all of their interactions. Based on residual cannot be interpreted in the usual fashion. These
analysis, we chose to examine second-order mod- measures are only useful in that they are reflective
els (with quadratic terms) for the two test options. of that amount of total variation explained by the
The search for a design that could minimize the various tactors, e.g. large t statistics are typically
additional runs of the model led to a central associated with factors that exhibit a significant
composite design. Central composite designs are effect. In the end, a second-order approximation
discussed by Box and Draper (1987). One prob- for each of the three decision alternatives fit
lem with using a deterministic model is that repli- the surfaces quite accurately. The t statistics in the
cations of the centre point (base case) are regression table are directly proportional to
meaningless, since they would produce the same the proportion of the total sums of squares
response in all cases (the base case solution), explained by each variable. In Table III, we see
hence, attempting to produce designs with desir- that only the bold faced variables explain mean-
able qualities, such as uniform precision and ro- ingful fractions of the total sums of squares. That
tatability (see Meyers, 1976; p. 153) are not is, the t values for these variables are only ones

Figure 9. Residual plot of No test alternative response surface approximation

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
Copyright © 1999 John Wiley & Sons, Ltd.

174
Table II. Central composite design with one centre point

Model I X1 X2 X3 X12 X13 X1 ‚ 2 X23 X2 ‚ 2 X123 X3 ‚ 2 Alternatives

run ORPP ORP DCP No test Core sample Exp seismic

1 1 −1 −1 −1 1 1 1 1 1 −1 1 50.3 40.37 47.3

K.W. BAUER Jr. ET AL.


2 1 1 −1 −1 −1 −1 1 1 1 1 1 16.7 15.43 18.088
3 1 −1 1 −1 −1 1 1 −1 1 1 1 82.7 72.7 79.7
4 1 1 1 −1 1 −1 1 −1 1 −1 1 38.3 32.17 35.3
5 1 −1 −1 1 1 −1 1 −1 1 1 1 36.3 31.55 33.3
6 1 1 −1 1 −1 1 1 −1 1 −1 1 2.7 7.73 11.004
7 1 −1 1 1 −1 −1 1 1 1 −1 1 68.7 58.7 65.7
8 1 1 1 1 1 1 1 1 1 1 24.3 24.47 25.692
9 1 0 0 0 0 0 0 0 0 0 0 40 34.3 37
18 1 1.68 0 0 0 0 2.83 0 0 0 0 7.2051 10.1663 13.2313
19 1 −1.68 0 0 0 0 2.83 0 0 0 0 72.7949 62.7949 69.7949
20 1 0 1.68 0 0 0 0 0 2.83 0 0 62.7042 52.7042 59.7042
J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)

21 1 0 −1.68 0 0 0 0 0 2.83 0 0 17.2958 16.5907 19.0603


22 1 0 0 1.68 0 0 0 0 0 0 2.83 28.2275 27.3542 28.2681
23 1 0 0 −1.68 0 0 0 0 0 0 2.83 51.7725 41.7725 48.7725
Copyright © 1999 John Wiley & Sons, Ltd.
Table III. Regression analysis for No test option on central composite design

CCD–NO TEST
SUMMARY OUTPUT ANOVA

Regression statistics df SS MS F Signif F

Multiple R 1 Regression 10 8409.4758 840.94758 4.695E+11 1.089E−23


R square 1 Residual 4 7.164E−09 1.791E−09
Adjusted R square 1 Total 14 8409.4758
Standard error 4.232E−05
Observations 15

RESPONSE SURFACE METHODOLOGY


RESIDUAL OUTPUT

Coeff Standard error t statistic p value Observ Predict Actual Residuals

Intercept 40 4.21E−05 950 702 7.345E−24 1 50.299974 50.3 2.63E−05


ORP −19.5 1.15E−05 −1 702 755 7.137E−25 2 16.700003 16.7 −3.348E−06
OR 13.5 1.15E−05 1 178 831 3.107E−24 3 82.699972 82.7 2.789E−05
DC −6.99999 1.15E−05 −611 245 4.298E−23 4 38.300002 38.3 −1.766E−06
X12 −2.7 1.5E−05 −180 448 5.659E−21 5 36.299998 36.3 1.766E−06
X13 4.441E−16 1.496E−05 2.968E 1 6 2.7000279 2.7 −2.789E−05
− 11
X1 ‚ 2 0 1.72E−05 0 1 7 68.699997 68.7 3.348E−06
J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)

X23 4.441E−16 1.49E−05 2.968E 1 8 24.300026 24.3 −2.63E−05


− 11
X2 ‚ 2 0 1.72E−05 0 1 9 40 40 0
X123 4.441E−16 1.496E−05 2.968E 1 10 7.2050647 7.2051 3.526E−05
−11
X3 ‚ 2 0 1.72E−05 0 1 11 72.794935 72.7949 −3.526E−05
12 62.704202 62.7042 −1.881E−06
13 17.295798 17.2958 1.881E−06
14 28.227471 28.2275 2.918E−05
15 51.772529 51.7725 −2.918E−05

175
176 K.W. BAUER Jr. ET AL.

Table IV. Response surface coefficients

Factors Parameter estimates Parameter estimates Parameter estimates


No test Core sample Seismic test

Constant 40.0 35.25 37.09


ORP −19.5 −15.53 −16.92
OR 13.5 11.25 12.09
DC −7.0 −4.57 −5.80
OR*ORP −2.7 −3.25 −4.11
ORP*DC 0.93 1.41
OR*DC
ORP*OR*DC
OR2
ORP2 1.04
DC2

Table V. One-way sensitivity analysis using RSM equations

Range of perturbation

ORP −0.1 −0.08 −0.06 −0.04 −0.02 0 0.02 0.04 0.06 0.08 0.1
No test 59.5 55.6 51.7 47.8 43.9 40 36.1 32.2 28.3 24.4 20.5
Core sample 50.8 47.7 44.6 41.5 38.4 35.3 32.1 29.0 35.9 22.8 19.7
Exp seismic 55.1 51.3 47.6 44.0 40.5 37.1 33.7 30.5 27.3 24.2 21.2
Max 59.5 55.6 51.7 47.8 43.9 40.0 36.1 32.2 28.3 24.4 21.2
OR −27 −21 −15 −9 −3 0 3 9 15 21 27
No test 26.5 29.5 32.5 35.5 38.5 40.0 41.5 44.5 47.5 50.5 53.5
Core sample 24. 26.5 29 31.5 34 35.25 36.5 39 41.5 44 46.5
Exp seismic 26.0 28.3 30.7 33.2 35.8 37.1 38.4 41.2 44.1 47.1 50.2
Max 26.5 29.5 32.5 35.5 38.5 40.0 41.5 44.5 47.5 50.5 53.5
DC −7 −5.6 −4.2 −2.8 −1.4 0 1.4 2.8 4.2 5.6 7
No test 47 45.6 44.2 42.8 41.4 40 38.6 37.2 35.8 34.4 33
Core sample 39.8 38.9 38.0 37.1 36.2 35.3 34.3 33.4 32.5 31.6 30.7
Exp seismic 43.9 42.4 40.9 39.6 38.3 37.1 36.0 34.9 34.0 33.1 32.3

significantly different from 0. Figure 9 shows that amine two-way sensitivity. Both of these tools
the fit is good since the residuals are small and provide useful pictorial representations of critical
there is no systematic pattern. The response sur- variables but do not explicitly identify the interac-
face approximations are shown in Table IV. We tions between variables. The table of response
included those coefficients that were not approxi- surface coefficients offers two benefits. First, the
mately zero. These equations are the expected table identifies the linear effects of each variable
profit response for each of the three options. They and as well as the interactive effects. Second, the
are shown here in their coded form. One quick table allows for direct comparisons of effects
accuracy check for the centre point can be accom- across variables without requiring an overlay of
tornado diagrams.
plished by comparing the intercept with the initial
centre point (e.g. 40=40, 34.3: 35.25, 37: 4.1. One-way analysis
37.09). These response surface equations can be Table V shows the results of comparing the three
used to create standard sensitivity analysis tools. alternative RSM equations when the factors are
Tornado diagrams are used to assess one-way perturbed one at a time. The preferred alternative
sensitivity, while strategy regions are used to ex- is determined by evaluating:

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
RESPONSE SURFACE METHODOLOGY 177

Max(E(profit)No test , E(profit)Core ,


sample E(profit)No test − E(profit)Exp seismic = 0.
E(profit)Exp seismic ) (5)
This calculation produces data that very closely 5. ANALYSIS
matches the rainbow diagram in Figure 5 and the 5.1. Comparison of one-way sensitivity analyses
tornado diagram shown in Figure 6. The objective of one-way sensitivity analysis is to
identify significant factors and their impact on the
4.2. Using response surface approximations to optimal policy. The first technique applied to
generate strategy regions generate this one-way analysis was the tornado
The strategy region graph introduced in Figure 7 diagram. The tornado diagram was produced in
is an excellent tool for presenting two-way analy- DPL, which varies one factor at a time and
sis. The new approach was to use the three RSM evaluates the model at the base case and the
equations to estimate each alternative’s expected endpoints. This required 14 additional runs of the
profit. Each of the equations was evaluated across model (the endpoints of seven factors). The result-
the region of interest. The alternatives were com- ing tornado diagram, shown in Figure 6, indicates
pared to generate a strategy region graph like that that changes in one of the seven factors would
shown previously in Figure 7. The result shows affect the optimal policy.
that there are only two optimal options in this ORP impacted the optimal decision and had
region (No test and Experimental seismic testing). the greatest impact on the overall expected profit.
Therefore, the expected profit for the core sample The other factors are not critical to the overall
test is never the highest in this region. The line decision in this range, when taken one at a time.
separating the strategy regions for the RSM equa- These variables are listed here in descending order
tions depicted in Figure 10 was constructed by of impact on the overall expected profit: ORP,
solving OR, DC, DCP, SSP, ETC and ETP.

Figure 10. Strategy region comparison for model vs. RSM

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
178 K.W. BAUER Jr. ET AL.

It is important to realize that tornado diagrams frugal use of model runs has only begun to be
show changes in decision policy (either decision), exploited, since no further runs are required to
not just the initial decision. The analyst should develop additional strategy region graphs. The
run the model at specific points of interest to RSM equations can be used in the current form to
clarify the information gained from the tornado predict a response to any number of changes in
diagram. This will require some additional runs of the parameters across the evaluated design region.
the model, but should ensure that the correct The usefulness of the approximations formulated
conclusions are drawn. Also, since (in typical using RSM is seen again in the two-way analysis
software) the tornado diagram does not specify discussed in the next section.
where the decision policy changes, it is necessary
to systematically narrow the range to pinpoint 5.2. Discussion of two-way sensitivity analysis
this crossover. Again this requires a number of Standard sensitivity analysis leads to an examina-
additional runs of the model (rainbow diagram), tion of two-way perturbations. The obvious fac-
depending on the accuracy required. tors of interest are those that proved significant in
The top three variables in the tornado diagram the one-way analysis; but other variables may
were ranked in the same order as suggested by have significant two-way interactions. As dis-
examination of the coefficients of the RSM equa- cussed previously, DPL rainbow diagrams were
tions, providing validation of our use of RSM. used to collect the data required for the two-way
The resulting equations are highly significant in a sensitivity analysis. Each rainbow diagram re-
least-squares (curve fit) sense. It is a simple exer- quired eleven runs of the model. The model was
cise to plug in the variables at the base case, evaluated at ten intervals of OR, ranging from
where all of our coded variables are zero. From − 27 to 27. ORP was set across the range (from
this, it is established that the maximum of the − 0.1 to 0.1) for these runs. It would take 11 such
three alternatives is the E(profit)No test. One-way rainbow diagrams to complete a strategy region
sensitivity analysis can then be accomplished by as seen in Figure 7. The total cost, in terms of
varying one factor at a time and comparing the model runs (influence diagram solution cycles), is
three expected utilities. This point of view would 110 per strategy region. In addition, we generated
indicate that there are only three significant vari- the strategy region by using the data off each of
ables: ORP, OR and DC. Obviously, there is the eleven rainbow diagrams. This would amount
some information lost at this point due to the to a significant amount of influence diagram eval-
approximation. The question is how much. Upon uations and substantial analyst time if there were
closer inspection of the tornado diagram, our a number of significant factors involved. Further,
coefficients of regression have picked out the most in the case where there are five significant factors,
significant factors in predicting the optimal test two-way sensitivity analysis would require:
decision. Again, there is only one variable that
affects the initial test decision (ORP). With this in
mind, the information lost may not be as critical

5
= 10 strategy region graphs.
2
as it initially appears.
Some indication of accuracy can be calculated These ten strategy region graphs would demand
after some clarification of the tornado diagram. It a total of 1100 model runs plus the time spent
was found that the decision policy change-over pulling 1100 data points off the rainbow dia-
point predicted by the approximations agrees to grams. Another disadvantage of using the rain-
within 0.01 of the factor perturbation. This level bow diagram is that the optimal policy change is
of accuracy is far beyond that required for the shown, but the actual optimal policy is not specif-
purpose of sensitivity analysis in this model. The ically identified. This requires model runs to de-
cost, in terms of model runs, is the eight fractional termine the optimal decision policy. At this point
screening design runs plus the 23 full factorial in the analysis, the initial method required nearly
design runs plus the six axial points in the central 160 model runs as compared with RSM’s 22.
composite design. These 22 runs exceed the 14 Obviously, RSM can lead to a significant savings
runs needed for a basic tornado diagram, but the in time and model runs. However, the RSM equa-
tornado diagram analysis required an additional tions are approximations; hence, some accuracy
ten or more runs to clarify the results. RSM’s has been forfeited.

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
RESPONSE SURFACE METHODOLOGY 179

There were several strategy regions generated tivity analysis. For the illustration, the one-way
with both methods for comparison. The results and two-way sensitivity analysis results were very
were similar from an accuracy viewpoint. The close. The minor loss of accuracy by using the
RSM method was accurate to within $1000 of RSM approach for sensitivity analysis is offset by
expected profit results and agreed with the rain- the analyst time savings. In addition, RSM coeffi-
bow diagram based strategy regions to 9 0.01 of cients (which show the direct impact of the vari-
the decision crossover-point. A comparison of able on the alternatives value/utility) can provide
strategy regions for the two methods is shown in important insights for the decision makers.
Figure 10. As before, the result shows that there
are only two optimal options in this region (No
test and Experimental seismic testing). Therefore,
the expected profit for the core sample test is REFERENCES
never the highest in this region. These strategy
regions would be used to locate areas were the ADA Decision Systems, DPL™ Ad6anced Version User
decision policy is sensitive to changes in two Guide. ADA Decision Systems, 2710 Sand Hill Road,
variables. Therefore, this accuracy appears suffi- Menlo Park, CA 94025, 1995.
Bauer, K.W., Parnell, G.S. and Meyers, D.A., ‘Identi-
cient for sensitivity analysis in decision analysis. fying key uncertainty relationships in decision analy-
The 22 RSM runs gave response surface equa- sis via designed experiments’, Proceedings of the
tions that could be used to produce sensitivity NATO AC/243 (Panel 7) Symposium on ‘Coping with
analysis on any combination of the seven group Uncertainty in Defence Decision Making’, The Hague,
factors involved. This efficiency would be more The Netherlands, 16 – 18 January 1995.
pronounced as the model increases in complexity. Box, G.E.P. and Draper, N.R., Empirical Model-Build-
Finally, there is one additional advantage of the ing and Response Surfaces, New York: Wiley, 1987.
RSM equations for each decision alternative. If Clemen, R.T., Making Hard Decisions: An Introduction
there are no interaction terms, the analyst is as- to Decision Analysis, Boston: PWS-Kent Publishing
sured that no two-way sensitivity analysis is re- Company, 1991.
Howard, R.A., ‘Decision analysis: practice and
quired. This is an important insight on the promise’, Manage. Sci., 34, 679 – 695 (1998).
problem structure and can result in savings of Howard, R.A. and Matheson, J.E., Readings on The
analyst time. Principles and Applications of Decision Analysis, Vol.
I, General Collection, Strategic Decisions Group,
1983.
Khuri, A.I. and Cornell, J.A., Response Surfaces: De-
6. CONCLUSION signs and Analyses, New York: Marcel Dekker, 1987.
Kirkwood, C.W., Strategic Decision Making: Multiob-
This paper illustrated the use of RSM as a sensi- jecti6e Decision Analysis with Spreadsheets, Belmont,
tivity analysis tool for DA. It was shown that any CA: Wadsworth Publishing, 1997.
single or multi-attribute DA value (or utility) Lowell, D.G., ‘Sensitivity to relevance in decision anal-
ysis’, Ph.D. Dissertation, Department of Engineering-
function can be transformed to a response func- Economic Systems, Stanford University, 1994.
tion of the uncertain variables. A sensitivity anal- McNamee, P. and Celona, J., Decision Analysis with
ysis framework designed to facilitate simultaneous Supertree, 2nd edn. South San Francisco, CA: The
perturbation of a number of uncertain variables Scientific Press, 1990, p. 155.
was presented. RSM was used with influence dia- Meyers, R.H., Response Surface Methodology, Virginia
gram models, but the methodology can also be Polytechnic Institute and State University, 1976.
used with decision tree models. The present ap- Meidt, G.J. and Bauer, K.W. Jr., ‘PCRSM: A decision
proach was illustrated with the well-known Oil support system for simulation metamodel construc-
Wildcatter Problem. tion’, Sim, 59, 183 – 191 (1992).
RSM can be used efficiently and effectively in Raiffa, H., Decision Analysis: Introductory Lectures on
Choices under Uncertainty, Menlo Park, CA: Ad-
the sensitivity analysis of decision analysis prob-
dison – Wesley, 1968.
lems. The RSM approach is more efficient than Rios Insua, D. and French S., ‘A framework for sensi-
current DA sensitivity analysis techniques and the tivity analysis in discrete multi-objective decision
RSM coefficients can provide improved insights making’, Eur. J. Oper. Res., 54, 176 – 190 (1991).
for decision makers. RSM offers the potential to Shachter, R.D. ‘Evaluating influence diagrams’, Oper.
significantly reduce the time to perform the sensi- Res., 34, 871 – 882 (1986).

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)
180 K.W. BAUER Jr. ET AL.

Smallwood, R.D. and Morris, P.A., ‘A task force deci- tute of Technology (AU), Wright – Patterson AFB
sion analysis’, Oper. Res., 28, 61–80 (1980). OH, December 1988 (AD-A206162).
Stone, M.A., ‘Decreasing nonconformance of parts in INDIA™, User’s Guide, Version 2.0, Student guide,
the air force supply system’, MS Thesis, AFIT/GOR/ Decision Focus Incorporated, Boston: PWS-Kent
MA/88D-6, School of Engineering, Air Force Insti- Publishing Company, 1991.

Copyright © 1999 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 8: 162 – 180 (1999)

You might also like