Professional Documents
Culture Documents
net/publication/303989747
CITATIONS READS
4 461
1 author:
Matthew V. Cilli
US Army Armament Research, Development and Engineering Center
7 PUBLICATIONS 26 CITATIONS
SEE PROFILE
All content following this page was uploaded by Matthew V. Cilli on 15 June 2016.
by
Matthew V. Cilli
A DISSERTATION
DOCTOR OF PHILOSOPHY
ADVISORY COMMITTEE
Abstract
The U.S. Department of Defense (DoD) has recently revised the defense acquisition system
in-depth review and interpretation of the revised defense acquisition system as put forth in
7 January 2015 DoDI 5000.02. One of the major changes in the revised acquisition system
requirements and lifecycle costs early in the acquisition process in order to ensure realistic
program baselines are established such that associated lifecycle costs of a contemplated
system are affordable within future budgets. Given that systems engineering trade-off
analyses will play a pivotal role in future defense acquisition efforts, this research
employed a two phase, exploratory sequential and embedded mixed methods approach
(Creswell, 2014) to take an in-depth look at the state of literature surrounding systems
engineering trade-off analyses, identify potential pitfalls associated with the typical
execution of a systems engineering trade-off analysis, quantify the risk that potential
pitfalls pose to acquisition decision quality, suggest remedies to mitigate the risk of each
pitfall, and measure the potential usefulness of contemplated innovations that may help
improve the quality of future systems engineering trade-off analyses. In the first phase of
this mixed methods study, qualitative data was captured through field observations and
engineering trade analyses. In the second phase, a larger sample of systems engineering
acquisition were surveyed to help interpret qualitative findings of the first phase. The
survey instrument was designed using Survey Monkey and was deployed through a link
posted on several groups within LinkedIn and was also sent directly via email to those with
known experience in this research area. The survey was open for a two month period and
collected responses from 181 participants. The findings and recommendations of this
Acknowledgments
I would like to thank my committee chair, Dr. Robert Cloutier, who fearlessly jumped into
my PhD pursuit midway through the journey to provide critical leadership and stability
during an unexpected change in faculty. His counsel helped me to navigate the unknown
terrain and always kept me focused on the next checkpoint while at the same time
also like to express my deep gratitude to my technical advisor, Dr. Gregory Parnell, for the
patient direction and expert advice he provided to me over the last few years. Through his
simultaneously building a bridge to practice in order to see the most satisfying returns. I
would also like to thank Dr. Teresa Zigh for her friendly and timely guidance through the
Stevens Institute of Technology Committees for the Protection of Human Subjects process
and for her suggestions regarding writing style. I would like to recognize the major
contributions that Mr. Clifford Marini made to this research as he spearheaded all computer
code development efforts needed to explore some of the innovations introduced by this
dissertation. In a similar vein, I’d like to recognize Mr. Rich Swanson for his role in some
of the coding efforts as well as his service as a trusted sounding board. I would like to
Engineering Center for their financial support and wealth of case studies. Finally, I'd like
to thank my wonderful wife, Val, for her loving inspiration, hopeful encouragement,
Table of Contents
Acknowledgments............................................................................................................... v
1.2.2 U.S. DoD Success Rates for New Product Developments ........................... 7
2.3 Literature Review Findings Regarding the State of Previous Research ............ 78
2.3.9 The Digital Thread & Lessons Learned From EIS Efforts ......................... 91
2.4.2 Crosswalk Between Research Gaps & Dissertation Research Questions ... 98
3.5 Population Size, Sample Size, and Confidence Interval .................................. 103
4.1 Perceptions Regarding Recent Changes in Defense Acquisition System ........ 109
ix
4.2 Perceived Level of Difficulty Associated with Early SE Trade-Off Analyses 111
4.7 Perceived Usefulness of Stakeholder Value Scatterplot With Uncertainty ..... 130
Vita.................................................................................................................................. 249
xi
List of Tables
Table 1-1. Costs Associated with Five Cancelled Development Programs .................... 10
Table 1-4. Comparing Revised Expectations of AoAs with Previous Expectations ....... 34
Table 1-6. Comparing New Expectations for Establishing a Program Office ................. 39
Table 2-1. Ulrich’s List of Decisions Associated with Product Development ................ 69
Table 2-2. Scoring Criteria for Documents Within Master Reference Library ............... 74
Table 2-4 Selected References – Analyze the Decision Information Activity .............. 76
Table 2-5 Selected References – Make and Manage Decisions Activity ....................... 77
Table 2-6. Hazelrigg's Assessment Selection Methods Against Required Properties .... 87
Table 4-1. 40 Potential Pitfalls and Their Perceived Risks ........................................... 118
Table 5-1. Crosswalk Between Potential Pitfalls and Detailed Recommendations....... 136
Table 6-1. Crosswalk Between SE Terms, INCOSE SE Handbook & ISO15288 ........ 144
xii
Table 6-2. Illustrating the One-to-One Mapping of Objective and Measure ................. 154
Table 6-5. End and Inflection Points for sUAV Functional Performance Value .......... 164
Table 6-7. Descriptions for Buzzard I, Buzzard II, Cardinal I, Cardinal II ................... 171
Table 6-8. Descriptions of Crow I, Crow II, Pigeon I, and Pigeon II ............................ 172
Table 6-9. Descriptions of Robin I, Robin II, Dove I, and Dove II ............................... 173
Table 6-10. Generic Physical Elements to Fundamental Objectives Mapping ............. 174
Table 6-12. Consequence Scorecard Example for sUAV Case Study........................... 180
Table 7-1. Restatement of Research Questions & Hypotheses (Revisited) .................. 205
xiii
List of Figures
Figure 1-1. Success Rates for New Product Development Efforts .................................... 6
Figure 1-4. Visualizing Inputs, Outputs, & Transformations of the Def Acq Sys .......... 16
Figure 1-5. Visualizing Wholes, Parts, & Relationships of the Def Acq Sys.................. 17
Figure 1-6. Visualizing Boundary, Interior, & Exterior of the Def Acq Sys ................... 19
Figure 1-7. Visualizing Structure, Function, & Process of the Def Acq Sys .................. 22
Figure 1-8. Visualizing Emergence, Hierarchy, & Openness of the Def Acq Sys .......... 25
Figure 1-9. Visualizing Executing Agents & Tasks Throughout the Def Acq Sys ......... 27
Figure 1-10. Comparing Usage of Key Terms in Old and New DoDI 5000.02 .............. 30
Figure 1-11. Visualizing Key Changes in the Def Acq Sys Using a Swimlane Chart .... 40
Figure 1-12. Visualizing Key Changes in the Def Acq Sys Using a Systemigram ......... 41
Figure 2-5. Keyword Cloud for Decision Analysis Journal 2009 - 2013 ........................ 59
xiv
Figure 2-6. Keyword Cloud for Operations Research Journal 2009 -2013 ..................... 60
Figure 2-7: Keyword Cloud for Management Science Journal 2009-2013 ..................... 63
Figure 2-9: Word Cloud of Information Visualization Journal Keywords 2009 - 2013 . 66
Figure 3-1. SIT CPHA Approval Letter for Study Protocol .......................................... 102
Figure 4-1. Perceptions Regarding Recent Changes in Defense Acquisition Process .. 110
Figure 4-3. Cluster of Perceived Risk Posed by Potential Pitfalls ................................. 117
Figure 6-3. Tradeoff Studies Throughout the Systems Development Life Cycle ......... 145
Figure 6-4: Key Properties of a High Quality Set of Fundamental Objectives ............. 152
Figure 6-6. Crosswalk - Fundamental Objectives & Stakeholder Need Statements ..... 154
Figure 6-10. Graphical Representation of Value Functions for sUAV example ........... 165
Figure 6-12. Assessment Flow Diagram (AFD) for a Hypothetical Activity ................. 176
Figure 6-18. Value Scatterplot for the sUAV Example ................................................. 191
Figure 6-19. Weightings as Generated by Focus Group 1 and Focus Group 2 ............. 194
Figure 6-22. sUAV Performance Value Sensitivity to Changes in Swing Weight ....... 198
Figure 6-25. Integration Pattern of Models & Actors of the ISEDM Process. .............. 203
Figure 7-1. Summary of the ISEDM Process in Illustration Form ................................ 210
1
1.1 Introduction
The U.S. Department of Defense (DoD) has recently revised the defense acquisition system
to address suspected root causes of unwanted acquisition outcomes. One of the major
trade-offs made between capability requirements and lifecycle costs early in the acquisition
process in order to ensure realistic program baselines are established and appropriately
vetted to ensure that associated lifecycle costs of a contemplated system are deemed to be
affordable within future budgets. Given that systems engineering trade-off analyses will
play a pivotal role in future defense acquisition efforts, this research employed a two-phase,
exploratory sequential and embedded mixed methods study (Creswell, 2014) to take an in-
depth look at the state of literature surrounding systems engineering trade-off analyses,
identify potential pitfalls associated with the typical execution of a systems engineering
trade-off analysis, quantify the risk that potential pitfalls pose to acquisition decision
quality, suggest remedies to mitigate the risk of each pitfall, and measure the potential
usefulness of contemplated innovations that may help improve the quality of future systems
engineering trade-off analyses. In the first phase, the needs of requirements writers and
1
This chapter is an extended version of Cilli, M., Parnell, G., Cloutier, R., and Zigh, T. (2016
Expected). A Systems Engineering Perspective of the Revised Defense Acquisition System.
Systems Engineering, XX(X), XXX-XXX. Accepted for publication 11/29/2015.
2
product development decision makers were explored through field observations and direct
interviews as they sought to understand system level cost, schedule, and performance
consequences of a set of contemplated requirements. From these case studies, theories for
improved processes were generated and a description of the emerging process was
documented. In the second phase of this research, a larger sample of systems engineering
acquisition were surveyed to help interpret qualitative findings of the first phase.
New product development is a difficult task in any domain. Such an endeavor within the
U.S. defense sector may be especially challenging as the process involves interactions
between industrial suppliers of goods and services with multiple government entities often
trying to balance competing objectives. Decisions associated with how to best balance
accountability, and high uncertainty. New product development for the U.S. Department
of Defense (DoD) is achieved through the defense acquisition system. The defense
opportunities for improvement. Despite its shortcomings, the defense acquisition system
has produced many weapons that have performed well in battle. As the author of Arming
the Eagle states, "...for the most part weapons systems, developed in one manner or
another, worked when they had to, and military and national goals were usually achieved."
(Jones 1999)
3
Past successes aside, the defense acquisition system must provide more affordable systems
as a matter of national security. Gansler describes the urgent need for improvement as
follows:
"Cynics might point out that there has long been a need for reform of the Defense
Department's acquisition processes. Hundreds of studies, reports, and even entire books
have been written about both the need and the actions required. Yet the costs of weapons
and services necessary for the nation's security have continued to rise, and the time to
deliver them has been increasing. It is fair to ask, "Why will this time be different?" The
answer is that the nation's security has reached the tipping point - the point at which
covering the full range of national security considerations (terrorism at home and abroad,
expeditionary conflicts, regional wars, activities associated with stability and control in
unstable areas of the world, potential peer competitors, and nuclear deterrence) has simply
become unaffordable in an era in which the nation must spend an increasing share of its
population, the rebuilding of its deteriorating infrastructure, and the repayment of debts
incurred during the economic collapse in the first decade of the twenty-first century." page
without evidence of improved acquisition outcomes, the U.S. Government understands that
this time must be different. This research takes a thorough look at the revised defense
acquisition system and uncovers reasons to believe that this time may indeed be different.
Changes from the previous defense acquisition process to the current process are
4
significant and may be cause for some cautious optimism in the U.S. This research will
show how the architects of the revised defense acquisition system have increased emphasis
on systems engineering activities applied early in the lifecycle so that meaningful trade-
offs between capability requirements and lifecycle costs can be explored as requirements
are being written to ensure realistic program baselines are established such that associated
lifecycle costs will likely fit within future budgets. If this time is to be truly different, early
identification of gaps that are likely to emerge as the defense acquisition community
attempts to execute the new process is vital. These potential gaps are captured as emerging
The next section of this dissertation presents data that show that new product developments
are indeed difficult in any domain and goes on to explain how such endeavors in the defense
sector may be especially difficult. The dissertation then presents original analysis of data
collected from multiple sources and quantifies current success rates of new product
development within the DoD. Potential root causes of the unwanted outcomes are
discussed as are the recent changes in the defense acquisition system enacted to address
the potential root causes. Two systems thinking methodologies are then applied in a
the revised defense acquisition system as put forth in 7 January 2015 DoDI 5000.02. The
result is a novel and compelling view that clearly elucidates the core activities, actors,
interfaces, and purpose of this complex process. The research identifies essential
differences between the current and the previous defense acquisition processes and
activities and wraps up by discussing process gaps likely to emerge as the defense
New product development is a difficult undertaking in any market across the globe. The
approximately 40% of all new product developments fail to achieve market success. The
failure rate jumps to about 54% if the new product development includes high levels of
innovation. Since the PDMA started conducting this assessment in 1990, the new product
development failure rate has consistently been assessed near the 40% mark (1990 - 42%,
1995 - 41%, 2004 - 41%) (Markham & Lee 2013). These data suggest that high failure
Figure 1-1. Success Rates for New Product Development Efforts with Various Levels of Innovation
(author's original graphic using data from (Markham & Lee 2013))
Beyond failing to meet market success, new product development efforts of any kind often
exhibit other unwanted symptoms of duress such as cost overruns and schedule slips.
Figure 1-1 was created using the data from the PDMA 2012 survey and it shows a
relationship between the degree of innovation and success rate as assessed across four
measures - on time, on budget, met technical objectives, met market objectives. Note that
only 44% of new product developments with a nominal degree of innovation meet
development schedule and 49% met their allotted development budget. For products with
higher levels of innovation, development schedules are met by only 29% of the projects
and development budgets are adhered to by a mere 32% of the projects. (Markham & Lee
2013) This record of poor outcomes for development projects around the world and across
7
industries indicates that new product development efforts are tremendously difficult tasks,
regardless of market.
While new product developments may be difficult tasks in all domains, such endeavors in
the defense sector are especially difficult as the process involves interactions between
industrial suppliers of goods and services with multiple government entities, often trying
The defense industry is a major sector of the U.S. economy, but because it has essentially
a single buyer (the Department of Defense), has a small group of major suppliers
Recently DoD issued Performance of the Defense Acquisition System: Annual Report
2013. In the forward of the annual report, the Under Secretary of Defense Acquisition,
Technology and Logistics begins by describing the motivation for measuring acquisition
process performance.
While the United States achieves its national security missions by equipping its military
forces with the best weapons systems in the world, questions continue about the
performance of the defense acquisition system. How effective is it? How can that
effectiveness be objectively measured? Can we use those measures to affect behaviors with
8
appropriate incentives or determine which policies and procedures improve results and
Performance of the Defense Acquisition System: Annual Report 2013 presents some
measures with some additional measures that could help round-out an assessment of
As one measure of new product development performance within DoD, consider the data
this information reveals that of the 87 programs in DoD’s 2013 major defense acquisition
portfolio about 36% are at or below first full estimate of total acquisition cost. This seems
to be on par with worldwide, multi-industry records for new products involving moderate
to high innovation discussed above. To get a more complete picture of new product
development outcomes with regards to cost, however, the data must be assessed beyond a
simple pass/fail measure. Figure 1-2 illustrates the distribution of cost performance of the
87 programs as a dot plot with associated descriptive statistics. Note that although a good
portion of the programs are at or below cost expectations, the total cost growth for those
that miss can be quite large with at least six projects missing their cost targets by about
300% or more.
9
Figure 1-2. Dotplot of Percent Change in Total Acquisition Costs From First Full Estimates
(author's original graphic using data from (Sullivan 2014))
By itself, this cost growth measure provides an incomplete assessment of cost performance
"Performance of the Defense Acquisition System 2013 Annual Report" (Kendall 2013)
about 30 major defense acquisition programs have been cancelled between 1995 and 2013
desirable to expect all research and development projects to successfully reach production,
a high efficiency defense acquisition process should identify doomed concepts before
investing major resources. To get a sense of the magnitude of costs associated with the
cancelled programs, Table 1-1 provides the costs of five of the thirty cancelled
development programs. These data were obtained from GAO-14-77 (Chaplain et al. 2014)
10
and they suggest that it is not uncommon for DoD to spend several billion dollars on a
development program prior to termination. The opportunity cost associated with sinking
extensive funding into development programs that will eventually be terminated is that this
funding could otherwise be spent elsewhere within the development portfolio and
potentially increase the pace at which superior capabilities are delivered to the soldier,
Unwanted trends in defense acquisition are not limited to cost issues. A recent National
A Retrospective Review and Benefits for Future Air Force Systems Acquisition.) (Board &
others 2008) observed that in an era where development time of commercial technology
products has significantly fallen, the development time of major weapon systems has
dramatically risen. Page fifteen of the NRC report provides data that illustrates this
observation. That same data was used to create Figure 1-3 (Board & others 2008). Notice
that of the major programs sampled by the NRC report, the development time required by
recent programs is two to three times greater than the development time of historical
programs.
11
The NRC report provides an initial list of possible causes for such schedule increases.
Many causes for this trend have been suggested, including the increased complexity of
the tasks and the systems involved from both technological and human/organizational
perspectives; funding instability; loss of “mission urgency” after the end of the Cold
War; bureaucracy, which increases cost and schedule but not value; and the need to
satisfy the demands of an increasingly diverse user community. (Board & others 2008)
The next section of this chapter explores further the potential root causes of these
Acquisition. Such reports often identify potential root causes and offer some
recommendations. In general, these reports call for early, robust, and rigorous assessments
include life-cycle costs, schedule, and performance. Table 1-2 provides summary
Document Observation
GAO-06-883 The services’ annual senior-level project portfolio reviews “do not include
Stronger Practices Needed to formal assessment of many of the technical and business criteria…such as
Improve Technology Transition determining if the costs, benefits, and risks are well understood and
Process (Found 2007) technology is affordable.”
Defense Acquisition Performance "...the budget, acquisition and requirements processes are not connected
Assessment Project (DAPA) (Kadish organizationally at any level below the Deputy Secretary of Defense."
et al. 2006) January 2006
GAO-07-388 "While DoD’s JCIDS process provides a framework for reviewing and
An Integrated Portfolio Management validating the initial need for proposed capabilities, it does not focus on the
Approach to Weapon System cost and feasibility of acquiring the capability to be developed and
Investments Could Improve DoD's fielded…Milestone A are often skipped…this practice eliminates a key
Acquisition Outcomes (Sullivan opportunity for decision makers to assess the early product knowledge needed
2007) to establish a business case that is based on realistic cost, schedule, and
performance expectations."
GAO-08-619 "DoD's inability to allocate funding effectively to programs is largely driven
Defense Acquisitions: A Knowledge- by the acceptance of unrealistic cost estimates and a failure to balance needs
Based Funding Approach Could based on available resources...Development costs for major acquisition
Improve Major Weapon System programs are often underestimated at program initiation ... in large part
Program Outcomes (Sullivan 2008) because the estimates are based on limited knowledge and optimistic
assumptions about system requirements and critical technologies."
Getting to Best: Reforming the "The principle shortcomings of the existing requirements process are that: 1)
Defense Acquisition Enterprise. A it does not couple needs for specific future systems to an overall national
Business Imperative for Change defense strategy; and 2) requirements are largely determined by the military
From the Task Force On Defense services without realistic input as to what is technically feasible from an
Acquisition Law and Oversight engineering perspective…"
(Augustine & others 2009)
GAO-09-665 "Although an AOA is just one of several inputs required to initiate a weapon
Many Analyses of Alternatives Have system program, a robust AOA can be a key element to ensure that new
Not Provided a Robust Assessment of programs have a sound, executable business case. Many of the AOAs that
Weapon System Options (Sullivan GAO reviewed did not effectively consider a broad range of alternatives for
2010) addressing a warfighting need or assess technical and other risks associated
with each alternative."
13
Taken together, the criticisms listed in Table 1-2 show that the defense acquisition system
did not have sufficient mechanisms in place to link the voice of the end user with the voice
of the design engineer or the voice of the budgetary analyst. Before the recent updates (to
be discussed in the next section), the defense acquisition system was an open-loop-system
military end user. Due to the open loop nature of the process however, these capability
requirements were not informed by technical and cost realities associated with the physical
requirements writers sometimes sensed the risk of the open loop process and tried to close
the loop by contacting individual subject matter experts on the technical side of the defense
to early systems engineering analyses, would reveal that no conceivable system could
simultaneously meet all threshold requirements within the requirements document. But
because such analyses were rarely done, programs were initiated to pursue systems that
addressed the ill-conceived requirements. These programs would quickly show predictable
The recent changes in the defense acquisition system strive to put an end to the inefficient
acquisition scenario described above by moving away from the open loop requirements
rigorous assessments of a broad range of system level alternatives across a thorough set of
stakeholder value criteria to include life-cycle costs, schedule, and performance. These
changes in the defense acquisition system are discussed in the following two sections.
Since 2009, Congress and DoD have taken significant actions to reverse the unwanted
trends and improve outcomes of the DoD's new product development process. A
November 2013 DoD Issued Interim DoD Instruction 5000.02 Operation of the Defense
Acquisition System
January 2015 DoD Issued DoD Instruction 5000.02 Operation of the Defense
Acquisition System
This introductory chapter will show how the new legislation and associated initiatives
make organizational and policy changes in order to place increased emphasis on systems
engineering early in the lifecycle in an effort to balance the often competing trades of
described in 7 January 2015 DoDI 5000.02 (Instruction 2015) through two systems
view of the core activities, actors, interfaces, and purpose of this complex process.
A systems thinking tool known as the Conceptagon (Boardman & Sauser 2008; Boardman
& Sauser 2013; Boardman 2010; Boardman et al. 2009) is employed in this section to
structure the examination and interpretation of the updated defense acquisition system. The
Conceptagon has twenty one elements organized as seven triads. The seven triads are
Each one of these triads will be applied as a lens through which the updated defense
acquisition system will be examined. Observations made through the application of each
lens will be captured as elements of a Systemigram - a soft systems method that combines
simple shapes with short phrases to portray strategic intent of a system of interest on a
single page in order to stimulate creative yet focused discussion about the core aspects of
a system (Boardman & Sauser 2008; Boardman & Sauser 2013). With the application of
each Conceptagon triad lens, additional strands of the Systemigram will be added to
16
gradually build a full, rich description of the defense acquisition system. This technique
enables the authors to pull long threads of interacting system elements and communicate
Systems exist to transform inputs into outputs. The defense acquisition system is no
exception as it transforms and sustains capability requirements and national resources into
enhanced military capabilities. Figure 1-4 provides a sketch of this transformation. The
January 7, 2015 DoD Instruction 5000.02 Operation of the Defense Acquisition System is
a 154 page document that "provides the detailed procedures that guide the operation of the
Figure 1-4. Visualizing Inputs, Outputs, and Transformations of the Defense Acquisition System
Through the Use of a Systemigram
17
The defense acquisition system is part of a larger whole, an extended acquisition enterprise.
The extended acquisition enterprise seeks to yield enhanced military capabilities in order
to make a more robust military action option available to national leaders striving to
execution of this extended national security enterprise requires strong interaction between
the defense acquisition system, the Joint Capabilities and Integration Development System
(JCIDS) and the Planning, Programming, Budgeting, and Execution (PPBE) process.
Figure 1-5 depicts this relationship between the major elements of the extended acquisition
enterprise.
Figure 1-5. Visualizing Wholes, Parts, and Relationships of the Defense Acquisition System Through
the Use of a Systemigram
18
7 January 2015 DoDI 5000.02 describes the importance of this interrelatedness as follows:
Acquisition, requirements, and budgeting, are closely related and must operate simultaneously
with full cooperation and in close coordination. Validated “Capability Requirements” provide
the basis for defining the products that will be acquired through the acquisition system and the
budgeting process determines Department priorities and resource allocations and provides the
funds necessary to execute planned programs. Throughout a product’s life cycle, adjustments
may have to be made to keep the three processes aligned. Capability requirements may have
to be adjusted to conform to technical and fiscal reality. Acquisition programs may have to
adjust to changing requirements and funding availability. Budgeted funds may have to be
and priorities. Stable capability requirements and funding are important to successful program
execution. Those responsible for the three processes at the DoD level and within the DoD
Components must work closely together to adapt to changing circumstances as needed, and to
JCIDS & PPBE process are considered to be exterior to the defense acquisition system but
within the boundaries of the extended acquisition enterprise. Interior to the defense
acquisition system are the Materiel Solution Analysis Phase, the Technology Maturity &
Risk Reduction Phase, the Engineering & Manufacturing Development Phase, the
Production & Deployment Phase, and the Operations & Support / Disposal Phase. Figure
1-6 represents these phases as smaller ovals within the larger oval of the defense acquisition
system.
19
Figure 1-6. Visualizing Boundary, Interior, and Exterior of the Defense Acquisition System Through
the Use of a Systemigram
Although the system of interest of this research is the defense acquisition system, Figure
1-7 illustrates the structure, function, and process of this system as well as the closely
related PPBE and JCIDS processes so that interactions with the defense acquisition system
The defense acquisition system begins with a Materiel Development Decision (MDD)
light of the Future Years Development Plan (FYDP) generated within the PPBE Process.
A Defense Acquisition Program is initiated if, at the MDD point, it is determined that a
capability gap is best filled with a new product. Throughout the follow-on phase, the
Materiel Solution Analysis Phase, alternative concepts are assessed in order to find the
solution space that best balances competing objectives of performance, life cycle cost, and
Milestone A decision point where a specific conceptual design is selected and resources
are mobilized to execute the Technology Maturation and Risk Reduction Phase. Activities
within this phase are conducted to mature technologies associated with the selected
conceptual design and to reduce uncertainty around performance and cost estimates
through modeling and simulation and testing activities. These activities inform the crafting
of the Capabilities Development Document (CDD) and the development of the Request for
Proposal (RFP) soliciting Industry Partners for the follow-on phase. This phase culminates
with the decision to release the RFP, execution of a proposal review and selection process,
and the decision to award the development contract(s) at the Milestone B decision point.
(EMD) Phase which contains activities that generate detailed designs, multiple prototypes,
extensive test data, and production plans. EMD concludes with an initial production
decision conducted at Milestone C followed by full rate production decisions within the
Production and Deployment Phase. When combined with new equipment training,
The Chairman of the Joint Chiefs of Staff Instruction (CJCSI) 3170.01H establishes and
describes the JCIDS process. As indicated in Figure 1-7, the JCIDS process begins with
high level strategy and guidance in the National Security Strategy, National Defense
Strategy, Quadrennial Defense Review, Defense Planning Guidance (DPG), etc. informing
Substantial capability gaps trigger the formulation of an ICD which forms the basis for
follow-on capability requirements documents (CDDs and CPDs) generated to guide the
development, procurement, and fielding solutions that satisfy the capability requirements
and close associated capability gaps. Interaction between the JCIDS process and the
defense acquisition system will be explored more fully in later sections of this dissertation.
22
Figure 1-7. Visualizing Structure, Function, and Process of the Defense Acquisition System and the
Extended Acquisition Enterprise Through the Use of a Systemigram
DoD Directive NUMBER 7045.14 January 25, 2013 provides PPBE process descriptions.
The process starts with a Planning phase that accepts inputs from the National Security
Strategy, threat assessment, and the Quadrennial Defense Review as appropriate to inform
the development of a National Defense Strategy that guides the establishment of planning
Resourcing phase that has a programming element, a budgeting element, and an execution
element. Within the programming element, DoD Components accept planning guidance
simultaneously serves as initial inputs to budget estimates of the budgeting element which
in turn feed a budget review. The budget review is conducted in coordination with the
program review and the coordinated output feeds the publication of decision documents.
Published decision documents inform the DoD Component's POM which communicate the
final resource allocation decisions of the Military Departments. This information is pushed
to the FYDP as the approved program and financial plan for the DoD expressed in cost,
manpower, and force structure dimensions over a 5-year period (force structure for 8-year
period). Execution reviews are conducted annually to assess the degree to which programs
and financing are meeting planning guidance. Adjustments to the decision documents are
Openness to changes that occur outside the system of interest (the defense acquisition
system) boundary is one of the essential features envisioned by the architects of the updated
defense acquisition system. The architects of the updated defense acquisition system
emphasize that the inputs to this system are not static and that successful execution of the
extended defense acquisition enterprise requires strong interaction and coordination among
the requirements, budgeting, and acquisition elements throughout the process. Feedback
24
loops established between the defense acquisition system and the other systems of the
extended defense acquisition enterprise seek to capture changes in the affordability goals,
the operational needs, or the technology opportunities inputs (changes caused by external
environment or by the acquisition process itself) and then inform those same external
systems of the consequences of the sensed changes so that the external systems can adjust
accordingly.
The Systemigram of Figure 1-8 offers a representation of this feedback loop notion. Notice
the arrows originating from each capability document and terminating at the appropriate
acquisition phase to indicate a guiding action. Note too how the feedback loop is completed
by connecting the output of each acquisition phase with the subsequent capability
document. For instance, activities within the Materiel Solution Phase inform the crafting
of the draft CDD, activities of the Technology Maturity & Risk Reduction Phase inform
the development of the validated CDD, and the activities of the Engineering &
Manufacturing Development Phase inform the writing of the CPD. Taken together, the
arrows flowing between the acquisition phases within the defense acquisition system, the
JCIDS process, and the PPBE process form a feedback loop structure that represents the
Figure 1-8. Visualizing Emergence, Hierarchy, and Openness of the Defense Acquisition System
Through the Use of a Systemigram
With this openness to change comes a particular form of organization known as hierarchy.
This hierarchy organizational scheme is evident in that each system element is charged
with a particular span of care, talent is distributed across each system element, relationships
between elements internal and external to the system are authentic, and the discovery cycle
emergence. The defense acquisition system displays emergent attributes in that each
element of the defense acquisition system matures a new product development design from
its previous state but the value of the military capability delivered through the execution of
26
the defense acquisition system as a whole cannot be measured by simply summing the
value obtained upon the completion of each acquisition system element. (Boardman &
Sauser 2008; Boardman & Sauser 2013; Boardman 2010; Boardman et al. 2009)
DoDI 5000.02 provides a rich description of how elements of command, control, and
communication are embedded within the defense acquisition system. The document
identifies parties responsible for decision making, how those decisions (commands) are
captured and communicated in key documents and how subsequent decisions are informed
by feedback loops (control). Since the feedback loops have already been discussed in the
Emergence, Hierarchy, Openness section above, this section will focus on command and
communication aspects of the defense acquisition system. Towards that end, the key
activities of the first four phases of the defense acquisition system and the executing agents
for each of those activities are identified in the swim lane chart of Figure 1-9 and the
Figure 1-9. Visualizing Executing Agents & Associated Tasks Throughout the Defense Acquisition
System Through the Use of a Swim-Lane Chart
DECISION AUTHORITY: The Decision Authority is responsible for the major program decisions
across the process to include MDD, MS A, CDD Validation Decision, the RFP Release Decision,
MS B, Long Lead Item Procurement Decisions, MS C, and Full Rate Production Decisions.
DIRECTOR, COST ASSESSMENT AND PROGRAM EVALUATION (DCAPE): The DCAPE
prepares the Analysis of Alternatives (AoA) Study Guidance as part of the Materiel Solution
Analysis Phase and Conducts Independent Cost Estimates within the EMD phase.
DOD COMPONENTS: DoD Components are responsible for conducting an affordability analysis
to yield tentative affordability goals prior to MDD and for refining those affordability goals through
the Materiel Solution Analysis Phase and for providing binding affordability caps in the Technology
Maturation and Risk Reduction Phase. The DoD Component conducts early systems engineering
(SE) analyses prior to MDD to gauge the degree to which considered materiel alternatives
potentially address capability gaps and to gain a rough order of magnitude understanding of the life
cycle cost implications of each materiel alternative and to understand the anticipated development
28
time required for each alternative. The DoD Component provides the results of the early SE
analyses to the Decision Authority in preparation for MDD and consumes the results of the early
SE analyses to inform the planning and execution of an AoA during the Materiel Solution Analysis
Phase "... informed by and in support of the AoA, to support selection of a preferred materiel solution
and development of the draft Capability Development Document (or equivalent requirements
document)." The DoD Component prepares an AoA study plan and conducts an AoA in the Materiel
Solution Analysis Phase and updates the AoA as needed in the Technology Maturation & Risk
Reduction (TMRR) Phase. The DoD Component estimates lifecycle costs for considered
alternatives in the Materiel Solution Analysis Phase and updates those estimates for alternatives
pursued through the Technology Maturation and Risk Reduction Phase and further refines those
estimates for detailed designs developed in the Engineering and Manufacturing Development Phase.
PROGRAM OFFICE: A Program Office is established during the Materiel Solution Analysis
Phase. During this initial phase, the Program Manager (PM) is responsible for conducting early SE
Assessments, crafting an Acquisition Strategy, preparing a preliminary reliability, availability,
maintainability, and cost rationale report, and generating a Request for Proposal (RFP) for the
TMRR Phase. In the TMRR Phase, the Program Manager conducts systems engineering trade-off
analyses to search for a solution space that strikes a compelling balance between capability, cost,
and schedule. Additionally in this phase, the PM updates the acquisition strategy, creates initial
reliability growth curves, conducts a preliminary design review, generates an RFP for EMD, and
creates an integrated test program and communicates the detailed elements of the integrated test
program through the Test and Evaluation Master Plan (TEMP). In the EMD phase, the PM updates
all the artifacts of TMRR phase based on results of the detailed design activities, conducts a critical
design review, and generates an RFP for the Production phase.
INDUSTRY: Industry potentially participates in the Materiel Solution Analysis phase through
competitive concept definition studies and also prepares proposals for the TMRR phase in response
to an RFP put forth by the PM. Industry is often responsible for maturing technologies and creating
preliminary designs in the TMRR phase and prepares proposals for the EMD phase. In the EMD
phase, Industry is creating detailed designs, building prototypes, executing development testing, and
preparing proposals for the production & deployment phase where Industry produces the item.
The defense acquisition system seeks to bring harmony to the process at large by bounding
approaches that do not adequately consider technical feasibility, turn a deaf ear to
operational utility, or downplay the investment needed to achieve desired end-state. Within
the bounds of the framework, however, ample variety exists as one generates creative
alternatives to execute the defense acquisition system tailored to the acquisition context at
hand.
29
While these generic decision points and milestones are standard, MDAs have full latitude
to tailor programs in the most effective and efficient structure possible, to include
eliminating phases and combining or eliminating milestones and decision points, unless
The previous section of this document provided an interpretation of the revised defense
acquisition system as described in 7 January 2015 DoDI 5000.02. This section seeks to
compare and contrast the previous and current defense acquisition systems in order to
identify areas of increased emphasis and unpack those areas to the extent required to gain
The changes from the previous version of DoDI 5000.02 to the current instruction are
significant. The current document sets "affordability" as one of the central themes and
cites the early application of systems engineering assessments and trade-off analyses used
in concert with a robust Analysis of Alternatives (AoA) as the model for achieving the
desired ends. A word count analysis of the previous and current versions of DoDI 5000.02
30
supports this observation. As seen in Figure 1-10, the term "affordability" appeared only
4 times in the 2008 version of DoDI 5000.02 whereas the current offering of DoDI 5000.02
hits the word "affordability" 108 times. Figure 10 also shows that the term "AoA" appears
more than twice as often in the new document, the term "Systems Engineering” appears
almost twice as often in the latest release of DoDI 5000.02 and the term "Trade-Off" (or
"Trade-off”, or “tradeoff") appears nine times as often in the new document as it did in the
old. The remainder of this section will go beyond a word count and will discuss more fully
the reforms put forth in the January 2015 DoDI 5000.02 document as they apply to the
Figure 1-10. Comparing Usage of Key Terms in Old and New DoDI 5000.02
31
1.2.6.1 Affordability
Affordability Analysis has received particular emphasis in the updated Directives and
Instructions. The 7 January 2015 DoDI 5000.02 dedicates a five page enclosure to the
acquisition communities. The Department has a long history of starting programs that
proved to be unaffordable. The result of this practice has been costly program cancelations
supported within reasonable expectations for future budgets. Affordability constraints for
procurement and sustainment will be derived early in program planning processes. These
constraints will be used to ensure capability requirements prioritization and cost tradeoffs
The intent of this policy is to require affordability analysis that addresses the total life cycle
of the planned program, including beyond the Future Years Defense Program (FYDP).
in DoD Directive 5000.01 (Reference (a)). Affordability within the FYDP is part of the
2366b (Reference (g)) for Major Defense Acquisition Programs (MDAPs) at and beyond
Milestone B. Assessing life-cycle affordability of new and upgraded systems is also crucial
32
(AoAs), guiding capability requirements and engineering tradeoffs, and setting realistic
program baselines to control life- cycle costs and help instil more cost-conscious
and ongoing communication with the requirements community on the cost and risk
It is clear from just these two paragraphs that the architects of the revised defense
acquisition system require that meaningful trade-offs between capability requirements and
lifecycle costs be explored early and often in order to ensure realistic program baselines
are established such that associated lifecycle costs will likely fit within future budgets. The
that it can inform early capability requirements trades and the selection of alternatives to
be considered during the AoA. Affordability constraints are not required before the MDD;
however, conducting some analysis before that point is beneficial. The best opportunity
for ensuring that a program will be affordable is through tailoring capability requirements
before and during the AoA(s) and early development. Thus, the Components will
incorporate estimated funding streams for future programs within their affordability
analyses at the earliest conceptual point and specify those estimates at the MDD and
Based on the Component’s affordability analysis and recommendations, the MDA will set
(a) At MDD. Tentative affordability cost goals (e.g., total funding, annual
and inventory goals to help scope the AoA and provide targets around which
to consider alternatives.
costs.
(c) At the Development RFP Release Decision Point, Milestone B, and Beyond.
Notice how tentative affordability cost goals are expected to inform MDD and by
Development RFP Release Decision Point affordability analysis products are to be used as
The fundamental ends that are to be achieved by the defense acquisition system are
programs that deliver compelling military capabilities within the constraints of available
resources. As a means to these ends, the current DoDI 5000.02 instructs that meaningful
trade-offs between capability requirements and lifecycle costs be conducted. As such, the
architects of the revised defense acquisition system have amplified the instruction
associated with one of the tasks where such trades should be explored - the Analysis of
Alternatives (AoA). The new instruction signals the increased emphasis on the AoA by
dedicating an entire enclosure to the topic (Enclosure 9 of 7 January 2015 DoDI 5000.02).
To gain an understanding of the differences between the way the old and the new DoDI
34
5000.02 documents treat AoA's, Table 1-4 provides a side by side comparison of the
DEC 2008 DoDI 5000.02, Pages 58-59 JAN 2015 DoDI 5000.02, Page 126
Assess the extent to which the AoA: Assess:
a. Illuminated capability advantages and (1) The extent to which the AoA:
disadvantages; (a) Examines sufficient feasible alternatives.
b. Considered joint operational plans; (b) Considers tradeoffs among cost, schedule,
c. Examined sufficient feasible alternatives; sustainment, and required capabilities for
d. Discussed key assumptions and variables and each alternative considered.
sensitivity to changes in these; (c) Achieves the affordability goals
e. Calculated costs; and, established at the MDD and with what
f. Assessed the following: risks.
(1) Technology risk and maturity; (d) Uses sound methodology.
(2) Alternative ways to improve the energy (e) Discusses key assumptions and variables
efficiency of DoD tactical systems with and sensitivity to changes in these.
end items that create a demand for energy, (f) Bases conclusions or recommendations, if
consistent with mission requirements and any, on the results of the analysis.
cost effectiveness; and (g) Considers the fully burdened cost of
(3) Appropriate system training to ensure that energy (FBCE), where FBCE is a
effective and efficient training is provided discriminator among alternatives.
with the system (2) Whether additional analysis is required.
(3) How the AoA results will be used to influence
the direction of the program.
Note how the new DoDI 5000.02 offering provides a mandate for the AoA to consider
trade-offs, achieve affordability goals, use sound methodology, and use results to influence
direction of the program - key topics that were met with silence on the 2008 version of the
instruction. The 7 January 2015 DoDI 5000.02 reinforces these key topics by providing
Much discussion regarding the execution of trade-off analyses has been added to the
Systems Engineering Enclosure of 7 January DoDI 5000.02. Table 1-5 provides a high
35
level summary of the topics addressed within the Systems Engineering Enclosure of the
2015 DoDI 5000.02 compared to the 2008 DoDI 5000.02. Discussion particularly
pertinent to trade-off analyses can be found in the added topics of Development Planning
Table 1-5. Comparing Topics Covered Within the Systems Engineering Enclosure of Old and New
DoDI 5000.02
As discussed above, the Development Planning section is a new addition to the Systems
Engineering Enclosure within DoDI 5000.02. The Development Planning section of the
mature technologies, and to begin system design must be based on early systems
materiel solution approaches are technically feasible and have the potential to
b) During the Materiel Solution Analysis Phase, the Components will conduct early
c) In preparation for Milestone A, and to provide the technical basis for executing
the Technology Maturation and Risk Reduction Phase, the Program Manager
develop the technical approach for acquiring the product. This technical
risks. The results will be incorporated in the SEP {Systems Engineering Plan
Note the call for systems engineering analyses prior to MDD and then again prior to
Milestone A. The text describes how systems engineering analyses are interrelated with
the AoA in that they "inform an Analysis of Alternatives" and are "informed by and in
support of the AoA" and together are meant to assess candidate materiel solution
37
approaches at the system level across capability gaps of interest, desired operational
development planning horizon, and presumably life-cycle costs relative to the affordability
Affordability Analysis Enclosure of 7 July 2015 DoDI 5000.02 and as discussed in the
Trade-Off Analyses section within the Systems Engineering enclosure of 7 January 2015
instructing DoD components and the Program Manager to use systems engineering
analyses and assessments in the earliest parts of a program, this section on trade-off
analyses instructs Program Managers to play a pivotal role in the shaping of the draft CDD
through the use of trade-off analyses. The Systems Engineering Trade-off Analyses section
reads as follows:
a) During the acquisition life cycle, the Program Manager will conduct
major design parameters, and schedule. The results will be provided to the
MDA and will identify major affordability drivers and show how the
The architects of the revised defense acquisition system instruct Program Managers to
create models that capture and communicate the relationship between the array of design
choices available to the engineer and the consequences of such choices when viewed at the
With the increased emphasis placed on early application of systems engineering analyses
office has received additional attention in the latest release of DoDI 5000.02. Table 1-6
compares the language used to describe the recommended timing for standing up a program
office.
39
Notice how the January 2015 DoDI 5000.02 uses very direct language instructing Program
Offices be established and Program Managers be selected during the Materiel Solution
This section has established that the changes between the 2008 version of DoDI 5000.02
and the 2015 version are not trivial. Figures 1-11 and 1-12 provide a quick reference
summary of the major changes from a systems engineering perspective. Figure 1-11 uses
red text within a swim-lane chart of the defense acquisition system to highlight the new
tasks associated with the early application of systems engineering assessments and trade-
off analyses used in concert with a robust AoA in order to achieve the desired ends of
compelling military capabilities within affordability goals and constraints. Note too that
40
Figure 1-11. Visualizing Key Changes in the Defense Acquisition System Through the Use of a
Swim-Lane Chart
The Systemigram of Figure 1-12 shows the significant differences between the 2008 DoDI
5000.02 and the 2015 DoDI 5000.02 as red arrows or shapes. Note the red arrows
extending from the affordability analysis to the MDD and to the activities associated with
the first two phases of the acquisition process. Also notice that the arrows flowing between
the acquisition phases within the defense acquisition system, the JCIDS process, and the
PPBE process form a feedback loop structure. These informing actions are depicted in
Figure 1-12 as red arrows to indicate DoDI 5000.02’s increased emphasis on such
interfaces, the defense acquisition system’s openness to change and mechanisms to adapt.
41
From the interpretation of the 7 January 2015 DoDI 5000.02 and related documents
provided above, it is clear that the updated acquisition system places increased emphasis
on systems engineering activities early in the lifecycle to seek system solution space that
best balances operational performance with affordability and schedule in the presence of
uncertainty. The swim-lane chart and Systemigram of Figures 1-11 and 1-12 respectively
had elements that were depicted in red. These elements pertain to the early activities central
the life cycle is a positive and necessary change in order to achieve envisioned goals, but
this change alone will likely prove insufficient if the systems engineers charged with
executing the new elements of the process are not experienced, high performing, well
trained, appropriately placed, and adequately equipped. The bullet list below provides five
broad research questions sparked by this big change in defense acquisition process.
1. How will DoD best attract, retain, and develop high quality systems engineers with
strong analytical skills and proven systems thinking capability? Similarly, how will
DoD’s industry partners and support contractors best attract, retain, and develop
such talent?
2. Are the organizations charged with executing the early systems trade-off activities
engineering talent from industry and support contractors as well as from within
DoD?
3. How will systems engineers within the DoD Component interface with systems
Within the DoD structure, where should the responsibility and authority to mobilize
5. Are there tools and techniques available to help the executing agents conduct and
The first research question is a timely question as DoD is facing an aging civilian
workforce, shrinking budgets, severe travel restrictions, low support for professional
conference participation, pay freezes, furloughs, sequestrations, and a pay scale that caps
out earlier than many top performing senior systems engineers would be willing to accept
as competitive compensation. This environment is not conducive to attracting the top talent
that DoD needs to successfully execute this vital but complex front end of the acquisition
process. When it comes to recruiting and retaining systems engineering talent, growing
sectors of the U.S. economy pose increasingly stiff competition to the slowing U.S. defense
sector. This competition for systems engineering talent impacts DoD’s support contractors’
The second and third broad research questions are related to inter- and intra- organizational
the systems engineers executing early systems engineering trades will likely emerge as will
questions regarding the mechanisms to facilitate efficient internal and external interfaces
between the systems engineers and the balance of the community to include support
Of all the program elements within the DoD budget, such activities seem to fall within the
Research, Development, Testing, and Evaluation (RDT&E) element. The specific funding
44
category within the RDT&E budget element most appropriate for conducting systems level
assessments early in the life-cycle is less clear. The fourth broad research question listed
The fifth research question underlines the new demands being placed on the SE community
through the updated defense acquisition system. Defense acquisition professionals are
anxious to comply with the new defense acquisition instructions but are facing significant
obstacles. The large number of variables involved with the exploration of emerging
requirements and their combined relationship to materiel systems causes compliance with
the new defense acquisition instructions to be a very complex task. Consequently, requests
for tools and techniques that can help model the relationship between physical design
decisions and consequences of those decisions at the system level as measured across all
elements of stakeholder value with robust trade space visualization capabilities are growing
more frequent. The balance of this dissertation seeks to unpack this fifth broad research
engineering trade-off analyses, identifying potential pitfalls associated with the typical
execution of a systems engineering trade-off analysis, quantifying the risk that potential
pitfalls pose to acquisition decision quality, suggesting remedies to mitigate the risk of each
pitfall, and measuring the potential usefulness of contemplated innovations that may help
The intent of this two phase, exploratory sequential and embedded mixed methods
approach (Creswell, 2014) is, in the first phase, to explore the needs of requirements writers
and product development decision makers through field observations and direct interviews
as they seek to understand system level cost, schedule, and performance consequences of
a set of contemplated requirements. From these case studies, theories for improved
processes will be generated and a description of the emerging process will be documented.
In the second phase of this research, a larger sample of systems engineering professionals
surveyed to help interpret qualitative findings of the first phase. The reason for collecting
Question 1
Question 2
How can systems engineering trade-off analyses go wrong? How many potential
pitfalls can be identified? What are the potential pitfalls associated with such
46
analyses? How likely are the pitfalls to occur? How severe are the consequences
of each pitfall?
Question 3
What steps can be taken to avoid potential pitfalls associated with systems
Question 4
quality? How does the variety and quality of visualization techniques used to
Question 5
How does the extent to which uncertainty is captured and the way in which the
Hypothesis A
with the U.S. defense acquisition process find the large number of variables
Hypothesis B
There are many potential pitfalls associated with systems engineering tradeoff
Hypothesis C
with the U.S. defense acquisition process would find an integrated systems
Hypothesis D
with the U.S. defense acquisition process would find the stakeholder value
Hypothesis E
with the U.S. defense acquisition process would find stakeholder value scatterplot
Clearly show why systems engineering trade-off analyses are emerging as a pivotal
Be fully immersed within several trade studies for an extended period of time to
data inputs. Present the case study with sufficient detail to enable reproducibility
current success rates of new product development within the U.S. Department of
49
of the Defense Acquisition System: Annual Report 2013 to provide a more complete
2015 DoDI 5000.02. Applied each of the seven triads as a lens through which the
each Conceptagon lens. Gradually built a full, rich description of the defense
Clearly identified the essential differences between the current (2015) and the
process.
Built master list of potential pitfalls associated with the typical execution of a
Quantified perceived risk that potential pitfalls pose to acquisition decision quality.
map
process map.
Chapter 1 has established the problem statement and has introduced this research design.
Chapter 2 will explore current literature across several functional disciplines to reveal
related themes, applicable theories, methods, and tools while identifying research gaps and
sequential & embedded mixed methods study approach (Creswell 2014) used to explore
the research questions and test hypotheses identified earlier in this chapter. Conclusions,
Research questions and hypotheses are revisited in Chapter 7 where conclusions are drawn,
As a first step in the research process seeking improved defense acquisition outcomes
through innovations to systems engineering trade-off analyses this literature review will
establish the state of previous research and identify research gaps that could potentially be
filled by the research associated with this dissertation. This chapter is organized into five
sections with section two having four subsections as described in the outline below.
(Randolph, 2009) and (Creswell, 2014) were used as primary resources for designing the
involved with new product development pertaining to at least one aspect of systems
52
engineering trade-off decision practices and theories. The target audience is primarily the
dissertation review committee and secondarily other academics in the field of systems
literature review may be refined to tailor it to the needs of practitioners and defense
acquisition policy makers. The findings are presented in a conceptual format synthesized
through a decision management process lens. The central output of this literature review is
the synthesis of findings and the link to the balance of this dissertation research. A
potentially useful by-product of the literature review is a list of citations to over three-
The general approach to building the dissertation reference library, organizing its content,
and culling the master library to form a selected set of references is summarized in the
process map of Figure 2-1. Each element of this process will be discussed in some detail
following a short description of the tools used for searching, retrieving, and managing the
provided access to a vast number of databases, full text Journals, and e-books. Linking
the Samuel C. Williams Library to Google Scholar through Google Scholar's settings page
proved to be an efficient and effective way to search and retrieve applicable literature.
Literature was collected, organized, and stored using the Qiqqa reference and PDF
management tool. PDF files and associated Meta Data were stored on a personal computer
hard drive and backed up on the Qiqqa Cloud. Qiqqa captures text highlighting, comments,
54
and annotations made to each PDF file and generates annotation reports of selected
elements of the library or the entire library if desired. Qiqqa captures review status and
applicability rating of each PDF in the library and allows sorting of library on 11 different
dimensions - search score, title, author, year, rating, favorite, recently read, recently added,
reading stage, has PDF, has BibTex, and pages. Qiqqa links with MS Word for automatic
As shown in Figure 2-3, the initial step taken on the quest for relevant literature was to
generate a list of keywords to drive an electronic keyword search for relevant references.
The keyword list is presented in the sections that follow as a series of word clouds that
show the relative frequency keywords appears in a representative journal over the five year
55
period 2009 to 2013. The six representative professional journals used for this explorative
keyword clouds give a sense of the breadth of topics covered in the functional discipline
and highlights hot topics. Presenting the breadth of current topics in a given functional
discipline provides context for the more in-depth discussion of a subset of the current topics
Figure 2-3. Identifying Keywords Section Focus Within Overall Literature Review Process Map
56
Massachusetts Institute of Technology, and the RAND Corporation circa 1950. (Buede
2011) The National Council on Systems Engineering (NCOSE) was established in 1990
and membership in this professional organization has grown to 8,000 members worldwide,
qualification.(Sage & Rouse 2011; Boardman & Sauser 2008; Buede 2011)
Conducting a keyword capture of the Journal of Systems Engineering for the five year
period of 2009 to 2013 reveals the number of topics currently in play in the field of Systems
2-4 below. Note the keywords dealing with requirements, architectures, and design are
prominent as are notions of complexity and scale. Risk and Decision Analysis seem to be
The roots of decision analysis are set in the works of Blaise Pascal, Pierre de Fermat, and
Jacob Bernoulli, and Thomas Bayes and their seminal work on concepts of probability,
utility, expected value, and the mathematics of updating probabilities given new
2
The phrase “Systems Engineering” excluded (dominated keyword cloud and added no new information);
plurals merged with singular to most frequent form; Variations of “Systems of Systems” consolidated;
variations of DODAF consolidated; architectural framework, architecture framework, architectural pattern
consolidated into “Architecture”; Risk analysis, risk assessment, risk management, risk modeling, risk
perception, and risk-based consolidated into “Risk”; Decision analysis, decision making process, decision
support process, decision support systems, decision support tools, decision support, decision theory,
decision trees consolidated into “Decision making”
58
the mid 1940's, John von Neumann and Oskar Morgenstern established the theory of
decision analysis and Ward Edwards created a psychology field of study around the theory
in the mid 1950's. Howard Raiffa carried the field forward, authoring the first book of
decision analysis in 1968 and co-authoring the first multiple objective decision analysis
book with Ralph Keeney in 1976. (Parnell et al. 2013; Edwards et al. 2007)
Foundational concepts for single objective decision analysis across a wide range of
business and policy decisions include decision trees, influence diagrams, sensitivity
analyses, probability distributions, and Monte Carlo simulations (Clemen & Reilly 2001).
For design trade decisions involving multiple competing objectives, Value Focused
(Parnell et al. 2011; Parnell et al. 2013; Keeney & Keeney 2009; Kirkwood 1996)
The keyword analysis of Decision Analysis Journal shows a healthy discussion around a
wide range of topics. Bayesian models, multiattribute utility theory, additive value model,
influence diagrams, decision trees, game theory, probability assessment, and risk analysis
all seem to enjoy a strong portion of the printed literature in this journal over the last few
years.
59
Figure 2-5. Keyword Cloud for Decision Analysis Journal 2009 - 2013
Operations research traces its heritage through World War II and applies the scientific
Theory, Decision Analysis, Markov Chains, Queuing Theory, Inventory Theory, Markov
Once again, turning to the word cloud of keywords to get a sense of language used within
discipline, it is clear that there are certain keywords that tend to be more apparent than
others. In this case, Figure 2-6 shows that the Operations Research community loves to
write about optimization. Notions of sales, costs, economics, commerce, queuing theory,
Figure 2-6. Keyword Cloud for Operations Research Journal 2009 -2013
61
The Management Science Journal was mined for keywords to aid the dissertation literature
search. The INFORMS webpage3 for this journal summarizes the publication’s purpose
practice of management. Within our scope are all aspects of management related to
well as all functional areas of business, such as accounting, finance, marketing, and
Notice marketing is identified as a functional area of business and thus captured under the
Management Science umbrella. (Wilkie & Moore 2003) trace the beginnings of marketing
research to the early 1900s (Era I) through a thirty year era of formalized principle
development beginning in 1920 (Era II) that formed the foundation for a boom era for
marketing research initiated around 1950 (Era III) giving way to an era of fragmentation
and specialization in the 1980s (Era IV). The ties between management science and
Management science and behavioral science emerged into the marketing mainstream at
roughly the same time. Their progress into the field was assisted by the offering of some
mutual support by academics in each area: Although well separated in terms of projects,
3
http://pubsonline.informs.org/journal/mnsc#
62
specialists in the two approaches agreed with each other’s beliefs in the scientific
method, in underlying disciplines (science and social sciences), and in the body of
marketing thought needing to be improved through new forms of knowledge and reliance
on advanced empirical research methods. The sciences arrived in stages, slowly during
the 1950s (Management Science was started in 1954), increasingly during the 1960s,
the 1970s. By the end of Era III, there was no question that the future of the mainstream
of marketing thought would be governed by people who had these forms of training and
Wilkie and Moore add that it was during Era III that data analysis research methods such
as Bayesian analysis, sensitivity analysis, trade-off analysis, and conjoint analysis first
appeared on the marketing research scene. The first three terms appear in the keyword
clouds of journals already discussed in this chapter but conjoint analysis has not. Review
of the keywords within the Management Science Journal shows a focus on Commerce,
Profitability, Investments, Sales, Economics, Costs, & Competition as seen in Figure 2-7.
63
The Journal of Information Technology (JIT) was scoured for keywords associated with
the information technology discipline to aid the dissertation literature search. The Palgrave
webpage4 for this journal summarizes the publication’s purpose and scope as follows:
The aim of the Journal of Information Technology (JIT) is to provide academically robust
papers, research, critical reviews and opinions on the organisational, social and
4
http://www.palgrave-journals.com/jit/about.html#Impact-factor
64
and those seeking an update on current experience and future prospects in relation to
JIT focuses on new research addressing technology and the management of IT, including
policies and standards. It also publishes articles that advance our understanding and
The keyword cloud depicted in Figure 2-8 shows that Enterprise Resource Planning (ERP)
Systems occupies a good portion of the literature in the discipline. Papers pertaining to
ERP systems should be sought out and included in this dissertations master reference
library.
To help prime the dissertation literature search, five years of keywords were extracted from
the Information Visualization Journal. The Sage webpage5 for this journal summarizes
The journal is a core vehicle for developing a generic research agenda for the field by
identifying and developing the unique and significant aspects of information
visualization. Emphasis is placed on interdisciplinary material and on the close
connection between theory and practice.
5
https://us.sagepub.com/en-us/nam/journal/information-visualization#aims-and-scope
66
In addition to the keywords described within the Journal’s purpose and scope statement
above, the keyword cloud depicted in Figure 2-9 shows that visual analytics, data
visualization, graph drawing and visualization, and cognitive loads processing and
Figure 2-9: Word Cloud of Information Visualization Journal Keywords 2009 to 2013
As shown in Figure 2-10 the references generated through the keyword search described
in the previous section were added the handful of references from graduate course work
and several references provided by the PhD advisory committee members to build an initial
library using Google Scholar linked with Samuel C. Williams Library to search for and
67
retrieve the referenced papers. Papers were saved in Portable Document Format (pdf) and
stored in the pdf manager program, Qiqqa, discussed in the previous section.
Papers within this initial pdf library were skimmed for relevancy. The references cited in
the relevant papers were used to search for additional papers. This process was repeated
many times and augmented with electronic keyword searches to build the master pdf library
Figure 2-10. Building Master Library Section Focus Within Overall Literature Review Process Map
68
One of the early discoveries that jump-started the literature search started with the seed
from one of the text books used in graduate level course work - Product Design and
Development (Ulrich, 2003). Searching for other works written by Ulrich, Google Scholar
this paper, Krishnan and Ulrich retrieved 400 papers and conducted a thorough review of
...look inside the 'black box' of product development at the fundamental decisions that are
Ulrich grouped the decisions encountered during product development and project
planning endeavors into eight categories. Ulrich's eight categories and general decisions
Table 2-1. Krishnan & Ulrich’s List of Decisions Associated with Product Development
Category Decisions
Product strategy What is the market and product strategy to maximize probability of economic success?
and planning What portfolio of product opportunities will be pursued?
What is the timing of product development projects?
What, if any, assets (e.g. platforms) will be shared across which products?
Which technologies will be employed in the product(s)?
Product Will a functional, project, or matrix organization be used?
development How will the team be staffed?
organization How will project performance be measured?
What will be the physical arrangement and location of the team?
What investments in infrastructure, tools, and training will be made?
What type of development process will be employed (e.g., stage-gate)?
Project What is the relative priority of development objectives?
management What is the planned timing and sequence of development activities? milestones?
What will be the communication mechanisms among team members?
How will the project be monitored and controlled?
Concept What are the target values of the product attributes, including price?
development What is the core product concept?
What is the product architecture?
What variants of the product will be offered?
Which components will be shared across which variants of the product?
What will be the overall physical form & industrial design of the product?
Supply chain Which components will be designed and which will be selected?
design Who will design the components (if they are to be designed)?
Who will produce the components and assemble the product?
What is the configuration of the physical supply chain? location? decouple point?
What type of process will be used to assemble the product?
Who will develop and supply process technology and equipment?
Product design What are the values of the key design parameters?
What is the configuration of the components and assembly precedence relations?
What is the detailed design of the components, including material & process selection?
Performance What is the prototyping plan?
testing & What technologies should be used for prototyping?
validation
Production What is the plan for market testing and launch?
ramp-up & What is the plan for production ramp-up?
launch
70
Ulrich used this taxonomy of new product development decisions to organize the 200
selected references generated by his literature review. Of the eight categories used in
Ulrich's new product development decision taxonomy, two are especially applicable to the
systems engineering trade-off analysis methods). The two categories are the concept
development category and the product design category. Of the 200 selected references
provided by Ulrich's literature survey, 66 were associated with either one of these two
categories. These 66 papers served as important building blocks to the master pdf library.
In his paper's concluding remarks, Ulrich identifies the link between marketing models and
Several areas for future research seem promising. Research in the marketing community
has flourished on methods for modeling consumer preferences and for optimally
models of the product as a bundle of attributes tend to ignore the constraints of the
We see an opportunity for these communities to work together to apply the product-
technological constraints.
71
Ulrich identified the gap between marketing models and engineering models to be
significant and worth filling in 2001. The literature review of this dissertation uses Ulrich's
literature review as a starting point and through a new multi-disciplined literature search,
seeks to discover the degree to which the gap still exists in 2015 and identify opportunities
disparate methods to resolve the gap between marketing models and engineering models
acquisition outcomes.
The three steps required to transform the master PDF library to a working library of
selected publications are illustrated in Figure 2-11. As discussed above, Ulrich understands
research from multiple disciplines, finds that many product development decisions can be
aided by tools, but discovers a gap between marketing models and engineering models.
The literature review of this dissertation builds upon Ulrich’s findings and searches for
ways to fill the gap that he identified between marketing models and engineering models.
To conduct this search, this dissertation explores the literature across many disciplines
through a “decision management lens” rather than the “decision lens” of Ulrich’s original
work. This new lens makes use of Ulrich’s finding that many of the product development
decisions can be aided by tools and this new lens is applied to provide a sharper focus on
72
the elements needed for a useful tool and organizes information gathered across disciplines
The activities and tasks of the decision management process as identified in Section 6.3.3
of the International Standard for Systems and Software Engineering – System Life Cycle
Processes, ISO/IEC/IEEE 15288:2015(E) are shown in the bullet list below and were used
o Select and declare the decision management strategy for each decisions
These activities and tasks form the basis of the decision management lens and associated
taxonomy presented in Table 2-3, Table 2-4, and Table 2-5. The three tables identify the
The selected references were identified through a three step process applied to the
dissertation’s master PDF library, the three steps being (1) tag, (2) score, and (3) filter.
Using the tagging feature of the Qiqqa PDF management tool, each of the approximately
650 references within the master PDF library was labeled with at least one of the tasks
associated with an ISO/IEC/IEEE 15288 2015 decision management activity. The task
tags are shown in the second column of Table 2-3, Table 2-4, and Table 2-5. With the
74
master PDF library categorized through the decision management lens, each reference was
then scored on a scale of 1-5 per criteria described in Table 2-2 below.
Table 2-2. Scoring Criteria for Documents Within Dissertation’s Master Reference Library
Score Criteria
5 Well cited reference that directly and effectively addresses a topic of interest (a
“tag” per the decision management lens)
4 Reference directly and effectively addresses a topic of interest (a “tag” per the
decision management lens)
3 Reference directly addresses a topic of interest (a “tag” per the decision
management lens)
2 BibTex information not obtainable, document represents a fragment of a more
complete thought, or unprofessional writing style.
1 Document written in a language other than English or poor copy quality renders
document almost unreadable.
The categorized and rated library was then screened to so that only references that scored
at least a three were identified as selected references. The selected references for each
decision management lens category are presented in Table 2-3, Table 2-4, and Table 2-5.
75
Associated
Decision
Tasks As
Management
Interpreted by Selected References
Activities of
Dissertation
ISO 15288
Author
(Kossiakoff et al. 2011; Edwards et al. 2007; Ross & Hastings 2005;
Group & others 2011; Adam M Ross et al. 2010; Osama 2006;
Ulrich 2003; Pugh 1991; Griffin & Hauser 1996; Ullman 1992;
Ullman & Spiegel 2006; Ulrich 2011; Lorenz 1990; Suh 1990; Dasu
Understand
& Eastman 2012; Simon 1996; Brown & others 2008; Thomke &
general product
Reinertsen 2012; Schneider & Hall 2011; Urban et al. 1993;
development
Crawford & Di Benedetto 2008; Sillitto 2005; Edson 2008;
processes,
Wunderlich et al. 2006; Ulrich & Ellison 1999; Elm et al. 2007;
general decision
Finger & Dixon 1989; Finger & Dixon 1989; Papalambros & Wilde
processes, and
2000; Honour 2004; Sage & Rouse 2011; Honour et al. 2006;
decision points.
HOPKINS & McAfee 2010; Buede 2011; Maier 2009; Davenport et
al. 2010; Keefer et al. 2004; Parnell et al. 2011; Buede 1994; Voss et
al. 1996; Ravasi & Stigliani 2012; Maier & Rechtin 2000; Hallam
2001; Parnell et al. 2013; Powell & Buede 2006; Parnell et al. 2011;
Mayda & Borklu 2014; Keeney 1982)
Design
(Keeney 2013; Davenport & Harris 2007; Roberto 2013; Keeney
organization to
Prepare for 2004b; Snowden & Boone 2007; Harris et al. 2010; Harris et al.
execute product
Decisions 2011; Spetzler 2007; Katz & Te’eni 2007; Davenport et al. 2010;
development
Pyster et al. 2013; Kogut 1985; HOPKINS & McAfee 2010)
decision process
(West & Pyster 2015; Kyd 2007; Rangaswamy & Lilien 1997;
Create/Integrate Bussen & Myers 1997; Wallace & Jakiela 1993; Tenenbaum &
Tools, Utilize Brown 1992; McManus et al. 1997; Chen & Lee 2003; Andradóttir
Computer & Prudius 2009; Fischer 2006; Sage 1981; Bhattacharjee et al. 2007;
Information Shim et al. 2002; Buede 1996; Carty 2002; Berente & Yoo 2012;
Systems Wand & Weber 2002; Kaniclides & Kimble 1995; Katz & Te’eni
2007; Todd & Benbasat 1991; Maxwell 2008)
(Boardman & Sauser 2013; Davis & Blumenthal 1991; Wunderlich
2006; Nagamachi 2002; Boardman et al. 2009; Boardman 2010;
Frame Decision
Boardman & Sauser 2008; Griffin & Hauser 1993; French et al.
1998; Madni 2015; Korfiatis et al. n.d.)
Understand
human cognitive (Pappas 1956; Brody 1993; Stolz 2010; Kane et al. 2004; Amir 2008;
limits, biases, Baucells & Rata 2006; Gaissmaier et al. 2008; Bernard et al. n.d.;
and psychology Sanfey & Chang 2008; Miller 1956)
decisions
76
Associated
Decision
Tasks As
Management
Interpreted by Selected References
Activities of
Dissertation
ISO 15288
Author
(Adam M Ross et al. 2010; Buede & Maxwell 1995; Hobbs 1986;
von Winterfeldt 1980; Ross & Hastings 2005; Davis et al. 2005;
Ullman & D’Ambrosio 1995; A. M. Ross, O’Neill, et al. 2010; A. M.
Ross, McManus, et al. 2010; Howard 1988; Locascio & Thurston
1998; Keeney & von Winterfeldt 2009; Mingers 2003; Turskis &
Zavadskas 2011; Buede & Choisser 1992; Shocker & Srinivasan
1979; Keeney 1994; Stewart 1996; Howard & Matheson 2005;
Understand the Wunderlich et al. 2006; Gardener & Moffat 2008; Figueira et al.
various 2005; Georgiadis et al. 2013; Ishizaka & Labib 2009; Cho 2003;
approaches to Lennon et al. 2010; Keeney 1993; Steuer et al. 2006; Winkler 1990;
decision analysis Wang & Yang 1998; Da Costa & Buede 2000; Saaty 1998; Dyer et
and select most al. 1992; Kou et al. 2011; Collopy & Horton 2002; Svenson 1998;
appropriate Platts 1996; Geldermann & Schӧbel 2011; O’Neill et al. 2010; Chen
& Ko 2009; Hazelrigg 2010; Frey et al. 2009; Madni & Jackson
2009; Zhai et al. 2009; Verma et al. 1999; Hazelrigg 1999; Parnell et
al. 2011; Stump et al. 2009; Parnell 2007; Dees et al. 2010;
Wallenius et al. 2008; Buede 2005; Smith & Von Winterfeldt 2004;
Raiffa 1969; Phillips 2006; Hammond et al. 1998; Dyer 1990;
Analyze the Kirkwood 1992; Schoner et al. 1997; Felix 2004; Ross et al. 2004;
Decision Zapatero et al. 1997; Cristiano et al. 2001; Chou & Chang 2008)
Information (Dahan & Srinivasan 2000; Griffin & Hauser 1993; Mankins 1995;
Determine
Randall et al. 2007; Yamamoto & Lambert 1994; Yoo et al. 2004;
desired
Shah & Madden 2004; Lancaster 1990; Mela et al. 2013; McDowell
outcomes,
2006; Harvey & \Osterdal 2010; Sauser et al. 2010; Buede 1997b;
develop criteria
Sauser et al. 2009; Felfernig et al. 2010; McManus et al. 2007; Juan
(objectives and
et al. 2010; Markov et al. 2011; Kälviäinen 2010; Buede 1997a;
measures)
Sproles 2002; Hauser et al. 2006; Griffin & Hauser 1993)
Generate (Keeney 2012; Ulrich 1995; Strawbridge et al. 2002; Girotra et al.
alternatives 2010; Kornish & Ulrich 2011; Madni 2014)
(Hulett 1995; Kenley & El-Khoury 2012; Neches & Madni 2013;
Johnstone et al. 2011; Mankins 2009; Hicks et al. 2009; Adler et al.
1995; Parkinson 1995; Tinnirello 2001; Ulrich et al. 1993; Ong et al.
2003; Smith 2010; Patenaude 1996; Cox Jr et al. 2008; Grossman
Evaluate
2006; Banks & Chwif 2010; Keller 1998; Hollis & Patenaude 1999;
alternative
Browning 1999; Fernandez 2010; Bilbro 2007; Fox & Clemen 2005;
against criteria
Simon et al. 1997; Kwak & Ingall 2007; Grey 1995; Torczon &
Trosset 1998; Dillon et al. 2002; Ross et al. 2008; Schwindt n.d.;
Ramaswamy & Ulrich 1997; Ramaswamy & Ulrich 1993; Law
1991; Palmer & Larson n.d.)
77
Associated
Decision
Tasks As
Management
Interpreted by Selected References
Activities of
Dissertation
ISO 15288
Author
(Masi 2006; Savage 2002; R’\ios-Insua et al. 2003; Chambal et al.
2011; Kolko 2010; Borcherding et al. 1991; Dahan & Hauser 2002;
Hauser & Toubia 2005; Gilbride et al. 2008; Toubia et al. 2007; Paul
E Green & Srinivasan 1990; Anon n.d.; Paul E. Green & Srinivasan
1990; Michalek et al. 2005; Stewart 1996; Scholl et al. 2005; Tufte
2005; Few 2008; O’Hara et al. 2007; Tufte & Graves-Morris 1983;
Tan & Fraser 1998; Gibson et al. 2012; Card et al. 1999; Tufte &
Graves-Morris 1983; van Wijk 2006; Few 2006; Song et al. 2010;
Chou & Ma n.d.; Scagnetti & Kim 2012; Skeels et al. 2010; Lee et
al. 2010; MacEachren et al. 2012; Few 2009; Few 2004; Feather et
Determine
al. 2006; Horvitz & Barry 1995; Stump et al. 2004; Kyd 2007; Smith
preferred
2013; Masi 2006; Savage 2002; R’\ios-Insua et al. 2003; Chambal et
alternative
al. 2011; Kolko 2010; Borcherding et al. 1991; Dahan & Hauser
2002; Hauser & Toubia 2005; Gilbride et al. 2008; Toubia et al.
2007; Paul E Green & Srinivasan 1990; Paul E. Green & Srinivasan
1990; Michalek et al. 2005; Stewart 1996; Scholl et al. 2005; Tufte
2005; Few 2008; O’Hara et al. 2007; Tufte & Graves-Morris 1983;
Tan & Fraser 1998; Gibson et al. 2012; Card et al. 1999; Tufte &
Graves-Morris 1983; van Wijk 2006; Few 2006; Song et al. 2010;
Scagnetti & Kim 2012; Skeels et al. 2010; Lee et al. 2010;
Make and MacEachren et al. 2012; Few 2009; Few 2004; Feather et al. 2006;
Manage Stump et al. 2004; Kyd 2007; Smith 2013)
Decisions Record
resolution,
(Keisler & Noonan 2012; Keeney 2004a; Estefan & others 2007;
rationale, and
Dickerson & Mavris 2013; Kirby et al. 2001; London 2012)
assumptions.
Track, report.
(Buede & Bresnick 1992; Feng et al. 2008; Whitney 1993; Girard
2006; French 1998; Cheung et al. 2012; Crocker et al. 2001; Keller
et al. 2009; Sahraoui et al. 2008; Virtanen et al. 2006; Sridhar et al.
2008; Geis et al. 2011; Sankar et al. 2002; Brown & Eremenko 2008;
Roark III 2012; Spaulding 2003; Nickel & others 2010; Ewing et al.
Capture lessons
2006; Van Leer & Roe 1987; Collopy & Consulting 2006; Jilla 2002;
learned from
Ulrich 2005; Dees et al. 2010; Merrick et al. 2005; Sutcliffe &
execution of
Hollingsworth 2010; Killen et al. 2005; Viscito et al. 2009; Buede &
decision
Bresnick 1992; Feng et al. 2008; Whitney 1993; Girard 2006; French
management
1998; Cheung et al. 2012; Crocker et al. 2001; Keller et al. 2009;
process (case
Sahraoui et al. 2008; Virtanen et al. 2006; Sridhar et al. 2008; Geis et
study examples)
al. 2011; Sankar et al. 2002; Brown & Eremenko 2008; Roark III
2012; Spaulding 2003; Nickel & others 2010; Ewing et al. 2006; Van
Leer & Roe 1987; Collopy & Consulting 2006; Jilla 2002; Ulrich
2005; Dees et al. 2010; Merrick et al. 2005; Sutcliffe &
Hollingsworth 2010; Killen et al. 2005; Viscito et al. 2009)
78
Reviewing the literature through a decision management lens sorted the information
gathered across disciplines into categories aligned with the elements of a decision
comprehensive review of all the selected references is beyond the scope of this dissertation,
a summary of the key threads that run through the selected references is provided in this
section to provide a sense of the state of previous research from which potential research
gaps will be identified in the next section of this literature review chapter.
The literature search found a recent international standard, many textbooks, and papers
devoted to general product development or systems engineering (Hallam 2001; Maier &
Rechtin 2000; Sage & Rouse 2011; Kossiakoff et al. 2011; Ulrich 2003; Crawford & Di
Benedetto 2008; Urban et al. 1993; Pugh 1991; Ullman 1992; Dasu & Eastman 2012; Suh
1990; Standard 2015). Many of these publications and others like it provide only a cursory
treatment of decision making. Some notable exceptions (Buede 1994; Buede 2011; Parnell
et al. 2011) take great care to address decision making as an integral part of systems
engineering and product development and spend a good portion of the publication covering
As discussed earlier in this chapter, their paper Product Development Decisions: A Review
of the Literature (Ulrich, 2001) takes the position that product development can be viewed
as a long list of decisions. Towards the end of their extensive cross functional literature
"We observe that research seems to flourish in problem areas with powerful
representations by the marketing community led to the large body of work on conjoint
analysis. The parametric representation of the engineering design problem led to hundreds
of papers on design optimization. More recently, the Design Structure Matrix spawned
infer that the development of representation schemes should be a high priority in the
Green and Srinivasan penned a well-read paper on Conjoint Analysis (Paul E Green &
Srinivasan 1990) and Browning penned a highly cited paper regarding the application of
the Design Structure Matrix (DSM) technique (Browning 2001) to integration problems.
Krishnan and Ulrich continue in their literature review paper and highlight the limitations
of Conjoint Analysis.
80
Much of the research on setting attribute values is also aimed at maximizing customer
satisfaction or market share, and does not explicitly consider design and production costs
or overall profitability. In addition, the research on setting attribute values (done in the
context of packaged goods) often assumes that arbitrary combinations of specifications are
Krishnan and Ulrich conclude their extensive literature review with a recommendation for
developing tools that facilitate the link between marketing models and engineering models.
Several areas for future research seem promising. Research in the marketing community
has flourished on methods for modeling consumer preferences and for optimally
establishing the values of product attributes. Yet, a weakness identified (earlier in the
paper) is that models of the product as a bundle of attributes tend to ignore the constraints
community. We see an opportunity for these communities to work together to apply the
technological constraints.
Many papers have been written on the notion of capturing customer requirements and then
attempting to carry them through the product design process – generally referred to as a
Multiple Domain Matrix (MDM) (Elezi et al. 2010) or a Domain Mapping Matrix (DMM)
81
(Danilovic & Browning 2007). A specific and popular example of an MDM technique is
Voice of the Customer and QFD addressed most notably by Griffin and Hauser (Griffin &
Hauser 1993; Griffin & Hauser 1996; Hauser & Clausing 1988; Hauser et al. 2006; Hauser
In a separate paper, Ramaswamy and Ulrich point out the limitations of QFD for use in
design decision making (Ramaswamy & Ulrich 1993) as does (Delano et al. 2000). Both
papers highlight their thoughts on how one might overcome these limitations. The
presenting customer information have been developed. One such methodology is the House
HOQ is most often used to set targets for the engineering performance of a product. In a
typical situation, marketing staff collect data about customers and competing products
and, with some input from engineering, decide a set of performance targets which are then
methodology,
2. The roof of the HOQ alone cannot adequately capture the complex
We believe that engineering models, if used in conjunction with the HOQ, can help address
these problems. Designers often have engineering models which they can use to test the
limits of product performance. The inputs to these models, the design variables, are the
actual quantities that the designer can control and the outputs are the important
performance metrics of the product. Engineering models can therefore be a valuable tool
for exploring design tradeoffs and product performance without building extensive
The likelihood of design engineers applying a flawed decision method is high and the
Design Alternative Selection Methods, Hazelrigg warns about the shortcomings of using
QFD as a decision tool, as did Ramaswamy and Ulrich, but Hazelrigg does not hold out
Hazelrigg provides strong warnings about seven other popular design selection methods
and about the catastrophic consequences of design engineers not using classical decision
Decision theory and optimization are closely linked in that decision making and
optimization contain exactly the same elements, and all decisions involve some amount of
optimization. The idea that the optimal choice is that alternative whose outcome is most
83
almost universally presented in the context, maximize f(x) subject to constraints g(x)<0,
and it deals exclusively with search techniques. Optimization theory does not deal with
the questions of where does f(x) come from and what conditions must it satisfy? These
questions are the domain of decision theory. Yet, they are equally important as the most
efficient search toward the wrong objective is no more useful than no search at all.
(Hazelrigg 2003)
Decision theory places great emphasis on the proper formulation of decision objectives
…the optimal choice is that alternative whose outcome is most preferred by the decision
maker; that if the decision maker is indifferent to the outcomes, then a random choice is
acceptable; and that all decisions result in outcomes. By and large, what modern
He goes on to lament about how design engineers, despite the availability of decision
techniques that are well founded in classical decision theory, are prone to use inferior
Yet, despite its long history and wide acceptance in other communities, design engineers
have not adopted the formalisms of decision theory, and there exists considerable
disagreement among members of the engineering design community as to the extent that
engineering design involves decision making and to which classical decision theory
engineers have developed and promoted a broad range of decision tools that have been
84
widely taught and that are in widespread use for design selection. These methods have,
almost entirely, failed to reference or utilize the extensive literature and research results
in the decision sciences and many deviate from the simple basic tenets given to Alice by
the Cheshire Cat. Unfortunately, this failure has led to methods that are inconsistent with
the rigorous principles of decision theory, and these inconsistencies lead to several
Hazelrigg references (Howard & others 1968) and suggests ten properties be used to
validate an engineering design alternative selection method. The ten properties are as
follows:
1. The method should provide a rank ordering of candidate designs. At the very
does not do this. A rank ordering of all candidate design alternatives provides
additional insight. It tells which is ‘‘best,’’ and gives insights as to the ordering
2. The method should not impose preferences on the designer, that is, the
designer. A basic tenet of all decision theory is that each decision maker has a
right to his/her own preferences (note the Cat’s advice to Alice, ‘‘That
depends...on where you want...’’). Hence, any method that imposes preferences
assured of getting what is desired out of a design unless the method leaves
preferences unconstrained.
outcome of the selection would be known completely and with perfect precision
short, there would be no excuse for one to make a bad engineering design
decision. The fact is that uncertainty and risk pervade all engineering design,
and they are a major reason why a formal selection method is needed.
manufacture for the product or system in question. In other words, the result
should be given by the method; the method should not be selected to obtain the
desired result. Since the purpose of the method is to find the result, it should be
clear that the method should not depend on the result. Although this statement
sounds facetious, (Saari 1995) shows that, for many selection cases, any result
to any reduced set SR, such as {C, D, ...} or {B, D, ...} or {D, ...}, etc. The
given above. The preferred material flips between A and B based on the
6. The method should make the same recommendation regardless of the order in
depends on the order in which the alternatives are considered, then it would be
necessary to know the ‘‘correct’’ result in order for the method to get the
‘‘correct’’ result. Said another way, the method would produce random results.
7. The method itself should not impose constraints on the design or the design
assure that the payoff cannot be greater than would be without the constraints
imposed, and it is generally less. Some constraints, such as the laws of nature,
are immutable. They must be accepted. On the other hand, some constraints are
arbitrary, and one would do better to reject their imposition. Any method that
arbitrarily imposes constraints on the design or the design process can impose
8. The method should be such that the addition of a new design alternative
should not make existing alternatives appear less favorable. The valuation of
with respect to any alternative must not make the decision situation less
decision making follows from having better information regarding the decision.
Thus, it should be the case that a design selection method always assigns a
10. The method should be self-consistent and logical, that is, it should not
contradict itself and it should make maximum use of available information for
effective use of information and may frequently make recommendations that are
Hazelrigg assesses eight design alternative selection methods against the ten required
properties (Hazelrigg 2003). A summary of the results are shown in Table 2-6. Notice
Design Alternative
Properties Not Met
Selection Method
Weighted Sum of
2, 3, 9, and except under restrictive conditions 1 & 10
Attributes
Analytical Hierarchy
2, 3, 5, 8, 9, 10 & except under restrictive conditions 1
Process
Physical
3, 9, & except under restrictive conditions 1, 2, & 10
Programming
Pugh
2, 3, 5, 6, 8, 9, 10, & except under restrictive conditions 1
Matrix
Quality Function
2, 3, 9, 10, & except under restrictive conditions 1
Deployment
Taguchi Loss
2, 9, & except under restrictive conditions 1, 3, & 10
Function
Suh’s Axiomatic
2, 3, 7, 9, 10 & except under restrictive conditions 1
Design
Six
2, 3, 7, 9, & except under restrictive conditions 1, 10
sigma
88
The opportunities for fixing any one of these eight methods are non-existent in Hazelrigg’s
view.
By and large, these methods cannot be salvaged. The logical errors that lead to their
misbehavior are generally central to the construct of the methods themselves, and
attempts to rectify them are so basic as to alter the method totally. (Hazelrigg 2003)
Hazelrigg suggests that poor design alternative selection methods are used despite their
severe flaws because the shortcomings of these methods are often obscured by the
Often, such undesirable behaviors are obscured by the complexity of the engineering
design process, and lead to poor engineering design decision making in a way that goes
Additionally, it is argued that these methods are often less taxing to implement than
methods involving classical decision theory. Hazelrigg puts this claim into proper
perspective.
Often, extant methods are touted for their ‘‘ease of use.’’ However, one should note that
it is possible to create methods that are quite easy to use so long as it is not important
that they provide valid results. Demanding that a method provide valid results often
In summary, Hazelrigg seeks to highlight very clearly the treacherous nature of path
The decision sciences comprise a rigorous branch of study that has emerged over the
past 300 years, with contributions from brilliant individuals of great fame. Yet, many
decision methods developed for engineering design have neglected this body of
knowledge, and many elements of the engineering methods are in direct contradiction
with well-established principles of the decision sciences. Using these contradictions, and
given the rigor of the decision sciences, it is not surprising to see that pathological
engineering methods, and to show that many of these methods are quite capable of
recommending even extremely poor design alternatives. Indeed, it is even suggested that
such behaviors are more the norm than the exception. The fact that any such pathological
cases exist should be extreme cause for alarm, as they give clear indication of the failure
treatment of the MODA process (Parnell et al. 2013) while Neches and Madni write about
the opportunity to use MODA for enhanced upfront engineering (Neches & Madni 2013)
and Buede specifically addresses the use of MODA to inform early requirements (Buede
1997b; Buede 1994; Buede 1997a). Foundational works such as (Keeney 1982; von
Winterfeldt 1980; Howard 1988; Keeney & Raiffa 1976; Raiffa 1968; Kirkwood 1992;
Howard & Matheson 2005; Howard & others 1968; Hammond et al. 1998; Bond et al.
90
2010; Keeney 2012; Raiffa 1969; Keeney & von Winterfeldt 2009; Keeney & Gregory
2005; Kirkwood 1996; Keeney 2002; Keeney & Keeney 2009; Keeney 1974) are still
frequently cited while recent work from Ross, McManus, Rhodes, Hastings, Richards, and
O'Neill investigates structured processes for exploring complex tradespace using value-
focused principles. (A. M. Ross, O’Neill, et al. 2010; A. M. Ross, McManus, et al. 2010;
Adam M Ross et al. 2010; Ross et al. 2008; A. M. Ross, O’Neill, et al. 2010). Kenley's
work on influence diagrams (Shachter & Kenley 1989) and work on cost/schedule models
(Kenley & El-Khoury 2012) also have potentially strong ties to MODA design alternative
selection methods.
The literature contains discussions of how to best use MODA for multiparty decisions. For
instance, in the case of weapon system acquisition, the number of stakeholders is high and
access is limited as the stakeholders are often high level decision makers sprawled across
the DoD enterprise, stakeholders often do not speak with one voice yet only one decision
will be made. In his PhD Dissertation, Ross identifies methods for measuring robustness
personalities. Such individuals may have multiple or conflicting preferences, with the
expression of "dominant" values changing over time. Congress as a single entity can be
considered of this class since each individual member may have differing and conflicting
preferences, yet the decisions articulated by Congress may or may not remain stable over
91
time as "dominant" Congressmen become the voice for the body. Treatment of Congress
as a multiple Demander case may provide insight into the problem as well, but reality lies
somewhere in between (has single demander effects, but multiple demander preferences).
Carl Spetzler pushes a similar idea forward in "The Biggest Challenge" section of the
become a part of the decision professional's domain of expertise, whether or not these
Most recently, Ralph Keeney addresses this hot topic in a paper for DA Journal entitled,
2.3.9 The Digital Thread & Lessons Learned From Executive Information Systems
Applicable literature within the Information Technology discipline tends to center around
experiences. Such lessons may prove very applicable to those creating and implementing
a decision analysis process and associated support tool for use with system engineering
trade-off analyses such as the vision described as the “Digital Thread” (West & Pyster
provide fast and easy access to information from a variety of sources, both
internal and external to the organisation. They are easily customisable and
Executive Information System implementation in the 1990s received mixed reviews and
because of the high costs associated with such endeavors, "EIS development (was) a high-
risk undertaking for many organizations." (Kaniclides & Kimble 1995). Literature
(Bussen & Myers 1997; Kaniclides & Kimble 1995) suggests that leading factors
This list is striking in that it could belong to almost any failed development program in any
industry. Generally speaking, these failure modes are not unique to EIS development
efforts. The first three have to do with poor market research and systems engineering. The
fourth may be an issue of too strictly framing the EIS vision to require the information
93
intermediaries when some senior executives clearly did not want to work with the product
directly. This could have indirectly lead to the fifth root cause bullet - a support staff would
be sized differently absent the assumption that the executive would indirect directly with
the EIS. An evolutionary development plan with early prototypes would have helped the
development plan with an early prototype would also help mitigate the risks associated
with the failure mode of the final bullet as incremental capability improvements could be
rolled out within a shorter delivery cycle than the all-at-once development strategy.
Stephen Few is a leading information visualization teacher, writer, and consultant and he
summarizes the purpose and potential benefits of information visualization field as,
Infovis (information visualization), when applied appropriately and done well, helps
people think more thoroughly and with greater insight... think more productively...
The goal of helping people "understand information so they can make better decisions" is
a shared goal with the research addressed in this dissertation. But beyond merely helping
techniques to generate, collect, store, and combine data using sound logic to produce
information meaningful to senior defense decision makers as they work to implement the
Directive for Better Buying Power. This research is operating in the intersection of defense
94
acquisition, decision analysis, and information visualization to discover and engineer novel
information products that are useful to senior defense acquisition decision makers.
Within that context, the intent of this research is to follow the principles and techniques
associated with high quality information visualization as published by the thought leaders
of the information visualization community and apply those principles and techniques to
the decision analysis products to create something new and useful to the defense leadership.
Once again, the words of Stephen Few to help reinforce this thought,
How can we guarantee that the tools and techniques that we develop are effective? They
must be designed to reflect what we already know about what works. Once they’re ready,
they must be tested in real-world situations with real world users to confirm that they
produce the desired outcomes and do so better than what already exists.(Few 2008)
Regarding the guiding principles of information visualization, consider the opening line of
Tufte goes on to list the attributes possessed by high quality graphical displays (Tufte &
At the very least, the intent is to assess the quality of the tradespace visualization products
This literature review broadly examined academic works across disciplines involved with
new product development pertaining to at least one aspect of systems engineering trade-
off decision practices and theories. The literature review produced a list of citations to
taxonomy and the findings regarding the state of current research were summarized and
presented in a conceptual format. This final section of the literature review chapter is
subdivided into two sub-sections, one dedicated to the identification of remaining research
gaps and another to make the link between these research gaps and the dissertation’s
The literature review revealed that design engineers apply a wide variety of decision
methods during design alternative selection activities such as systems engineering trade-
off analyses. Few of the methods employed are founded on classical decision theory and
96
thus many are prone to misguiding the decision maker. To make things worse, even
decision methods that are based on classical decision theory can be misused or poorly
executed and lead to poor choices. Consequently, there are many academic critics in the
current state of literature warning of the potential hazards associated with systems
missing however was any measure of perceived level of difficulty from the practitioners’
point of view. Are the warnings from academics purely “academic” in that practitioners
have this figured out and there is no cause for concern? Do all the criticisms in the literature
Although there are warnings of potential errors associated with the execution of systems
scattered throughout the literature, there does not seem to be a comprehensive list of pitfalls
tied to decision process steps. Additionally, it appears that there has been no attempt to
measure the frequency at which particular errors are encountered and there has been no
attempt to measure severity of the consequences given the error does occur.
The literature review revealed that most of the systems engineering textbooks, handbooks,
and standards treat decision methods in only a cursory manner. A couple of the
publications were the exception in that they made a moves towards including decision
analysis as a key element throughout the systems engineering process. What was missing,
97
however, was a process map that explicitly integrated elements of the decision analysis
process with elements of the systems engineering process. Of course, also missing from
the literature was any measure indicating that such a process map would be perceived as
beneficial.
There was very little discussion in the literature regarding the most appropriate level of
analyses.
There is some discussion in the literature regarding the variety of information visualization
techniques to graphically portray uncertainty. What was missing from the literature was
any measure of perceived usefulness for such methods applied to systems engineering
Table 2-7 provides the link between literature review and dissertation research.
Table 2-7. Restatement of Research Questions & Hypotheses Cross-walked with Research Gaps
Gap Qualitative Research Questions Quantitative Hypotheses
Hypothesis A
Question 1
What is the level of difficulty associated with number of variables involved with the
systems engineering trade-off analyses within exploration of emerging requirements
the U.S. Defense Acquisition System? and their combined relationship to
conceptual materiel systems to be
difficult (where a significant percentage
is defined as > 70%).
Hypothesis B
associated with systems engineering
Question 2
3.1 Overview
quantitative or qualitative research approaches, this research effort employed a two phase,
exploratory sequential and embedded mixed methods approach (Creswell, 2014). In the
first phase, the needs of requirements writers and product development decision makers
were explored through field observations and direct interviews as they sought to understand
requirements. From these case studies, theories for improved processes were generated and
a description of the emerging process was documented. In the second phase of this
qualitative findings of the first phase. The survey was largely quantitative with an open
Survey Monkey (an online survey development and administration tool) was used to create
the survey instrument. Subjects were asked to respond to 29 survey questions. Some
questions had multiple parts. There was one question regarding consent and two questions
asking for demographic information regarding discipline and years of experience. There
were four questions that asked the subject to assess the degree to which an observation
reflects their own experience. Four questions asked the subject to rate the potential
100
concluded with eighteen questions regarding potential pitfalls within trade-off analyses -
nine questions asked the subject to indicate how often they have observed various pitfalls
and nine questions asked the subject to assess the impact to overall study quality if such a
Most of the survey questions asked the subject to select the most appropriate rating on a
Likert type scale with an open ended question at the end of each grouping to collect
qualitative information. The survey was designed so that the total time commitment
required of each subject would be on the order of thirty minutes and that there would be no
Before deploying the survey instrument, the Stevens Institute of Technology Committees
for the Protection of Human Subjects (SIT CPHS) application process required for any
research proposing to use human subjects. As part of this process, investigators must
discuss how they planned to ensure that each subject’s participation is voluntary, free of
direct or implied coercion. This study created the first page of the online survey to be a
consent form. This consent form explicitly stated that participation in this project is
voluntary and that the subject may refuse to participate or withdraw at any time without
penalty or loss of benefits to which the subject is otherwise entitled and it stated that the
subject may refuse to answer any question within the survey. To enter the survey however,
101
the subject must explicitly provide consent at the end of the consent form page by
or
If consent is denied, the survey page logic brought the subject to the thank you page and
allowed the subject to exit survey. Note that the page logic required an answer to this
question. The subject was not allowed to proceed into the survey without explicitly
granting consent.
The subjects’ privacy and confidentiality were maintained during the administration of the
survey by administering the survey online, not collecting names, not using embedded
event-capturing software, video, or audio recording of any kind. The survey did not pose
any risk to the subjects – no risk of physical discomfort or harm, no risk of psychological
harm, stress, or discomfort, no risk of legal action such as criminal prosecution or civil
sanctions, no risk of harm to social status such as loss of friendship, and no risk of harm to
employment status. The survey did not use deceptive techniques, it did not veil the actual
purpose of the study, it did not use private records, it did not manipulate psychological or
102
social variables, it did not probe for personal information, and it did not present materials
While the subjects did not directly benefit from participation, their participation helped
with the development of a decision support process tailored to the needs of the defense
acquisition community seeking to understand the range of potential system level cost,
The Stevens Institute of Technology Committees for the Protection of Human Subjects
(SIT CPHS) approved the application to proceed with the survey. A thumbnail image of
The online survey was deployed to systems engineers and operations researchers involved
in defense acquisition through a link posted on five groups within the professional social
media site LinkedIn - INCOSE Group, MBSE Group, DAU Alumni Group, MORS Group,
and INFORMS Group. The link to the online survey was also sent directly via email to
Operations Research Society (MORS) with known activity in this research area.
The survey was deployed on February 1, 2015 and was closed on March 31, 2015. Over
this two month period, 184 participants entered the disclosure page of which 181 completed
the survey. Because survey participants were allowed to skip questions, confidence
intervals associated with response statistics are developed for each specific question in the
engaged in the defense acquisition domain. The size of the subject population is estimated
to be on the order of 4,000 people given the fact that INCOSE membership is
approximately 10,000 people6 working across at least ten domains including automotive,
6
www.incose.org/about/membership
104
mass transit, urban planning, power & energy, communication systems, information
systems, healthcare, hospitality, aerospace systems, and defense systems with aerospace
systems and defense systems being identified as core domains7. Therefore, of the 10,000
systems engineers within INCOSE, it is reasonable to believe that between 2,000 and 3,000
systems engineers focus most of their efforts on defense systems. Combine this estimate
with the 2,122 members of the Military Operations Research Society (MORS) group within
LinkedIn and adjusting the sum to account for the fact that some members of MORS may
also be members of INCOSE yields the estimate of 4,000 systems engineers and operations
It should be mentioned that the U.S. Department of Defense has 39,000 positions coded as
systems engineers but the number of DoD employees that perform true systems
engineering tasks is unknown but likely to be far fewer than 39,000. As seen in Table 3-
1, the 181 respondents represent a meaningful sample of the true population of systems
engineers and operations researchers engaged in the defense acquisition domain across a
7
www.incose.org/docs/default-source/wgcharters/infrastructure.pdf?sfvrsn=8
105
Table 3-1. Confidence Interval at 95% Confidence Level for Sample Size of 181
Across Estimates of Population
2 4 6 8 10 12 14 16 18 20
Confidence
6.95 7.12 7.17 7.20 7.22 7.23 7.24 7.24 7.25 7.25
Interval
The confidence intervals were generated using the confidence interval calculator found on
the Creative Research Systems site8 using inputs for a sample size of 181, a population
ranging from 2,000 through 20,000, and assuming 50% of the sample will select a
particular answer. This 50% is considered worst case and most appropriate for generating
a general confidence interval associated with a multi-question survey for a given sample
size.
As shown in Table 3-1, if the original estimate of the true population of 4,000 is correct, a
sample size of 181 would allow one to be 95% confident that the results reflect the
perceptions of the true population of 4,000 within a margin of error of about + 7.12%.
Chapter 4 will present confidence intervals associated with the response statistics for each
specific question but this general calculation shows that the sample obtained adequately
8
www.surveysystem.com/sscalc.htm
106
169 participants answered this question. As shown in Figure 3-2, all but 22 of the 169
or some combination of the two with a majority of this sample associating more strongly
Twenty-two of the respondents selected "other" and identified their disciplines as follows:
Notice that even within the twenty-two respondents that selected “other” as their discipline,
Of the 181 respondents, 15 skipped this question. Figure 3-3 shows that of the ten
experience categories the most common level of experience was the 25-29 years with 88%
having at least 5 years of experience, 76% of the respondents having at least 10 years of
Survey question number four provided the participants with an observation informed by
the case study research conducted in the first phase of this two phase research effort as
summarized in the previous chapter. To help interpret the observation and perhaps
generalize results to the population of all early defense acquisition activity, survey question
number four asked respondents to indicate the degree to which the following observation
Observation: Success rates of defense acquisition programs are lower than desired. Many
cause of some undesired acquisition outcomes. The Department of Defense has recently
level alternatives across a thorough set of stakeholder objectives to include life-cycle costs,
Figure 4-1 shows that 86% of the 139 participants that responded to question number four
indicated that their observations were somewhat, very, or extremely similar to the survey
authors’ observations With the sample size of 139 one can be 95% confident that 86% +
respond that their observations were at least somewhat similar to the survey authors’
Figure 4-1. Degree to which respondents’ observations matched survey author’s observations
regarding recent changes in defense acquisition process (survey question 4).
Qualitative comments were provided by 41 of the 140 participants that responded to this
question. The qualitative statements provided in the comment section of this survey
question generally amplified the quantitative results by giving some anecdotal information
such as, “I have witnessed several acquisition programs where the initial requirements were
impossible to achieve at the current state of the art, a fact that would have been revealed
documents, in general, do a poor job of justifying the rationale for a requirements and look
more like technical specifications vice operational requirements. The net result is removal
of almost all trade space.” One respondent laments, “In my experience, few engineers
111
understand the difference between performance and effectiveness. And fewer still know
when to apply performance, mission, or campaign M&S”. About a dozen respondents used
writing process was the only root cause for poor acquisition outcomes. Other potential root
desire for more clarity within the survey question itself and called for a more direct
As with the previous survey question, survey question number seven provided the
participants with an observation informed by the case study research conducted in the first
phase of this two phase research effort as summarized in the previous chapter. To help
interpret the observation and perhaps generalize results to the population of all early
defense acquisition activity, survey respondents were asked to indicate the degree to which
Observation: Defense acquisition professionals are eager to comply with the new defense
acquisition instructions but the large number of variables involved with the exploration of
causes compliance with the new instructions to be difficult. Consequently, requests for
tools and techniques that can help model the relationship between physical design
112
decisions and consequences of those decisions at the system level as measured across all
Figure 4-2 shows that approximately 81% of the respondents found their own experience
to be somewhat similar, very similar, or extremely similar to the observation that defense
acquisition professional found the large number of variables involved with the exploration
causes compliance with the new instructions to be difficult. With the sample size of 135
one can be 95% confident that 81% + 6.5% of the true population (estimated to be 4,000
as discussed in Chapter 3) would respond that their observations were at least somewhat
similar to the survey author's observations that acquisition professionals found the large
number of variables involved with the exploration of emerging requirements and their
Figure 4-2. The degree to which survey respondents’ experience matched the survey author’s
observation that defense acquisition professionals found compliance with the new instructions to be
difficult and are consequently seeking new tools and processes (survey question 7)
113
Qualitative comments were provided by 33 of the 136 participants that responded to this
question. Many of the qualitative statements supported the quantitative results suggesting
that a strong majority of those surveyed sense that acquisition professionals are indeed
struggling with the complexities of informing requirements. For example, one respondent
writes, “program / project managers are constantly asked to understand the relationship
between requirements and cost. It’s not a simple measure to make, if at all even possible
Qualitative comments seem to show that the question of whether such struggles are leading
to an increase in requests for tools and processes is met with a more varied response. One
respondent writes, “I believe the first part of the observation. I haven’t seen the second
half.” Another survey taker seems to agree that the demand for such tools is not necessarily
on the rise but not because there is no need, but rather because effected parties are slow to
react, stating, “Requests lag actual need. Very few Program Managers are consciously
concerned with the complexity, intricacy, and rigor of true and honest trade analysis to
guide program decisions. But the availability of good decision analysis tools and
techniques can help change the practice of looking to solve design implementation
problems with more money and more time.” A different respondent sees increased interest
among acquisition professionals, “I see program managers being more and more interested
in tools that support integrated system level architectures (integrated behavior allocated to
physical system level architecture, with a value model against user requirements).”
114
Others took the time to acknowledge an observed upward trend in chatter surrounding tools
one survey participant writes, “(the) trend in the last five years has been to increase reliance
on tools of this nature. The problem with getting tools prepared to assist in the process is
that tools take time and resources to develop. These are often not within the purview of
the organization dealing with the acquisition.” Another states, “I work directly with
someone brings a new ‘tool’ to the director for portfolio management, decision analysis,
dashboarding, etc. Also, I see new policies coming out that require more from systems
engineering, but there are never enough resources to tackle all of it.” A separate survey
taker sees it this way, “In my experience acquisition professionals want compliance with
the instructions to occur by magic – it isn’t anyone’s job (which is why it wasn’t happening)
and it still isn’t anyone’s job, despite instructions and exhortations.” Bureaucratic and
security concerns sometimes also present hurdles to the introduction of new tools as
discussed by one of the survey takers, “Modeling and simulation is one of the least
understood and most under-utilized tool sets in the acquisition tool box (actually most of
those tool boxes include almost no real tools in that domain). Better tools are needed
however bureaucratic and legitimate security requirements make this very difficult in
practice.”
One respondent warns that even the best of tools would have limited impact on decision
quality. As this respondent sees it, “Several other things would also have to happen even
if the tools were available. (1) Sufficient time & resources need to be allocated to the
115
analyses – they usually aren’t; (2) Leadership would need to spend adequate time and
become more deeply involved in the decision making process – they probably won’t / can’t
because of constraints on their time and; (3) Political pressures would need to be mitigated
1997; Edwards, Miles, von Winterfeldt 2007; Parnell et al. 2013; Parnell, Cilli, Buede
2014) with personal experience gained in the first phase of this research effort. Survey
questions 12 – 29 asked the subject to assess the likelihood and consequence of these 40
potential pitfalls associated with systems engineering trade-off analyses. The risk of the
40 potential pitfalls was compiled by using the mode of the response distribution for each
question for each potential pitfall. Figure 4-3 depicts the cluster of perceived risk that these
potential pitfalls to pose a higher risk than others but found none of them to be low risk.
Table 4-1 lists each potential pitfall and identifies the perceived risk to systems engineering
risk analyses. Full questions and response distribution for each are provided in the
The number of subjects that responded to these questions regarding likelihood and
consequence of the potential pitfalls ranged from 124 to 132 respondents. Using 124 as a
worst case sample size, the population estimate of 4,000, and the approximation that 50%
of the sample selected the answer identified as the mode, one can be 95% confident that
116
the results reflect the perceptions of the true population within a margin of error of about
+ 8.7%. In some cases, this margin of error could be enough to cause the mode of the true
population to differ from the mode of the sample. However, upon examination of the
response distributions for each question, one sees that this margin of error will at most
cause the mode to change by only one category left or right on the likelihood or severity
scale. Examination of Figure 4-3 reveals that such movement will not perturb the
overarching observation that some of the potential pitfalls were thought to be higher risk
than others but none were low risk. This result indicates that the community has a long
Figure 4-3. Cluster of perceived risk that potential pitfalls present to a systems engineering trade-off
analysis (survey questions 12-29)
118
Table 4-1. 40 Potential Pitfalls and Their Perceived Risk to Systems Engineering Trade-Off Analyses
Similar to some of the other survey questions discussed above, survey question number
eight provided the participants with a research focus statement informed by the literature
review and case study research conducted in the first phase of this two phase research effort
as summarized in the Research Method section of this dissertation. To help gauge the
potential usefulness of contemplated analytical processes and methods that may help
reduce the risk to decision quality posed by these potential pitfalls, survey respondents
A literature review reveals that Multiple Objective Decision Analysis (MODA) techniques
can be applied in a way that provides a conceptual link between the voice of the customer,
the voice of the engineer, the voice of the cost analyst, etc. The literature also reveals,
however, that a wide variety of MODA based models have been created and applied to
product design problems over the past decade with varying degrees of success. This
tailored to support those seeking to understand the range of potential system level cost,
research aims to identify potential pitfalls and best practices associated with the execution
of a trade study process, introduce innovations as needed, and apply all findings to the
integrated SE/MODA process. The research focus described here was identified while
working with several acquisition projects. In order to help interpret the potential
if successful, please consider the following: If faced with the task of facilitating fully
process supported by tools and techniques that can help model the relationship between
120
physical design decisions and the consequences of those decisions across system level cost,
Figure 4-4 shows that that about 92% of the respondents to question eight indicated that an
integrated SE / MODA process supported by tools and techniques that can help model the
relationship between physical design decisions and the consequences of those decisions
across system level cost, schedule, and performance elements of stakeholder value would
be somewhat, very, or extremely helpful. With the sample size of 133 one can be 95%
confident that 92% + 4.5% of the true population (estimated to be 4,000 as discussed in
Chapter 3) would respond that contemplated tools and techniques would be helpful.9
9
www.surveysystem/sscalc.htm
121
Figure 4-4. Perceived usefulness of an ISEDM process supported by tools and techniques to help model
the relationship between physical design decisions and the consequences of those decisions across
system level cost, schedule, and performance elements of stakeholder value (survey question 8)
Qualitative comments were provided by 39 of the 133 participants. These remarks suggest
that although a process and tools to help model these relationships could potentially be
very helpful, a healthy skepticism regarding the likelihood of proper execution prevails.
For example, one respondent writes, “It would be a God-send” and another echoes the
sentiment with the statement, “I actually taught the need for this very thing at the Air Force
Institute of Technology in the mid-1980’s.” Some comments voiced the fear that pursuing
such tools would run the risk of replacing the voice of engineering talent and experience.
The concern is that, as with any model, the output is only as good as the input provided
and sometimes a slick model masks poor quality of input data. As one respondent puts it,
“…until the base level practitioners actually KNOW what physical equations govern
122
relationships and have a good cost basis, then there is little a meta process can do to correct
Survey question number nine provided the participants with a process map of an ISEDM
process tailored for informing requirements similar to Figure 4-5 shown below.
123
Figure 4-5: Integrated Systems Engineering Decision Management (ISEDM) Process Map
An earlier version of this process map appears in the SE Body of Knowledge (SEBoK,
2015). The description of the process map was provided as part of survey question number
nine shown below. The development of this process map was informed by the literature
review and case study research conducted in the first phase of this two phase research effort
124
as summarized in the previous chapter. In order to help interpret the potential usefulness
The integrated ISEDM process illustrated in Figure 4-5 combines the structured, holistic view
fundamental to the systems engineering discipline with the analytical rigor of the operations
research discipline. The white text within the outer green ring identifies elements of a systems
engineering process while the ten blue arrows represent the ten steps of the Decision
Management Process. Interaction between the systems engineering process and the Decision
Management Process are represented by the small, dotted green or blue arrows. The circular
shape of the process map is meant to convey the notion of an iterative process with significant
interaction between the process steps. The feedback loops seek to capture new information
regarding the decision task at any point in the decision process and make appropriate
adjustments. The ISEDM process tailored for informing requirements described here was
drafted while working with several acquisition projects. In order to help interpret the potential
usefulness of the research product to the population of SE and OR military professionals, please
consider the following: If faced with the task of facilitating fully informed requirements writing,
how useful would it be to have a process map such as the one illustrated in Figure 4-5?
Figure 4-6 shows that 91% of the respondents perceived the process map of the integrated
somewhat, very, or extremely helpful. With the sample size of 132 one can be 95%
confident that 91% + 4.8% of the true population (estimated to be 4,000 as discussed in
Chapter 3) would respond that contemplated ISEDM process tailored for informing
requirements would be helpful. The survey finding supports the INCOSE effort to use this
125
integrated SE/MODA process in the INCOSE SE Handbook (2015) and the SE Body of
Knowledge (2015).
Figure 4-6. Perceived usefulness of a process map of the integrated systems engineering decision
management (ISEDM) process tailored for informing requirements (survey question 9)
Qualitative comments were provided by 37 of the 132 participants that answered this
question. The statements reveal that although the quantitative data show that a majority of
the respondents found the process map to by very or extremely helpful, some respondents
found the map to be too simplistic while others found it too complex. Others cautioned
126
that the systems engineering discipline is generally perceived as too process oriented and
such maps run the risk of reinforcing such beliefs if not skillfully presented.
For example, one of the survey takers that thought it was too a bit too simplistic wrote, “To
be more helpful to a requirements writer, this needs to be decomposed further into working
level tasks, understand inputs/outputs and roles/responsibilities for each task, etc. But
would need to be careful to remember the ultimate goal is to develop requirements, not
comply with a model/process.” On the other side of the spectrum, one of the respondents
that thought it was too complex commented, “This diagram has more than 30 pieces of
information on it, ignoring the dotted arrows and other contextual information. I believe
that no-one can sufficiently process this, unless they’ve developed the model themselves.”
Another wrote, “A map is very helpful. But I think there are too many steps for the
All in all, many of the qualitative statements echoed the quantitative data. As one
participant said, “Many organizations say they perform trade studies, but they are
inconsistently performed both within and across organizations. A process would help.”
Similarly, another participant stated, “If this were to be implemented I see great value, but
the current reality is much different.” Finally, consider the comment from one of the
participants “This is a good, high level process map. As with most high level pictorial
constructs, it doesn’t show the feedback loops between the steps – but it does a good job
of showing the general process and helps to level set expectations for decision makers as
to what steps are involved and how long the process may take.”
127
Survey question number ten provided the participants with an illustration of a trade-space
The description of this particular trade-space visualization approach was provided as part
of survey question number ten shown below and covered in more detail in chapter 6. The
development of this trade-space visualization was informed by the literature review and
case study research conducted in the first phase of this two phase research effort as
summarized in the Research Method section of this dissertation. In order to help interpret
the potential usefulness of the research product to the population of SE and OR military
The stakeholder value scatterplot shows in one chart how all system level alternatives
respond with regards to lifecycle costs, performance, development schedule, and long term
viability. Seeing the entire picture all at once is potentially important because finding a
solution that meets one or two of the stakeholder values is trivial but the real world
challenge to select alternatives in a way that best balances the competing trades is often
marker. An alternative’s lifecycle cost and performance value are indicated by a marker's
color of the marker while the long term viability for a particular alternative is indicated by
the shape of the marker as described in the legend of the illustration of Figure 2. How
useful do you think a scatterplot like the one illustrated in Figure 2 could potentially be to
contemplated requirements?
Figure 4-8 shows that 91% of the respondents perceived the stakeholder value scatterplot
as having the potential to be at least somewhat helpful. With the sample size of 132 one
can be 95% confident that 91% + 4.8% of the true population (estimated to be 4,000 as
<http://www.surveysystem.com/sscalc.htm>.
129
Figure 4-8. Perceived usefulness of a stakeholder value scatterplot (survey question 10)
Qualitative comments were provided by 41 of the 132 participants that answered this
question. The statements regarding this visualization were very positive but the
associated scoring criteria was not present then such a visualization would be unhelpful.
Others called for a robust treatment of uncertainty on the chart while others called for an
Some of the negative comments included statements like “Too much information is
presented in a format that is not intuitively obvious.” Another comment of this type is,
“Must it be this complex? Often simply interviewing stakeholder’s will give you great
insight.” One survey taker complained, “I have serious doubt that input values for this type
For the most part, however, the comments were very positive. A respondent wrote, “This
diagram presents three or four concepts for the audience to grasp. Easy, simple and very
enter Engineering & Manufacturing Development.” Someone else commented, “This type
of chart can be very helpful – properly done, it can be briefed at high level.”
Survey question number eleven provided the participants with an illustration of a trade-
space visualization called a stakeholder value scatterplot with uncertainty. This illustration
The description of this particular trade-space visualization approach was provided as part
of survey question number ten shown below and described in more detail in chapter 6. The
development of this trade-space visualization was informed by the literature review and
case study research conducted in the first phase of this two phase research effort as
summarized in the Research Method section of this dissertation. In order to help interpret
the potential usefulness of the research product to the population of SE and OR military
(Figure 4-9) shows an example of how one might visualize the degree to which uncertainty
impacts performance value. Uncertainty present in a defense acquisition trade study often
stems from the fidelity of single dimensional consequence estimates and from the
132
understanding that stakeholders may not all share a common view of weightings associated
with each single dimensional consequence used as part of the aggregating function. A
visualization like the one in (Fig. 4-9) shows decision makers the degree of volatility
useful do you think a scatterplot like the one illustrated in (Fig. 4-9) could potentially be
contemplated requirements?
Figure 4-10 shows that 92% of the respondents perceived the stakeholder value scatterplot
with uncertainty as having the potential to be at least somewhat helpful. With the sample
size of 131 one can be 95% confident that 92% + 4.6% of the true population (estimated to
be 4,000 as discussed in Chapter 3) would respond that the value scatterplot with
Figure 4-10. Perceived usefulness of the stakeholder value scatterplot with uncertainty
(survey question 11)
Qualitative comments were provided by 34 of 131 participants that answered this question.
Statements showed a strong appreciation for the display of uncertainty on the performance
value axis and called for uncertainty to be shown in at least the cost dimension and if
possible the schedule dimension. Some outliers sensed that this type of visualization has
the potential to do more harm than good, “Seems to me that this makes it more complicated
– more uncertainty not less; … an opportunity to those with a vested interest to bias the
data.” Most of the qualitative comments had to do with visualizing uncertainty in all the
dimensions, not just the performance dimension. One commenter wrote, “Even better. But
the uncertainty is likely to extend along all the axes, not just one.”
134
Chapter 4 provided the results of a survey administered to systems engineers and operations
researchers within the defense acquisition domain. The survey was developed to discover
the perceived level of difficulty associated with compliance to the revised defense
acquisition system mandate for early systems engineering trade-off analyses, to measure
perceived likelihood and impact of potential pitfalls within systems engineering trade-off
studies, and to gauge the potential usefulness of Decision Management processes and
methods that may help reduce the risk to decision quality posed by these potential pitfalls.
The survey instrument was designed using Survey Monkey and was deployed through a
link posted on several groups within LinkedIn and was also sent directly via email to those
with known experience in this research area. The survey was open for a two month period
and collected responses from 181 participants. The data showed that approximately 81%
of the respondents found the large number of variables involved with the exploration of
be difficult and characterized each of the 40 potential pitfalls as a medium or high risk to
systems engineering trade-off study quality. 91% of the respondents indicated that an
integrated systems engineering decision management process would be helpful and 92%
of the respondents found the stakeholder value scatterplot and the stakeholder value
participated in the survey. The new defense acquisition changes will require more trade-
135
off analyses. The systems engineers and analysts believe that trade-off studies are currently
not well planned or well executed. All 40 pitfalls were thought to occur often and have
negative consequences. The integration of SE and MODA offers potential to help improve
trade-off analyses. The process in the INCOSE SE Handbook and SEBoK is a step in the
right direction and has the potential to help avoid many of the identified pitfalls as does a
forthcoming textbook. The plots of performance value vs. life cycle cost, development
schedule, and long term viability are useful. Incorporating uncertainty into those same
charts provides an even more useful trade-space visualization. Table 5-1 provides a
crosswalk between potential pitfalls and best practices identified in Chapter 6 that may help
avoid each pitfall. Trade-off Analyses are complex. The systems engineers and analysis
performing the analysis may need education and training in MODA theory and best SE
practices to develop the expertise to implement the best practices in SE trade-off analyses.
136
Table 5-1. Crosswalk Between Potential Pitfalls & Pitfall Avoidance Measures of Chapter 6
6.1 Introduction
engineering decisions are difficult decisions in that they include multiple competing
high accountability. In these cases, good decision making requires a formal decision
characterizing and evaluating a set of alternatives for a decision at any point in the life-
cycle and select the most beneficial course of action.” This chapter aligns with the structure
and principles of the Decision Management Process Section of the INCOSE Systems
Engineering Handbook v4.0 (INCOSE SE Handbook Working Group, 2015) and presents
the decision management process steps as described therein and it expands on the SEBok
10
This chapter is an extended version of material submitted to SEBoK.
SEBoK authors. 2015. "Decision Management," in BKCASE Editorial Board. 2015. The Guide to the Systems
Engineering Body of Knowledge (SEBoK), v. 1.4. R.D. Adcock (EIC). Hoboken, NJ: The Trustees of the
Stevens Institute of Technology ©2015, Released 29 June 2015,
http://sebokwiki.org/w/index.php?title=Decision_Management&oldid=50860 (accessed June 16, 2015).
BKCASE is managed and maintained by the Stevens Institute of Technology Systems Engineering Research
Center, the International Council on Systems Engineering, and the Institute of Electrical and Electronics
Engineers Computer Society
11
This chapter is an extended version of material submitted to Chapter 5.3 of the INCOSE SE Handbook
INCOSE (2015). Systems Engineering Handbook: A Guide for System Life Cycle Process and Activities (4 th
ed.). D.D. Walden, G.J. Roedler, K.J. Forsberg, R.D. Hamelin, and, T.M. Shortell (Eds). San Diego, CA:
International Council on Systems Engineering, Published by John Wiley & Sons, Inc.
138
section on Decision Management (SEBok authors, 2015). Building upon the foundation,
this chapter adds a significant amount of text and introduces many illustrations to provide
situation into a recommended course of action and associated implementation plan. The
process is executed by a resourced decision team that consists of a decision maker with full
responsibility, authority, and accountability for the decision at hand, a decision analyst with
a suite of reasoning tools, subject matter experts with performance models, and a
representative set of end users and other stakeholders (Parnell, Bresnick, Tani, & Johnson,
2013). The decision process is executed within the policy and guidelines established by
the sponsoring agent. The formal decision management process realizes this transformation
through a structured set of activities described in the balance of this chapter. Note the
process presented here does not replace the engineering models, performance models,
operational models, cost models, and expert opinion prevalent in many enterprises but
rather complements such tools by synthesizing their outputs in a way that helps decision
makers thoroughly compare relative merits of each alternative in the presence of competing
objectives and uncertainty. (Buede, 2009; Parnell, G. S., Driscoll, P. J., and Henderson D.
L., 2011)
Models are central to systems analysis and trade-off analysis. A decision support model is
a composite model that integrates outputs of otherwise separate models into a holistic
139
decision support model helps decision maker(s) overcome cognitive limits without
Early in the life cycle, inputs to the decision management process are often little more than
broad statements of the decision situation. As such, systems engineers should not expect
process. In later stages of the system life cycle, the inputs usually include models and
simulations, test results, and operational data. Opportunities to use a decision management
process as part of a systems engineering trade-off analysis throughout the system analysis
The ultimate output of the decision management process should be a recommended course
of action and associated implementation plan provided in the form of a high quality
decision report. The decision report should communicate key findings through effective
that are repeatable and traceable. As decision makers seek to understand root causes of top
level observations and build their own understanding of the tradeoffs, the ability to rapidly
drill down from top level trade-space visualizations into lower level analyses and data
The decision analysis process as described in (Parnell, Bresnick, Tani, Johnson, 2013) and
(Parnell, Driscoll, Henderson 2011) can be summarized in ten process steps: (i) frame
decision and tailor process, (ii) develop objectives and measures, (iii) generate creative
alternatives, (iv) assess alternatives via deterministic analysis, (v) synthesize results, (vi)
(viii) improve alternatives, (ix) communicate tradeoffs, (x) present recommendation and
Applying this decision process to a new product development context calls for the
integration of this process with the systems engineering process. The systems engineering
process provides the holistic, structured thinking perspective required for the design and
development of complex systems while the analytics based decision process provides the
mathematical rigor needed to properly represent and communicate reasoning and produce
meaningful visualization of the trade-space. Figure 6-2 provides a process map of this
analytical decision process integrated with six of the systems engineering technical
142
described in the INCOSE Systems Engineering Handbook V4. This integrated process will
Figure 6-2: Integrated Systems Engineering Decision Management (ISEDM) Process Map
143
The white text within the outer green ring of Figure 6-2 identifies systems engineering
processes & methods while the ten blue arrows represent the ten steps of the analytical
decision process. Interaction between the systems engineering processes and the decision
process are represented by the small, dotted green or blue arrows. These interactions are
The focus of the process is to find system solutions that best balance competing objectives
in the presence of uncertainty as shown in the center of Figure 6-2. This single focus is
important as it can be argued that all systems engineering activities should be conducted
within the context of supporting good decision making. If a systems engineering activity
cannot point to at least one of the many decisions embedded in a systems lifecycle, one
must wonder why the activity is being conducted at all. Positioning decision management
as central to systems engineering activity will ensure the efforts are rightfully interpreted
as relevant and meaningful and thus maximize the discipline’s value proposition to new
The decision analysis process is an iterative process with an openness to change and adapts
as understanding of the decision and the trade-space emerges with each activity. The
circular shape of the process map is meant to convey the notion of an iterative process with
significant interaction between the process steps. The feedback loops seek to capture new
information regarding the decision task at any point in the decision process and make
appropriate adjustments.
144
Table 6-1 provides a crosswalk between the Systems Engineering terms used in Figure 6-
Processes and the section of the INCOSE Systems Engineering Handbook V4 devoted to
the term.
The ISEDM process can be used for trade-off analyses encountered across the systems
development life cycle - tailored to the particulars of the decision situation. Figure 6-3
adds the ISEDM process icon several times to the generic life-cycle model put forth in the
Figure 6-3. Tradeoff studies throughout the systems development life cycle
The first step of the decision management process is to frame the decision and to tailor the
decision process. To help ensure the decision makers and stakeholders fully understand
the decision context and to enhance the overall traceability of the decision, the systems
engineer should capture a description of the system baseline as well as a notion for how
the envisioned system will be used along with system boundaries and anticipated
interfaces. Decision context includes such details as the timeframe allotted for the
decisions, an explicit list of decision makers and stakeholders, available resources, and
expectations regarding the type of action to be taken as a result of the decision at hand as
well as decisions anticipated in the future. (Edwards et al. 2007) The best practice is to
identify a decision problem statement that defines the decision in terms of the system life
cycle. Next, three categories of decisions should be listed: decisions that have been made,
146
decisions to be made now, and subsequent decisions that can be made later in the life cycle.
Once the decision at hand is sufficiently framed, systems engineers must select the
analytical approach that best fits the frame and structure of the decision problem at hand.
For deterministic problems, optimization models can explore the decision space. However,
when there are “… clear, important, and discrete events that stand between the
implementation of the alternatives and the eventual consequences…” (Edwards, Miles Jr.,
& Von Winterfeldt, 2007), a decision tree is a well suited analytical approach, especially
when the decision structure has only a few decision nodes and chance nodes. As the
number of decision nodes and chance nodes grow, the decision tree quickly becomes
unwieldy and loses some of its communicative power. However, decision trees and many
alternatives can be readily monetized and end state consequences can be expressed in
dollars, euros, yen, etc. When the potential consequences of alternatives within a decision
practice for this type of problem is the multiple objective decision analysis (MODA)
approach.
The decision management method most commonly employed by systems engineers is the
trade study, and often employ some form of MODA approach. The aim is to define,
147
measure, and assess shareholder and stakeholder value and then synthesize this information
to facilitate the decision maker’s search for an alternative that represents the optimally
balanced response to often competing objectives. Major system projects often generate
large amounts of data from many separate analyses performed at the system, subsystem,
delivers one dimension of the decision at hand, one piece of the puzzle that the decision
makers are trying to assemble. These analyses may have varying assumptions, and may
aggregate system level data for all alternatives across all dimensions of the trade space in
his or her head. This would prove to be an ill-fated task as all decision makers and
stakeholders have cognitive limits that preclude them from successfully processing this
amount of information in their short term memory (Miller 1956). When faced with a deluge
of information that exceeds human cognitive limits, decision makers may be tempted to
oversimplify the trade space by drastically truncating objectives and/or reducing the set of
alternatives under consideration but such oversimplification runs a high risk of generating
By providing techniques to decompose a trade decision into logical segments and then
synthesize the parts into a coherent whole, a formal decision management process offers
an approach that allows the decision makers to work within human cognitive limits without
into smaller elements, experts can provide assessments of alternatives as they perform
148
within the objective associated with their area of expertise. Buede and Choisser put it this
way:
These component parts can be subdivided as finely as needed so that the total expertise of
the system design team can be focused, in turn, on specific, well-defined issues. The
analyses on the component parts can then be combined appropriately to achieve overall
results that the decision makers can use confidently. The benefits to the decision maker of
using this approach include increased objectivity, less risk of overlooking significant
factors and, perhaps most importantly, the ability to reconstruct the selection process in
MODA approaches generally differ in the techniques used to elicit values from
stakeholders, the use of screening techniques, the degree to which an alternative’s response
to objectives (and sub-objectives) are aggregated, the mathematics used to aggregate such
responses, the treatment of uncertainty, the robustness of sensitivity analyses, the search
for improved alternatives, and the versatility and quality of trade space visualization
outputs. If time and funding allow, systems engineers may want to conduct tradeoff studies
using several techniques, compare and contrast results, and reconcile any differences to
ensure findings are robust. Although there are many possible ways to specifically
implement MODA, the discussion contained in the balance of this chapter represents a
short summary of best practices. An anticipated future paper will apply various MODA
unmanned aerial system (sUAS) case study introduced in the following paragraphs. Note
that the author of this dissertation has no first-hand experience in aircraft design but was
able to create an unclassified yet plausible and sufficiently rich example by distilling the
technical ideas presented in the 805 page textbook by Dr. Jay Gundlach, Designing
to inform physical architecture descriptions of the notional sUAVs and the stakeholder
requirements created for the case study that follows but no attempt was made to use the
performance estimates for the sUAV concepts within the case study of this chapter. All
For this thought experiment, assume that the military is contemplating the start of a new
effort to develop the next generation sUAS and a lead systems engineer has been tasked
inform requirements generation. The lead systems engineer is told that the future system
context as they are now but instead of operating at altitudes of 500 – 1000 feet, the sUAS
will be expected to operate at an altitude of 3,000 feet in order to avoid airspace conflicts
with military helicopters and to reduce the likelihood of being detected by enemy forces.
The lead engineer was also told the new capability should be operational within seven years
The lead systems engineer is excited about the opportunity but is a bit anxious about the
ambiguity surrounding the problem statement. To curb some of the anxiety, he begins by
asking some clarification questions regarding decision timeframe, system boundaries, and
expectations regarding affordability. He learns that the final report is due in twelve months
with executive level reviews scheduled every quarter with preliminary study findings
expected by the third review. He also learns that the system boundaries include the air
vehicle, the ground elements, and the communication links between them. With regards
to affordability, he was told not to initially discard concepts on a cost basis but rather collect
rough order of magnitude life cycle cost estimates for each concept and show the cost vs.
performance vs. schedule relationship for each. With this information and the information
from similar trades being conducted elsewhere in the portfolio, the executive decision
board will determine appropriate affordability goals for the next phase of the future sUAS
development effort.
Armed with the initial framing of the trade at hand, the lead systems engineer begins
wondering how the goodness of system alternatives should be defined. The next section
of this chapter addresses the best practices associated with developing objectives and
measures and is immediately followed by the continuation of the sUAV case study.
Defining how a decision will be made may seem straightforward, but often becomes an
arduous task of seeking clarity amidst a large number of ambiguous stakeholder need
statements. The first step is to use the information obtained from the Stakeholder
151
Management Processes to develop objectives and measures. If these processes have not
been started, then stakeholder analysis is required. Often this begins with reading
documentation on the decision topic followed by a visit to as many decision makers and
stakeholders as reasonable and facilitating discussion about the decision problem. This is
best done with interviews and focus groups with subject matter experts and stakeholders.
For systems engineering trade-off analyses, top-level stakeholder value often includes
competing objectives of performance, development schedule, life cycle costs, and long
term viability. For corporate decisions, shareholder value would be added to this list. With
the top level objectives set, lower levels of objective hierarchy should be discovered. For
decomposition (usually done as part of the requirements and architectural design processes)
identifying inputs and outputs of the system of interest and craft a succinct top level
functional statement about what the system of interest does, identifying the action
performed by the system of interest to transform the inputs into outputs. As portrayed in
Figure 6-4, test this initial list of fundamental objectives for key properties by checking
that each fundamental objective is essential and controllable and that the set of fundamental
al. 2007)
152
Beyond these best practices, the creation of fundamental objectives is as much an art as it
is a science. This part of the decision analysis process clearly involves subjectivity. It is
important to note however, that a subjective process is not synonymous with an arbitrary
Subjective aspects are a critical part of decisions. Defining what the decision is and
coming up with a list of objectives, based on one’s values, and a set of alternatives are by
nature subjective processes. You cannot think about a decision, let alone analyze one,
without addressing these elements. Hence, one cannot even think about a decision without
The output of this process step takes on the form of a fundamental objectives hierarchy as
Stakeholder
Value
Objective 3 Objective 4
Objective 1 Objective 2
Development Long Term
Performance Life Cycle Cost
Duration Viability
Objective Objective
1.1.3 1.3.3
For completeness it is often helpful to build a crosswalk between the objectives hierarchy
and any stakeholder need statements or capability gap lists as illustrated in Figure 6-6. This
activity helps ensure that all stakeholder need statements or capability gaps have been
covered by at least one objective. It also aids in identifying objectives that do not directly
trace to an expressed need. Such objectives will need additional explanation to justify their
inclusion in the hierarchy. It should be noted, however, that it is common to have such
objectives included in a hierarchy because stakeholders are often silent about needs that
are currently satisfied by the incumbent system but would not be happy if, in an effort to
fill a perceived need, the new system created a new gap. For example, up-armored vehicles
Objective 1
OBJ 1.1 OBJ 1.2 OBJ 1.3
Objective 2
Objective 3
Objective 4
OBJ 1.1
OBJ 1.2
OBJ 1.3
OBJ 2.1
OBJ 2.2
OBJ 3.1
OBJ 3.2
OBJ 3.3
Capability Gap 1 x
Capability Gap 2 x
Capability Gap 3 x
Capability Gap 4 x
Capability Gap 5 x x
Business Need 1 x
Business Need 2 x
Business Need 3 x
Figure 6-6. Crosswalk Between Fundamental Objectives and Stakeholder Need Statements
For each fundamental objective, a measure (also known as attribute, criterion, and metric)
must be established so that alternatives that more fully satisfy the objective receive a better
score on the measure than those alternatives that satisfy the objective to a lesser degree.
understandable. (Keeney & Gregory 2005). Table 6-3 defines these properties of a high
quality measure.
Property Definition
In practice, information to describe consequences can be obtained and value trade-offs can
Operational
reasonably be made.
Consequences and value trade-offs made using the measure can readily be understood and
Understandable
clearly communicated.
Keeney has identified three types of measures - natural measure (kilometers, degrees,
probability, seconds, etc.), constructed measure (Dow Jones Industrial Average, Heat
Index, Consumer Price Index, etc.) and a proxy measure (usually a natural measure of a
recommends a natural measure whenever possible since they tend to be commonly used
then a proxy measure using a natural scale is often workable although by definition it is an
theory) is the transformation from measure space to value space that enables mathematical
is performed through the use of a value function. Value functions describe returns to scale
on the measure. In other words, value functions describe the degree of satisfaction
There are several techniques available to elicit value functions and priority weightings from
stakeholders. One of the more popular techniques used in marketing circles is Conjoint
Analysis (Green, Krieger, Wind, 2001) where stakeholders are asked to make a series of
pair-wise comparisons between hypothetical products. For decisions that involve fewer
objectives, the number of pair wise comparisons required to formulate the representative
objectives, the value scheme elicitation approach described in this chapter is a direct
When creating a value function, one ascertains whether stakeholders believe there is a
walk-away point on the objective measure scale (x-axis) and map it to 0 value on the value
scale (y-axis). A walk-away point is defined as the measure score where regardless of how
well an alternative performs in other measures; the decision maker will walk away from
the alternative. Working with the stakeholder, find the measure score beyond which an
alternative provides no additional value, label it "meaningful limit" (also called ideal) and
map it to 100 (1 and 10 are also common scales) on the value scale (y-axis). If the returns
to scale are linear, connect the walk-away value point to the meaningful value point with a
straight line. If there is reason to believe stakeholder value behaves with non-linear returns
to scale, pick appropriate inflection points and draw the curve. The rationale for the shape
of the value functions should be documented for traceability and defensibility (Parnell et
al, 2011). Figure 6-7 provides examples of some common value function shapes.
Practice suggests that eliciting two end points and three inflection points provides
Walk Away: stakeholder will dismiss an alternative if it fails to meet at least this level regardless
of how it performance on other value measures (0 points).
Marginally Acceptable: Stakeholder begins to become interested and beyond this point the
perceived value increases rapidly (10 points).
Target: Desired level (50 points)
Stretch Goal: Improving beyond this point is considered gold plating so there is very little available
value between this point and meaningful limit (90 points).
Meaningful Limit: Theoretical limit or known practical limit beyond which would be considered
nonsense (100 points).
In an effort to capture the voice of the customer, system engineers will often ask a
Most important decisions involve multiple objectives, and usually with multiple-objective
decisions, you can't have it all. You will have to accept less achievement in terms of some
objectives in order to achieve more on other objectives. But how much less would you accept
The mathematics of Multiobjective Decision Analysis (MODA) requires that the weights
depend on importance of the preferentially independent measure and the range of the
measure (walk away to stretch goal or ideal). A useful tool for determining weightings is
the swing weight matrix. For each measure, consider its importance by determining if the
and also consider the variation measure range by considering the gap between the current
capability and the desired capability and put the name of the measure in the appropriate
159
cell of the matrix. Swing weights are then assigned to each measure according to the
required relationship rules described in Figure 6-8. Swing weights are then converted to
measure weights by normalizing such that the set sums to one. (Parnell et al, 2011) For
the purposes of swing weight matrix use, consider a defining capability to be one that
directly traces to a verb/noun pair identified in the top level (level 0) functional definition
of the system of interest – the reason the system exists. Consider enabling capabilities to
trace to functions that are clearly not the reason the system exists but somehow allow the
core functions to be executed more fully. Let critical capabilities be those that are more
All decisions involve elements of subjectivity, the distinctive feature of formal decision
management process is that these subjective elements are rigorously documented so that
160
the consequences can be identified and assessed. Towards this end, it is considered good
practice to document the how measured, the priority weighting, and the value function
Returning to the sUAV case study example, we find that after some quality time with many
of the stakeholders and hours digging through white papers, memos, and presentations
relevant to the problem statement, the lead systems engineer spearheaded a requirements
analysis effort that included a functional decomposition. This exercise helped the systems
engineering trade off study team understand and be able to articulate what the system of
interest is expected to “do” which in turn enabled them to construct the functional
performance objectives shown in the objectives hierarchy shown in Figure 6-9. Note that
stakeholder value is measured not only in terms of functional performance, but also life
cycle costs, and development schedule. (Although not addressed here due to space
considerations, the notion of long term viability fits well within an objectives hierarchy
such as this).
161
Stakeholder Value
1 3
Functional 2
Life Cycle
Performance Development Duration
Costs
1.5.1
1.2.1 1.3.1 1.4.1 Enable High
1.1.1 Be Responsive to a Exchange Info
Reach Areas of Probability of
Avoid Impeding Variety of ISR Data Across Various
Interest Quickly Recovery
Soldier Endurance Requests Terrains &
Geometries
1.3.2
1.2.2 Collect High 1.5.2
1.1.2 Reach Distant Quality Imagery 1.4.2 Render System
Avoid Impeding Areas of Interest During Daytime Send ISR data Useless Upon
Soldier Sprint quickly & reliably Enemy Capture
1.3.4
Collect High
Quality Imagery in
Obscured
Environments
With the objectives hierarchy in hand, the study team identified measures for each of the
1.1 Be Soldier 1.1.1 Avoid Impeding Soldier Endurance Measure: % decrease in sustainable march
Transportable speed
1.1.2 Avoid Impeding Soldier Sprint Measure: % increase in soldier sprint time
1.1.3 Avoid Impeding Soldier Jump Measure: % degredation in soldier jump height
1.2 Maneuver to and 1.2.1 Reach Areas of Interest Quickly Measure: Max flight speed (km/hour)
Dwell at Area of
Interest
1.2.2 Reach Distant Areas of Interest Measure: Maximum operational range (km)
1.2.3 Dwell @ Area of Interest for Extended Measure: Operational Endurance (hours)
Periods
1.3 Collect ISR Info 1.3.1 Be Responsive to a Variety of ISR Data Measure: ISR Data Request Responsiveness
Requests Index
1.3.2 Collect High Quality Imagery During Measure: TTP rating per NV-IPM @ 3000m full
Daytime light
1.3.3. Collect High Quality Imagery at Night Measure: TTP rating per NV-IPM @ 3000m low
light
1.3.4 Collect High Quality Imagery in Obscured Measure: TTP rating per NV-IPM @ 3000m w/
Env. smoke
1.4 Securely 1.4.1 Exchange Info Across Terrains & Measure: BLOS comms capable (yes/no)
Exchange Info w/ Geometries
Command Station
1.4.2 Send large volumes of data quickly & Measure: High data rate payload comm link?
reliably (Y/N)
1.4.3 Avoid spoofing, jamming, intercept Measure: Digital C2 link? (Y/N) Digital Payload
Com link? (Y/N)
1.5 Be Recoverable & 1.5.1 Enable High Probability of Recovery Measure: Subjective assessment of landing
Tamper Resistant scheme
1.7.2 Render System Useless Upon Enemy Measure: Command self destruct feature?
Capture
Not shown in Table 6-4 are the measures for the Life Cycle Cost objective and the
Development Schedule objective. These two measures are discussed here. The life cycle
cost measure for this hypothetical case study is the sum of the rough order of magnitude
163
estimates for development costs, procurement costs, training costs, maintenance costs, and
wartime costs. Schedule duration for this exercise is measured as the number of years that
the development effort requires estimated at the 80% confidence level after considering the
uncertainty associated with the duration estimates for each configuration item to mature
from its current state to form, fit, and function tested across temperatures plus the estimate
With a good understanding of how each objective is to be measured, the lead systems
engineer worked to understand the degree of satisfaction that each stakeholder perceives at
each point along a particular measure scale and then expressed these relationships as a set
of value functions. To accomplish this, the lead systems engineer of the sUAV effort
acceptable point, a target point, a stretch goal, and a meaningful limit. The lead systems
engineer repeated this process for several stakeholder groups in order to capture any
differences among value schemes. The lead systems engineer maintained a record of each
set of value functions so that he may use them as part of the sensitivity analysis later in the
process. Table 6-5 and Figure 6-10 describe the value functions associated with the fifteen
one of the stakeholder groups. Notice that life cycle cost measures and schedule duration
measures are not included in Table 6-5 for due to space considerations.
Limit
Name
Target
Acceptable
Marginally
Meaningful
Walk-Away
Stretch Goal
1
90
50
10
100
Value
%
0
2
5
8
10
Avoid Impeding Soldier Endurance
0
3
12
15
Avoid Impeding Soldier Sprint
7.5
%
Be Transported
0
4
10
16
20
Avoid Impeding Soldier Jump
1
4
8
12
15
Reach Area of Interest (10km) Quickly
min
km
50
45
30
15
10
Reach Distant Areas of Interest
Fly
8
4
2
1
hrs
16
Dwell at Area of Interest
8
5
2
1
VI
Be Responsive to a Variety of ISR Data Requests
10
Pd Collect High Quality Imagery During Day
1.0
0.9
0.7
0.2
0.1
Collect
Pd
1.0
0.9
0.7
0.2
0.1
Functional Performance
Pd
1.0
0.9
0.7
0.2
0.1
-
-
-
B
L/B
1
5
Send ISR Imagery Quickly and Reliably
10
30
90
sec
Communicate
-
-
-
100
End
-
-
-
To complete his understanding of the stakeholder value, the lead systems engineer set out
to identify the objectives of which the stakeholders were willing to accept marginal returns
in order to achieve high returns on others. By working through a swing weight matrix with
166
each stakeholder group, the lead system engineer identified weightings for each functional
performance value measure. The weightings for one of the stakeholder groups is depicted
in Figure 6-11. Weightings developed by other stakeholder groups were also documented
With the value schemes of stakeholders captured the lead systems engineer knows how
goodness will be measured for each alternative considered within the systems engineering
trade-off analysis. The lead system engineer can now turn his attention to generating
sUAV system alternatives. The next section of this chapter describes best practices for
generating creative alternatives and is followed by the continuation of the sUAV case
study.
167
For many trade studies the alternatives will be systems composed of many interrelated
interest and to apply this product structure consistently throughout the decision analysis
The product structure should be a useful decomposition of the physical elements of the
system of interest.
Each alternative is composed of specific design choices for each generic product structure
element. The ability to quickly communicate the differentiating design features of given
alternatives is a core element of the decision making exercise. These subsystem design
choices have system level consequences across the objectives hierarchy. Every subsystem
design choice will impact system level cost, system level development schedule, and
system level performance. It is important to emphasize that these design choices are not
fundamental objectives, they are means objectives important only to the degree that they
the levers used by the system architect to steer the system design toward a solution that
best satisfies the all elements of stakeholder value - the full fundamental objectives
hierarchy. These levers are very important and care should be given in this step of the
process to clearly and completely identify specific design choices for each generic product
process described later. The ability to quickly and accurately communicate the
differentiating design features of given alternatives is a core element of the decision making
exercise. It may be useful to think of design choices as the levers used by the system
architect to steer the system design toward a solution that best satisfies the elements of
shareholder and stakeholder value - the fundamental objectives hierarchy. These levers are
very important and care should be given in this step of the process to clearly and completely
identify specific design choices for each product structure element for every alternative
or inconsistent alternative assessments in the process described in the next section. The
ability to quickly and accurately communicate the differentiating design features of given
alternatives is critical.
at the system level but are not themselves fundamental objectives. For example, the weight
design decision nor a fundamental objective. Weight is not a design decision but rather a
consequence of all the subsystem design choices made throughout the product structure.
Weight is not a fundamental objective because it is not inherently good or bad although it
Returning to our sUAV example, the lead systems engineer has identified the following
four top-level elements of the sUAV physical architecture: the air vehicle, the ISR
Collecting Payload, the Communication Links, and the Ground Elements. The sUAV
Physical Architecture Description of Table 6-6 decomposes the four top level physical
elements into generic sub-elements and also provides a list of specific design choices
Air Vehicle
Prop
Propulsion Energy Wing Wing Fin Airframe
Size & Actuators Autopilot Launch
System Source Span Config. Config. Material
Location
22”
8 ft Cruciform
JP-8 Fuel Front
26”
9 ft
Front
Small Fixed
Small Fixed
Antenna
Antenna transmit
transmit analog Ruggedized
Fixed none none analog data direct Dipole Keyboard Generator
data direct to Laptop
to GCS
GCS
(VHF or UHF)
(VHF or UHF)
Small, Fixed,
Small, Fixed, Non-
Non-pointing
4 Cooled pointing Antenna
Antenna
Megapixel 320 x 240 transmit digital Parabolic Wearable Battery +
Pan-tilt transmit digital Joystick
Daylight MWIR data to LEO Reflector Computer Generator
data to LEO
Camera Satellite
Satellite
(L Band)
(L Band)
Mech. Steerable
8 Cooled, parabolic dish
Battery +
Megapixel 640 x 480 transmit digital Smart- Touchscre
Roll-tilt Backup
Daylight MWIR data to GEO phone en
Batteries
Camera Satellite
(Ka or Ku Band)
Electronically
Steered Phased
Cooled
Array Antenna
Pan- 1280 x 720
transmit digital Stylus
tilt-roll MWIR & LWIR
data to GEO
Satellite
(Ka or Ku Band)
Uncooled
1024 x 768
MWIR & LWIR
171
Using the Physical Architecture Description of Table 6-6, the lead systems engineer and
his study team developed the 12 system level sUAV concepts in Tables 6-7, 6-8, and 6-9.
The team used the table format for describing the alternatives to ensure each system would
1 2 3 4
Buzzard I Buzzard II Cardinal I Cardinal II
Subsystem / Component Design Choice Design Choice Design Choice Design Choice
Air Vehicle
Propulsion System Electric 300W Electric 300W Electric 300W Electric 300W
Energy Source Li-Ion Battery Li-Ion Battery Li-S Battery Li-S Battery
Prop Size & Location 18” Rear 18” Rear 20” Rear 20” Rear
Wing Span 5’ 5’ 6’ 6’
Wing Configuration Canard Canard Conventional Conventional
Fin Configuration Inverted V Inverted V Twin Boom Twin Boom
Actuators Electromagnetic Electromagnetic Electromagnetic Electromagnetic
Airframe Material Graphite Epoxy Graphite Epoxy Graphite Epoxy Graphite Epoxy
Autopilot Semi-Auto Semi-Auto Remotely Piloted Remotely Piloted
Launch Mechanism Hand Hand Hand Hand
ISR Collecting
Payload
Sensor Actuation Fixed Fixed Fixed Fixed
EO Imager 4 MP 4 MP 4 MP 4 MP
320 x 240 640 x 480 320 x 240 640 x 480
IR Imager
MWIR MWIR MWIR MWIR
Communication Links
Command & Control
Fixed VHF Fixed VHF Fixed VHF Fixed VHF
Link
Payload Data Link Fixed VHF Fixed VHF Fixed VHF Fixed VHF
Ground Elements
Antenna Dipole Dipole Dipole Dipole
Computer Laptop Laptop Smartphone Smartphone
User Input Device Keyboard Keyboard Joystick Joystick
Power Battery + Spare Battery + Spare Battery + Spare Battery + Spare
172
5 6 7 8
Crow I Crow II Pigeon I Pigeon II
Subsystem / Component Design Choice Design Choice Design Choice Design Choice
Air Vehicle
Propulsion System Electric 600W Electric 600W Electric 600W Electric 600W
Energy Source Li-Ion Battery Li-Ion Battery Li-S Battery Li-S Battery
Prop Size & Location 22” Rear 22” Rear 20” Rear 20” Rear
Wing Span 6’ 6’ 6’ 6’
Wing Configuration Tandem Wing Tandem Wing Conventional Conventional
Fin Configuration V Tail V Tail Twin Boom Twin Boom
Actuators MEMS MEMS Electromagnetic Electromagnetic
Airframe Material Graphite Epoxy Graphite Epoxy Graphite Epoxy Graphite Epoxy
Autopilot Semi-Auto Semi-Auto Remotely Piloted Remotely Piloted
Launch Mechanism Hand Hand Hand Hand
ISR Collecting Payload
Sensor Actuation Pan-tilt Pan-tilt Pan-tilt Pan-tilt
EO Imager 8 MP 8 MP 8 MP 8 MP
1280 x 720 1280 x 720
1280 x 720 1280 x 720
MWIR & MWIR &
IR Imager MWIR & LWIR MWIR & LWIR
LWIR LWIR
cooled uncooled
cooled uncooled
Communication Links
Command & Control
Fixed VHF Fixed VHF Fixed VHF Fixed VHF
Link
Payload Data Link Fixed VHF Fixed VHF Phased Array Ka Phased Array Ka
Ground Elements
Antenna Dipole Dipole Dipole & Dish Dipole & Dish
Computer Laptop Laptop Laptop Laptop
User Input Device Keyboard Keyboard Joystick Joystick
Power Battery + Spare Battery + Spare Battery + Spare Battery + Spare
173
9 10 11 12
Robin I Robin II Dove I Dove II
Subsystem / Component Design Choice Design Choice Design Choice Design Choice
Air Vehicle
Propulsion System Piston 2.5 HP Piston 2.5 HP Piston 4.0 HP Piston 4.0 HP
Energy Source JP-8 JP-8 JP-8 JP-8
Prop Size & Location 26” Front 26” Front 28” Front 28” Front
Wing Span 8’ 8’ 9’ 9’
Wing Configuration Conventional Conventional Conventional Conventional
Fin Configuration H Tail H Tail Cruciform Cruciform
Actuators Hydraulic Hydraulic Hydraulic Hydraulic
Fiberglass Fiberglass Fiberglass Fiberglass
Airframe Material
Epoxy Epoxy Epoxy Epoxy
Remotely Remotely Remotely Remotely
Autopilot
Piloted Piloted Piloted Piloted
Launch Mechanism Tensioned Line Tensioned Line Tensioned Line Tensioned Line
ISR Collecting Payload
Sensor Actuation Pan-tilt Pan-tilt Pan-tilt Pan-tilt
EO Imager 8 MP 8 MP 8 MP 8 MP
1280 x 720 1280 x 720 1280 x 720 1280 x 720
IR Imager MWIR & LWIR MWIR & LWIR MWIR & LWIR MWIR & LWIR
cooled uncooled cooled uncooled
Communication Links
Command & Control Fixed VHF Fixed VHF Fixed VHF Fixed VHF
Link
Elect. Steered Elect. Steered Mech. Steered Mech. Steered
Payload Data Link
Phased Array Ka Phased Array Ka Dish Ka Dish Ka
Ground Elements
Antenna Dipole Dipole Dipole & Dish Dipole & Dish
Computer Laptop Laptop Laptop Laptop
User Input Device Keyboard Keyboard Joystick Joystick
Power Battery + Gen. Battery + Gen. Battery + Gen. Battery + Gen.
174
The lead systems engineer knows that with the alternatives defined he is ready to start
collecting data regarding each system’s response to the measures established earlier in the
process. The next section of this chapter walks through some best practices with regards
sUAV example.
With objectives and measures established and alternatives identified and defined, the
decision team should engage subject matter experts, ideally equipped with operational data,
test data, models, simulations and expert knowledge. Often a mapping between physical
architecture elements to fundamental objectives may help identify the types of subject
matter expertise needed to fully assess each alternative against a particular objective.
Objective 1
OBJ 1.1 OBJ 1.2 OBJ 1.3
Objective 2
Objective 3
Objective 4
OBJ 1.1.1
OBJ 1.1.2
OBJ 1.1.3
OBJ 1.2.1
OBJ 1.2.2
OBJ 1.3.1
OBJ 1.3.2
OBJ 1.3.3
Subsystem A x x x x
Subsystem B x x x x
Subsystem C x x x x
Subsystem D x x x x
Subsystem E x x x x x
Subsystem F x x x x
Subsystem G x x x x x
Subsystem H x x x
Subsystem I x x x x
Subsystem J x x x x
175
It may be helpful to grow these simple maps into more informative Assessment Flow
Diagrams (AFDs) that trace the relationships between physical means, intermediate
sample provided in Figure 6-12 (author’s original graphic). An Assessment Flow Diagram
helps individual subject matter experts understand how their area of expertise fits into the
larger assessment picture, from where inputs to feed their particular model will be coming
and to where their outputs will be consumed. An AFD can be used by the lead Systems
Engineer to organize, manage, and track assessment activities especially when used in
conjunction with the consequence scorecard shown in Table 6-11 and Table 6-12.
In addition to the organization and communication benefits, an AFD seems to provide some
psychological benefits to the subject matter experts (SMEs) conducting the assessments
and to the stakeholders hoping to make use of the results. An AFD gives an SME
confidence that their analysis will not be ignored and sends the message that their expertise
is important and needed and their results will find their way to the decision table in proper
context, as a piece of the whole assessed in terms meaningful to the stakeholder. Likewise,
by showing the pedigree of the data feeding the decision support model, an AFD gives the
stakeholder confidence that the trade-off analysis rests on a solid foundation and not a
Figure 6-12. Assessment Flow Diagram (AFD) for a Hypothetical Gun Design Choice Activity
The decision team can prepare for subject matter expert engagement by creating structured
scoring sheets. Assessments of each concept against each criterion can be captured on
separate structured scoring sheets for each alternative/measure combination. Each score
sheet contains a summary description of the alternative under examination and a summary
of the scoring criteria to which it is being measured. The structured scoring sheet should
contain ample room for the evaluator to document the assessed score for the particular
concept against the measure followed by clear discussion providing the rationale for the
177
score, noting how design features of the concept under evaluation led to the score as
described in the rating criteria. Whenever possible, references to operational data, test
After all the structured scoring sheets have been completed for each alternative/measure
combination, it is useful to summarize all the data in tabular form. Each column in such a
table would represent a measure and each row would represent a particular alternative.
Figure 6-11 provides a sample structure of such a table, identified here as a consequences
scorecard.
178
Objective 1
OBJ 1.1 OBJ 1.2 OBJ 1.3
Objective 2
Objective 3
Objective 4
OBJ 1.1.1
OBJ 1.1.2
OBJ 1.1.3
OBJ 1.2.1
OBJ 1.2.2
OBJ 1.3.1
OBJ 1.3.2
OBJ 1.3.3
ID Name Image
Descriptive
Illustration
Name for x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,1.1.1 x1,2 x1,3 x1,4
1 for Alt #1
Alt #1
Descriptive
Illustration
Name for x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,1.1.1 x2,2 x2,3 x2,4
2 for Alt #2
Alt #2
Descriptive
Illustration
Name for x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,1.1.1 x3,2 x3,3 x3,4
3 for Alt #3
Alt #3
Descriptive
Illustration
Name for x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,1.1.1 x4,2 x4,3 x4,4
4 for Alt #4
Alt #4
Descriptive
Illustration
Name for x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,1.1.1 x5,2 x5,3 x5,4
5 for Alt #5
Alt #5
Descriptive
Illustration
Name for x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,1.1.1 x6,2 x6,3 x6,4
6 for Alt #6
Alt #6
Continuing the sUAV case study example, the lead systems engineer recruited a team of
subject matter experts to assess each alternative against a measure that aligns with their
skillset. For instance, human factors experts assessed impact to soldier mobility and
communication engineers scored the information transit and receive measures, and
mechanical engineers rated each alternative against the recover measures. Each subject
matter expert was provided with a scoring sheet for recording their findings. The lead
systems engineer took the findings from the scoring sheets and created the consequence
Functional Performance
Be
Fly Collect Communicate End
Transported
1 Buzzard
1 2 3 15 10 1 2 0.3 0.2 0.1 L 30 A 90 Y 1 2
I
2 Buzzard
1 2 3 15 10 1 4 0.4 0.3 0.2 L 30 A 90 Y 1.5 2
II
3 Cardinal
2 4 5 12 12 1.5 5 0.5 0.4 0.3 L 90 A 90 Y 2.3 3
I
4 Cardinal
2 4 5 12 12 1.5 10 0.5 0.4 0.3 L 90 A 90 Y 4 3
II
Crow
5 4 5 8 10 15 5 10 0.9 0.8 0.7 B 20 D 95 Y 5 9
I
Crow
6 3 4 7 9 15 6 10 0.8 0.7 0.6 B 30 D 95 Y 6 9
II
Pigeon
7 5 7 10 8 18 8 10 0.9 0.8 0.7 B 20 D 95 Y 6.5 7
I
Pigeon
8 4 6 9 7 18 9 10 0.8 0.7 0.6 B 30 D 95 Y 7.5 7
II
Robin
9 7 12 17 6 22 10 10 0.9 0.8 0.7 B 20 D 98 Y 8.3 6
I
Robin
10 6 11 16 6 22 11 10 0.8 0.7 0.6 B 30 D 98 Y 8.8 6
II
Dove
11 10 15 20 4 30 23 10 0.9 0.8 0.7 B 20 D 98 Y 9.3 5
I
Dove
12 9 14 19 4 30 24 10 0.8 0.7 0.6 B 30 D 98 Y 9.9 5
II
181
With 204 measurements (12 alternatives scored against 17 measures) taken, some would
be tempted to claim success and call it a day but the lead systems engineer for the sUAV
trade-off analysis knew there was much to be done in order to fully mine this data for
understanding and to communicate the findings and recommendations to the study sponsor
in a way that would lead to action. Of course, making 204 measurements and recording
them in a well-structured data store is no small task and it is better than some of the trade-
study products he’s seen over his career, but the consequence scorecard alone is certainly
not conducive to confident decision making. The next section of this chapter discusses
some of the best practices for synthesizing results for rapid and thorough understanding of
the trade at hand followed by an application of these best practices to the sUAV case study.
At this point in the process the decision team has generated a large amount of data as
summarized in the consequences scorecard. Now it is time to explore the data and display
results in a way that facilitates understanding. Transforming the data in the consequences
scorecard into a value scorecard is accomplished through the use of the value functions
developed in the decision analysis process step described above. Table 6-13 shows the
of the value scorecard, consider associating increments on the value scale with a color
according to heat map conventions. This view can be useful when trying to determine
which objectives are causing a particular alternative trouble. In addition, one can use this
182
view to quickly see if there are objectives for which no alternative scores well. From this
view, the systems engineer can also see if there is at least one alternative that scores above
the walk-away point for all objectives. If not, the solution set is empty and the decision
Objective 1
OBJ 1.1 OBJ 1.2 OBJ 1.3
OBJ 1.1.1
OBJ 1.1.2
OBJ 1.1.3
OBJ 1.2.1
OBJ 1.2.2
OBJ 1.3.1
OBJ 1.3.2
OBJ 1.3.3
ID Name Image
Illustration v1.1.1(x v1.1.2(x v1.1.3(x v1.2.1(x v1.2.2(x v1.3.1(x v1.3.2(x v1.3.3(x
1 Descriptive Name for Alt #1
for Alt #1 1,1.1.1) 1,1.1.2) 1,1.1.3) 1,1.2.1) 1,1.2.2) 1,1.3.1) 1,1.3.2) 1,1.3.3)
Radar graphs and tornado graphs (Figures 6-13 and 6-14) are popular visualization
techniques to show the same value data captured in a Heat-Indexed Value Scorecard but
Returning to the sUAV example, the lead systems engineer has transformed the
consequence table into a value scorecard through the use of the value functions created in
the second step of this process. Notice how the conditional formatting of the scorecard
makes the strengths and weaknesses of each alternative very apparent. It also quickly
highlights objectives that are non-discriminating and objectives that are difficult to achieve
Functional Performance
Be
Fly Collect Communicate End
Transported
1 Buzzard
95 93 92 1 1 1 10 18 10 1 10 10 10 1 90 100 90
I
2 Buzzard
95 93 92 1 1 1 37 26 18 10 10 10 10 1 90 95 90
II
3 Cardinal
90 81 83 10 5 6 50 34 26 18 10 1 10 1 90 86 77
I
4 Cardinal
90 81 83 10 5 6 100 34 26 18 10 1 10 1 90 63 77
II
Crow
5 63 72 63 30 10 60 100 90 70 50 90 30 90 50 90 50 6
I
Crow
6 77 81 70 40 10 70 100 70 50 42 90 10 90 50 90 37 6
II
Pigeon
7 50 54 50 50 18 90 100 90 70 50 90 30 90 50 90 30 23
I
Pigeon
8 63 63 57 60 18 91 100 70 50 42 90 10 90 50 90 17 23
II
Robin
9 23 10 8 70 29 92 100 90 70 50 90 30 90 90 90 7 37
I
Robin
10 37 19 10 70 29 94 100 70 50 42 90 10 90 90 90 3 37
II
Dove
11 1 1 1 90 50 100 100 90 70 50 90 30 90 90 90 1 50
I
Dove
12 6 4 3 90 50 100 100 70 50 42 90 10 90 90 90 1 50
II
186
The lead systems engineer was pleased with the value scorecard and rushed to show the
emerging results to the study sponsor and several other stakeholders. The feedback he
received was very positive and encouraging but all asked for the cost/schedule/performance
trade to be more explicitly shown. The next section in this chapter covers the ins and outs
Beyond the consequence scores for each alternative on each measure, all that was needed
to construct the visualizations covered in Figure 6 were the value functions associated with
each objective. Introducing the weighting scheme, the systems engineer can create
value is a prescreen for alternatives that fail to meet a walk-away point for any objective
measure and set that alternative’s aggregated value to zero regardless of how it
performance on other objective measures. For those alternatives that pass the walk-away
prescreen, the additive value model12 uses the following equation to calculate each
12
The additive model assumes preferential independence. See Keeney & Raiffa, 1976, and Kirkwood,
1997 for additional models.
187
n
v( x ) wi v i ( xi )
i 1
where
w 1
i 1
i
This chapter is devoted to the pragmatic application of the aggregation technique but a
thorough treatment of the mathematical foundation for the additive value model are
provided by (Keeney 1981; Stewart 1996; Von Winterfeldt et al. 1986). Before moving
on, however, it is important to understand the properties that must be satisfied by the
components of the additive value function for proper use. (Stewart 1996) provides a
(1) The marginal values vi(xi) must lie on an interval scale of preferences, i.e. equal
(2) The marginal values must also satisfy ‘additive independence’, i.e. the value
gained by a fixed increment in vi{xi) for some criterion, when performance levels
of all other criteria are unchanged should not depend on the fixed levels of the
other criteria.
(3) Irrespective of how the weights are actually assessed, they must be
𝑤𝑖 ∆𝑘
=
𝑤𝑘 ∆𝑖
With the weights in hand, one can construct aggregated visualizations such as the value
component graph as shown in Figure 6-14. In a value component graph, each alternative's
total value is represented by the total length of a segmented bar. Each bar segment
represents the contribution of the value earned by the alternative of interest within a given
The heart of a decision support process for systems engineering trade analysis is the ability
to integrate otherwise separate analyses into a coherent, system level view that traces
consequences of design decisions across all dimensions of stakeholder value. Figure 6-16
cycle cost, development schedule, and long term viability. Each system alternative is
duration is indicated by the color of the marker per heat map conventions shown in the
legend, while the long term viability of a particular alternative is indicated by the shape of
Resuming the sUAV case study example, the lead systems engineer is anxious to respond
to the study sponsor feedback and provide a more explicit representation of the cost,
schedule, and performance trade between the alternatives under consideration. Making use
of the additive value model and the weighting scheme described in Figure 6-11 along with
the value scorecard of Table 6-14 the lead systems engineer is able to create the aggregated
Stacked Bar
80
RENDER SYSTEM USELESS
70 UPON ENEMY CAPTURE
Functional Performance Value
60
50 ENABLE HIGH
PROBABILITY OF
40 RECOVERY
30
AVOID SPOOFING,
20 JAMMING, OR
COMMUNICATE
10 INTERCEPT
Crow I
Robin II
Buzzard I
Cardinal I
Robin I
Cardinal II
Dove II
Pigeon I
Dove I
Pigeon II
100
FUNCTIONAL PERFORMANCE
90
80
70 Crow I Pigeon I Robin I
Crow II Dove I
60 Pigeon II Dove II
50 Robin II
Cardinal II
40 Cardinal I
Buzzard II
30 Buzzard I
20
10
0
0 2 4 6 8 10 12
LIFE CYCLE COSTS ($B)
The lead systems engineer was excited about the value scatterplot and once again hurried
to the show this visualization to the study sponsor and several other stakeholders. The
feedback he received was again glowing, agreeing that this particular visualization clearly
showed the cost/schedule/performance trade. However this time the study sponsor asked
to understand how variations in priority weightings would impact the results. The next
section in this chapter covers the best practices associated with identifying uncertainty and
As part of the assessment, it is important for the subject matter expert to explicitly discuss
potential uncertainty surrounding the assessed score and variables that could impact one or
more scores. One source of uncertainty that is common within system engineering trade
off analyses that explore various system architectures is technology immaturity. System
design concepts are generally described as a collection of subsystem design choices but if
some of the design choices include technologies that are immature, there may be lack of
detail associated with component level design decisions that will eventually be made
downstream during detailed design. Many times the subject matter expert can assess an
upper, nominal, and lower bound measure response by making three separate assessments
performance.
193
Another source of uncertainty has to do with the subjective nature of the value schemes
elicited from the stakeholders. Considering that a stakeholder’s value scheme is often tied
to their forecast of future scenarios and acknowledging that the stakeholder is probably not
clairvoyant leads to the conclusion that thinking people can reasonably disagree about
things like priority weightings. One of the common pitfalls of systems engineering trade-
off analyses is to collect value scheme information from a small set of like-minded
stakeholders and ignore the fact that value schemes likely vary across the full population
of stakeholders. The best practice that should be employed here is to collect value scheme
information from many different stakeholders and then run a battery of sensitivity analyses
presence of such uncertainty. The next section of this chapter discusses some techniques
Recall the sUAV case study and how the study sponsor asked the lead systems engineer to
show how variations in priority weightings would impact the results. Towards this end,
the lead systems engineer discussed the composition of the focus group he used to develop
the priority weightings with the study sponsor and asked him to provide some
how weightings for this trade should be set. The study sponsor provided the systems
engineering lead with a list of contacts that he suspects would offer a somewhat different
take on weights associated with this trade. Focus group number two was formed from this
list and the lead systems engineer developed Figure 6-19 to highlight the differences in the
The lead systems engineer noticed the clear differences between the two group’s areas of
emphasis, group 1 on ISR data collection quality and group 2 on soldier mobility while
transporting the sUAV system. He realized that capturing this source of uncertainty is the
first step to assessing its impact on the overall decision. The next section of this chapter
describes some sensitivity analyses that can be used to gain this understanding followed by
Decision analysis uses many forms of sensitivity analysis including line diagrams, tornado
diagrams, waterfall diagrams and several uncertainty analyses including Monte Carlo
Many decision makers will want to understand how sensitive a particular recommendation
is to weightings and will ask questions regarding the degree to which a particular weighting
each measure’s weighting from absolute minimum to absolute maximum while holding the
relative relationship between the other measure weightings constant and noting changes to
overall score. The output of this type of sensitivity analysis is in the form of a line graph.
(Parnell et al. 2013) An example of such a graph is provided in Figure 6-19. Note this
particular example shows how sweeping the weight associated with Objective 1.1.2
impacts performance value. The graph in this example shows the alternative with the
highest performance value was Alternatives 2 for all cases where priority weighting
associated with Objective 1.1.2 was somewhat low but as the weight of Objective 1.1.2 is
the impact of one particular uncertainty at a time. For this type of view, consider the trade-
space visualization in Figure 6-20. Once all the uncertainties have been assessed, Monte
Carlo Simulations can be executed to identify the uncertainties that impact the decision
findings and of the uncertainties that are inconsequential. For example, Figure 6-21 shows
that after considering all sources of uncertainty, alternative 1 is less susceptible to changes
in stakeholder value than alternative 3. Note however that although the stakeholder value
of alternative 3 is a bit more volitale in the precense of uncertainty, its stakeholder value
never falls below the highest level of alternative 1’s stakeholder value. This graph also
indicates alternative 3 has the edge on long term viability whereas development duration
197
3 over altnernative 1 if the difference in life cycle costs were deemed affordable.
The takeaway of this section may be that good decisions can often be made in the presence
of high uncertainty. Systems engineers should not let what is not known sabotage a
Picking up the sUAV case study, the lead systems engineer applied two sensitivity analysis
first step in his attempt to get his arms around the degree of decision volatility introduced
line graphs – one graph per measure. Figure 6-22 shows one of the these line graphs, the
198
line graph that shows changes in functional performance value as the swing weight
associated with “avoid impeding soldier sprint” objective is swept from zero to one. Notice
that the top performing alternative only changes twice throughout the entire sweep – from
Figure 6-22. sUAV Performance Value Sensitivity to Changes in Swing Weight of "Avoid Impeding
Soldier Sprint" Objectives
Although the set of line graphs were interesting and shed some light on the degree of
volatility pertaining to this specific decision, the lead systems engineer feared that such
graphs would not directly address the question raised by the study sponsor. For this he
199
decided to generate a stakeholder value scatterplot with uncertainty. Figure 6-23 shows
this scatterplot. Notice how this graph clearly shows that Crow I maintains the highest
100
90
FUNCTIONAL PERFORMANCE
80
70 Crow II Pigeon I
Crow I
60 Robin II
Pigeon II Dove I
50 Buzzard II Robin I
Cardinal II Dove II
40 Cardinal I
30
Buzzard I
20
10
0
0 2 4 6 8 10 12
LIFE CYCLE COSTS ($B)
The study sponsor was thrilled when the lead systems engineer presented the visualization
of Figure 6-23 and the graph formed the focal point for many thoughtful negotiations
One could be tempted to end the decision analysis here, highlight the alternative that has
the highest total value and claim success. Such a premature ending however, would not be
considered best practice. Mining the data generated for the first set of alternatives will
likely reveal opportunities to modify some subsystem design choices to claim untapped
value and reduce risk. Recall the cyclic decision analysis process map and the implied
feedback. Taking advantage of this feedback loop and using initial findings to generate
new and creative alternatives starts the process of transforming the decision process from
consider taking additional steps to spark focused creativity to overcome anchoring biases.
As Keeney warns,
Once a few alternatives are stated, they serve to anchor thinking about others. Assumptions
implicit in the identified alternatives are accepted, and the generation of new alternatives,
Truly creative or different alternatives remain hidden in another part of the mind,
unreachable by mere tweaking. Deep and persistent thought is required to jar them into
alternative generation table (also called a morphological box) (Parnell et al. 2013) analysis
Within the sUAV example, the lead systems engineer worked with the pool of subject
matter experts and sUAV design engineers to explore ways to potentially reduce time
required for the development of CROW I from about eight years to five or six without
This is the point in the process where the decision team identifies key observations
regarding what stakeholders seem to want and what they must be willing to give up in order
achieve it. It is here where the decision team can highlight the design decisions that most
influence shareholder and stakeholder value and which are inconsequential. In addition,
the important uncertainties and risks should also be identified. Observations regarding
combination effects of various design decisions are also important products of this process
step. Competing objectives that are driving the trade should be explicitly highlighted as
well.
Beyond the top level tradespace visualization products, the systems engineer / decision
analyst must be able to rapidly drill down to supporting rationale. The decision support
tool construct represented in Figure 6-24 below will allow the decision team to navigate
seamlessly from top level stakeholder value scatterplot all the way down to any structured
scoring sheet so that rationale for any given score is only a click away. Rapid access to
rationale associated with the derivation of the value function or priority weightings is also
The integration pattern of the decision support model, performance models, cost models,
and actors of the ISEDM process is provided in Figure 6-25. Notice the central role
entrusted to the systems engineer / decision analyst. As the single interface between the
data store of the decision support model, the technical subject matter experts, and the
stakeholders, the systems engineer / decision analyst is in the best position to present the
aggregated data in terms relevant to the stakeholders but then drill down into the data store
or even, to a limited degree, the lower level models that generated the data.
203
Figure 6-25. Integration Pattern of the Decision Support Model, Performance Models, Cost Models,
and Actors of the ISEDM Process.
Concluding the sUAV case study, the lead systems engineer presented the sUAV systems
engineering trade-off analysis to the study sponsor and supporting stakeholder senior
advisory group. He used Figure 6-23 to summarize the study findings and Figure 6-25 and
6-25 to provide an appreciation for the heritage of underpinning data. He ended the talk
with a crisp summary of the decision at hand – if an eight year development time is
acceptable, then Crow I offers superior performance at a very attractive life cycle cost
point. If the capability is somehow urgent, Dove I can be fully developed within four years
204
but comes with about a 15% drop in performance value and about a 90% increase in life
It is often helpful to describe the recommendation in the form of clearly worded, actionable
task list to increase the likelihood of the decision analysis leading to some form of action,
thus delivering some tangible value to the sponsor. Reports are important for historical
traceability and future decisions. Take the time and effort to create a comprehensive, high
quality report detailing study findings and supporting rationale. Consider static paper
As covered in Chapter 1, the U.S. Department of Defense (DoD) has recently revised the
trade-offs made between capability requirements and lifecycle costs early in the acquisition
process in order to ensure realistic program baselines are established such that associated
lifecycle costs of a contemplated system are affordable within future budgets. Is the
Five qualitative research questions and associated quantitative hypothesis were developed
These questions and hypotheses guided both phases of this two phase, exploratory
205
sequential and embedded mixed methods study and were summarized in a table first
introduced in Chapter 2 and re-introduced here for convenience and Table 7-1.
What is the level of difficulty defense acquisition process find the large
Hypothesis A
Question 1
associated with systems number of variables involved with the 81% + 6.5%
engineering trade-off analyses exploration of emerging requirements (@ 95%
within the U.S. Defense and their combined relationship to confidence)
Acquisition System? conceptual materiel systems to be
difficult (where a significant percentage is
defined as > 70%).
Hypothesis B
Low: 0
be identified? What are the tradeoff studies that can be characterized
Medium: 14
potential pitfalls associated with as a medium or high risk to systems
High: 26
such analyses? How likely are the engineering trade-off study quality
pitfalls to occur? How severe are (where "many" is defined as > 10)
the consequences of each pitfall?
systems engineering trade-off involved with the U.S. defense acquisition 92% + 4.5%
analyses? Would an integrated process would find an integrated systems (@ 95%
systems engineering decision engineering decision management confidence)
management process map help process helpful (where a significant
avoid potential pitfalls? percentage is defined as > 70%)
influence decision quality? How involved with the U.S. defense acquisition 91% + 4.8%
does the variety and quality of process would find the stakeholder value (@ 95%
visualization techniques used to scatterplot to be useful trade space confidence)
display the aggregated data visualizations (where a significant
influence decision quality? percentage is defined as > 70%).
How does the extent to which operations researchers involved with the
Hypothesis E
Question 5
uncertainty is captured and the U.S. defense acquisition process would 92% + 4.6%
way in which the impact of the find stakeholder value scatterplot w/ (@ 95%
uncertainty to overall decision is uncertainty to be a useful trade space confidence)
visualized impact decision quality? visualization (where a significant
percentage is defined as > 70%).
7.1 Conclusions
with U.S. defense acquisition process find the large number of variables involved with the
The data show that with the sample size of 135 one can be 95% confident that 81% + 6.5%
of the true population (estimated to be 4,000 as discussed in Chapter 3) would respond that
their observations were at least somewhat similar to the survey author's observations that
acquisition professionals found the large number of variables involved with the exploration
to be difficult.
Hypothesis B: There are many potential pitfalls associated with systems engineering
tradeoff studies that can be characterized as a medium or high risk to systems engineering
Survey questions 12 – 29 asked the subject to assess the likelihood and consequence of
these 40 potential pitfalls associated with systems engineering trade-off analyses. The risk
of the 40 potential pitfalls was compiled by using the mode of the response distribution for
each question for each potential pitfall. Respondents found some of the 40 potential pitfalls
to pose a higher risk than others but found none of them to be low risk. The number of
207
subjects that responded to these questions regarding likelihood and consequence of the
potential pitfalls ranged from 124 to 132 respondents. Using 124 as a worst case sample
size, the population estimate of 4,000, and the approximation that 50% of the sample
selected the answer identified as the mode, one can be 95% confident that the results reflect
the perceptions of the true population within a margin of error of about + 8.7%. In some
cases, this margin of error could be enough to cause the mode of the true population to
differ from the mode of the sample. However, upon examination of the response
distributions for each question, one sees that this margin of error will at most cause the
mode to change by only one category left or right on the likelihood or severity scale.
Examination of Figure 4-3 reveals that such movement will not perturb the overarching
observation that some of the potential pitfalls were thought to be higher risk than others
involved with the U.S. defense acquisition process would find an integrated systems
About 92% of the respondents to question eight indicated that an ISEDM process supported
by tools and techniques that can help model the relationship between physical design
decisions and the consequences of those decisions across system level cost, schedule, and
With the sample size of 133 one can be 95% confident that 92% + 4.5% of the true
208
involved with the U.S. defense acquisition process would find the stakeholder value
91% of the respondents perceived the process map of the ISEDM process tailored for
With the sample size of 132 one can be 95% confident that 91% + 4.8% of the true
with the U.S. defense acquisition process would find stakeholder value scatterplot w/
92% of the respondents perceived the stakeholder value scatterplot with uncertainty as
having the potential to be at least somewhat helpful. With the sample size of 131 one can
be 95% confident that 92% + 4.6% of the true population (estimated to be 4,000 as
209
discussed in Chapter 3) would respond that the value scatterplot with uncertainty would be
helpful.
The integrated SEDA process map, the stakeholder value scatterplot, and the stakeholder
best practices with systems engineering activities to create a baseline from which future
papers can explore possible innovations to further enhance tradeoff study quality. The
relationship between requirements, the design choices made to address each requirement,
and the system level consequences of the sum of design choices across the full set of
and schedule. Through data visualization techniques, decision makers can quickly
recommendations that are robust in the presence of uncertainty. Figure 7-1 summarizes
INCOSE SE
Accepted Decision Chapter 5.3 Decision
Handbook
May 2014 Management Management Process
Chapter 5.3
Integrated Systems
Engineering Paper #70
Presented INCOSE 2014
Decision Systems Engineering Tradeoff
July 2014 Proceedings
Management Study Process Framework
Process
Paper #77
Presented INCOSE 2014 Potential Pitfalls of Tradeoff Study Cascading
July 2014 Proceedings Trade-off Studies Mistakes of Omission and
Commission
Integrated Systems
Guide to the Systems
Engineering
Accepted Engineering Body of
Decision Decision Management
June 2015 Knowledge
Management
(SEBoK), version 1.4
Process
A Systems Engineering
Defense Acquisition
Accepted for Perspective of the Revised
Journal of Systems Outcomes &
Publication Defense Acquisition System
Engineering Process
November 2015 SYS-15-003 Publication Date
Improvements
TBD
Integrated Systems
Engineering
Submitted, Chapter 5
Textbook Decision
In 3rd Peer Review Decision Management
Management
Process
Perceptions of Defense
Submitted
IEEE Systems Survey Results Acquisition Trade-Off Study
6 NOV 2015
Quality
212
An opportunity for additional research became apparent while conducting this research.
Computerized force on force combat models are important tools for estimating the degree
combined arms team as these models incorporate the interactions of the system of interest
with other elements of the friendly force, friendly tactics, enemy systems, enemy tactics,
terrain, and weather. Ideally, such models would be used to assess marginal impact on
for an optimized solution across stakeholder value space. These dynamic operational
models however are often large and complex, requiring a significant investment of time
and human resources to design and execute a simulated battle. Consequently, running a
large number of alternatives through a battery of synthetic fights to assess marginal impact
on the other hand, are well suited for efficient exploration of a large trade space but rely on
a set of priority weightings and value curves that are usually derived from interactions with
stakeholders. In practice, many complex decisions have numerous stakeholders and often
these stakeholders have divergent views regarding priorities and values. Future research
will investigate a methodology for eliciting priority weightings and value functions through
thoughtfully designed combat simulations and show how these objectively derived
priorities and values, when used as inputs to a well-structured multiple objective value
model, can help facilitate negotiations between stakeholders with conflicting views and
This appendix is dedicated to the clear presentation of quantitative data collected via the
survey instrument. Interpretation of the results were provided in the discussion section of
the main body through graphs and descriptive statistics of the quantitative data as well as
Question 1
Your participation is voluntary and you may refuse to participate or withdraw at any time without penalty or loss of benefits to which you are otherwise
entitled. You may also refuse to answer any question. While you will not directly benefit from participation, your participation may help investigators
develop a decision support process tailored to the needs of defense acquisition professionals seeking to understand the range of potential system level
cost, schedule, and performance consequences of a set of contemplated requirements. Subject’s name will not be collected. Subject’s identity will
remain anonymous. There are no foreseeable risks associated with participation in proposed survey. Participation in this project is voluntary and the
only alternative to this project is non-participation. The results of this study may be published in professional and/or scientific journals. It may also be
used for educational purposes or for professional presentations. However, no individual subject will be identified. How would you like to proceed?
Consent granted, I wish to proceed with the survey Consent denied, I wish to exit the survey
81 3
Question 2
With which of the following discipline descriptions do you most closely identify?
Primarily Systems Primarily Operations Equal Parts Systems
Exclusively Systems Exclusively Operations
Engineering w/ some Research w/ some Engineering and Other
Engineering Research
Operations Research Systems Engineering Operations Research
44 8 45 16 34 22
Question 3
About how many years of experience do you have in military operations research, defense systems engineering, or general defense acquisition?
>
<5 5-9 10 - 14 15 - 19 20 - 24 25 – 29 30 - 34 35 - 39 40 - 44
44
20 20 26 17 13 29 23 11 6 5
214
Question 4
Success rates of defense acquisition programs are lower than desired. Many reports point to an open-loop requirements capability writing process as a
potential root cause of some undesired acquisition outcomes. The Department of Defense has recently revised the Defense Acquisition System to
include a move to a closed-loop capability requirements writing process informed by rigorous assessments of a broad range of system level
alternatives across a thorough set of stakeholder objectives to include life-cycle costs, schedule, and performance. This observation was made while
working with several acquisition projects. In order to help interpret this observation and perhaps generalize results to the population of all early
defense acquisition activity, please indicate the degree to which this observation reflects your own experience.
Extremely Somewhat No Basis to
Very Similar Somewhat Similar Very Different Extremely Different
Similar Different Assess
23 54 44 7 2 3 7
Question 5
In your experience with Defense Acquisition, how often are the requirements within Capability Development Documents (CDDs) crafted with the
benefit of feedback from the cost analysis and engineering community regarding system level consequences across each of the following elements of
stakeholder value? (Life Cycle Costs, Development Schedule, Performance)
Almost Very Very Almost No Basis to
Rarely Occasionally Frequently
Never Rarely Frequently Always Assess
Life Cycle Costs 13 18 35 27 25 8 8 8
Development
5 13 22 44 24 10 13 10
Schedule
Performance 6 7 11 32 39 20 18 9
Question 6
How would you assess the impact on the defense acquisition outcome given requirements were crafted without a full understanding of system level
consequences in the following elements of stakeholder value? (Life Cycle Costs, Development Schedule, Performance)
No Basis to
Insignificant Slight Minor Moderate Major Severe Catastrophic
Assess
Life Cycle Costs 0 1 3 14 46 63 12 3
Development
0 0 2 24 50 51 11 3
Schedule
Performance 0 0 8 17 50 51 12 4
Question 7
Defense acquisition professionals are eager to comply with the new defense acquisition instructions but the large number of variables involved with
the exploration of emerging requirements and their combined relationship to notional materiel systems causes compliance with the new instructions
to be difficult. Consequently, requests for tools and techniques that can help model the relationship between physical design decisions and
consequences of those decisions at the system level as measured across all elements of stakeholder value are growing more frequent and intense.
This observation was made while working with several acquisition projects. In order to help interpret this observation and perhaps generalize results
to the population of all early defense acquisition activity, please indicate the degree to which this observation reflects your own experience.
Extremely Somewhat Somewhat No Basis to
Very Similar Very Different Extremely Different
Similar Similar Different Assess
19 51 40 11 8 0 7
215
Question 8
A literature review reveals that Multiple Objective Decision Analysis (MODA) techniques can be applied in a way that provides a conceptual link
between the voice of the customer, the voice of the engineer, the voice of the cost analyst, etc. The literature also reveals, however, that a wide
variety of MODA based models have been created and applied to product design problems over the past decade with varying degrees of success.
This research seeks to develop an integrated Systems Engineering / Multiple Objectives Decision Analysis (SE/MODA) Process tailored to support
those seeking to understand the range of potential system level cost, schedule, and performance consequences of a set of contemplated requirements.
The research aims to identify potential pitfalls and best practices associated with the execution of a trade study process, introduce innovations as
needed, and apply all findings to the integrated SE/MODA process. The research focus described here was identified while working with several
acquisition projects. In order to help interpret the potential usefulness of the research product to the population of SE and OR military professionals
if successful, please consider the following: If faced with the task of facilitating fully informed requirements writing, how useful would it be to have
an integrated SE/MODA process supported by tools and techniques that can help model the relationship between physical design decisions and the
consequences of those decisions across system level cost, schedule, and performance?
Extremely Somewhat Somewhat Very Extremely No Basis
Very Helpful Benign
Helpful Helpful Harmful Harmful Harmful to Assess
47 48 27 4 2 1 1 3
Question 9
The integrated SE/MODA process illustrated in (Figure A1) combines the structured, holistic view fundamental to the systems engineering discipline
with the analytical rigor of the operations research discipline. The white text within the outer green ring identifies elements of a systems engineering
process while the ten blue arrows represent the ten steps of the Decision Management Process. Interaction between the systems engineering process
and the Decision Management Process are represented by the small, dotted green or blue arrows. The circular shape of the process map is meant to
convey the notion of an iterative process with significant interaction between the process steps. The feedback loops seek to capture new information
regarding the decision task at any point in the decision process and make appropriate adjustments. The integrated SE/MODA process tailored for
informing requirements described here was drafted while working with several acquisition projects. In order to help interpret the potential usefulness
of the research product to the population of SE and OR military professionals, please consider the following: If faced with the task of facilitating
fully informed requirements writing, how useful would it be to have a process map such as the one illustrated in Figure A1?
Extremely Somewhat Somewhat Very Extremely No Basis
Very Helpful Benign
Helpful Helpful Harmful Harmful Harmful to Assess
26 50 44 7 1 1 0 3
Question 10
The stakeholder value scatterplot shows in one chart how all system level alternatives respond with regards to lifecycle costs, performance,
development schedule, and long term viability. Seeing the entire picture all at once is potentially important because finding a solution that meets one
or two of the stakeholder values is trivial but the real world challenge to select alternatives in a way that best balances the competing trades is often
difficult. In this visualization, each system alternative is represented by a scatterplot marker. An alternative’s lifecycle cost and performance value are
indicated by a marker's x and y position respectively. An alternative’s development schedule is indicated by the color of the marker while the long
term viability for a particular alternative is indicated by the shape of the marker as described in the legend of the illustration of Figure 2. How useful
do you think a scatterplot like the one illustrated in Figure 2 could potentially be to understanding system level cost, schedule, and performance
consequences of a set of contemplated requirements?
Extremely Somewhat Somewhat Very Extremely No Basis
Very Helpful Benign
Helpful Helpful Harmful Harmful Harmful to Assess
34 43 43 3 4 3 1 1
Question 11
Figure 4 shows an example of how one might visualize the degree to which uncertainty impacts performance value. Uncertainty present in a defense
acquisition trade study often stems from the fidelity of single dimensional consequence estimates and from the understanding that stakeholders may
not all share a common view of weightings associated with each single dimensional consequence used as part of the aggregating function. A
visualization like the one in Figure 3 shows decision makers the degree of volatility associated with each alternative's performance value if
weightings change and/or as uncertainty surrounding single dimensional objective measures are incorporated. How useful do you think a scatterplot
like the one illustrated in Figure 3 could potentially be to understanding system level cost, schedule, and performance consequences of a set of
contemplated requirements?
Extremely Somewhat Somewhat Very Extremely No Basis
Very Helpful Benign
Helpful Helpful Harmful Harmful Harmful to Assess
35 58 28 4 3 1 0 2
216
Table A5. Perceived Frequency and Impact of Potential Pitfalls While Framing a Decision
Question 12
As the first step in a trade-off study, analysts seek to frame the decision opportunity at hand by capturing a description of the system baseline as well
as a notion for how the envisioned system will be used along with system boundaries and anticipated interfaces. Decision context includes such
details as the timeframe allotted for the decisions, an explicit list of decision makers and stakeholders, available resources, and expectations regarding
the type of action to be taken as a result of the decision at hand as well as decisions anticipated in the future. In an effort to better understand
potential pitfalls associated with the execution of a trade study, please indicate how often you have observed the following potential pitfalls while
framing such a study. (Trade Study Structured to Answer Wrong Question,
Trade Study Structured to Collect Information Inconsequential to Decision, Insufficient Access to Decision Maker, Insufficient Access to End User,
Insufficient Access to Affordability Analysis, Viewpoints of Various Stakeholders Not Collected)
Very Occasional Very Almost No Basis
Almost Never Rarely Frequently
Rarely ly Frequently Always to Assess
Trade Study Structured
to Answer Wrong 1 5 15 57 24 17 6 7
Question
Trade Study Structured
to Collect Information
2 2 13 42 40 24 4 5
Inconsequential to
Decision
Insufficient Access to
0 1 15 27 31 38 16 3
Decision Maker
Insufficient Access to
1 4 16 21 37 29 20 3
End User
Insufficient Access to
1 2 13 26 36 31 14 5
Affordability Analysis
Viewpoints of Various
Stakeholders Not 0 2 10 30 41 30 15 4
Collected
Question 13
If the following pitfalls should occur, how would you assess the impact on the overall trade study process output? (Trade Study Structured to Answer
Wrong Question, Trade Study Structured to Collect Information Inconsequential to Decision, Insufficient Access to Decision Maker, Insufficient
Access to End User, Insufficient Access to Affordability Analysis, Viewpoints of Various Stakeholders Not Collected)
Cata- No Basis
Insignificant Slight Minor Moderate Major Severe
strophic to Assess
Trade Study Structured
to Answer Wrong 0 1 1 12 26 39 50 3
Question
Trade Study Structured
to Collect Information
- - - - - - - -
Inconsequential to
Decision
Insufficient Access to
0 1 1 16 53 41 18 2
Decision Maker
Insufficient Access to
0 0 2 15 36 55 22 2
End User
Insufficient Access to
0 1 4 25 56 34 9 3
Affordability Analysis
Viewpoints of Various
Stakeholders Not 0 1 3 20 47 47 12 2
Collected
217
Table A6. Perceived Frequency and Impact of Potential Pitfalls While Developing Objectives
Question 14
Fundamental objectives are "the ends objectives used to describe the consequences that essentially define the basic reasons for being interested in the
decision" Reference: Edwards, W., Miles, R.F., von Winterfeldt, D., Advances in Decision Analysis: From Foundations to Applications. Cambridge
University Press, New York, NY, 2007. Page 110, 113. In an effort to better understand potential pitfalls associated with the execution of a trade
study, please indicate how often you have observed the following potential pitfalls while developing objectives. (Objectives Set Contain Means-To-
An-End Objectives, Structured Techniques to Identify Objectives Not Used, Objectives Are Not Logically Organized In A Hierarchy, Objectives Set
Is Incomplete, Objectives Set Contains Objectives That Are Preferentially Dependent, Objectives Set Contains Redundant Objectives)
Almost Very Very Almost No Basis
Rarely Occasionally Frequently
Never Rarely Frequently Always to Assess
Objectives Set Contain
Means-To-An-End 1 2 8 46 31 21 6 15
Objectives
Structured Techniques to
Identify Objectives Not 0 1 11 29 52 26 3 8
Used
Objectives Are Not
Logically Organized In A 2 2 2 39 36 37 7 5
Hierarchy
Objectives Set Is
1 4 8 31 38 32 12 4
Incomplete
Objectives Set Contains
Objectives That Are 0 4 5 30 37 39 4 11
Preferentially Dependent
Objectives Set Contains
1 8 12 47 32 20 5 5
Redundant Objectives
Question 15
If the following pitfalls should occur, how would you assess the impact on the overall trade study process output? (Objectives Set Contain Means-To-
An-End Objectives, Structured Techniques to Identify Objectives Not Used, Objectives Are Not Logically Organized In A Hierarchy, Objectives Set
Is Incomplete, Objectives Set Contains Objectives That Are Preferentially Dependent, Objectives Set Contains Redundant Objectives)
Insig- Cata- No Basis
Slight Minor Moderate Major Severe
nificant strophic to Assess
Objectives Set Contain
Means-To-An-End 0 1 4 38 38 25 8 16
Objectives
Structured Techniques to
Identify Objectives Not 0 5 4 36 42 31 4 9
Used
Objectives Are Not
Logically Organized In A 1 3 10 46 34 28 1 7
Hierarchy
Objectives Set Is
0 0 1 17 54 37 16 6
Incomplete
Objectives Set Contains
Objectives That Are 1 0 9 32 49 23 6 10
Preferentially Dependent
Objectives Set Contains
1 6 30 39 27 19 1 8
Redundant Objectives
218
Table A7. Perceived Frequency and Impact of Potential Pitfalls While Developing Measures
Question 16
For each fundamental objective, a measure (also known as attribute, criterion, and metric) must be established so that alternatives that more fully
satisfy the objective receive a better score on the measure than those alternatives that satisfy the objective to a lesser degree. In an effort to better
understand potential pitfalls associated with the execution of a trade study, please indicate how often you have observed the following potential
pitfalls while developing measures. (Measure Is Ambiguous, Measure Does Not Cover the Range of Possible Consequences, Measure Does Not
Directly Describe Consequence of Interest, Measure Is Non-Operational In That Information To Describe Consequences Cannot Be Obtained,
Measure Is Not Easily Understood, Measure Does Not Incorporate Output of Modeling, Simulation, or Test Tools Even Though Such Output Could
Be Generated, Measure Uses Ordinal Scale)
Almost Very Very Almost No Basis to
Rarely Occasionally Frequently
Never Rarely Frequently Always Assess
Measure Is Ambiguous 2 2 11 36 41 28 4 4
Measure Does Not Cover the
Range of Possible 1 2 11 31 47 26 6 5
Consequences
Measure Does Not Directly
Describe Consequence of 0 3 9 34 50 22 3 8
Interest
Measure Is Non-Operational
In That Information To
1 3 10 44 34 21 3 13
Describe Consequences
Cannot Be Obtained
Measure Is Not Easily
2 5 10 33 44 29 2 4
Understood
Measure Does Not
Incorporate Output of
Modeling, Simulation, or Test 0 4 15 29 37 26 8 9
Tools Even Though Such
Output Could Be Generated
Measure Uses Ordinal Scale 4 4 16 30 34 16 3 20
Question 17
If the following pitfalls should occur, how would you assess the impact on the overall trade study process output? (Measure Is Ambiguous, Measure
Does Not Cover the Range of Possible Consequences, Measure Does Not Directly Describe Consequence of Interest, Measure Is Non-Operational In
That Information To Describe Consequences Cannot Be Obtained, Measure Is Not Easily Understood, Measure Does Not Incorporate Output of
Modeling, Simulation, or Test Tools Even Though Such Output Could Be Generated, Measure Uses Ordinal Scale)
Insignif Catastrophi No Basis to
Slight Minor Moderate Major Severe
icant c Assess
Measure Is Ambiguous 0 0 2 19 58 33 13 5
Measure Does Not Cover the
Range of Possible 0 0 4 19 46 47 8 5
Consequences
Measure Does Not Directly
Describe Consequence of 0 0 4 25 37 44 12 7
Interest
Measure Is Non-Operational
In That Information To
0 0 4 24 40 39 11 12
Describe Consequences
Cannot Be Obtained
Measure Is Not Easily
0 0 6 16 52 38 13 5
Understood
Measure Does Not
Incorporate Output of
Modeling, Simulation, or Test 0 1 9 30 42 32 9 7
Tools Even Though Such
Output Could Be Generated
Measure Uses Ordinal Scale 0 1 23 36 31 11 6 20
219
Question 18
All decisions involve elements of subjectivity. The distinctive feature of a formal decision support process is that these subjective elements are
rigorously documented so that the consequences can be identified and assessed. Towards that end, a defining feature of Multiple Objective Decision
Analysis (MODA) is the transformation from measure space to value space that enables mathematical representation of a composite value score
across multiple measures. This transformation is performed through the use of a value function – a description of returns to scale on the measure. In
an effort to better understand potential pitfalls associated with the execution of a trade study, please indicate how often you have you observed the
following potential pitfalls while synthesizing data. (Value Functions Not Used, Value Functions Used But Are Non-Normalized, Weights
Determined By Reflecting On Most Recent Experience Only (Recency Effect & Availability Bias), Weights Determined By Importance Only Rather
Than Importance & Differentiation, Weights Collected From A Small Set of Like-Minded Stakeholders (Representativeness Bias))
Almost Very Very Almost No Basis to
Rarely Occasionally Frequently
Never Rarely Frequently Always Assess
Value Functions Not Used 3 2 12 35 28 22 13 12
Value Functions Used But
3 2 15 35 27 17 6 22
Are Non-Normalized
Weights Determined By
Reflecting On Most Recent
1 2 6 29 42 23 10 14
Experience Only (Recency
Effect & Availability Bias)
Weights Determined By
Importance Only Rather Than 1 1 5 23 40 20 20 17
Importance & Differentiation
Weights Collected From A
Small Set of Like-Minded
1 1 4 15 42 38 15 10
Stakeholders
(Representativeness Bias)
Question 19
If the following pitfalls should occur, how would you assess the impact on the overall trade study process output? (Value Functions Not Used, Value
Functions Used But Are Non-Normalized, Weights Determined By Reflecting On Most Recent Experience Only (Recency Effect & Availability
Bias), Weights Determined By Importance Only Rather Than Importance & Differentiation, Weights Collected From A Small Set of Like-Minded
Stakeholders (Representativeness Bias))
Insignif Catastrophi No Basis to
Slight Minor Moderate Major Severe
icant c Assess
Value Functions Not Used 0 1 3 26 45 33 8 10
Value Functions Used But
0 1 5 36 44 19 5 16
Are Non-Normalized
Weights Determined By
Reflecting On Most Recent
0 1 5 28 47 30 4 11
Experience Only (Recency
Effect & Availability Bias)
Weights Determined By
Importance Only Rather Than 0 2 10 28 42 29 3 12
Importance & Differentiation
Weights Collected From A
Small Set of Like-Minded
0 1 1 16 37 46 15 10
Stakeholders
(Representativeness Bias)
220
TableA9. Perceived Frequency and Impact of Potential Pitfalls While Generating Alternatives
Question 20
For many trade studies, the alternatives will be systems composed of many interrelated subsystems. It is important to establish a meaningful product
structure for the system of interest and to apply this product structure consistently throughout the decision analysis effort in order to aid effectiveness
and efficiency of communications about alternatives. The product structure should be a useful decomposition of the physical elements of the system
of interest. In an effort to better understand potential pitfalls associated with the execution of a trade study, please indicate how often you have
observed the following potential pitfalls while generating alternatives. (Alternatives Not Well Defined And Open To Various Interpretations,
Structured Techniques For Creative Alternative Generation Not Used, Very Limited Set of Alternatives Considered)
Almost Very Very Almost No Basis
Rarely Occasionally Frequently
Never Rarely Frequently Always to Assess
Alternatives Not Well
Defined And Open To 1 3 14 37 42 18 11 2
Various Interpretations
Structured Techniques For
Creative Alternative 0 2 6 25 48 25 14 7
Generation Not Used
Very Limited Set of
2 1 4 24 49 32 15 1
Alternatives Considered
Question 21
If the following pitfalls should occur, how would you assess the impact on the overall trade study process output? (Alternatives Not Well Defined
And Open To Various Interpretations, Structured Techniques For Creative Alternative Generation Not Used, Very Limited Set of Alternatives
Considered)
Insignif Catastro No Basis
Slight Minor Moderate Major Severe
icant phic to Assess
Alternatives Not Well
Defined And Open To 0 1 2 25 41 40 15 3
Various Interpretations
Structured Techniques For
Creative Alternative 0 2 11 40 36 30 2 6
Generation Not Used
Very Limited Set of
0 0 5 20 48 43 9 2
Alternatives Considered
TableA10. Perceived Frequency and Impact of Potential Pitfalls While Assessing Alternatives
Question 22
With objectives and measures established and alternatives identified and defined, the decision team engages subject matter experts, ideally equipped
with operational data, test data, models, simulation and expert knowledge. In an effort to better understand potential pitfalls associated with the
execution of a trade study, please indicate how often you have observed the following potential pitfalls while assessing alternatives. (Single
Dimensional Scores Are Not Credible - Assessments Formed By Non-Experts, Single Dimensional Scores Are Not Credible - Assessments Are
Conducted Inconsistently Across Alternatives)
Almost Very Very Almost No Basis to
Rarely Occasionally Frequently
Never Rarely Frequently Always Assess
Single Dimensional Scores
Are Not Credible -
1 4 15 41 31 18 3 13
Assessments Formed By
Non-Experts
Single Dimensional Scores
Are Not Credible -
Assessments Are Conducted 2 3 15 35 31 23 3 13
Inconsistently Across
Alternatives
Question 23
If the following pitfalls should occur, how would you assess the impact on the overall trade study process output? (Single Dimensional Scores Are Not
Credible - Assessments Formed By Non-Experts, Single Dimensional Scores Are Not Credible - Assessments Are Conducted Inconsistently Across
Alternatives)
No Basis to
Insignificant Slight Minor Moderate Major Severe Catastrophic
Assess
Single Dimensional Scores
Are Not Credible -
0 0 2 15 48 36 12 12
Assessments Formed By
Non-Experts
Single Dimensional Scores
Are Not Credible -
Assessments Are Conducted 0 0 2 17 42 42 10 12
Inconsistently Across
Alternatives
221
Table A11. Perceived Frequency and Impact of Potential Pitfalls While Synthesizing Results
Question 24
At this point in the process the decision team has generated a large amount of data. Synthesizing data allows the results to be displayed in a way that
facilitates understanding. In an effort to better understand potential pitfalls associated with the execution of a trade study, please indicate how often
you have you observed the following potential pitfalls while synthesizing data. (Individual Analyses Presented Without Synthesis - System Level
Trade-offs Are Not Communicated, Single Dimensional Scores Not Aggregated Logically)
Almost Very Very Almost No Basis
Rarely Occasionally Frequently
Never Rarely Frequently Always to Assess
Individual Analyses
Presented Without Synthesis -
1 2 13 33 40 25 6 7
System Level Trade-offs Are
Not Communicated
Single Dimensional Scores
1 3 18 31 32 18 8 14
Not Aggregated Logically
Question 25
If the following pitfalls should occur, how would you assess the impact on the overall trade study process output? (Individual Analyses Presented
Without Synthesis - System Level Trade-offs Are Not Communicated, Single Dimensional Scores Not Aggregated Logically)
Insignifica Catastrophi No Basis
Slight Minor Moderate Major Severe
nt c to Assess
Individual Analyses
Presented Without Synthesis -
0 1 2 24 40 43 8 8
System Level Trade-offs Are
Not Communicated
Single Dimensional Scores
0 1 3 33 34 37 3 13
Not Aggregated Logically
222
TableA12. Perceived Frequency and Impact of Potential Pitfalls While Identifying & Assessing
Uncertainty
Question 26
Uncertainty present in a defense acquisition trade study often stems from the fidelity of single dimensional consequence estimates and from the
understanding that stakeholders may not all share a common view of weightings associated with each single dimensional consequence used as part of
the aggregating function. It is important to capture and discuss these uncertainties and assess impact to overall trade-space. Sensitivity analysis allows
decision makers to see how the value of each alternative moves as weightings change and/or as uncertainty surrounding single dimensional objective
measure is incorporated. This allows the decision analyst to identify the uncertainties that impact the decision findings and the uncertainties that are
inconsequential to decision findings. In an effort to better understand potential pitfalls associated with the execution of a trade study, please indicate
how often you have observed the following potential pitfalls while incorporating uncertainty. (Uncertainty Surrounding Expected Value Not Measured,
Differing Value Schemes Among Stakeholders Not Acknowledged, Uncertainty's Impact On Decision Not Assessed Or Misunderstood, No
Relationship Between Risk Assessment & Trade-off Studies)
Almost Very Very Almost No Basis to
Rarely Occasionally Frequently
Never Rarely Frequently Always Assess
Uncertainty Surrounding
Expected Value Not 2 3 7 13 42 29 22 7
Measured
Differing Value Schemes
Among Stakeholders Not 0 1 6 19 47 29 17 6
Acknowledged
Uncertainty's Impact On
Decision Not Assessed Or 1 2 5 13 41 37 24 2
Misunderstood
No Relationship Between
Risk Assessment & Trade-off 0 3 13 18 45 26 16 3
Studies
Question 27
If the following pitfalls should occur, how would you assess the impact on the overall trade study process output? (Uncertainty Surrounding Expected
Value Not Measured, Differing Value Schemes Among Stakeholders Not Acknowledged, Uncertainty's Impact On Decision Not Assessed Or
Misunderstood, No Relationship Between Risk Assessment & Trade-off Studies)
No Basis to
Insignificant Slight Minor Moderate Major Severe Catastrophic
Assess
Uncertainty Surrounding
Expected Value Not 0 0 3 20 50 37 9 6
Measured
Differing Value Schemes
Among Stakeholders Not 0 0 4 31 45 34 7 4
Acknowledged
Uncertainty's Impact On
Decision Not Assessed Or 0 0 2 21 44 40 16 2
Misunderstood
No Relationship Between
Risk Assessment & Trade-off 0 0 2 20 39 51 10 3
Studies
223
Question 28
After mining the data generated for the first set of alternatives for opportunities to modify some subsystem design choices to claim untapped value
and reduce risk, the decision team identifies key observations regarding what stakeholders seem to want and what they must be willing to give up in
order achieve it. In an effort to better understand potential pitfalls associated with the execution of a trade study, please indicate how often you have
observed the following potential pitfalls while seeking to improve alternatives, communicate findings, and provide recommendation with
implementation plan. (No Attempt To Improve Original Alternatives, Premature Convergence On A Recommendation, Chronic Divergence -
Recommendation Never Made, Recommendation Unclear, Recommendation Presented Without Implementation Plan)
Almost Very Very Almost No Basis
Rarely Occasionally Frequently
Never Rarely Frequently Always to Assess
No Attempt To Improve
1 3 11 43 37 19 9 2
Original Alternatives
Premature Convergence On A
0 2 3 16 49 34 18 3
Recommendation
Chronic Divergence -
Recommendation Never 1 5 28 46 21 11 2 11
Made
Recommendation Unclear 2 4 28 41 33 12 1 3
Recommendation Presented
0 2 12 33 41 20 11 6
Without Implementation Plan
Question 29
If the following pitfalls should occur, how would you assess the impact on the overall trade study process output? (No Attempt To Improve Original
Alternatives, Premature Convergence On A Recommendation, Chronic Divergence - Recommendation Never Made, Recommendation Unclear,
Recommendation Presented Without Implementation Plan)
Insignific Catastrop No Basis
Slight Minor Moderate Major Severe
ant hic to Assess
No Attempt to Improve
0 1 4 39 50 26 2 2
Original Alternatives
Premature Convergence On A
0 1 1 12 51 42 14 3
Recommendation
Chronic Divergence -
Recommendation Never 0 1 5 21 34 32 22 9
Made
Recommendation Unclear 0 1 3 24 48 32 15 1
Recommendation Presented
0 2 8 32 39 32 7 4
Without Implementation Plan
224
Bibliography
Amir, O., 2008. Tough choices: How making decisions tires your brain. Scientific
American: Mind Matters.
Andradóttir, S. & Prudius, A.A., 2009. Balanced explorative and exploitative search with
estimation for simulation optimization. INFORMS Journal on Computing, 21(2),
pp.193–208.
Augustine, N. & others, 2009. Getting to Best: Reforming the Defense Acquisition
Enterprise.
Banks, J. & Chwif, L., 2010. Warnings about simulation. Journal of Simulation, 5(4),
pp.279–291.
Baucells, M. & Rata, C., 2006. A survey study of factors influencing risk-taking behavior
in real-world decisions under uncertainty. Decision Analysis, 3(3), pp.163–176.
Berente, N. & Yoo, Y., 2012. Institutional contradictions and loose coupling:
Postimplementation of NASA’s enterprise information system. Information Systems
Research, 23(2), pp.376–396.
Board, A.F.S. & others, 2008. Pre-Milestone A and Early-Phase Systems Engineering:: A
Retrospective Review and Benefits for Future Air Force Acquisition, National
Academies Press.
225
Boardman, J., 2010. On Systemic Media For Problem Owners. In Systems Research
Forum. pp. 1–32.
Boardman, J. et al., 2009. The conceptagon: A framework for systems thinking and
systems practice. In Systems, Man and Cybernetics, 2009. SMC 2009. IEEE
International Conference on. pp. 3299–3304.
Boardman, J. & Sauser, B., 2013. Systemic thinking: building maps for worlds of systems,
John Wiley \& Sons.
Boardman, J. & Sauser, B., 2008. Systems thinking: Coping with 21st century problems,
CRC Press.
Bond, S.D., Carlson, K.A. & Keeney, R.L., 2010. Improving the generation of decision
objectives. Decision Analysis, 7(3), pp.238–255.
Borcherding, K., Eppel, T. & Von Winterfeldt, D., 1991. Comparison of weighting
judgments in multiattribute utility measurement. Management Science, 37(12),
pp.1603–1619.
Brody, H., 1993. Great expectations: why predictions go awry. Journal of Consumer
Marketing, 10(1), pp.23–27.
Brown, T. & others, 2008. Design thinking. Harvard business review, 86(6), p.84.
Browning, T.R., 2001. Applying the design structure matrix to system decomposition and
integration problems: a review and new directions. Engineering Management, IEEE
Transactions on, 48(3), pp.292–306.
Browning, T.R., 1999. Sources of schedule risk in complex system development. Systems
Engineering, 2(3), pp.129–142.
Buede, D., 1996. Second Overview of the MCDA Software Market. Journal of Multi-
Criteria Decision Analysis, 5(4), pp.312–316.
Buede, D.M., 1997a. Developing originating requirements: defining the design decisions.
Aerospace and Electronic Systems, IEEE Transactions on, 33(2), pp.596–609.
Buede, D.M., 1994. Engineering design using decision analysis. In Systems, Man, and
Cybernetics, 1994.’Humans, Information and Technology’., 1994 IEEE International
226
Buede, D.M., 2011. The engineering design of systems: models and methods, John Wiley
\& Sons.
Buede, D.M. & Bresnick, T.A., 1992. Applications of decision analysis to the military
systems acquisition process. Interfaces, 22(6), pp.110–125.
Buede, D.M. & Choisser, R.W., 1992. Providing an analytic structure for key system
design choices. Journal of Multi-Criteria Decision Analysis, 1(1), pp.17–27.
Buede, D.M. & Maxwell, D.T., 1995. Rank disagreement: A comparison of multi-criteria
methodologies. Journal of Multi-Criteria Decision Analysis, 4(1), pp.1–21.
Bussen, W. & Myers, M.D., 1997. Executive information system failure: A New Zealand
case study. Journal of Information Technology, 12(2), pp.145–153.
Card, S.K., Mackinlay, J.D. & Shneiderman, B., 1999. Readings in information
visualization: using vision to think, Morgan Kaufmann Pub.
Carty, A., 2002. An approach to multidisciplinary design, analysis and optimization for
Rapid Conceptual Design. AIAA Paper, 5438.
Chambal, S.P. et al., 2011. A practical procedure for customizable one-way sensitivity
analysis in additive value models. Decision Analysis, 8(4), pp.303–321.
Chaplain, C. et al., 2014. Canceled DOD Programs: DOD Needs to Better Use Available
Guidance and Manage Reusable Assets,
Chen, J.Q. & Lee, S.M., 2003. An exploratory cognitive DSS for strategic decision
making. Decision support systems, 36(2), pp.147–160.
Chen, L.-H. & Ko, W.-C., 2009. Fuzzy approaches to quality function deployment for
new product design. Fuzzy sets and systems, 160(18), pp.2620–2639.
227
Cho, K.T., 2003. Multicriteria decision methods: an attempt to evaluate and unify.
Mathematical and computer modelling, 37(9), pp.1099–1119.
Chou, P.-N. & Ma, Z., Trends in Information Visualization Research: A Content Analysis
in a Referred Journal.
Chou, S.-Y. & Chang, Y.-H., 2008. A decision support system for supplier selection
based on a strategy-aligned fuzzy SMART approach. Expert systems with applications,
34(4), pp.2241–2253.
Cilli, M., Parnell, G., Cloutier, R., and Zigh, T., 2016 Expected. A Systems Engineering
Perspective of the Revised Defense Acquisition System. Systems Engineering, XX(X),
XXX-XXX. Accepted for publication 11/29/2015
Clemen, R.T. & Reilly, T., 2001. Making hard decisions with decisions tools. ISBN: 0-
534-36597-3.
Collopy, P. & Consulting, D., 2006. Value-driven design and the global positioning
system. AIAA paper, 7213.
Collopy, P. & Horton, R., 2002. Value modeling for technology evaluation. AIAA Paper,
3622.
Da Costa, P.C.G. & Buede, D.M., 2000. Dynamic decision making: A comparison of
approaches. Journal of Multi-Criteria Decision Analysis, 9(6), pp.243–262.
Cox Jr, L.A., Brown, G.G. & Pollock, S.M., 2008. When Is Uncertainty About
Uncertainty Worth Characterizing? Interfaces, 38(6), pp.465–468.
Crawford, C.M. & Di Benedetto, C.A., 2008. New products management, Tata McGraw-
Hill Education.
Creswell, J., 2014. Research design: Qualitative, quantitative and mixed methods
approaches. Sage publications.
Cristiano, J.J., White III, C.C. & Liker, J.K., 2001. Application of multiattribute decision
analysis to quality function deployment for target setting. Systems, Man, and
Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 31(3), pp.366–
382.
228
Crocker, A.M., Charania, A. & Olds, J., 2001. An introduction to the rosetta modeling
process for advanced space transportation technology investment. AIAA, 4625, pp.28–
30.
Dahan, E. & Hauser, J.R., 2002. The virtual customer. Journal of Product Innovation
Management, 19(5), pp.332–353.
Dahan, E. & Srinivasan, V., 2000. The predictive power of internet-based product
concept testing using visual depiction and animation. Journal of Product Innovation
Management, 17(2), pp.99–109.
Dasu, S. & Eastman, C., 2012. Management of design: engineering and management
perspectives, Springer Science \& Business Media.
Davenport, T.H. & Harris, J.G., 2007. Competing on analytics. Harvard Business School
Publishing Corporation, Boston.
Davenport, T.H., Harris, J.G. & Morison, R., 2010. Analytics at work: smarter decisions,
better results, Harvard Business Press.
Davis, P.K. & Blumenthal, D., 1991. The base of sand problem: A white paper on the
state of military combat modeling,
Davis, P.K., Kulick, J. & Egner, M., 2005. Implications of modern decision science for
military decision-support systems, Rand Corporation.
Dees, R.A., Dabkowski, M.F. & Parnell, G.S., 2010. Decision-focused transformation of
additive value models to improve communication. Decision Analysis, 7(2), pp.172–
184.
Delano, G. et al., 2000. Quality function deployment and decision analysis: a R\&D case
study. International Journal of Operations \& Production Management, 20(5),
pp.591–609.
Dickerson, C.E. & Mavris, D., 2013. A brief history of models and model based systems
engineering and the case for relational orientation. Systems Journal, IEEE, 7(4),
pp.581–592.
229
Dillon, R.L., John, R. & von Winterfeldt, D., 2002. Assessment of cost uncertainties for
large technology projects: A methodology and an application. Interfaces, 32(4), pp.52–
66.
Directive, DoD., 2013. 7045.14 The Planning, Programming, and Budgeting System
(PPBS).
Dyer, J.S. et al., 1992. Multiple Criteria Decision Making, Multiattribute Utility Theory:
The Next Ten Years. Management Science, 38(5), pp.645–654.
Dyer, J.S., 1990. Remarks on the analytic hierarchy process. Management science,
pp.249–258.
Edson, R., 2008. Systems thinking. Applied. A primer. ASYST Institute (ed.). Arlington,
VA: Analytic Services.
Edwards, W., Miles Jr, R.F. & Von Winterfeldt, D., 2007. Advances in decision analysis:
from foundations to applications, Cambridge University Press.
Estefan, J.A. & others, 2007. Survey of model-based systems engineering (MBSE)
methodologies. Incose MBSE Focus Group, 25.
Ewing, P.L., Tarantino, W. & Parnell, G.S., 2006. Use of decision analysis in the army
base realignment and closure (BRAC) 2005 military value analysis. Decision Analysis,
3(1), pp.33–49.
Felix, A., 2004. Standard approach to trade studies: A process improvement model that
enables systems engineers to provide information to the project manager by going
beyond the summary matrix. In International Council on Systems Engineering
230
Feng, T., Keller, L.R. & Zheng, X., 2008. Modeling Multi-Objective Multi-Stakeholder
Decisions: A Case-Exercise Approach. INFORMS Transactions on Education.
Fernandez, J.A., 2010. Contextual role of TRLs and MRLs in technology management.,
Few, S., 2009. Now you see it: simple visualization techniques for quantitative analysis,
Analytics Press.
Few, S., 2004. Show me the numbers: Designing Tables and Graphs to Enlighten,
Analytics Press Oakland, CA.
Few, S., 2008. What ordinary people need most from information visualization today.
Perceptual Edge: Visual Business Intelligence Newsletter.
Figueira, J., Greco, S. & Ehrgott, M., 2005. Multiple criteria decision analysis: state of
the art surveys, Springer Science \& Business Media.
Finger, S. & Dixon, J.R., 1989. A review of research in mechanical engineering design.
Part I: Descriptive, prescriptive, and computer-based models of design processes.
Research in engineering design, 1(1), pp.51–67.
Fischer, M.J., 2006. Computing and Analytic Science: A Useful Symbiosis. Making
Better Decisions.
Found, W.G., 2007. Stronger Practices Needed to Improve DoD Technology Transition
Processes. GAO-06-883.
Fox, C.R. & Clemen, R.T., 2005. Subjective probability assessment in decision analysis:
Partition dependence and bias toward the ignorance prior. Management Science, 51(9),
pp.1417–1432.
French, S., 1998. Decision making not decision theory. Journal of Multi-Criteria
Decision Analysis, 7(6), pp.303–303.
French, S. et al., 1998. Problem formulation for multi-criteria decision analysis: report of
a workshop. Journal of Multi-Criteria Decision Analysis, 7(5), pp.242–262.
Frey, D.D. et al., 2009. The Pugh Controlled Convergence method: model-based
evaluation and implications for design theory. Research in Engineering Design, 20(1),
231
pp.41–58.
Gaissmaier, W., Schooler, L.J. & Mata, R., 2008. An ecological perspective to cognitive
limits: Modeling environment-mind interactions with ACT-R. Judgment and Decision
Making, 3(3), pp.278–291.
Gardener, T. & Moffat, J., 2008. Changing behaviours in defence acquisition: a game
theory approach. Journal of the Operational Research Society, 59(2), pp.225–230.
Geis, J.P. et al., 2011. Blue horizons study assesses future capabilities and technologies
for the United States Air Force. Interfaces, 41(4), pp.338–353.
Geldermann, J. & Schӧbel, A., 2011. On the Similarities of Some Multi-Criteria Decision
Analysis Methods. Journal of Multi-Criteria Decision Analysis, 18(3-4), pp.219–230.
Georgiadis, D.R., Mazzuchi, T.A. & Sarkani, S., 2013. Using multi criteria decision
making in analysis of alternatives for selection of enabling technology. Systems
Engineering, 16(3), pp.287–303.
Gibson, H., Faith, J. & Vickers, P., 2012. A survey of two-dimensional graph layout
techniques for information visualisation. Information Visualization.
Gilbride, T.J., Lenk, P.J. & Brazell, J.D., 2008. Market share constraints and the loss
function in choice-based conjoint analysis. Marketing Science, 27(6), pp.995–1011.
Girard, A.M., 2006. The Art and Science of Mathematical Modeling. Making Better
Decisions, p.28.
Girotra, K., Terwiesch, C. & Ulrich, K.T., 2010. Idea generation and the quality of the
best idea. Management Science, 56(4), pp.591–605.
Green, P.E. & Srinivasan, V., 1990. Conjoint analysis in marketing: new developments
with implications for research and practice. The Journal of Marketing, pp.3–19.
Green, P.E. & Srinivasan, V., 1990. Conjoint analysis in marketing: New developments
with implications for research and practice. Journal of Marketing, 54(4), pp.3–19.
Grey, S., 1995. Practical risk assessment for project management, Wiley.
232
Griffin, A. & Hauser, J.R., 1996. Integrating R\&D and marketing: a review and analysis
of the literature. Journal of product innovation management, 13(3), pp.191–215.
Griffin, A. & Hauser, J.R., 1993. The voice of the customer. Marketing science, 12(1),
pp.1–27.
Group, I.S.H.W. & others, 2011. INCOSE systems engineering handbook v. 3.2. 2,
Hammond, J.S., Keeney, R.L. & Raiffa, H., 1998. The hidden traps in decision making.
Harvard Business Review, 76(5), pp.47–58.
Harris, J., Craig, E. & Egan, H., 2010. How successful organizations strategically manage
their analytic talent. Strategy \& Leadership, 38(3), pp.15–22.
Harris, J.G., Craig, E. & Light, D.A., 2011. Talent and analytics: new approaches, higher
ROI. Journal of Business Strategy, 32(6), pp.4–13.
Harvey, C.M. & \Osterdal, L.P., 2010. Cardinal scales for health evaluation. Decision
Analysis, 7(3), pp.256–281.
Hauser, J., Tellis, G.J. & Griffin, A., 2006. Research on innovation: A review and agenda
for marketing science. Marketing Science, 25(6), pp.687–717.
Hauser, J.R. & Toubia, O., 2005. The impact of utility balance and endogeneity in
conjoint analysis. Marketing Science, 24(3), pp.498–507.
Hicks, B. et al., 2009. A methodology for evaluating technology readiness during product
development. economics, 6, p.7.
Hillier, F.S., 1990. Intro To Operations Research 8E (Iae), Tata McGraw-Hill Education.
Hobbs, B.F., 1986. What can we learn from experiments in multiobjective decision
analysis? Systems, Man and Cybernetics, IEEE Transactions on, 16(3), pp.384–394.
Hollis, W.W. & Patenaude, A., 1999. Simulation Based Acquisition: Can We Stay the
Course? Army RD\&A, May-June, pp.11–14.
Honour, E.C., 2004. 6.2. 3 Understanding the Value of Systems Engineering. In INCOSE
International Symposium. pp. 1207–1222.
Honour, E.C., Valerdi, R. & Initiative, M.I.T.L.A., 2006. Advancing an ontology for
systems engineering to allow consistent measurement. In Conference on Systems
Engineering Research.
Hopkins, I.B.M.S. & McAfee, A., 2010. Putting the Science in Management Science.
Horvitz, E. & Barry, M., 1995. Display of information for time-critical decision making.
In Proceedings of the Eleventh conference on Uncertainty in Artificial Intelligence. pp.
296–305.
Howard, R. & others, 1968. The foundations of decision analysis. Systems Science and
Cybernetics, IEEE Transactions on, 4(3), pp.211–219.
Howard, R.A., 1988. Decision analysis: practice and promise. Management science,
34(6), pp.679–695.
Howard, R.A. & Matheson, J.E., 2005. Influence diagrams. Decision Analysis, 2(3),
pp.127–143.
Instruction, D., 2015. 5000.02, Operation of the Defense Acquisition System, 7 January
2015.
Instruction, D., 2013. Interim DoDI 5000.02. Operation of the Defense Acquisition
System, 11, p.250.
Ishizaka, A. & Labib, A., 2009. Analytic hierarchy process and expert choice: Benefits
and limitations. OR Insight, 22(4), pp.201–220.
Johnstone, D.J., Jose, V.R.R. & Winkler, R.L., 2011. Tailored scoring rules for
probabilities. Decision Analysis, 8(4), pp.256–268.
Jones, W.D., 1999. Arming the Eagle: A History of US Weapons Acquisition since 1775,
Defense Systems Management College Press.
Juan, Z., Wei, L. & Xiamei, P., 2010. Research on Technology Transfer Readiness Level
and Its Application in University Technology Innovation Management. In E-Business
and E-Government (ICEE), 2010 International Conference on. pp. 1904–1907.
Kälviäinen, M., 2010. Interdisciplinary Interaction for the Early Stages of Product and
Service Development. Handbook of Research on Trends in Product Design and
Development: Technological and Organizational Perspectives, p.39.
Kane, M.J. et al., 2004. The generality of working memory capacity: a latent-variable
approach to verbal and visuospatial memory span and reasoning. Journal of
Experimental Psychology: General, 133(2), p.189.
Kaniclides, A. & Kimble, C., 1995. A framework for the development and use of
executive information systems. Proceedings of GRONICS, 95, pp.47–52.
Katz, A. & Te’eni, D., 2007. The contingent impact of contextualization on computer-
mediated collaboration. Organization Science, 18(2), pp.261–279.
Keefer, D.L., Kirkwood, C.W. & Corner, J.L., 2004. Perspective on decision analysis
applications, 1990-2001. Decision analysis, 1(1), pp.4–22.
Keeney, R.L., 2002. Common mistakes in making value trade-offs. Operations Research,
50(6), pp.935–945.
Keeney, R.L., 1982. Decision analysis: an overview. Operations research, 30(5), pp.803–
838.
Keeney, R.L., 2013. Foundations for Group Decision Analysis. Decision Analysis, 10(2),
pp.103–120.
Keeney, R.L., 2004b. Making better decision makers. Decision Analysis, 1(4), pp.193–
204.
Keeney, R.L., 1974. Multiplicative utility functions. Operations Research, 22(1), pp.22–
34.
Keeney, R.L., 1994. Using values in operations research. Operations Research, 42(5),
pp.793–813.
Keeney, R.L. & Gregory, R.S., 2005. Selecting attributes to measure the achievement of
objectives. Operations Research, 53(1), pp.1–11.
Keeney, R.L. & Keeney, R.L., 2009. Value-focused thinking: A path to creative
decisionmaking, Harvard University Press.
Keeney, R.L. & Raiffa, H., 1976. Decision analysis with multiple conflicting objectives.
Wiley\& Sons, New York.
Keeney, R.L. & von Winterfeldt, D., 2009. Practical value models.
Keisler, J.M. & Noonan, P.S., 2012. Communicating analytic results: A tutorial for
decision consultants. Decision Analysis, 9(3), pp.274–292.
Keller, L.R., Simon, J. & Wang, Y., 2009. Multiple-objective decision analysis involving
multiple stakeholders. TutORials in Operations Research: Decision Technologies and
Applications (INFORMS, Catonsville, MD), pp.139–155.
236
Keller, S.P., 1998. Simulation-based acquisition: real world examples. ARMY RD AND A,
pp.25–26.
Kendall, F., 2013. Performance of the Defense Acquisition System: 2013 Annual Report.
Kenley, C.R. & El-Khoury, B., 2012. An Analysis of TRL-Based Cost and Schedule
Models,
Killen, C.P., Walker, M. & Hunt, R.A., 2005. Strategic planning using QFD.
International Journal of Quality \& Reliability Management, 22(1), pp.17–29.
Kirby, M.R., Mavris, D.N. & Largent, M.C., 2001. A Process for Tracking and Assessing
Emerging Technology Development Programs for Resource Allocation. Aircraft
Technology, Integration, and Operations Forum.
Kirkwood, C.W., 1992. An overview of methods for applied decision analysis. Interfaces,
22(6), pp.28–39.
Kogut, B., 1985. Designing global strategies: Comparative and competitive value-added
chains. MIT Sloan Management Review, 26(4), pp.15–28.
Kolko, J., 2010. Abductive thinking and sensemaking: The drivers of design synthesis.
Design Issues, 26(1), pp.15–28.
Korfiatis, M.P., Cloutier, R. & Zigh, T., Graphical CONOPS Development to Enhance
Model-Based Systems Engineering.
Kornish, L.J. & Ulrich, K.T., 2011. Opportunity spaces in innovation: Empirical analysis
of large samples of ideas. Management Science, 57(1), pp.107–128.
Kossiakoff, A. et al., 2011. Systems engineering principles and practice, Wiley. com.
Kou, G., Miettinen, K. & Shi, Y., 2011. Multiple criteria decision making: challenges and
advancements. Journal of Multi-Criteria Decision Analysis, 18(1-2), pp.1–4.
Krishnan, V. & Ulrich, K.T., 2001. Product development decisions: A review of the
literature. Management science, 47(1), pp.1–21.
Kwak, Y.H. & Ingall, L., 2007. Exploring Monte Carlo simulation applications for
project management. Risk Management, 9(1), pp.44–57.
237
Kyd, C., 2007. Don’t Discard Those Spreadsheets: The Power of Excel-Friendly OLAP.
Business Performance Management, 5(1), pp.15–20.
Lancaster, K., 1990. The economics of product variety: A survey. Marketing Science,
9(3), pp.189–206.
Law, A., 1991. Simulation model’s level of detail determines effectiveness. Industrial
Engineering, 23(10), pp.16–18.
Lee, B. et al., 2010. Sparkclouds: Visualizing trends in tag clouds. Visualization and
Computer Graphics, IEEE Transactions on, 16(6), pp.1182–1189.
Van Leer, B. & Roe, P.L., 1987. Cranfield Institute of Technology Cranfield, England
Richard W. Newsome, Major, USAF Air Force Wright Aeronautical Laboratories
NASA Langley Research Center.
Lennon, E., Besser, R. & Farr, J., 2010. Efficiency Metrics for Multi-Attribute Decision
Analysis Methods Used During the Concept Design of Novel Technologies. In CSER.
pp. 20–32.
Locascio, A. & Thurston, D., 1998. Transforming the house of quality to a multiobjective
optimization formulation. Structural optimization, 16(2-3), pp.136–146.
Lorenz, C., 1990. The Design Dimension: The New Competitive Weapon for Product
Strategy and Global Marking.
Madni, A.M. & Jackson, S., 2009. Towards a conceptual framework for resilience
engineering. Systems Journal, IEEE, 3(2), pp.181–191.
238
Maier, M.W. & Rechtin, E., 2000. The art of systems architecting, CRC press.
Mankins, J.C., 2009. Technology readiness and risk assessments: A new approach. Acta
Astronautica, 65(9), pp.1208–1215.
Markham, S.K. & Lee, H., 2013. Product development and management association’s
2012 comparative performance assessment study. Journal of Product Innovation
Management, 30(3), pp.408–429.
Markov, G., Hoffmann, A. & Creighton, O., 2011. Requirements engineering process
improvement: an industrial case study. Requirements Engineering: Foundation for
Software Quality, pp.34–47.
Masi, D.M.B., 2006. Avoiding a Simple Model’s Tragedy of Errors. Making Better
Decisions, p.18.
Maxwell, D.T., 2008. Decision analysis: Find a tool that fits. OR/MS Today, 35(5).
McDowell, I., 2006. Measuring health: a guide to rating scales and questionnaires,
Oxford University Press.
Mela, C.F., Roos, J. & Deng, Y., 2013. Invited Paper-A Keyword History of Marketing
Science. Marketing Science, 32(1), pp.8–18.
Michalek, J.J., Feinberg, F.M. & Papalambros, P.Y., 2005. Linking marketing and
engineering product design decisions via analytical target cascading*. Journal of
Product Innovation Management, 22(1), pp.42–62.
Miller, G.A., 1956. The magical number seven, plus or minus two: some limits on our
capacity for processing information. Psychological review, 63(2), p.81.
Neches, R. & Madni, A.M., 2013. Towards affordably adaptable and effective systems.
Systems Engineering, 16(2), pp.224–234.
Nickel, J. & others, 2010. Using multi-attribute tradespace exploration for the
architecting and design of transportation systems.
O’Hara, J.J. et al., 2007. Advanced visualization techniques for trade space exploration.
In Proc of the 48th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and
Material Conference, No AIAA-2007-1878, Honolulu, HI, USA. pp. 23–26.
O’Neill, M.G. et al., 2010. Comparing and optimizing the DARPA system F6 program
value-centric design methodologies. In AIAA Space 2010 Conference and Exposition,
AIAA-2010-8828, Anaheim, California, Aug.
Ong, Y.S., Nair, P.B. & Keane, A.J., 2003. Evolutionary optimization of computationally
expensive problems via surrogate modeling. AIAA journal, 41(4), pp.687–696.
Osama, A., 2006. Multi-Attribute Strategy and Performance Architectures in R\&D: The
Case of The Balanced Scorecard,
Palmer, A.R. & Larson, E.V., Cost Factors in the Army: Vol. 2, Factors, Methods, and
Models.
Papalambros, P.Y. & Wilde, D.J., 2000. Principles of optimal design: modeling and
computation, Cambridge university press.
Pappas, F.C., 1956. The magical number seven, plus or minus two: Some limits on our
capacity for processing information. Psychological Review, 63, pp.81–97.
240
Parkinson, A., 1995. Robust mechanical design using engineering models. Journal of
Vibration and Acoustics, 117(B), pp.48–54.
Parnell, G.S., 2007. Chapter 19, Value Focused Thinking. Methods for Conducting
Military Operational Analysis, pp.619–655.
Parnell, G.S. et al., 2013. Handbook of decision analysis, John Wiley \& Sons.
Parnell, G.S., Driscoll, P.J. & Henderson, D.L., 2011. Decision making in systems
engineering and management, Wiley. com.
Patenaude, A., 1996. Study on the Effectiveness of Modeling and Simulation in the
Weapon System Acquisition Process.,
Phillips, L.D., 2006. How Raiffa’s RAND memo led to a multi-criteria computer
program. Journal of Multi-Criteria Decision Analysis, 14(4-6), pp.185–189.
Powell, R.A. & Buede, D.M., 2006. Decision-making for successful product
development.
Pugh, S., 1991. Total design: integrated methods for successful product engineering,
Addison-Wesley.
R’\ios-Insua, S., Jiménez, A. & Mateos, A., 2003. Sensitivity analysis in a generic multi-
attribute decision support system. KJ Engemann and GE Lasker. Advances in decision
technology and intelligent information systems, The International Institute for
Advanced Studies in Systems Research and Cybernetics, Ontario, pp.31–35.
Raiffa, H., 1968. Decision analysis: introductory lectures on choices under uncertainty.
Ramaswamy, R. & Ulrich, K., 1993. Augmenting the house of quality with engineering
models. Research in engineering design, 5(2), pp.70–79.
241
Randall, T., Terwiesch, C. & Ulrich, K.T., 2007. Research note—user design of
customized products. Marketing Science, 26(2), pp.268–280.
Randolph, J. J., 2009. A guide to writing the dissertation literature review. Practical
Assessment, Research & Evaluation, 14(13)
Rangaswamy, A. & Lilien, G.L., 1997. Software tools for new product development.
Journal of Marketing Research, pp.177–184.
Ravasi, D. & Stigliani, I., 2012. Product design: a review and research agenda for
management studies. International Journal of Management Reviews.
Roark III, H.H., 2012. Value Centric Approach to Target System Modularization Using
Multi-Attribute Tradespace Exploration and Network Measures of Component
Modularity.
Roberto, M.A., 2013. Why Great Leaders Don’t Take Yes for an Answer: Managing for
Conflict and Consensus, Pearson Education.
Ross, A.M., McManus, H.L., et al., 2010. A Role for Interactive Tradespace Exploration
in Multi-Stakeholder Negotiations. AIAA Space 2010.
Ross, A.M., O’Neill, M.G., et al., 2010. Aligning Perspectives and Methods for Value-
Driven Design. In American Institute of Aeronautics and Astronautics: Space 2010
Conference and Exposition.
Ross, A.M. et al., 2004. Multi-attribute tradespace exploration as front end for effective
space system design. Journal of Spacecraft and Rockets, 41(1), pp.20–28.
Ross, A.M. et al., 2008. Responsive systems comparison method: Case study in assessing
future designs in the presence of change. AIAA Space 2008, p.9.
Ross, A.M. et al., 2010. Revisiting the Tradespace Exploration Paradigm: Structuring the
Exploration Process. AIAA Space 2010.
Ross, A.M. & Hastings, D.E., 2005. The tradespace exploration paradigm. In INCOSE
International Symposium 2005. p. 13.
Saari, D.G., 1995. A chaotic exploration of aggregation paradoxes. SIAM review, 37(1),
pp.37–52.
242
Saaty, T.L., 1998. That is not the analytic hierarchy process: what the AHP is and what it
is not. Journal of Multi-Criteria Decision Analysis, 6(6), pp.324–335.
Sage, A.P., 1981. A methodological framework for systemic design and evaluation of
computer aids for planning and decision support. Computers \& Electrical
Engineering, 8(2), pp.87–101.
Sage, A.P. & Rouse, W.B., 2011. Handbook of systems engineering and management,
Wiley. com.
Sahraoui, A.E.K., Buede, D.M. & Sage, A.P., 2008. Systems engineering research.
Journal of Systems Science and Systems Engineering, 17(3), pp.319–333.
Sanfey, A.G. & Chang, L.J., 2008. Of two minds when making a decision. Scientific
American, 3.
Sauser, B. et al., 2009. Defining an integration readiness level for defense acquisition. In
Int Conf Int Council Syst Eng, INCOSE, Singapore.
Scagnetti, G. & Kim, S., 2012. The Diagram of Information Visualization. The Parsons
Journal for Information Mapping (PJIM), 4(4), pp.1–8.
Schneider, J. & Hall, J., 2011. Why most product launches fail.
Scholl, A. et al., 2005. Solving multiattribute design problems with analytic hierarchy
process and conjoint analysis: An empirical comparison. European Journal of
Operational Research, 164(3), pp.760–777.
Schoner, B., Choo, E.U. & Wedley, W.C., 1997. A Comment on “Rank Disagreement: A
Comparison of Multi-Criteria Methodologies.” Journal of Multi-Criteria Decision
Analysis, 6(4), pp.197–200.
243
Schwartz, M., 2014. Defense acquisitions: How DoD acquires weapon systems and
recent efforts to reform the process.
Shah, D. & Madden, L., 2004. Nonparametric analysis of ordinal data in designed
factorial experiments. Phytopathology, 94(1), pp.33–43.
Shim, J.P. et al., 2002. Past, present, and future of decision support technology. Decision
support systems, 33(2), pp.111–126.
Shocker, A.D. & Srinivasan, V., 1979. Multiattribute approaches for product concept
evaluation and generation: A critical review. Journal of Marketing Research, pp.159–
180.
Sillitto, H., 2005. Some really useful principles: A new look at the scope and boundaries
of systems engineering. In The 15th International Symposium of the International
Council on Systems Engineering (INCOSE), Rochester, NY.
Simon, P., Hillson, D. & Newland, K., 1997. Project risk analysis and management
(PRAM) guide. UK: The Association for Project Management, 16.
Smith, J.E. & Von Winterfeldt, D., 2004. Anniversary article: decision analysis in
management science. Management Science, 50(5), pp.561–574.
244
Smith, V.S., 2013. Data Dashboard as Evaluation and Research Communication Tool.
New Directions for Evaluation, 2013(140), pp.21–45.
Snowden, D.J. & Boone, M.E., 2007. A leader’s framework for decision making. harvard
business review, 85(11), p.68.
Sridhar, P., Madni, A.M. & Jamshidi, M., 2008. Multi-criteria decision making in sensor
networks. Instrumentation \& Measurement Magazine, IEEE, 11(1), pp.24–29.
Standard, I., 2015. Systems and software engineering-system life cycle processes. ISO
Standard, 15288, p.2015.
Steuer, R.E., Qi, Y. & Hirschberger, M., 2006. Developments in multi-attribute portfolio
selection. Multiple Criteria Decision Making ‘05, pp.251–262.
Stewart, T.J., 1996. Robustness of additive value function methods in MCDM. Journal of
Multi-Criteria Decision Analysis, 5(4), pp.301–309.
Stolz, R.F., 2010. Dan Ariely on Irrationality and What to Do with It. Journal of
Financial Planning.
Strawbridge, Z., McAdams, D.A. & Stone, R.B., 2002. A computational approach to
conceptual design. In Proceedings of DETC2002, DETC2002/DTM-34001, Montreal,
Canada.
Stump, G. et al., 2009. Visual steering commands for trade space exploration: User-
guided sampling with example. Journal of Computing and Information Science in
Engineering, 9(4), p.044501.
Stump, G.M. et al., 2004. Trade space exploration of satellite datasets using a design by
shopping paradigm. In Aerospace Conference, 2004. Proceedings. 2004 IEEE. pp.
245
3885–3895.
Suh, N.P., 1990. The principles of design, Oxford University Press New York.
Sullivan, M.J., 2010. Defense Acquisitions: Many Analyses of Alternatives Have Not
Provided a Robust Assessment of Weapon Systems Options, DIANE Publishing.
Svenson, O., 1998. Multi-criteria decision aids and human decision making: two worlds?
Journal of Multi-Criteria Decision Analysis, 7(6), pp.352–354.
Tan, Y.S. & Fraser, N.M., 1998. The modified star graph and the petal diagram: Two
new visual aids for discrete alternative multicriteria decision making. Journal of Multi-
Criteria Decision Analysis, 7(1), pp.20–33.
Thomke, S. & Reinertsen, D., 2012. Six myths of product development. Harvard
Business Review, 90(5), pp.84–94.
Todd, P. & Benbasat, I., 1991. An experimental investigation of the impact of computer
based decision aids on decision making strategies. Information Systems Research, 2(2),
pp.87–115.
Toubia, O., Hauser, J. & Garcia, R., 2007. Probabilistic polyhedral methods for adaptive
choice-based conjoint analysis: Theory and application. Marketing Science, 26(5),
pp.596–610.
Tufte, E., 2005. PowerPoint Does Rocket Science-and Better Techniques for Technical
Reports. URL: http://www. edwardtufte. com/bboard/q-and-a-fetch-msg.
Tufte, E.R. & Graves-Morris, P., 1983. The visual display of quantitative information,
Graphics press Cheshire, CT.
Turskis, Z. & Zavadskas, E.K., 2011. Multiple criteria decision making (MCDM)
methods in economics: an overview. Technological and economic development of
economy, (2), pp.397–427.
Ullman, D.G., 1992. The mechanical design process, McGraw-Hill New York.
Ullman, D.G. & D’Ambrosio, B., 1995. Taxonomy for classifying engineering decision
problems and support systems. Artificial Intelligence for Engineering, Design,
Analysis and Manufacturing, 9(05), pp.427–438.
Ullman, D.G. & Spiegel, B.P., 2006. Trade studies with uncertain information. In
Sixteenth Annual International Symposium of the International Council On Systems
Engineering (INCOSE).
Ulrich, K., 1995. The role of product architecture in the manufacturing firm. Research
policy, 24(3), pp.419–440.
Ulrich, K.T., 2005. Estimating the technology frontier for personal electric vehicles.
Transportation Research Part C: Emerging Technologies, 13(5), pp.448–462.
Ulrich, K.T., 2003. Product design and development, Tata McGraw-Hill Education.
Ulrich, K.T. & Ellison, D.J., 1999. Holistic customer requirements and the design-select
decision. Management Science, 45(5), pp.641–658.
Urban, G.L., Hauser, J.R. & Urban, G.L., 1993. Design and marketing of new products,
Prentice Hall Englewood Cliffs, NJ.
247
Verma, D. et al., 1999. Fuzzy set based multi-attribute conceptual design evaluation.
Systems Engineering, 2(4), pp.187–197.
Virtanen, K., Karelahti, J. & Raivio, T., 2006. Modeling air combat by a moving horizon
influence diagram game. Journal of guidance, control, and dynamics, 29(5), pp.1080–
1091.
Viscito, L., Richards, M.G. & Ross, A.M., 2009. Assessing the value proposition for
operationally responsive space. In AIAA Space.
Voss, C. et al., 1996. Managing New Product Design and Development: an Anglo-
German Study. Business Strategy Review, 7(3), pp.1–15.
Wallace, D.R. & Jakiela, M.J., 1993. Automated product concept design: Unifying
aesthetics and engineering. Computer Graphics and Applications, IEEE, 13(4), pp.66–
75.
Wallenius, J. et al., 2008. Multiple criteria decision making, multiattribute utility theory:
Recent accomplishments and what lies ahead. Management Science, 54(7), pp.1336–
1349.
Wand, Y. & Weber, R., 2002. Research commentary: information systems and
conceptual modeling—a research agenda. Information Systems Research, 13(4),
pp.363–376.
Wang, M. & Yang, J., 1998. A multi-criterion experimental comparison of three multi-
attribute weight measurement methods. Journal of Multi-Criteria Decision Analysis,
7(6), pp.340–350.
West, T.D. & Pyster, A., 2015. Untangling the Digital Thread: The Challenge and
Promise of Model-Based Engineering in Defense Acquisition. INSIGHT, 18(2), pp.45–
55.
Whitney, D.E., 1993. Nippondenso Co. Ltd: A case study of strategic product design.
Research in Engineering Design, 5(1), pp.1–20.
Van Wijk, J.J., 2006. Views on visualization. Visualization and Computer Graphics,
IEEE Transactions on, 12(4), pp.1000–433.
Wilkie, W.L. & Moore, E.S., 2003. Scholarly research in marketing: Exploring the “4
eras” of thought development. Journal of Public Policy \& Marketing, 22(2), pp.116–
146.
248
Winkler, R.L., 1990. Decision modeling and rational choice: AHP and utility theory.
Management Science, 36(3), pp.247–248.
Von Winterfeldt, D., 1980. Structuring decision problems for decision analysis. Acta
Psychologica, 45(1), pp.71–93.
Von Winterfeldt, D., Edwards, W. & others, 1986. Decision analysis and behavioral
research, Cambridge University Press Cambridge.
Wunderlich, K.E., 2006. Is Your Model Designed to Fail? Making Better Decisions, p.11.
Wunderlich, K.E., Fischer, M.J. & Miller, H.G., 2006. Forging a Sword of Purpose.
Making Better Decisions.
Yamamoto, M. & Lambert, D.R., 1994. The impact of product aesthetics on the
evaluation of industrial products. Journal of Product Innovation Management, 11(4),
pp.309–324.
Zapatero, E.G., Smith, C.H. & Weistroffer, H.R., 1997. Evaluating Multiple-Attribute
Decision Support Systems. Journal of Multi-Criteria Decision Analysis, 6(4), pp.201–
214.
Zhai, L.-Y., Khoo, L.-P. & Zhong, Z.-W., 2009. Design concept evaluation in product
development using rough sets and grey relation analysis. Expert Systems with
Applications, 36(3), pp.7072–7079.
249
Vita
Visionary Leader, PhD candidate, & Supervisory Systems Engineer with two Master’s Degrees coupled
with 20 years’ experience developing proposals, securing resources, and leading effective programs for
the Dept. of Defense.
SUMMARY OF QUALIFICATIONS
Highly educated in the hard sciences with engineering degrees from University of
Pennsylvania, New York University Polytechnic, and Villanova University. Currently a Ph.D.
candidate at Stevens Institute of Technology.
Experienced leader of large, diverse teams where a full set of finely tuned soft skills and
advanced emotional intelligence are regularly employed.
Accomplished public speaker with a proven track record of several hundred presentations
ranging from large audiences at annual conferences to small audiences of defense leaders at the
Pentagon.
Author of a growing list of peer reviewed and published papers.
creating trade space visualizations, driving consensus and fostering understanding of final
recommendation.
Earned the ARDEC System Engineering Fellow Scholarship for this state-of-the-art,
extended body of work.
Deputy Product Manager, Electromagnetic Gun (2004-2009)
Responsible for a $20M annual budget and led a cross-organizational team (government,
industry, and academia) to research and mature electromagnetic launch technology.
Created a vision to pursue radical technology innovation to sustain firepower capability
growth.
Presented the vision at the 14th International Electromagnetic Launch Symposium in a paper
entitled “EM Gun and the Battle of Crecy”. The paper received highest honors of the
symposium as it was voted “Best Oral Presentation.”
Earned the National Defense Industrial Association (NDIA) Picatinny Chapter Firepower
Award for outstanding contributions during this period.
Deputy Project Officer (DPO), Guided Munition Technology Investigations (1997 – 2003)
Responsible for a $5M annual budget and led a government/industry team to research
technology to enable a family of guided munitions.
Earned the ARDEC Commander’s Award for Civilian Service during this period.
Systems Management Engineer (1994 – 1997)
Served as technical representative to the Fire Support Center at Fort Sill, Oklahoma.
Researched customer requirements and translated requirements into proposed solutions.
EDUCATION
PUBLICATIONS
Vision for a Multiple Objectives Decision Support Tool for Assessing Initial Business Cases
of Military Technology Investments. (March 2010)
Peer reviewed and accepted paper, presented and published in the 8th Conference on
Systems Engineering Research (CSER) proceedings. Co-Authored with Dr. Gregory Parnell,
Professor of Systems Engineering, U.S. Military Academy, West Point, NY.
Electromagnetic Gun and the Battle of Crecy. (June 2008)
Winner of best paper and presentation award at the 14th Electromagnetic Launch
Conference. Published in EML 2008 Proceedings.
Applying Integrated Product and Process Development (IPPD) on the U.S. Army Multi-Role
Armament and Ammunition System. (February 2002)
Peer reviewed and accepted paper, published in International Council On Systems
Engineering (INCOSE) proceedings.
PROFESSIONAL AFFILIATIONS
CERTIFICATIONS