Professional Documents
Culture Documents
net/publication/316474003
The use of multiple methods in the Joint Irregular Warfare Analytic Baseline
(JIWAB) study
CITATIONS READS
6 234
6 authors, including:
Karen Grattan
George Mason University
4 PUBLICATIONS 78 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Michael Bailey on 11 October 2018.
Background
The U.S. wars in Iraq and Afghanistan introduced considerable urgency within the
insurgency and what the U.S. government could do about them. Focused initially on a repeat of
the quick and conventional 1991 Gulf War, the U.S. military found itself dealing with an entirely
different set of challenges than for which it had prepared. The attacks on the World Trade
Center and Pentagon on September 11, 2001; the 2001 U.S. invasion of Afghanistan; and the
2003 U.S. invasion of Iraq challenged many long-held U.S. defense related practices and
paradigms. The shift towards “irregular warfare” (IW) accelerated as the 2006 Quadrennial
Defense Review (QDR) identified irregular, catastrophic, and disruptive challenges as areas that
Figure 1. Four Priority Areas for the 2006 Quadrennial Defense Review
1 U.S. Department of Defense, Quadrennial Defense Review Report, February 6, 2006, p. 19.
1
Source: Department of Defense, 2006 Quadrennial Defense Review
The defense analysis community developed over several decades has developed tools
focused on the conventional military challenges of the Cold War and the Gulf War. The most
common approaches at the time were physics-based, force-on-force computer models and
simulations. The analytic community’s initial response to the IW challenge was to try and
create or sponsor similar computer models and tools that took a similar approach towards
complex dynamics of counter-terrorism and insurgency. These new models and modeling
environments included the Synthetic Environment for Analysis and Simulation (SEAS), an agent-
Northrop Grumman for the U.S. Marine Corps; the Peace Support Operations Model (PSOM)
2Purdue Research Foundation, “Simulex celebrates open house with expansion, announcement
of two separate divisions,” April 16, 2008, accessed at
http://www.purdue.edu/uns/x/2008a/080416ChaturvediOpen.html on September 7, 2016.
2
developed by the UK Defence Science and Technology Laboratory (DSTL) and used by the Joint
developed for Joint Staff to illustrate the new counterinsurgency manual, but whose
complicated visualization was derided in the New York Times.4 Perhaps the one computational
backplane to rule them all was the Conflict Modeling, Planning and Outcomes Experimentation
(COMPOEX) Program funded by the Defense Advanced Research Projects Agency (DARPA).5
interdisciplinary review of the scholarly work on the causes of terrorism,6 it was difficult for the
DoD analysis community to approach problems in non-quantitative ways. This may have been
due to the heavy representation within the community of analysts with quantitative
backgrounds, the overall lack of understanding of social science, and highly routinized
organizational norms within the community as to what constituted legitimate analysis. Thus,
years into the start of the wars in Iraq and Afghanistan, the analysis community continued to
lean on large computer models to try to analyze IW dynamics. This was highlighted by the 2009
Africa Study by then Program Analysis and Evaluation (PA&E) Simulation and Analysis Center
3 Ministry of Defence, “MOD scientists help shape Afghanistan operations,” April 28, 2011,
accessed at https://www.gov.uk/government/news/mod-scientists-help-shape-afghanistan-
operations on September 7, 2016.
4 Elisabeth Bumiller, “We have met the enemy and he is Power Point,” April 26, 2010, New York
3
(SAC) within the Office of the Secretary of Defense (OSD), which used COMPOEX and other
models. The Africa Study was faulted for using quantitative data and relationships that were
not validated; as well as being a black box with quantitative outputs that did not readily
translate to intuitive explanations. Dissatisfaction with the Africa Study ultimately prompted
the Marine Corps to volunteer to lead an analytic effort focused on IW that would use a non-
computational modeling approach. The resulting three-year effort was known as the Joint
JIWAB was a significant departure from the practices and methods of the existing DoD
analysis community at the time. In addition to using different tools and disciplines, it was also
purposely designed to incorporate a very high level of participatory input from organizations
outside DoD. To facilitate this collaborative approach, the study was kept unclassified for as
Overall Approach
The overall approach to JIWAB was one that was distinctly not focused on computer
based M&S, or anything quantitative. The reason for this was simple: very little in the realm of
understood primarily through quantifiable means. The traditional defense analytic approach
can be characterized as quantifying relationships that are well understood and well defined,
4
and is often marked by large, integrative computer models and simulations that attempt to roll
The traditional DoD analytic approach focuses on a single classified scenario that is
developed by the DoD over a compressed time period – a matter of weeks. The scenario goals
are to test specific capabilities rather than scenario realism, is adversary oriented, and typically
looks out 20-25 years into the future. The analytic approach is typically modeling & simulation
heavy, with a computational perspective. Wargames used in this traditional approach are often
computer-supported and focus on quantitative outputs, and often exist to inform further, down
applying this approach to IW is that the complication that much of IW is qualitative. Another
issues is that computational social science, perhaps the area closest to what DoD might hope to
achieve in applying modeling & simulation to IW, is in many ways still a nascent field
unrecognized by many.
In contrast, the JIWAB approach different in may respects. The scenario development
process developed multiple vignettes and potential paths, had heavy interagency participation,
and involved an extended process over many months. In order to maximize collaboration with
interagency, academic, and non-governmental experts, the study process remained unclassified
for as long as possible. The study began with the existing conflict drivers and mitigators and
projected out only a few years, rather than beginning with a desired future conflict and
5
progressing towards that point.7 Longer time horizons multiply errors when projecting events
into the future. They are necessary in conventional war scenarios where the purpose is to
support the analysis of long-term weapons systems. Because IW solutions are not geared
towards weapons systems development, JIWAB had the freedom to stay in the near-term
future.
JIWAB also had an orientation towards drawing out the multiple actors in a space rather
than being adversary oriented. That is, rather than being focused on the U.S. versus an
adversary, the study was oriented at understand the actors in the region, whose interactions
would be the primary conflict drivers. Primary JIWAB methods were structured but qualitative
methods, and wargames were designed to explore concepts and feed scenario development.
Data was primarily qualitative, and the study had a significant qualitative data collection plan.
Because of its focus on expert-driven methods, the epistemological challenge for the JIWAB
Methods in JIWAB
JIWAB used a number of methods drawn from a wide variety of disciplines. Although
the following methods were chosen for the study, it would be possible to repeat the study
using an entirely different set of methods. Figure 2 below shows the various steps for the
7Harvey DeWeerd, A Contextual Approach to Scenario Construction (Santa Monica, CA: RAND
Corporation, 1973), pp. 4-5.
6
Figure 2. JIWAB Study Map
The starting point, the identification of the geography and timeframe, came from OSD
Policy. JIWAB was to examine Sudan approximately two years in the future from 2010, which
put the scenario at 2012. The rationale for a short time future projection was better fidelity on
the problem. However, this posed some interesting challenges for the study team. One was
that the study timeline eventually outran the scenario timeline; the other was the Sudan split
understanding, ran as independent efforts. The reason for spending so much time on scenario
and contextualized understanding was to gain a detailed appreciation for the problem.
7
Scenario
Scenario development, typically a brief stage in what is now called Support to Strategic
Analysis (SSA) in DoD, was expanded considerably within the JIWAB process. The first step in
by Fritz Zwicky, can be executed as a group method that asks participants to identify the most
important parameters of a problem and their values.8 For the JIWAB GMA, experts on Sudan
were asked what would be the key factors affecting the security environment in the area two
Figure 3 below identifies the “morphological field” produced by participants. Across the
top are the parameters that are expected to factor into the security environment, such as
economic conditions, internal migration, and negotiations in Darfur and other areas. Below
each parameter are the values that the parameter can take on. For example, resource
distribution can become more unbalanced than at present, or more balanced in the future. The
green boxes represent one particular configuration of the morphological box, or one possible
combination of factors in the future. Cross-consistency analysis is also done with participants in
order to rule out inconsistent combinations of factors. Key to the GMA workshop was a
8
9
From the huge number of potential configurations, participants selected several
“interesting corners,” where they felt several potential future values intersected in insightful
ways. This created numerous vignettes that various DoD study stakeholders assessed using
criteria such as interest to their organization and general interest for IW. The next step within
the process was to use structured scenario fusion from a counterfactual reasoning (CFR)
approach developed for intelligence analysts projecting future trends.9 CFR is structured to
intermediate states, and consequent scenarios when making projections. Part of the method
involves fusing together events and trends in a structured way, and informed the creation of
five “proto-scenarios” using the GMA vignettes. The Pentagon-level study steering committee
In the next stage of scenario development, each of the five proto-scenarios were fleshed
out through a seminar wargame, Separating Sudan, designed and executed by the Center for
Naval Analyses (CNA).10 This involved another round of expert recruitment, and experts role-
played various actors from the starting point specified in each proto-scenario. The result was
five fleshed-out scenarios that were developed to the point of U.S. government intervention.
Participant roles included the governments of Sudan and South Sudan, the African Union,
9 Noel Hendrickson, Counterfactual Reasoning: a Basic Guide for Analysts, Strategists, and
Decision Makers, The Proteus Monograph Series, Vol. 2, Issue 5, October 2008 (Washington,
D.C.: National Intelligence University); and Alec Barker, “Applying Structured Scenario Fusion to
Address Multifaceted Strategic Challenges,” The Journal of the Operational Research Society,
65(11), November 2014.
10 Peter Perla, Michael Markowitz, and Lesley Warner. Separating Sudan: Final CNA Report.
10
(NGOs), the United Nations – and the Lord’s Resistance Army (LRA) for a scenario involving the
LRA.
notetaking to observe wargame participants.11 This enabled the team to observe the shift from
third person to first person as players settled into their roles, the extent to which players were
immersed in the scenario, and the degree to which they found the scenarios realistic.12 The
addition of formal, qualitative social science theory and methods to wargaming is another area
that is wide open for further development, and one that the JIWAB team began experimenting
Contextualized Understanding
As the scenarios developed, the JIWAB study team also began a parallel effort to
develop what we termed contextualized understanding of the problem. While the scenario
development process began with GMA, an atheoretical framework for structuring a variety of
problems, work on contextualized understanding started very much from a theory-based lens
of conflict.
This lens was the Interagency Conflict Assessment Framework (ICAF), a process normally
run by then-State Department’s Office of the Coordinator for Stabilization and Reconstruction
11 Robert M. Emerson, Rachel I. Fretz, and Linda L. Shaw, Writing Ethnographic Fieldnotes, 1st
Edition, Chicago Guides to Writing, Editing, and Publishing, (Chicago: University of Chicago
Press), 1995.
12 The observation about shifting to first person was made by team member Deborah Frost;
such observations point to the possibility of being able to document wargame dynamics
through empirical observations.
11
(S/CRS). The ICAF process consists of four steps: 1) evaluating the context for the conflict, 2)
understanding core grievances and social and institutional resiliencies, 3) identifying key drivers
and mitigators of conflict, and 4) identifying windows of vulnerability and opportunity. 13 The
key “formula” for a conflict was a key actor mobilizing an identify group around a grievance;
while the formula for reducing conflict was a key actor mobilizing a group around a resilience.
The JIWAB study team adapted ICAF for its study by taking a more analytically oriented
situation. Thus the study team developed the Interagency Conflict Assessment Framework –
12
Table 1. Difference Between ICAF and ICAF-A
ICAF ICAF-A
The focus of ICAF-A was more research-oriented, and attempted to use sourced
information on the framework elements. This information was captured in a wiki, also one of
13
Another addition in ICAF-A were the “systems maps” of conflict dynamics for different
areas in the region, in workshops facilitated by experts in conflict and conflict resolution. This
was done during three all-day workshops, one each on Darfur, South Sudan, and North Sudan.
Participants were asked to identify the causal loops created by conflict dynamics, and to pull
them together into an overall “map” of the dynamics in each area. Figure 6 below is a photo of
14
Figure 7 below shows an example of one of the systems maps. As part of the effort, the
team of conflict experts facilitating the event developed a “grammar” pictured in the systems
maps. Within this grammar, R denoted “reinforcing” loops, where once around the loop
reinforces the dynamic; and B denoted “balancing” loops, where once around the loops
balances out the dynamic. The +/- show the direction of the variables depicted in the loop.
15
In this figure, red R1 is a reinforcing loop. Going once around the loop, a decrease in the
capacity and legitimacy of the Government of South Sudan (GoSS) leads to an increase in
resource distribution. This in turn leads to an increase in political exclusion, which leads to a
decrease again in GoSS legitimacy and capacity. The addition of systems mapping also helped
visually link the conflict elements in a dynamic way, and was useful in conveying this
14One concern of the original ICAF was that it produced “lists of lists,” without being
integrative. Feedback on the JIWAB systems maps, however, was that they were still limited
because they did not integrate over the different regions of Sudan and South Sudan.
16
Taking two fundamentally different approaches to understanding the nature of the
problems in Sudan and South Sudan, the scenario process and the contextualized
understanding process arrived at essentially the same information. The experts used in the two
separate processes were also different, with only a small amount of overlap (especially hard
given the small number of relevant experts located in the United States). This let the JIWAB
team have greater confidence in the results. One thing learned from taking both approaches
was that either approach appears sufficient for developing a deep level of context from which
to develop scenarios. In other words, with the same caliber of experts, analysis could probably
Policy Option
The next stage of the study was the development of the interagency policy option. DoD,
U.S. Department of State, and U.S. Agency for International Development (USAID) participants
scenarios. Figure 8 shows the five lettered lines of effort they developed: A) coalition
Blue action cards represented DoD activities; pink represented State; light green represented
USAID; and other colors represented actors such as the host nation government. Dependence
and time synchronization is captured on the vertical axis, where tasks at the top of the board
depend on, or are planned to succeed tasks at the bottom. Smaller cards were used by the
17
Figure 8. Whole-of-Government Policy Option
This interagency planning exercise within JIWAB raised some interesting intellectual
questions, since no formal interagency planning process actually exists at this level of the
government. The team debated whether the study should try to reflect governmental
processes as they were, or as they should be. The JIWAB exercise was more a reflection of the
latter, since it was difficult to say what happens in the absence of formal processes.
The interagency option then fed the development of military concept of operations
(CONOPS). At this point, Joint Staff led a planning exercise similar to its Multi-Service Force
Deployment (MSFD) planning exercises for SSA scenarios. The mention of specific military
18
forces and other details made the product classified, the first point in the JIWAB study where
requirements, participants redid part of the strategic-level U.S. government response and
created a response with a lighter DoD footprint. Although it was not necessarily the intent of
the wargame, it moved the study down a path of offering more than one DoD approach to a
particular scenario. The idea of preparing for different levels of military response to a challenge
is not new, particular for IW and other challenges where DoD is not the lead organization, and
The second JIWAB wargame, Tumult and Transition, also presented an opportunity for
the team to bring structured narrative analysis into the analysis of seminar game dynamics, a
methodological coupling that had not been done before.15 In an article that already has to
move rapidly over the many methods used, without doing sufficient justice to any, we can only
note that the use of actant analysis allowed the comparison of wargames narrative between
actors over time. This allowed a systematic analysis of the narrative at crucial turning points in
the game. 16 The details of that analytic approach and its application to wargaming are dealt
15 Peter Perla, Daniella Mak, Mike Markowitz, Cathering Norman, and Karen Grattan. Tumult
and Transition: Gamebook. Alexandria, VA: Center for Naval Analyses, August 2013.
16 Yuna Huh Wong and Sara Cobb. “Narrative Analysis in Seminar Gaming.” Presentation to the
working paper submitted for the 2015 Military Operations Research Society Barchi Prize.
19
Table 1. JIWAB Steps, Methods, and Outputs
Success?
Study participants and the relevant senior Marine Corps official regarded the study as
very successful, even “mind-bending” in reframing the way in which campaign analyses should
irregular warfare. While one of the criticism of previous DoD analytic community work on IW
was that it insufficiently considered the political, social, economic, and other factors in a
credible way, JIWAB was considered very credible on this front. The office of the U.S. Special
Envoy to Sudan and South Sudan (USSESS) at the U.S. Department of State asked the JIWAB
team for “everything you have” prior to their development of strategy in Sudan and South
20
Sudan. USAID also used the JIWAB systems maps before making decisions on the ground for
development programs in South Sudan. This was unprecedented for DoD analysis community
products. JIWAB products were also used to conduct IW capability analyses and to identify
capability gaps for Joint Staff J7 and OSD Special Operations/Low Intensity Conflict (SO/LIC).
The methods introduced to the community through JIWAB also helped bring the
research, conflict analysis and resolution, intelligence analysis, and other areas. It also
pioneered new applications, such as the combination of GMA with wargaming; the use of
structured narrative analysis in wargaming; and an analytically more rigorous version of ICAF.
Many of the methodological steps that JIWAB employed are applicable for other problems sets.
sequence that takes hours rather than months to run. Project Cassandra was run at the 2016
Connections Wargaming Conference at Maxwell Air Force Base and the 2016 MORS Wargaming
There are several implications for the DoD analytic community from this study. One of
the most important is that the community has a significant need to draw from a wider pool of
analytic methods than what is currently institutionalized within DoD.18 The DoD analysis
community has found the limitations of its traditional quantitative, computer simulation-based
18
Yuna Huh Wong, “Preparing for Contemporary Analytic Challenges,” PHALANX: the Bulletin of
Military Operations Research , Vol. 47, No. 4 (December 2014), pp. 35-39.
21
approaches for IW. The application of related methods such as systems dynamics and agent-
The solution to DoD’s conundrum is greater use of methods that have been and are
being developed for these types of highly unstructured and challenging problems.
which has been around since the 1940s. Similar types of “judgment based methods” have
receiving increased attention in NATO operational analysis circles for exactly these types of
problems.19 JIWAB also drew heavily from theories in the conflict resolution field influential in
the development of ICAF, a field highly relevant to future analytic efforts in IW. Wargaming is
another area where there is the possibility of future innovation, given the high interest that
model that is counter to how many of us are taught to approach analysis, where the analysts
become the experts on the topic. However, this is not a model that scales in the complex world
we now inhabit, where there is significant need to incorporate interdisciplinary and interagency
expertise. Epistemic dependence is the idea that we all rely on the intellectual authority of
19 NATO, Code of Best Practice for Judgement-Based Operational Analysis, 2009; and Diederik J.
D. Wijnmalen and Neville J. Curtis, “A code of best practice for judgement-based operational
research,” OR Insight, Vol. 26, Issue 4 (December 2013), pp. 291-308.
20 Deputy Secretary of Defense, Memorandum for Secretaries of the Military Departments,
22
others, particularly experts, for forming the basis of much of what we know. 21 This was a
central feature of JIWAB: that at each stage, the product would be based on the expertise of
those with more substantial knowledge of conflict dynamics in Sudan and South Sudan,
interagency planning and activities, and joint military planning and activities than the core
JIWAB study team. This extended even to the very methods involved in the study, to the point
that the developers of the methods themselves were active participants in the study whenever
possible.22
At the same time, there are a number of difficulties to overcome to effectively use
experts. One practical difficulty is in identifying and recruiting the appropriate experts. The
literature on expertise discusses the problem of upward discrimination. We are easily able to
downward discriminate, or gauge just how much less expert a person is than we are on a topic.
However, the argument is that we cannot reliably discriminate upward, or meaningfully judge
someone more expert on a topic.23 Another practical consideration was the time and effort
required to identify and recruit experts, through both formal and informal channels. Many of
the methods selected for JIWAB were geared towards using expertise, such as GMA and
wargaming. These come from classes of methods that are inherently geared towards
facilitating and explicitly eliciting expert judgment. These are methods that are worth learning.
A variety of U.S. government offices were heavily involved with JIWAB in its different
phases. Participants from the State Department included the Special Envoy’s office (USSESS),
21 John Hardig, “Epistemic Dependence,” Journal of Philosophy, Vol. 82, 1985, p. 335-336.
22 This included Noel Hendrickson for counterfactual reasoning, Tom Ritchey for general
morphological analysis, Peter Perla for wargaming, Sara Cobb for narrative analysis, and
23 Harry Collins and Robert Evans, Rethinking Expertise, Chicago (University of Chicago Press,
2007), p. 15.
23
African Affairs (AF), Conflict and Stabilization Operations (CSO), International Organizations (IO),
and Political-Military Affairs (PM). USAID offices involved in the effort included the Bureau for
Africa (AFR) – especially the Office for Sudan and South Sudan Programs (AFR/SSSP) – Office of
Civil-Military Cooperation (CMC), Office of Conflict Management and Mitigation (CMM), Office
of U.S. Foreign Disaster Assistance (OFDA), and the Office of Transition Initiatives (OTI). U.S.
JIWAB also drew heavily from numerous DoD organizations, with active participation
from different offices with the Office of Secretary of Defense for Policy (Strategy, African
Affairs, and Special Operations and Low Intensity Conflict), Joint Staff (JS) J7, JS J8, the Army,
the Air Force, the Navy, the Marine Corps, U.S. Africa Command (AFRICOM), U.S. Special
Operations Command (SOCOM), Combined Joint Task Force Horn of Africa (CJTF-HOA), Center
for Advanced Operational Culture Learning (CAOCL), Marine Corps Intelligence Activity (MCIA),
National Ground Intelligence Center (NGIC), Defense Intelligence Agency (DIA), National
Defense University (NDU), and U.S. Army Peacekeeping and Stability Operations Institute
(PKSOI).
Experts were also drawn from academica: Columbia University, George Mason
University, James Madison University, Penn State University, University of Florida, University of
University. Federally Funded Research and Development Centers (FFRDCs) RAND and CNA
were also part of the study at various points, as were other research organizations such as the
Center for Strategic and International Studies (CSIS), U.S. Institute of Peace (USIP), and Wilson
24
Center. Participants also came from NGOs Resolve, Enough, and Refugees International; and
Study Governance
The JIWAB team also had a number of government oversight bodies reviewing its
activities. Direct oversight bodies in the Pentagon were the JIWAB General Officer Steering
Committee (GOSC), JIWAB O-6 level working group, Joint Analytic Agenda Steering Committee
working group, and the Irregular Warfare Executive Steering Committee (IW ESC). The Irregular
Warfare Modeling & Simulation Senior Coordinating Group (IW M&S SCG) provided the funding
for some tools development and also provided oversight. Another stakeholder body that the
study team coordinated with was the Irregular Warfare Working Group (IWWG), a group co-
chaired by Army Training and Doctrine Command (TRADOC) and then Marine Corps Operations
Contributors to Success
There are several reasons why JIWAB was able to accomplish as much as it did. The first
and perhaps most important factor in its success was the very long lead time that the study
team had to explore the problem and try ideas that could fail. Although organizations often
want innovation and new ways of thinking about complex problems, it is often hard for them to
provide their personnel with the luxury of time and to shelter them from the pressures of
showing immediate results. The strong belief in the Marine Corps that an alternative approach
25
to traditional modeling & simulation was needed to understand IW meant they were willing to
invest a significant amount of time and resources into the JIWAB team. This allowed the JIWAB
team to test approaches and discard those that showed little promise. It even began by
concluding that it did not attempt to address the most important behavioral, social, and
cultural issues that real world operations would need to consider.24 We also spent a
considerable amount of time studying the other analytic efforts within DoD attempting to
address IW. Thus there was a few years’ worth of research and development leading up to the
effort itself.
A second factor for the study’s success was the organization’s willingness to hire social
scientists and allow them freedom in study design and methodology. While this is something
that may seem obvious in retrospect, given the applicability of social science to IW, this was not
the practice of the existing analytic organizations. Years of emphasis on physics-based models
had resulted in personnel whose skills sets were heavy in operations research and similar
disciplines. Social scientists, where retained, were usually used as subject matter experts
whom the operations research analysts would consult as they built their models. Or, as one
JIWAB operations research analyst put it, “you get a SME, and then you beat the data out of
them.”25 However, starting without the constraint that everything had to fit a computer model
26
Conclusions
use and adapt analytic methods that were outside the norm for their community. It is difficult
to convey, within such a short paper, just how unusual and barrier-breaking the JIWAB study
was for the DoD analysis community. Yet it showed that the community was able to adapt and
come up with a solution completely different from its established methods after enough time.
At the same time, it is also a story of an adaptation that happened too late – analytic interest in
IW was immense in 2006, but was already beginning to wane during the course of the JIWAB
study. It is also a story of adaptations and changes not incorporated into the bureaucracy, as
both the reason for its success and the reason its developments were not adopted was because
it was developed outside normal DoD channels and processes. Hopefully, however, the next
time IW is of interest to DoD, we will have a better record of the analytic methods that are
References
Bumiller, Elisabeth. “We have met the enemy and he is Power Point,” New York Times, April 26,
2010.
Colins, Harry and Robert Evans. Rethinking Expertise. Chicago: University of Chicago Press,
2007.
Davis, Paul K. and Kim Cragin, editors. Social Science for Counterterrorism: Putting the Pieces
Together. Santa Monica, CA: RAND Corporation, 2009.
27
DeWeerd, Harvery. A Contextual Approach to Scenario Construction. Santa Monica, CA: RAND
Corporation, 1973.
Emerson, Robert M., Rachel I. Fretz, and Linda L. Shaw. Writing Ethnographic Fieldnotes, 1st
Edition. Chicago Guides to Writing, Editing, and Publishing. Chicago: University of Chicago
Press, 1995.
Hardig, John. “Epistemic Dependence,” Journal of Philosophy, Vol. 82, 1985, pp. 335-349.
Hendrickson, Noel. Counterfactual Reasoning: a Basic Guide for Analysts, Strategists, and
Decision Makers. Proteus Monograph Series, Vol. 2, Issue 5, October 2008. Washington,
D.C.: National Intelligence University, 2008.
Kott, Alexander and Peter Corpac, Defense Advanced Research Projects Agency. “COMPOEX
Technology to Assist Leaders in Planning and Executing Campaigns in Complex Operational
Environments.” 12th International Command and Control Research and Technology
Symposium, June 2007, Newport, RI.
North Atlantic Treaty Organization. Code of Best Practice for Judgement-Based Operational
Analysis. 2009.
Perla, Peter. The Art of Wargaming: a Guide for Professionals and Hobbyists. Annapolis, MD:
Naval Institute Press, 1990.
Perla, Peter, Daniella Mak, Mike Markowitz, Cathering Norman, and Karen Grattan. Tumult and
Transition: Gamebook. Alexandria, VA: Center for Naval Analyses, August 2013.
Perla, Peter, Michael Markowitz, and Lesley Warner. Separating Sudan: Final CNA Report.
Alexandria, VA: Center for Naval Analyses, 2011.
Purdue Research Foundation. “Simulex celebrates open house with expansion, announcement
of two separate divisions,” April 16, 2008. Accessed at
http://www.purdue.edu/uns/x/2008a/080416ChaturvediOpen.html on September 7, 2016.
Ricigliano, Robert and Karen Grattan. “Advice to Policy Makers Who Would Tackle Syria: The
Problem of with Problem Solving,” PRISM, April 1, 2014.
28
UK Ministry of Defence. “MOD scientists help shape Afghanistan operations,” April 28, 2011.
Accessed at https://www.gov.uk/government/news/mod-scientists-help-shape-
afghanistan-operations on September 7, 2016.
U.S. Department of Defense. Quadrennial Defense Review Report. Washington, D.C., February
2006.
U.S. Department of Defense. Deputy Secretary of Defense, Bob Work. Memorandum for
Secretaries of the Military Departments, Wargaming and Innovation. Washington, D.C.:
February 2015.
Wijnmalen, Diederik J. D. and Neville J. Curtis. “A code of best practice for judgement-based
operational research,” OR Insight, Vol. 26, Issue 4 (December 2013), pp. 291-308.
Wong, Yuna Huh. “Preparing for Contemporary Analytic Challenges,” PHALANX: the Bulletin of
Military Operations Research, Vol. 47, No. 4 (December 2014), pp. 35-39.
Wong, Yuna Huh and Karan Grattan. “Joint Irregular Warfare Analytic Baseline: Ethnographic
Observations From a Seminar Wargame.” Presentation at the 80th Military Operations
Research Society Symposium in Alexandria, VA on June 12, 2012.
Wong, Yuna Huh and Sara Cobb. “Narrative Analysis in Seminar Gaming.” Presentation to the
Military Operations Research Wargaming Community of Practice in Alexandria, VA on
September 10, 2014.
Wong, Yuna Huh and Sara Cobb. “Narrative Analysis in Seminar Gaming,” 2014. Unpublished
working paper submitted for the 2015 Military Operations Research Society Barchi Prize.
29
Author Biographies
Yuna Huh Wong, PhD, is a RAND Corporation researcher. She was previously an operations
research analyst for the Marine Corps Operations Analysis Division (OAD), where she was the
methodology lead for the JIWAB study. She is a member of the Military Operations Research
Society (MORS) Board of Directors.
Michael Bailey, PhD, was the Technical Director of the Marine Corps Operations Analysis
Division (OAD) and study lead for the JIWAB. He is a career civil servant, serving first as a
professor of Operations Research at the Naval Postgraduate School, then with OPNAV and
Marine Corps Combat Development Command.
Karen Grattan is the CEO of Engaging Inquiry, LLC a firm specializing in designing collaborative
analysis and strategy workshops for federal and non-profit organizations. She was previously a
senior social scientist at Group W, where she worked on IW studies for Marine Corps
Operations Analysis Division (OAD). She is a graduate of the George Mason University (GMU)
School for Conflict Analysis and Resolution, as well as the GMU School of Public Policy, where
she focused on organizational learning and public policy analysis.
C. Steve Stephens was an operations research analyst at the Marine Corps Operations Analysis
Division (OAD) and is now retired. Steve focused on stochastic simulations of ground combat at
OAD and had previously served as an infantry officer in the Marine Corps.
Robert Sheldon, PhD, is a retired Air Force officer, having taught calculus and Operations
Research at the Air Force Academy and served as Chief Military Analyst for the Air Force Studies
and Analyses Agency. He is a Past President of the Military Operations Research Society
(MORS) and a MORS Fellow of the Society (FS). He is currently employed by Group W, Inc.
doing analysis and modeling in support of the Marine Corps Operations Analysis Division (OAD).
Bill Inserra is an operations research analyst at Operations Analysis Division (OAD), where he
supports the Marine Corps in IW and M&S, and is the lead analyst in support of Combat
Development Wargames. He has almost twenty years of operations research (OR) experience.
He was a Marine Infantry Officer that was complemented with two OR payback tours at Marine
Corps Studies & Analysis Division and U.S. Special Forces Command (USSOCOM).
30