You are on page 1of 31

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/316474003

The use of multiple methods in the Joint Irregular Warfare Analytic Baseline
(JIWAB) study

Article  in  The Journal of Defense Modeling & Simulation · January 2017


DOI: 10.1177/1548512916680917

CITATIONS READS
6 234

6 authors, including:

Yuna Wong Michael Bailey


Institute for Defense Analyses Systems Planning and Analysis
12 PUBLICATIONS   30 CITATIONS    24 PUBLICATIONS   79 CITATIONS   

SEE PROFILE SEE PROFILE

Karen Grattan
George Mason University
4 PUBLICATIONS   78 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Hedgemony View project

All content following this page was uploaded by Michael Bailey on 11 October 2018.

The user has requested enhancement of the downloaded file.


The Use of Multiple Methods in the Joint
Irregular Warfare Analytic Baseline (JIWAB)
Study
Yuna Huh Wong, Michael Bailey, Karen Grattan, C. Steve Stephens, Robert Sheldon, and William
Inserra

Background

The U.S. wars in Iraq and Afghanistan introduced considerable urgency within the

Department of Defense (DoD) to find ways of analytically understanding the dynamics of

insurgency and what the U.S. government could do about them. Focused initially on a repeat of

the quick and conventional 1991 Gulf War, the U.S. military found itself dealing with an entirely

different set of challenges than for which it had prepared. The attacks on the World Trade

Center and Pentagon on September 11, 2001; the 2001 U.S. invasion of Afghanistan; and the

2003 U.S. invasion of Iraq challenged many long-held U.S. defense related practices and

paradigms. The shift towards “irregular warfare” (IW) accelerated as the 2006 Quadrennial

Defense Review (QDR) identified irregular, catastrophic, and disruptive challenges as areas that

DoD had to address.1

Figure 1. Four Priority Areas for the 2006 Quadrennial Defense Review

1 U.S. Department of Defense, Quadrennial Defense Review Report, February 6, 2006, p. 19.

1
Source: Department of Defense, 2006 Quadrennial Defense Review

The defense analysis community developed over several decades has developed tools

focused on the conventional military challenges of the Cold War and the Gulf War. The most

common approaches at the time were physics-based, force-on-force computer models and

simulations. The analytic community’s initial response to the IW challenge was to try and

create or sponsor similar computer models and tools that took a similar approach towards

complex dynamics of counter-terrorism and insurgency. These new models and modeling

environments included the Synthetic Environment for Analysis and Simulation (SEAS), an agent-

based model developed by Simulex 2; Pythagoras, another agent-based model developed by

Northrop Grumman for the U.S. Marine Corps; the Peace Support Operations Model (PSOM)

2Purdue Research Foundation, “Simulex celebrates open house with expansion, announcement
of two separate divisions,” April 16, 2008, accessed at
http://www.purdue.edu/uns/x/2008a/080416ChaturvediOpen.html on September 7, 2016.

2
developed by the UK Defence Science and Technology Laboratory (DSTL) and used by the Joint

Staff3; and the Counterinsurgency (COIN) Systems Dynamics Model by PA Consulting,

developed for Joint Staff to illustrate the new counterinsurgency manual, but whose

complicated visualization was derided in the New York Times.4 Perhaps the one computational

backplane to rule them all was the Conflict Modeling, Planning and Outcomes Experimentation

(COMPOEX) Program funded by the Defense Advanced Research Projects Agency (DARPA).5

Adaptation proved to be difficult. Despite sponsoring work such as RAND’s

interdisciplinary review of the scholarly work on the causes of terrorism,6 it was difficult for the

DoD analysis community to approach problems in non-quantitative ways. This may have been

due to the heavy representation within the community of analysts with quantitative

backgrounds, the overall lack of understanding of social science, and highly routinized

organizational norms within the community as to what constituted legitimate analysis. Thus,

years into the start of the wars in Iraq and Afghanistan, the analysis community continued to

lean on large computer models to try to analyze IW dynamics. This was highlighted by the 2009

Africa Study by then Program Analysis and Evaluation (PA&E) Simulation and Analysis Center

3 Ministry of Defence, “MOD scientists help shape Afghanistan operations,” April 28, 2011,
accessed at https://www.gov.uk/government/news/mod-scientists-help-shape-afghanistan-
operations on September 7, 2016.
4 Elisabeth Bumiller, “We have met the enemy and he is Power Point,” April 26, 2010, New York

Times, accessed at http://www.nytimes.com/2010/04/27/world/27powerpoint.html?_r=1 on


September 6, 2016; and Headquarters of the Army, Counterinsurgency, Field Manual 3-24,
Washington, D.C.: December 2006.
5 Alexander Kott and Peter Corpac, Defense Advanced Research Projects Agency, “COMPOEX

Technology to Assist Leaders in Planning and Executing Campaigns in Complex Operational


Environments,” 12th International Command and Control Research and Technology Symposium,
June 2007, Newport, RI.
6 Paul K. Davis and Kim Cragin, editors, Social Science for Counterterrorism: Putting the Pieces

Together (Santa Monica, CA): RAND Corporation, 2009.

3
(SAC) within the Office of the Secretary of Defense (OSD), which used COMPOEX and other

models. The Africa Study was faulted for using quantitative data and relationships that were

not validated; as well as being a black box with quantitative outputs that did not readily

translate to intuitive explanations. Dissatisfaction with the Africa Study ultimately prompted

the Marine Corps to volunteer to lead an analytic effort focused on IW that would use a non-

computational modeling approach. The resulting three-year effort was known as the Joint

Irregular Warfare Analytic Baseline (JIWAB).

The JIWAB Approach

JIWAB was a significant departure from the practices and methods of the existing DoD

analysis community at the time. In addition to using different tools and disciplines, it was also

purposely designed to incorporate a very high level of participatory input from organizations

outside DoD. To facilitate this collaborative approach, the study was kept unclassified for as

many steps as possible.

Overall Approach

The overall approach to JIWAB was one that was distinctly not focused on computer

based M&S, or anything quantitative. The reason for this was simple: very little in the realm of

counterinsurgency, stabilization, or local conflict could be measured in quantitative terms, or

understood primarily through quantifiable means. The traditional defense analytic approach

can be characterized as quantifying relationships that are well understood and well defined,

4
and is often marked by large, integrative computer models and simulations that attempt to roll

up physics and conventional-warfare-related phenomena to higher operational levels. JIWAB

represented a departure from this in several distinct ways.

The traditional DoD analytic approach focuses on a single classified scenario that is

developed by the DoD over a compressed time period – a matter of weeks. The scenario goals

are to test specific capabilities rather than scenario realism, is adversary oriented, and typically

looks out 20-25 years into the future. The analytic approach is typically modeling & simulation

heavy, with a computational perspective. Wargames used in this traditional approach are often

computer-supported and focus on quantitative outputs, and often exist to inform further, down

stream computer modeling efforts. Data in the traditional approach is predominantly

quantitative and simulation driven. However, one important epistemological challenges to

applying this approach to IW is that the complication that much of IW is qualitative. Another

issues is that computational social science, perhaps the area closest to what DoD might hope to

achieve in applying modeling & simulation to IW, is in many ways still a nascent field

unrecognized by many.

In contrast, the JIWAB approach different in may respects. The scenario development

process developed multiple vignettes and potential paths, had heavy interagency participation,

and involved an extended process over many months. In order to maximize collaboration with

interagency, academic, and non-governmental experts, the study process remained unclassified

for as long as possible. The study began with the existing conflict drivers and mitigators and

projected out only a few years, rather than beginning with a desired future conflict and

5
progressing towards that point.7 Longer time horizons multiply errors when projecting events

into the future. They are necessary in conventional war scenarios where the purpose is to

support the analysis of long-term weapons systems. Because IW solutions are not geared

towards weapons systems development, JIWAB had the freedom to stay in the near-term

future.

JIWAB also had an orientation towards drawing out the multiple actors in a space rather

than being adversary oriented. That is, rather than being focused on the U.S. versus an

adversary, the study was oriented at understand the actors in the region, whose interactions

would be the primary conflict drivers. Primary JIWAB methods were structured but qualitative

methods, and wargames were designed to explore concepts and feed scenario development.

Data was primarily qualitative, and the study had a significant qualitative data collection plan.

Because of its focus on expert-driven methods, the epistemological challenge for the JIWAB

approach was in finding good methods to structure and manage expertise.

Methods in JIWAB

JIWAB used a number of methods drawn from a wide variety of disciplines. Although

the following methods were chosen for the study, it would be possible to repeat the study

using an entirely different set of methods. Figure 2 below shows the various steps for the

study, some of which ran in parallel.

7Harvey DeWeerd, A Contextual Approach to Scenario Construction (Santa Monica, CA: RAND
Corporation, 1973), pp. 4-5.

6
Figure 2. JIWAB Study Map

The starting point, the identification of the geography and timeframe, came from OSD

Policy. JIWAB was to examine Sudan approximately two years in the future from 2010, which

put the scenario at 2012. The rationale for a short time future projection was better fidelity on

the problem. However, this posed some interesting challenges for the study team. One was

that the study timeline eventually outran the scenario timeline; the other was the Sudan split

into Sudan and South Sudan in 2011.

Study Steps and Methods

Two steps in the study process, scenario development and contextualized

understanding, ran as independent efforts. The reason for spending so much time on scenario

and contextualized understanding was to gain a detailed appreciation for the problem.

7
Scenario

Scenario development, typically a brief stage in what is now called Support to Strategic

Analysis (SSA) in DoD, was expanded considerably within the JIWAB process. The first step in

scenario development was the use of a problem-structuring method known as general

morphological analysis (GMA). GMA, a non-quantitative modeling approach initially developed

by Fritz Zwicky, can be executed as a group method that asks participants to identify the most

important parameters of a problem and their values.8 For the JIWAB GMA, experts on Sudan

were asked what would be the key factors affecting the security environment in the area two

years into the future.

Figure 3 below identifies the “morphological field” produced by participants. Across the

top are the parameters that are expected to factor into the security environment, such as

economic conditions, internal migration, and negotiations in Darfur and other areas. Below

each parameter are the values that the parameter can take on. For example, resource

distribution can become more unbalanced than at present, or more balanced in the future. The

green boxes represent one particular configuration of the morphological box, or one possible

combination of factors in the future. Cross-consistency analysis is also done with participants in

order to rule out inconsistent combinations of factors. Key to the GMA workshop was a

facilitator experienced in GMA and in training others to conduct GMAs workshops.

Figure 3. JIWAB Morphological Field

8Tom Ritchey, “General Morphological Analysis: a General Method for Non-Quantified


Modelling,” Swedish Morphological Society, 1998 and 2003.

8
9
From the huge number of potential configurations, participants selected several

“interesting corners,” where they felt several potential future values intersected in insightful

ways. This created numerous vignettes that various DoD study stakeholders assessed using

criteria such as interest to their organization and general interest for IW. The next step within

the process was to use structured scenario fusion from a counterfactual reasoning (CFR)

approach developed for intelligence analysts projecting future trends.9 CFR is structured to

function as a “checklist” for intelligence analysts to think through antecedent scenarios,

intermediate states, and consequent scenarios when making projections. Part of the method

involves fusing together events and trends in a structured way, and informed the creation of

five “proto-scenarios” using the GMA vignettes. The Pentagon-level study steering committee

approved four of the proto-scenarios and suggested a fifth.

In the next stage of scenario development, each of the five proto-scenarios were fleshed

out through a seminar wargame, Separating Sudan, designed and executed by the Center for

Naval Analyses (CNA).10 This involved another round of expert recruitment, and experts role-

played various actors from the starting point specified in each proto-scenario. The result was

five fleshed-out scenarios that were developed to the point of U.S. government intervention.

Participant roles included the governments of Sudan and South Sudan, the African Union,

Darfur Arabs, Darfur Africans, U.S. Department of State, non-governmental organizations

9 Noel Hendrickson, Counterfactual Reasoning: a Basic Guide for Analysts, Strategists, and
Decision Makers, The Proteus Monograph Series, Vol. 2, Issue 5, October 2008 (Washington,
D.C.: National Intelligence University); and Alec Barker, “Applying Structured Scenario Fusion to
Address Multifaceted Strategic Challenges,” The Journal of the Operational Research Society,
65(11), November 2014.
10 Peter Perla, Michael Markowitz, and Lesley Warner. Separating Sudan: Final CNA Report.

Alexandria, VA: Center for Naval Analyses, 2011.

10
(NGOs), the United Nations – and the Lord’s Resistance Army (LRA) for a scenario involving the

LRA.

Another JIWAB team methodological addition to seminar gaming was ethnography-style

notetaking to observe wargame participants.11 This enabled the team to observe the shift from

third person to first person as players settled into their roles, the extent to which players were

immersed in the scenario, and the degree to which they found the scenarios realistic.12 The

addition of formal, qualitative social science theory and methods to wargaming is another area

that is wide open for further development, and one that the JIWAB team began experimenting

with in both JIWAB wargames.

Contextualized Understanding

As the scenarios developed, the JIWAB study team also began a parallel effort to

develop what we termed contextualized understanding of the problem. While the scenario

development process began with GMA, an atheoretical framework for structuring a variety of

problems, work on contextualized understanding started very much from a theory-based lens

of conflict.

This lens was the Interagency Conflict Assessment Framework (ICAF), a process normally

run by then-State Department’s Office of the Coordinator for Stabilization and Reconstruction

11 Robert M. Emerson, Rachel I. Fretz, and Linda L. Shaw, Writing Ethnographic Fieldnotes, 1st
Edition, Chicago Guides to Writing, Editing, and Publishing, (Chicago: University of Chicago
Press), 1995.
12 The observation about shifting to first person was made by team member Deborah Frost;

such observations point to the possibility of being able to document wargame dynamics
through empirical observations.

11
(S/CRS). The ICAF process consists of four steps: 1) evaluating the context for the conflict, 2)

understanding core grievances and social and institutional resiliencies, 3) identifying key drivers

and mitigators of conflict, and 4) identifying windows of vulnerability and opportunity. 13 The

key “formula” for a conflict was a key actor mobilizing an identify group around a grievance;

while the formula for reducing conflict was a key actor mobilizing a group around a resilience.

Figure 4. The Interagency Conflict Assessment Framework (ICAF)

Source: U.S. Department of State

The JIWAB study team adapted ICAF for its study by taking a more analytically oriented

approach, shifting it from its primary purposed of shared interagency understanding of a

situation. Thus the study team developed the Interagency Conflict Assessment Framework –

Analytic, or “ICAF-A”. Table 3 below describes the differences.

13U.S. Department of State, Interagency Conflict Assessment Framework (Washington, D.C.),


2008, p. 6.

12
Table 1. Difference Between ICAF and ICAF-A

ICAF ICAF-A

Primary purpose is shared understanding Primary purpose is support to analytic


methodology
DC-based interagency analysis is initial Expert groups with intimate knowledge of
step region is initial step (“Regional Context
Assessment Workshops”)
Concepts of grievance and drivers of Articulation of key context dynamics
conflict developed by interagency developed by experts (systems diagrams
of major factors & their relationships)
Preliminary analysis from DC drives Expert analysis and guidance drives
additional research, in-country field detailed research by JIWAB analysts
interviews
Details captured in single analytic Descriptive analysis of ICAF elements,
narrative, report based and empirical evidence of conflict
dynamics captured in JDS Wiki
Report informs Embassies, Agencies or Agencies use systems diagrams, expert-
COCOMs planning & programming developed narratives and wiki-based
activities conflict evidence to inform priorities; leads
to integrated options development

The focus of ICAF-A was more research-oriented, and attempted to use sourced

material (particularly primary sources) and additional academic experts in generating

information on the framework elements. This information was captured in a wiki, also one of

the study products. Figure 5 below illustrates the tool.

Figure 5. Conflict Assessment Evidence Tool

13
Another addition in ICAF-A were the “systems maps” of conflict dynamics for different

areas in the region, in workshops facilitated by experts in conflict and conflict resolution. This

was done during three all-day workshops, one each on Darfur, South Sudan, and North Sudan.

Participants were asked to identify the causal loops created by conflict dynamics, and to pull

them together into an overall “map” of the dynamics in each area. Figure 6 below is a photo of

one of the workshops.

Figure 6. Expert-Generated Regional Context Assessment

14
Figure 7 below shows an example of one of the systems maps. As part of the effort, the

team of conflict experts facilitating the event developed a “grammar” pictured in the systems

maps. Within this grammar, R denoted “reinforcing” loops, where once around the loop

reinforces the dynamic; and B denoted “balancing” loops, where once around the loops

balances out the dynamic. The +/- show the direction of the variables depicted in the loop.

Figure 7. South Sudan Regional Context Assessment

15
In this figure, red R1 is a reinforcing loop. Going once around the loop, a decrease in the

capacity and legitimacy of the Government of South Sudan (GoSS) leads to an increase in

unbalanced economic development, which leads to a decrease in sufficiency and equity of

resource distribution. This in turn leads to an increase in political exclusion, which leads to a

decrease again in GoSS legitimacy and capacity. The addition of systems mapping also helped

visually link the conflict elements in a dynamic way, and was useful in conveying this

information to non-experts on Sudan and South Sudan.14

14One concern of the original ICAF was that it produced “lists of lists,” without being
integrative. Feedback on the JIWAB systems maps, however, was that they were still limited
because they did not integrate over the different regions of Sudan and South Sudan.

16
Taking two fundamentally different approaches to understanding the nature of the

problems in Sudan and South Sudan, the scenario process and the contextualized

understanding process arrived at essentially the same information. The experts used in the two

separate processes were also different, with only a small amount of overlap (especially hard

given the small number of relevant experts located in the United States). This let the JIWAB

team have greater confidence in the results. One thing learned from taking both approaches

was that either approach appears sufficient for developing a deep level of context from which

to develop scenarios. In other words, with the same caliber of experts, analysis could probably

use either GMA or ICAF-A as starting points, without having to do both.

Policy Option

The next stage of the study was the development of the interagency policy option. DoD,

U.S. Department of State, and U.S. Agency for International Development (USAID) participants

participated in a facilitated workshop to create an interagency approach to the presented

scenarios. Figure 8 shows the five lettered lines of effort they developed: A) coalition

formation, B) humanitarian aid, C) governance, D) development of livelihoods, and E) security.

Blue action cards represented DoD activities; pink represented State; light green represented

USAID; and other colors represented actors such as the host nation government. Dependence

and time synchronization is captured on the vertical axis, where tasks at the top of the board

depend on, or are planned to succeed tasks at the bottom. Smaller cards were used by the

different agencies to comment on the proposed activities of other organizations.

17
Figure 8. Whole-of-Government Policy Option

This interagency planning exercise within JIWAB raised some interesting intellectual

questions, since no formal interagency planning process actually exists at this level of the

government. The team debated whether the study should try to reflect governmental

processes as they were, or as they should be. The JIWAB exercise was more a reflection of the

latter, since it was difficult to say what happens in the absence of formal processes.

Military CONOPS, Interagency CONOPS, and Wargame

The interagency option then fed the development of military concept of operations

(CONOPS). At this point, Joint Staff led a planning exercise similar to its Multi-Service Force

Deployment (MSFD) planning exercises for SSA scenarios. The mention of specific military

18
forces and other details made the product classified, the first point in the JIWAB study where

anything was classified.

In second interagency wargame aimed at better understanding the DoD capability

requirements, participants redid part of the strategic-level U.S. government response and

created a response with a lighter DoD footprint. Although it was not necessarily the intent of

the wargame, it moved the study down a path of offering more than one DoD approach to a

particular scenario. The idea of preparing for different levels of military response to a challenge

is not new, particular for IW and other challenges where DoD is not the lead organization, and

should be further explored and developed in future analytic efforts.

The second JIWAB wargame, Tumult and Transition, also presented an opportunity for

the team to bring structured narrative analysis into the analysis of seminar game dynamics, a

methodological coupling that had not been done before.15 In an article that already has to

move rapidly over the many methods used, without doing sufficient justice to any, we can only

note that the use of actant analysis allowed the comparison of wargames narrative between

actors over time. This allowed a systematic analysis of the narrative at crucial turning points in

the game. 16 The details of that analytic approach and its application to wargaming are dealt

more extensively in a separate paper.17

15 Peter Perla, Daniella Mak, Mike Markowitz, Cathering Norman, and Karen Grattan. Tumult
and Transition: Gamebook. Alexandria, VA: Center for Naval Analyses, August 2013.
16 Yuna Huh Wong and Sara Cobb. “Narrative Analysis in Seminar Gaming.” Presentation to the

Military Operations Research Wargaming Community of Practice in Alexandria, VA on


September 10, 2014.
17 Yuna Huh Wong and Sara Cobb, “Narrative Analysis in Seminar Gaming,” 2014. Unpublished

working paper submitted for the 2015 Military Operations Research Society Barchi Prize.

19
Table 1. JIWAB Steps, Methods, and Outputs

Study Step Methods Outputs


Scenario  General Morphological Analysis  Morphological field of
(GMA) potential vignettes
 Counterfactual Reasoning (CFR)  Combination of vignettes into
and Scenario Fusion several proto-scenarios
 Seminar Wargame  Five developed scenarios
Contextualized  Interagency Conflict Assessment  Key actors, identify groups,
Understanding Framework (ICAF) – Analytic conflict drivers & mitigators
 Systems Mapping  Conflict assessment evidence
tool (wiki)
 Systems maps of societal
grievances and resiliencies
Policy Option  Interagency Workshop  Interagency plan to address
scenarios
Military Concept  Joint Staff-led planning exercise  Military CONOPS and forces
of Operations with service participation and for a high-footprint response
input from combatant commands
Interagency  Interagency Wargame  Interagency CONOPS for a
Wargame constrained response
 Narrative analysis of
wargame roles

Success?

Study participants and the relevant senior Marine Corps official regarded the study as

very successful, even “mind-bending” in reframing the way in which campaign analyses should

be approached for contexts such as peacemaking interventions, interagency coordination, and

irregular warfare. While one of the criticism of previous DoD analytic community work on IW

was that it insufficiently considered the political, social, economic, and other factors in a

credible way, JIWAB was considered very credible on this front. The office of the U.S. Special

Envoy to Sudan and South Sudan (USSESS) at the U.S. Department of State asked the JIWAB

team for “everything you have” prior to their development of strategy in Sudan and South

20
Sudan. USAID also used the JIWAB systems maps before making decisions on the ground for

development programs in South Sudan. This was unprecedented for DoD analysis community

products. JIWAB products were also used to conduct IW capability analyses and to identify

capability gaps for Joint Staff J7 and OSD Special Operations/Low Intensity Conflict (SO/LIC).

The methods introduced to the community through JIWAB also helped bring the

analysis community up to speed on a number of methods in non-traditional operational

research, conflict analysis and resolution, intelligence analysis, and other areas. It also

pioneered new applications, such as the combination of GMA with wargaming; the use of

structured narrative analysis in wargaming; and an analytically more rigorous version of ICAF.

Many of the methodological steps that JIWAB employed are applicable for other problems sets.

One example of this Project Cassandra, a miniaturized variant of the GMA-CFR-wargaming

sequence that takes hours rather than months to run. Project Cassandra was run at the 2016

Connections Wargaming Conference at Maxwell Air Force Base and the 2016 MORS Wargaming

Special Meeting, and continues to be developed.

Implications for DoD Analytic Methods

There are several implications for the DoD analytic community from this study. One of

the most important is that the community has a significant need to draw from a wider pool of

analytic methods than what is currently institutionalized within DoD.18 The DoD analysis

community has found the limitations of its traditional quantitative, computer simulation-based

18
Yuna Huh Wong, “Preparing for Contemporary Analytic Challenges,” PHALANX: the Bulletin of
Military Operations Research , Vol. 47, No. 4 (December 2014), pp. 35-39.

21
approaches for IW. The application of related methods such as systems dynamics and agent-

based modeling also have significant limitations in this domain.

The solution to DoD’s conundrum is greater use of methods that have been and are

being developed for these types of highly unstructured and challenging problems.

Methodologically, JIWAB is heavily influenced by problem structuring methods such as GMA,

which has been around since the 1940s. Similar types of “judgment based methods” have

receiving increased attention in NATO operational analysis circles for exactly these types of

problems.19 JIWAB also drew heavily from theories in the conflict resolution field influential in

the development of ICAF, a field highly relevant to future analytic efforts in IW. Wargaming is

another area where there is the possibility of future innovation, given the high interest that

DoD currently has in reinvigorating wargaming.20

Greater Use of Structured Expertise in JIWAB


A hallmark of the JIWAB process was the degree to which outside experts and other

government agencies contributed significantly to the production of study products. This is a

model that is counter to how many of us are taught to approach analysis, where the analysts

become the experts on the topic. However, this is not a model that scales in the complex world

we now inhabit, where there is significant need to incorporate interdisciplinary and interagency

expertise. Epistemic dependence is the idea that we all rely on the intellectual authority of

19 NATO, Code of Best Practice for Judgement-Based Operational Analysis, 2009; and Diederik J.
D. Wijnmalen and Neville J. Curtis, “A code of best practice for judgement-based operational
research,” OR Insight, Vol. 26, Issue 4 (December 2013), pp. 291-308.
20 Deputy Secretary of Defense, Memorandum for Secretaries of the Military Departments,

Wargaming and Innovation, February 9, 2015.

22
others, particularly experts, for forming the basis of much of what we know. 21 This was a

central feature of JIWAB: that at each stage, the product would be based on the expertise of

those with more substantial knowledge of conflict dynamics in Sudan and South Sudan,

interagency planning and activities, and joint military planning and activities than the core

JIWAB study team. This extended even to the very methods involved in the study, to the point

that the developers of the methods themselves were active participants in the study whenever

possible.22

At the same time, there are a number of difficulties to overcome to effectively use

experts. One practical difficulty is in identifying and recruiting the appropriate experts. The

literature on expertise discusses the problem of upward discrimination. We are easily able to

downward discriminate, or gauge just how much less expert a person is than we are on a topic.

However, the argument is that we cannot reliably discriminate upward, or meaningfully judge

someone more expert on a topic.23 Another practical consideration was the time and effort

required to identify and recruit experts, through both formal and informal channels. Many of

the methods selected for JIWAB were geared towards using expertise, such as GMA and

wargaming. These come from classes of methods that are inherently geared towards

facilitating and explicitly eliciting expert judgment. These are methods that are worth learning.

A variety of U.S. government offices were heavily involved with JIWAB in its different

phases. Participants from the State Department included the Special Envoy’s office (USSESS),

21 John Hardig, “Epistemic Dependence,” Journal of Philosophy, Vol. 82, 1985, p. 335-336.
22 This included Noel Hendrickson for counterfactual reasoning, Tom Ritchey for general
morphological analysis, Peter Perla for wargaming, Sara Cobb for narrative analysis, and
23 Harry Collins and Robert Evans, Rethinking Expertise, Chicago (University of Chicago Press,

2007), p. 15.

23
African Affairs (AF), Conflict and Stabilization Operations (CSO), International Organizations (IO),

and Political-Military Affairs (PM). USAID offices involved in the effort included the Bureau for

Africa (AFR) – especially the Office for Sudan and South Sudan Programs (AFR/SSSP) – Office of

Civil-Military Cooperation (CMC), Office of Conflict Management and Mitigation (CMM), Office

of U.S. Foreign Disaster Assistance (OFDA), and the Office of Transition Initiatives (OTI). U.S.

Department of Agriculture was involved in one event as well.

JIWAB also drew heavily from numerous DoD organizations, with active participation

from different offices with the Office of Secretary of Defense for Policy (Strategy, African

Affairs, and Special Operations and Low Intensity Conflict), Joint Staff (JS) J7, JS J8, the Army,

the Air Force, the Navy, the Marine Corps, U.S. Africa Command (AFRICOM), U.S. Special

Operations Command (SOCOM), Combined Joint Task Force Horn of Africa (CJTF-HOA), Center

for Advanced Operational Culture Learning (CAOCL), Marine Corps Intelligence Activity (MCIA),

National Ground Intelligence Center (NGIC), Defense Intelligence Agency (DIA), National

Defense University (NDU), and U.S. Army Peacekeeping and Stability Operations Institute

(PKSOI).

Experts were also drawn from academica: Columbia University, George Mason

University, James Madison University, Penn State University, University of Florida, University of

Pennsylvania, University of Wisconsin, McGill University, Durham University, and Uppsala

University. Federally Funded Research and Development Centers (FFRDCs) RAND and CNA

were also part of the study at various points, as were other research organizations such as the

Center for Strategic and International Studies (CSIS), U.S. Institute of Peace (USIP), and Wilson

24
Center. Participants also came from NGOs Resolve, Enough, and Refugees International; and

from USG implementing partners AECOM and International Sustainable Systems.

Study Governance

The JIWAB team also had a number of government oversight bodies reviewing its

activities. Direct oversight bodies in the Pentagon were the JIWAB General Officer Steering

Committee (GOSC), JIWAB O-6 level working group, Joint Analytic Agenda Steering Committee

(JAASC)/Analytic Agenda Steering Committee (AASC), Joint Analytic Agenda/Analytic Agenda

working group, and the Irregular Warfare Executive Steering Committee (IW ESC). The Irregular

Warfare Modeling & Simulation Senior Coordinating Group (IW M&S SCG) provided the funding

for some tools development and also provided oversight. Another stakeholder body that the

study team coordinated with was the Irregular Warfare Working Group (IWWG), a group co-

chaired by Army Training and Doctrine Command (TRADOC) and then Marine Corps Operations

Analysis Division (OAD).

Contributors to Success

There are several reasons why JIWAB was able to accomplish as much as it did. The first

and perhaps most important factor in its success was the very long lead time that the study

team had to explore the problem and try ideas that could fail. Although organizations often

want innovation and new ways of thinking about complex problems, it is often hard for them to

provide their personnel with the luxury of time and to shelter them from the pressures of

showing immediate results. The strong belief in the Marine Corps that an alternative approach

25
to traditional modeling & simulation was needed to understand IW meant they were willing to

invest a significant amount of time and resources into the JIWAB team. This allowed the JIWAB

team to test approaches and discard those that showed little promise. It even began by

developing a relatively simple agent-based model for counterinsurgency, Pythagoras, before

concluding that it did not attempt to address the most important behavioral, social, and

cultural issues that real world operations would need to consider.24 We also spent a

considerable amount of time studying the other analytic efforts within DoD attempting to

address IW. Thus there was a few years’ worth of research and development leading up to the

effort itself.

A second factor for the study’s success was the organization’s willingness to hire social

scientists and allow them freedom in study design and methodology. While this is something

that may seem obvious in retrospect, given the applicability of social science to IW, this was not

the practice of the existing analytic organizations. Years of emphasis on physics-based models

had resulted in personnel whose skills sets were heavy in operations research and similar

disciplines. Social scientists, where retained, were usually used as subject matter experts

whom the operations research analysts would consult as they built their models. Or, as one

JIWAB operations research analyst put it, “you get a SME, and then you beat the data out of

them.”25 However, starting without the constraint that everything had to fit a computer model

allowed an entirely different pathway.

24Zoe Henscheid, Donna Middleton, and Edmund Bitinas, “Pythagoras: An Agent-Based


Simulation Environment,” The Scythe, Issue 1, pp. 40-44; accessed at
harvest.nps.edu/scythe/Issue1/IDFW13-Scythe-Pythagoras.pdf on November 2, 2016.

25 Comment by Cortez “Steve” Stephens.

26
Conclusions

JIWAB represented an effort by a public organization, outside of normal processes, to

use and adapt analytic methods that were outside the norm for their community. It is difficult

to convey, within such a short paper, just how unusual and barrier-breaking the JIWAB study

was for the DoD analysis community. Yet it showed that the community was able to adapt and

come up with a solution completely different from its established methods after enough time.

At the same time, it is also a story of an adaptation that happened too late – analytic interest in

IW was immense in 2006, but was already beginning to wane during the course of the JIWAB

study. It is also a story of adaptations and changes not incorporated into the bureaucracy, as

both the reason for its success and the reason its developments were not adopted was because

it was developed outside normal DoD channels and processes. Hopefully, however, the next

time IW is of interest to DoD, we will have a better record of the analytic methods that are

more suited to it.

References

Barker, Alec. “Applying Structured Scenario Fusion to Address Multifaceted Strategic


Challenges,” The Journal of the Operational Research Society, 65(11), November 2014.

Bumiller, Elisabeth. “We have met the enemy and he is Power Point,” New York Times, April 26,
2010.

Colins, Harry and Robert Evans. Rethinking Expertise. Chicago: University of Chicago Press,
2007.

Davis, Paul K. and Kim Cragin, editors. Social Science for Counterterrorism: Putting the Pieces
Together. Santa Monica, CA: RAND Corporation, 2009.

27
DeWeerd, Harvery. A Contextual Approach to Scenario Construction. Santa Monica, CA: RAND
Corporation, 1973.

Emerson, Robert M., Rachel I. Fretz, and Linda L. Shaw. Writing Ethnographic Fieldnotes, 1st
Edition. Chicago Guides to Writing, Editing, and Publishing. Chicago: University of Chicago
Press, 1995.

Hardig, John. “Epistemic Dependence,” Journal of Philosophy, Vol. 82, 1985, pp. 335-349.

Hendrickson, Noel. Counterfactual Reasoning: a Basic Guide for Analysts, Strategists, and
Decision Makers. Proteus Monograph Series, Vol. 2, Issue 5, October 2008. Washington,
D.C.: National Intelligence University, 2008.

Henscheid, Zoe, Donna Middleton, and Edmund Bitinas. “Pythagoras: An Agent-Based


Simulation Environment,” The Scythe, Issue 1, pp. 40-44. Accessed at
https://harvest.nps.edu/scythe/Issue1/IDFW13-Scythe-Pythagoras.pdf on November 2,
2016.

Kott, Alexander and Peter Corpac, Defense Advanced Research Projects Agency. “COMPOEX
Technology to Assist Leaders in Planning and Executing Campaigns in Complex Operational
Environments.” 12th International Command and Control Research and Technology
Symposium, June 2007, Newport, RI.

North Atlantic Treaty Organization. Code of Best Practice for Judgement-Based Operational
Analysis. 2009.

Perla, Peter. The Art of Wargaming: a Guide for Professionals and Hobbyists. Annapolis, MD:
Naval Institute Press, 1990.

Perla, Peter, Daniella Mak, Mike Markowitz, Cathering Norman, and Karen Grattan. Tumult and
Transition: Gamebook. Alexandria, VA: Center for Naval Analyses, August 2013.

Perla, Peter, Michael Markowitz, and Lesley Warner. Separating Sudan: Final CNA Report.
Alexandria, VA: Center for Naval Analyses, 2011.

Purdue Research Foundation. “Simulex celebrates open house with expansion, announcement
of two separate divisions,” April 16, 2008. Accessed at
http://www.purdue.edu/uns/x/2008a/080416ChaturvediOpen.html on September 7, 2016.

Ricigliano, Robert and Karen Grattan. “Advice to Policy Makers Who Would Tackle Syria: The
Problem of with Problem Solving,” PRISM, April 1, 2014.

Ritchey, Tom. “General Morphological Analysis: a General Method for Non-Quantified


Modeling.” Swedish Morphological Society, 1998 and 2003.

28
UK Ministry of Defence. “MOD scientists help shape Afghanistan operations,” April 28, 2011.
Accessed at https://www.gov.uk/government/news/mod-scientists-help-shape-
afghanistan-operations on September 7, 2016.

U.S. Department of Defense. Quadrennial Defense Review Report. Washington, D.C., February
2006.

U.S. Department of Defense. Deputy Secretary of Defense, Bob Work. Memorandum for
Secretaries of the Military Departments, Wargaming and Innovation. Washington, D.C.:
February 2015.

U.S. Department of State. Interagency Conflict Assessment Framework. Washington, D.C.,


2008.

Wijnmalen, Diederik J. D. and Neville J. Curtis. “A code of best practice for judgement-based
operational research,” OR Insight, Vol. 26, Issue 4 (December 2013), pp. 291-308.

Wong, Yuna Huh. “Preparing for Contemporary Analytic Challenges,” PHALANX: the Bulletin of
Military Operations Research, Vol. 47, No. 4 (December 2014), pp. 35-39.

Wong, Yuna Huh and Karan Grattan. “Joint Irregular Warfare Analytic Baseline: Ethnographic
Observations From a Seminar Wargame.” Presentation at the 80th Military Operations
Research Society Symposium in Alexandria, VA on June 12, 2012.

Wong, Yuna Huh and Sara Cobb. “Narrative Analysis in Seminar Gaming.” Presentation to the
Military Operations Research Wargaming Community of Practice in Alexandria, VA on
September 10, 2014.

Wong, Yuna Huh and Sara Cobb. “Narrative Analysis in Seminar Gaming,” 2014. Unpublished
working paper submitted for the 2015 Military Operations Research Society Barchi Prize.

29
Author Biographies
Yuna Huh Wong, PhD, is a RAND Corporation researcher. She was previously an operations
research analyst for the Marine Corps Operations Analysis Division (OAD), where she was the
methodology lead for the JIWAB study. She is a member of the Military Operations Research
Society (MORS) Board of Directors.

Michael Bailey, PhD, was the Technical Director of the Marine Corps Operations Analysis
Division (OAD) and study lead for the JIWAB. He is a career civil servant, serving first as a
professor of Operations Research at the Naval Postgraduate School, then with OPNAV and
Marine Corps Combat Development Command.

Karen Grattan is the CEO of Engaging Inquiry, LLC a firm specializing in designing collaborative
analysis and strategy workshops for federal and non-profit organizations. She was previously a
senior social scientist at Group W, where she worked on IW studies for Marine Corps
Operations Analysis Division (OAD). She is a graduate of the George Mason University (GMU)
School for Conflict Analysis and Resolution, as well as the GMU School of Public Policy, where
she focused on organizational learning and public policy analysis.

C. Steve Stephens was an operations research analyst at the Marine Corps Operations Analysis
Division (OAD) and is now retired. Steve focused on stochastic simulations of ground combat at
OAD and had previously served as an infantry officer in the Marine Corps.

Robert Sheldon, PhD, is a retired Air Force officer, having taught calculus and Operations
Research at the Air Force Academy and served as Chief Military Analyst for the Air Force Studies
and Analyses Agency. He is a Past President of the Military Operations Research Society
(MORS) and a MORS Fellow of the Society (FS). He is currently employed by Group W, Inc.
doing analysis and modeling in support of the Marine Corps Operations Analysis Division (OAD).

Bill Inserra is an operations research analyst at Operations Analysis Division (OAD), where he
supports the Marine Corps in IW and M&S, and is the lead analyst in support of Combat
Development Wargames. He has almost twenty years of operations research (OR) experience.
He was a Marine Infantry Officer that was complemented with two OR payback tours at Marine
Corps Studies & Analysis Division and U.S. Special Forces Command (USSOCOM).

30

View publication stats

You might also like