This action might not be possible to undo. Are you sure you want to continue?
Structured Methods or Intuition?'
by Stephen Marrin
Rcently-after the release of the 9111 Commission nd WMD Commission Reports-a substantial mount of money and attention has been devoted to the identification and development of structured analytic techniques for capturing, presenting, and evaluating data more effectively; deriving patterns from the data; understanding what those patterns mean; and communicating them to others. But less attention has been focused on the utility of structured methods, their fit with the reigning analytic culture, and when they should be used, or not used, by analysts. This article addresses these issues using medical practices to illustrate how the same kinds of dynamics can play out in other fields.
DEFINING TERMS: STRUCTURED METHODS VERSUS INTUITION
so what are structured analytic methods? They are those techniques which have a formal or structured methodology that is visible to external observers. Examples of structured methods include: key assumption check, analysis of competing hypotheses, devil's advocacy, Team AlTeam B, alternative futures. 2
The primary value of analytic techniques or structured methods is that they provide a way to account for the analytic judgment; an analytic "audit trail" as it were. According to Rob Hubal-a researcher for RTI International-their use provides a mechanism for verifying or documenting "analysis so that any outside observer can understand the analytic basis of conclusions.'? Alan Schwartz-a teacher of structured techniques throughout the intelligence community-makes a similar point when he observes that "the great virtue" of analytic techniques is that they "make the analytic process transparent." 4 While transparency in terms of methodology may be more useful as a way to structure analytic thinking rather than share with consumers, in a global sense this transparency, according to Schwartz, facilitates collaboration, permits meaningful review by managers, and deeper understanding by customers. Given the possibility for failure and the high stakes associated with many judgments, transparency is essential. Without structured techniques, it is virtually impossible to understand the complete bases for critical analytic judgments." With an
analytic audit trail, analysts and their colleagues can discover the sources of analytic mistakes when they occur and evaluate new methods or new applications of old methods. In this way, structured methods make it possible to advance analytic trade craft.
"The great virtue" of analytic techniq ues is that they "make the analytic process transparent. "
Systematic analytic techniques are also a way for the intelligence analyst to approximate the scientific method in terms of analytic processes. The scientific method is a relatively simple process to understand in theory, but very complicated to apply effectively to real world problems, which are frequently detailed and messy. So analytic techniques are a short-hand way to approach certain problems, to make the study of them more methodical. Yet the foundation of most, if not all, of them remains the scientific method.
The scientific method-like intelligence analysis-involves description, explanation, and prediction. So in all fields where the scientific method is applied, methodologists tell practitioners to use good data that is reliable, and validate it if possible. When describing the characteristics of things, decide if the categories are nominal, ordinal, interval, or ratio, and derive inferences accordingly. When explaining, account for as many independent variables as you can; rank-order them based on how much they affect the dependent variable. Test alternative hypotheses about causation against each other to find out which explains the greatest amount of change in the dependent variable. And so on. The most basic analytic techniques will be those that link or display data so that they can be interpreted or communicated more easily. On the other hand, the more complicated analytic techniques-such as Bayesian analysis-are those that provide assistance in deriving meaning from the accumulated data.
In contrast to structured methods are those techniques that are not structured; frequently identified as "intuition." According to Robert Folker, "non-structured methodology
American Intelligence Journal
is intuitive. Intuition is a feeling or instinct that does not use demonstrative reasoning processes and cannot be adequately explained by the analyst with the available evidence." He goes on to say that "intuition plays an inescapable role in analysis." Yet at the same time he argues that "structured methods have significant and unique value (over intuition) in that they can be easily taught to other analysts as a way to structure and balance their analysis. It is difficult, if not impossible, to teach an intelligence analyst how to conduct accurate intuitive analysis. Intuition comes with experience."?
A problem with Folker's definition, however, is that it equates intuition with lack of structure, which is not necessarily the case.' Intuition can be structured, but perhaps in ways that are not visible to outside observers. Folker has since modified his distinction between structured methods and intuition, arguing that the "real difference between the two basic analytic processes is that intuition is an invisible analytic process, and the structured approach is a visible process."! Folker goes on to say that if this distinction is correct, then a comparison between "intuitive" (or invisible) and "visible" analytic approaches is more appropriate than that between structured methods and intuition." Accordingly, when this paper uses the terms structured versus intuition, it is emphasizing the visible or verifiable aspects of the process versus the invisible or nonverifiable aspects of the process rather than how structured versus unstructured the analytic approach might be.
The question to be addressed here is how much of a role should structured methods play in intelligence analysis.
The structured versus intuitive approach to intelligence analysis is something that has been debated in analytic circles for decades. According to Folker, "a long-standing debate exists within the Intelligence Community about whether more should be invested in structured methodologies to improve qualitative intelligence analysis. At the heart of this controversy is the question of whether intelligence analysis should be accepted as an art (depending largely on subjective, intuitive judgment) or a science (depending largely on structured, systematic analytic methods)."?
How much of a role should structured methods play?
According to Folker, proponents of intelligence analysis as an art "contend that qualitative intelligence analysis deals with an infinite number of variables that are impossible to
---------------~~~ Summer 2007
operationalize because they cannot be adequately quantified or fully collected ... Because in many cases the variables are so complex, countless, and incomplete, attempting to analyze them using scientific methods is pseudo-science. Therefore, any attempt to make predictions based on quantifying these variables is futile. (As a result, these proponents) argue that qualitative intelligence analysis is an art because it is an intuitive process based on instinct, education, and experience.?"
By way of contrast, proponents of intelligence analysis as a science "assert that science is a necessary tool to use when conducting qualitative analysis. They argue that, although it is impossible to consider every variable when conducting analysis, one can identify key variables and weigh their importance. And although much may be unknown, identifying what is known and analyzing it scientifically is an effective approach .... (because) scientific methods help analysts determine the relevancy of information and form conclusions, a process that analysts do not perform well on their own."!'
After evaluating each argument, Folker concludes that intelligence analysis is both an art and a science. As he puts it, "The fallacy in the art or science debate may be the "either/or" proposition. If qualitative intelligence analysis is not exclusively an art nor a science, then it may best be considered a combination of both intuitive and scientific method"?
So now that terms are defined, the next question is: which is used more by intelligence analysts; structured methods or intuition?
FAILING TO USE STRUCTURED METHODS
Some outside observers with little experience in the 'real' world of intelligence analysis assume that it involves rigorous application of structured methodologies. This perspective is best captured by New York Times columnist David Brooks in an article that criticized CIA's analysis as being too structured; how its analysis relies on 'sophisticated modeling' and 'probability calculations' that results in a 'false scientism' that fails to take into account some of the intangibles that are important in determining the actions of an adversary." And in fact, according to Folker, "many proponents of exploiting scientific methodologies to aid intelligence analysis assume that these methods are regularly used in real-world analysis." However, Folker goes on to say that "this assumption that intelligence analysts use these various scientific approaches is based not on empirical evidence, but on the belief that experience alone does not make an
-----~ - -----~--- - ~~-~---
Page 8 American Intelligence Journal
expert, and that an expert must have a tool for structuring knowledge to efficiently solve a problem.'?"
The problem with this assumption, though, is that it is inaccurate. Even though there are over 200 analytic methods that intelligence analysts could choose from," the intelligence analysis process frequently involves intuition rather than structured methods. As someone who worked at CIA from 1996 to 2000, I possess firsthand knowledge of the kind of analytic approaches used at the time. While I was there, the reigning analytic paradigm was based on generalized intuition; an analyst would read a lot, come up with some analytic judgment, and send that judgment up the line without much focus on either the process involved in coming to that judgment, or making that process transparent to others. No one I knew-except for maybe the economic analysts-used any form of structured analytic process that was transparent to others. No quantitative methods; no special software; no analysis of competing hypotheses; not even link charts.
The intelligence analysis process frequently involves intuition rather than structured methods.
Others have also made similar observations. Folker has said that "structured methodologies are severely neglected" in the intelligence community." In addition, Randy Pherson, a former CIA analyst and National Intelligence Officer who currently teaches analytic techniques across the community, has said that he "often tell(s) students that the traditional CIA method of analysis ... (involves three phases): Read as much as you have time to read that day; Think about it and suck an answer out your thumb; Write it down in as crisp a manner as possible.'?"
This intuitive approach to intelligence analysis-the dominant analytic approach-does not leave much room for structured methodologies. Pherson went on to say that "unfortunately, at CIA most of our energy and on-the-job training as analysts traditionally has gone into phase three-learning how to capture the essence of our analysis in a paragraph or page. Substantial resources also have been devoted to phase one, but we remain woefully behind what technology offers. And, until recently, we have largely ignored the need (or used the excuse we don't have the time) to develop the necessary skills to ensure more rigor and scientific method in our analytic process. "18
Granted, this kind of anecdotal evidence cannot be generalized across the entire CIA-or across multiple organizations-because it is likely that certain analytic
techniques were used by certain kinds of analysts for specialized purposes. For example, most counter-terrorism and counter-narcotics analysts probably either used or knew about using link charts or social network analysis as a way to display or represent connections between targets of interest and perhaps other analytic techniques were used in other parts of the agency. But the analysts I worked with didn't use them at all. In other words, intuition frequently dominated the actual process of intelligence analysis instead of more structured methodologies.
I was initially appalled at the CIA's lack of structured analytic methodologies, because I prefer some kind of rigor as a foundation to analysis. So finding out who used analytic methodologies was a particular interest of mine. And I ran into-and identified with-methodologists who advocated greater rigor in analysis or those who experimented with innovative techniques. Foremost among these were Jack Davis and alternative analysis;" Stanley Feder and FACTIONS or Policon and other kinds of modeling;" the folks at Strategic Assessments Group who worked on scenario-based simulations;" and Carole Dumaine and the Global Futures Project, which at the time tended to support future scenario work, among other things."
But for a variety of different reasons, including some good ones, few analysts embraced these techniques. As Folker observes, these approaches were "rejected by analysts because the scientific methods were thought to be too narrowly focused and not relevant to the questions the intelligence analysts were addressing.':" He then quotes Richards Heuer as saying that most analysts tended "to be skeptical of any form of simplification inherent in the application of probabilistic models.'?" He also points out that "the complexity of some scientific methodologies (such as Bayesian analysis) prevents them from being regularly exploited by most intelligence analysts.'?'
In addition, he observed that "opponents of the scientific approach .... denounce the amount of time it takes to scientifically analyze a problem." As he points out, "under the accelerating pressures of time, intelligence analysts feel that structured analytical approaches are too cumbersome.':" This observation is supported by Arthur Hulnick, a professor at Boston University who spent 35 years as an intelligence analyst and manager, who has said that "methodologists discovered that their new techniques were unpopular with analysts because they had no time to absorb these systems in the face of tight deadlines.'?"
Other reasons for the failure by analysts to adopt structured techniques include increased accountability and unproven benefits. In terms of accountability, Alan Schwartz points out that it is natural for analysts to prefer "to have their
American Intelligence Journal
intuitive process unexamined.'?' Mistakes can be more easily explained if the process used is opaque, and credit can be taken for getting the right result for the wrong reason. Other analysts failed to adopt the techniques because their benefits remained unproven to a skeptical audience.
As Folker puts it, intelligence analysts "are not convinced that (structured methodologies) will improve their analysis."?" In an environment of constant change and new analytic tools and techniques being sold-both figuratively and literally-as the next 'big thing,' many analysts adopted a cautious if not skeptical posture towards any new tool, technique, or methodology. This skepticism was heightened by the fact that the value of many new techniques or methods was asserted or argued rather than proven through scientific evaluation.
The value of many new techniques or methods was asserted or argued rather than proven.
Exceptions do, of course, exist. For example, Stanley Feder - a former CIA methodologist- argues that the use of a specific analytic model produced more precise forecasts than conventional intelligence analysis without sacrificing accuracy."? He has pointed out that the use of a particular analytic model "helped avoid analytic traps and improved the quality of analyses by making it possible to forecast specific policy outcomes and the political dynamics leading to them.'?' Feder's articles about analytic methodology are unusual for the intelligence community in that they involve an effort to evaluate and document the effectiveness of a structured methodology against the status quo intuitive approach used by most analysts. Another example is Folker's Master's thesis at the Joint Military Intelligence College in which he showed that using one particular analytic method-structured hypothesis testing-produced better analysis than intuition." But few evaluative articles of this kind exist. As a result, the benefits-and limits--of most analytic techniques have not yet been specified.
In the end, once there are impediments to incorporating structured methods into how people do analysis, those impediments become embedded in the organizational culture. Feder said that "despite the advantages of (the models) the vast majority of analysts do not use them.''" Feder speculates that the models are not used because "this kind of systematic analysis does not fit into an organizational culture that sees an "analyst" as someone who writes reports, often evaluating and summarizing available information. In contrast, people who use models
and quantitative techniques are considered "methodologists."
Folker agrees, pointing out that "structured thinking is radically at variance with the way in which the human mind is in the habit of working. Most people are used to solving problems intuitively by trial and error. Breaking this habit and establishing a new habit of thinking is an extremely difficult task and probably the primary reason why attempts to reform intelligence analysis have failed in the past, and why intelligence budgets for analytical methodology have remained extremely small when compared to other intelligence functions.'?'
But perhaps organizational culture can change if the use of structured methods can be demonstrated to provide clear benefits to the practitioner, as Feder has attempted to do. And there, Richards Heuer has provided us with a little hope. In an article he wrote regarding the use of quantitative techniques in intelligence analysis, Heuer observed that "the initial attitude of country analysts toward our unconventional proposals typically ranged from skepticism to hostility. Equally typical, however, has been their post-project appraisal that the work was interesting and well worth doing. "35
So perhaps it is possible for structured analytic techniques to improve analysis, although the fact that it has been almost 30 years since Heuer made that observation-and we are still having the same discussion-does provide a certain cautionary note about prospects for changing culture.
In the end, the debate continues between proponents of an art versus science approach to intelligence analysis, and whether or not structured analytic techniques are more effective than intuition. But the same kinds of debates exist in other fields and can be used to shed light on ways to resolve the current debate.
COMPARING INTELLIGENCE ANALYSIS TO MEDICAL DIAGNOSIS
Just as there is both an art and a science to intelligence analysis, there is also an art and a science to medicine as well as a similar discussion about the value of structured methods versus intuition. The comparison between intelligence analysis and medicine has run through the intelligence literature for almost 60 years, if not longer. In 1949, Sherman Kent wrote that "intelligence is a simple and self-evident thing. As an activity, it is the pursuit of a certain kind of knowledge ... In a small way, it is what we all do every day .... When a doctor diagnosis an ailment-when almost anyone decides
American Intelligence Journal
upon a course of action-he usually does some preliminary intelligence work.'?"
In 1983, historian Walter Laqueur examined the analogy at length, and argued that medicine is more an art than a science because the process of diagnosis entails the use of judgment as a means to address ambiguous signs and symptoms." Laqueur also highlighted similarities between medicine and intelligence including similarities in analytic processes, pointing out that "the student of intelligence will profit more from contemplating the principles of medical diagnosis than immersing himself in any other field. The doctor and the analyst have to collect and evaluate the evidence about phenomena frequently not amenable to direct observation. This is done on the basis of indications, signs, and symptoms .... The same approach applies to intelligence." As a result of the similarities, references to the medical analogy have recently become much more prevalent in the intelligence literature, in particular Rob Johnston's observation that the "practice of medicine is an art and a tradecraft" and goes on to describe how medicine relies on a base of science that has not yet been created to support intelligence analysis."
Doctor and the analyst have to collect and evaluate the evidence about phenomena frequently not amenable to direct observation.
In medicine, the structured approach to diagnosis is known as 'differential diagnosis' which involves a process used to distinguish one disease from another with very similar symptoms. For intelligence practitioners unfamiliar with this process, differential diagnosis is the medical equivalent to Analysis of Competing Hypotheses. Richards Heuer has said that his discussion of the 'diagnosticity of evidence' in his book "Psychology ofIntelligence Analysis" was drawn from the medical literature." So intelligence analysts who use ACH are applying a technique is more or less equivalent to what doctors do when they diagnose patients. Good illustrations of differential diagnosis in practice can be found in the Fox TV show "House"-which is about a master diagnostician-because it shows how hypotheses are developed and tested in a medical context, sometimes intuitively and sometimes rigorously.
Frequently, however, physicians don't work through this process in an observable way. Rather, they do so via structured 'intuition,' which may look to outsiders as if they were diagnosing patients off the top of their heads rather than through the use of structured methods. In many cases, the intuitive approach is used in medicine because of
American Intelligence Journal
the pressure on the doctor to make a diagnosis and come up with treatment options in a very short period of time. Sometimes doctors who work in large practices are limited to only a few minutes with each patient. This severely restricts how much time they can spend during the diagnostic process, which makes them resistant to using more structured methods. And those same pressures
explain why intelligence analysts tend to resist incorporating structured methods into their work processes as well.
There's an ongoing debate in the medical community over the value of structured methodologies-just as there is in intelligence-and some people have criticized the unstructured aspects of medical diagnosis-part of the art of medicine-for causing diagnostic failure. In medicine, the dominant approach to diagnosis is intuition-frequently structured, but relying on the practitioner's knowledge, memory, and reasoning skills, and these competencies can vary greatly between individuals. For example, according to Jeffrey Pfeffer and Robert Sutton in an article in the Harvard Business Review, medical decisions are frequently based on "obsolete knowledge gained in school, longstanding but never proven traditions, patterns gleaned from experience, the methods they believe in and are most
skilled in applying and information from hordes of vendors with products and services to sell.'?"
For example, Jerome Groopman-a professor at Harvard Medical School and author of the book "How Doctors Think"-has said that "conservatively about 15 percent of all people are misdiagnosed" because of "errors in thinking" primarily involving shortcuts." As he says, "most doctors, within the first 18 seconds of seeing a patient, will. .. generate an idea in his mind (of) what's wrong. And too often, we make what's called an anchoring mistake-we fix on that snap judgment" which he says could be based on anything. He also describes other kinds of cognitive errors that physicians make in the diagnostic process. And he points out that there is a movement afoot to make the diagnostic process more rigorous.
Some have argued that medical diagnosis can be turned from an art into a science if physicians use structured methods to assist them in the process of making diagnoses. According to Groopman, "To establish a more organized structure (for their diagnoses), medical students and residents are being taught to follow preset algorithms and practice guidelines in the form of decision trees .... The trunk of the clinical decision tree is a patient's major symptom or laboratory result, contained within a box. Arrows branch from the first box to other boxes.
For example, a common symptom like "sore throat" would begin the algorithm, followed by a series of branches with
---- -------- ---------
"yes" or "no" questions about associated symptoms. Is there a fever or not? Are swollen lymph nodes associated with
the sore throat? Have other family members suffered from this symptom? Similarly, a laboratory test like a throat culture for bacteria would appear farther down the trunk of the tree, with branches based on "yes" or "no" answers to the results of the culture. Ultimately, following the branches to the end should lead to the correct diagnosis and therapy. "42
Whether this kind of diagnostic process is an improvement over the intuitive approach hasn't yet been decided in medicine, either. Groopman thinks that this more structured approach to diagnosis may be effective for
simple problems, but "quickly falls apart when doctors need to think outside the box, when symptoms are vague, or multiple and confusing, or when test results are inexact. In such cases-the kinds of cases where we most need a discerning doctor-algorithms discourage physicians from thinking independently and creatively. Instead of expanding a doctor's thinking, they can constrain it."
Algorithms discourage physicians from thinking independently and creatively.
This discussion about the use of structured methods in medicine is taking place within an even larger context of evidence-based medicine, which is a push to make medical practice in general less intuitive; less of an art, and more of a science. For example, an evidence-based approach to medicine would involve the implementation of evaluation processes within existing diagnostic procedures. A couple of years ago, the New York Times published an article describing how some radiologists had instituted a feedback loop in their diagnosis of breast cancer which provided them with the opportunity to track cases over time, find out where their mistakes were, and use that knowledge to become better at diagnosing problems." I've been told by medical professionals that this example is unusual-even in medicine-because of the rigor involved in the process. If these processes are found to be more effective, an argument could be made to incorporate them into existing diagnostic doctrine.
Medicine has an established doctrine-a formalized process-for diagnosing and treating patients; a doctrine that is taught to new physicians in medical school. Doctrine does not necessarily reflect 'best practices,' although in an optimal world it might. But doctrine alone may not provide the answer to improving analytic quality across the board because there might not be a single best way to approach each specific problem. According to Hubal, "experts often don't agree.
There is a goal (e.g., to diagnose a patient's presentation) but two experts may go about acquiring information, forming relationships, and drawing conclusions quite differently - that is, they may use different structured approaches to reach the goal. A doctrinal process is simply 'a way' to go about the business that is formally set as 'the way', even though there may be other perfectly valid ways to succeed. And doctrine must really be used as a guide, because initial conditions will vary and individuals will have to adapt.':"
Yet even if doctrine has limits in application and as a result is not the only answer, it can still act as a mechanism to inculcate best practices into new practitioners. So how might this discussion about the art and science of medicine assist intelligence practitioners in improving practices in their own field?
SHOULD INTELLIGENCE ANALYSIS BECOME MORE LIKE MEDICINE?
Just as medicine is debating how to adjust its diagnostic doctrine to become less like an art and more like a science, the same discussion is taking place regarding intelligence analysis as well. Still an open question, however, is whether there is an equivalent kind of doctrine for intelligence analysis.
In 1957, Washington Platt observed that "in intelligence production as yet we find little study of methods as such. Yet a systematic study of methods in any given field by an open-minded expert in that field nearly always leads to worthwhile improvernent.?" But it wasn't until the mid- 1990s that CIA's Product Evaluation Staff put together its analytic tradecraft notes in an effort to formalize analytic doctrine, and in 1999 its Center for the Study of Intelligence released Richards Heuer's "Psychology of Intelligence Analysis" which provides the articulation, rationale, and description of the technique known as Analyses of Competing Hypotheses, which is fast becoming a part of informal analytic doctrine.
But is there a mechanism to inculcate this doctrine into new practitioners? The medical profession has formal practices for developing and teaching doctrine; primarily through medical schools and continuing education programs that are a required component of the medical profession. Medicine is a mix of the art and the science, which involves both professional knowledge as well as craftsmanship. But medicine also possesses educational requirements for entry into the profession. In medicine, a doctor spends a substantial amount of time in school to learn some of the knowledge that has been accumulated thus far on medical science-in other words, how to diagnose and treat the various diseases and problems that
American intelligence Journal
affect human health-with skills development acquired through an apprenticeship program and on-the-job training.
Intelligence analysis has similar educational and apprenticeship requirements for proficiency, but lacks the infrastructure and formal requirements to acquire the requisite knowledge and skills. Intelligence analysis has, for the most part, been practiced more as a craft than a profession; or as Hubal has said, "the educational and experiential processes have not (yet) been formalized.?" As a result, professional practices are still pretty rudimentary.
If there is some benefit to making intelligence analysis more like a formal profession, then greater professionalization is necessary. Professionalization---or the process of shifting a profession from less formal to more formal practices-is a process that relates to the interaction of the art and the science in the field. If you think about it, a craftsman is really an artist whose skill is developed through training and experience, while a professional is someone who has been formally educated in the 'science' of his or her field, and then uses that knowledge in an applied way. Crafts rely primarily on the skill of the individual practitioner-which does not improve very much from generation to generation without the accumulation of knowledge that the formalization of education in the field entails-while professions build on the knowledge of past practitioners and relay it to new professionals through their educational process.
There is a growing consensus that some kind of improved professionalization process is necessary, but an open question is the mechanism for doing so. Some of the pieces of this process are being put together at the national level by the staff of the Office of the Director of National Intelligence (ODNI), as they try to define analytic competencies and standards for intelligence analysts across the many analytic disciplines of the intelligence community. Others-such as myself-are advocates for a less hierarchical approach through the efforts of a professional association, and a possible model could be the American Medical Association (AMA).
Medicine wasn't always a profession. 150 years ago it was actually a craft very much like intelligence analysis is today. As my co-author-Dr. Jonathan Clemente-has pointed out, the mechanism that pushed medicine from mostly a craft to mostly a profession was the American Medical Association through its effort from about 1850 to about 1910. We've argued that the most effective way to professionalize intelligence analysis is through an overarching professional association modeled on the AMA.47 This association would then work on developing the adoption of centralized knowledge accumulation efforts and
the development of formal personnel practices such as: a structured selection process built around analytic competencies; training, education and development programs; performance standards; and a code of ethics. In addition, the creation of a centralized focal point for knowledge regarding best practices---or as Hubal puts it, "a 'center for lessons learned' that maintains a repository of best structured practices, as well as meta-data describing when, why, how to apply those practices=v=would enable intelligence analysis as an occupation to learn and improve over time in a way that it is currently unable to.
Just as medicine is debating whether or not to become more structured, the intelligence community has been examining the value of structured methods in greater detail. According to Randy Pherson, there is "a growing realization that we have to instill more rigor into the analytic process and that we can no longer afford to say we are too busy to get it right." In addition, some writers have advocated a more methodological approach to intelligence analysis. Two of the more prominent are Rob Johnston's call for the incorporation of methodologists into teams of substantive experts," and Tim Smith's suggestion that to increase the rigor of analysis the community should create "knowledge factories" explicitly using scientific methodology to produce all-source finished intelligence." This paper won the 2006 DNI Galileo essay contest, signaling the community's interest in making the analytic process more rigorous and structured.
We can no longer afford to say we are too busy to get it right.
But as analytic standards and doctrine are developed, questions will be raised about whether to promote certain analytic techniques over others; whether to teach-or mandate the use of ---certain techniques such as ACH or link analysis. For example, parts of the intelligence community have been encouraging analysts to use more structured methods in their analysis. At the community level, the ODNI has created an "Analysis 101" course as a way to introduce structured techniques to all new analysts in the community. 52
Additional courses are being introduced at the agency level. Pherson observes that "the CIA has been engaged in a major training effort to introduce structured analytic techniques into its work process including mandatory training of all analysis managers in the core techniques.?" In addition, his company is teaching structured analytic techniques to analysts at various intelligence community organizations, including I ,400 FBI analysts in 2006, and is "engaged at several agencies in helping analysts apply
American Intelligence Journal
these techniques to everyday challenges.'?' Finally, the National Security Agency's David T. Moore has written a book on critical thinking which he used to develop a course on critical thinking and structured methodology that is being taught at a number of intelligence community organizations. S5
Additional efforts to develop structured analytic techniques and teach them to practitioners are also underway. For example, the software version of Richards Heuer's Analysis of Competing Hypotheses" is being adapted by CIA to accommodate multiple users. According to Heuer, it is being "designed for use in a collaborative environment. ... to facilitate collaboration across organizational boundaries, and through all the firewalls, within the Intelligence Community." It is hoped that the ability to collaborate online in a common virtual workspace will "change how the interagency coordination process works."? Additional work on ACH-being done at Mercyhurst College-is also ongoing."
Some might perceive the teaching of these techniques as just the next step in the professionalization process. To a degree it is. Knowledge advancement is an iterative process and it is important to teach new practitioners the more structured cutting edge approaches as well as those which are more traditional, such as intuition. Essentially, the use of structured methods, if only as a guide or framework for addressing a problem, is better than no structure at all. But before certain structured analytic techniques become de rigeur throughout the intelligence community, I think it is important to take a step back and ask how structured intelligence analysis should be, and whether or not that structure will actually lead to improved analysis.
At this point-given the general intuitive approach that analysts use and the relative paucity of data showing that structured techniques would improve accuracy-I'm agnostic about the value of mandating use of more structured methods. I think it is important to teach them, because if there is value in their application then it would be good for practitioners to know how to use them. But people think in different ways, and until the intelligence community can demonstrate that these techniques are an improvement over intuitive models for all practitioners, I don't think their use should be mandated. Instead, more time, money, and effort should be devoted to developing the capacity to evaluate the utility of these approaches rather than just to developing and teaching them. As Rob Johnston observed in 2003, "little work has been done comparing structured techniques to intuition.t'"
A step in the right direction was the January 2007 ODNI conference on 'improving intelligence analysis' intended to "examine a new way of doing business in which analytic
practices are based not on tradition, personal experience, or intuitive plausibility but on solid evidence of what works.?" Institutionalizing this kind of effort would be even better, and a recommendation by Steve Rieber would accomplish this through the creation of a National Institute for
Analytic Methods-modeled on the National Institutes of Health-to sponsor research on the effectiveness of analytic tools and techniques." If this kind of research
infrastructure is created, it will be possible to gain knowledge-scientifically-about the effectiveness of different approaches/" rather than rely primarily on anecdotal evidence as is currently the case. And as Folker has observed, "if empirical evidence shows that such an improvement will take place, then analysts may consider it worth the investment of time, effort, and risk required on their part to regularly exploit structured methodologies.?"
Folker had it right when he observed that "the basic research question remains unanswered: Will the use of structured methods improve qualitative intelligence analysis?" ... If qualitative intelligence analysis is an art, then efforts to improve it should focus on measuring the accuracy of one's intuition, selecting those analysts with the best track record, and educating them to become experts in a given field. If, on the other hand, qualitative intelligence analysis is a science, then analysts should be trained to select the appropriate method for a given problem from a variety of scientific methodologies and exploit it to guide them through the analytical process."
In the end, since intelligence analysis is both an art and a science, that means efforts to improve it should incorporate both intuition and structured methods." But the same kind of rigor that advocates for structured methodologies argue should be used in intelligence analysis should also be applied to the evaluation of the utility of those very techniques.
Stephen Marrin=a former analyst with the CIA and the congressional Government Accountability Office-is an assistant professor in Mercyhurst College's Intelligence Studies Department. He is a doctoral candidate at the University of Virginia, and has written many articles on various aspects of intelligence studies, including one that led to the creation of CIA University. In 2004, the National Journal described him as one of the country's top ten experts on intelligence reform. He is also a member of the editorial advisory board of the International Journal of Intelligence and Counterintelligence, the editorial review board of the National Military Intelligence Association's American Intelligence Journal, a member of American Military University's Intelligence Advisory Council, and on the Board of Directors of the International Association for Intelligence Education.
American Intelligence Journal
1 The core concepts behind this paper were derived from presentations given at the Naval War College in March 2006 ("Making Better Communicators") and the Eighth Annual Mercyhurst College International Colloquium on Intelligence in June 2006 ("Non-Intelligence Analytical Approaches").
2 For more information on particular structured analytic techniques, see: "A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis." CIA Sherman Kent School. Vol. 2. No.2. June 2005; "Handbook of Analytic Tools and Techniques." Pherson Associates, LLC. 2005; Kristan 1. Wheaton, Emily E. Mosco, and Diane E. Chido. "The Analyst 's Cookbook." Mercyhurst College Institute of Intelligence Studies Press. 2006.
3 Rob Hubal; Email. December 26, 2006.
4 Alan Schwartz. Email. 25 December 2006.
1 Robert Folker. "Intelligence Analysis in Theater Joint Intelligence Centers: An Experiment in Applying Structured Methods." Occasional Paper Number Seven: Center for Strategic Intelligence Research. Joint Military Intelligence College. January 2000. 5.
6 Folker, 14.
7 I am beholden to Rob Hubal for suggesting this distinction. This failure to carefully define the concept of intuition with rigor is something that Rob Johnston also referenced in
"Foundations for Meta-Analysis: Developing a Taxonomy of Intelligence Analysis Variables." CIA Studies in Intelligence V. 47. No.3. (2003).
8 Robert Folker. Email. 15 June 2007. 9 Folker, 6.
10 Folker, 6-7; citing Richard K. Betts, "Surprise, Scholasticism, and Strategy: A Review of Ariel Levite's Intelligence and Strategic Surprises (New York: Columbia University Press, 1987)," International Studies Quarterly 33, no. 3 (September 1989): 338. and John L. Peterson, "Forecasting: It 's Not Possible." Defense Intelligence Journal
3, N.2 (Fall 1994): 37-38.
11 Folker, 9-10.
12 Folker, 13.
13 David Brooks. The CIA: Method and Madness. The New York Times. February 3, 2004. A 23.
14 Folker, II.
Il Rob Johnston. Integrating Methodologists into Teams of Substantive Experts. Studies in Intelligence. Vol. 47. No.1. 16 Folker, 14.
17 Randy Pherson Email. December 24,2006. 18 Randy Pherson Email. December 24, 2006.
19 For more information on Jack Davis and his involvement with Alternative Analysis and other aspects of analytic methodology, see: http://www.oss.net/extraJpage/
?action=page _show&id=21&module _instance=4 For more information on alternative analysis in general, see: Roger Z. George. "Fixing the Problem of Analytical Mind-Sets:
Alternative Analysis." International Journal of Intelligence and Counterintelligence. Vol. 17. No.3. (2004). 385-404.
20 Stanley A. Feder. "FACTIONS and Policon: New Ways to Analyze Politics." In Inside CIA's Private World: Declassified Articles from the Agency's Internal Journal, 1955-1992, ed. H. Bradford Westerfield. New Haven: Yale University Press. 274- 292.
American Intelligence Journal
21 According to the following website, the Strategic Assessment Group's "methodology ranges from the pooling of experts to the construction of scenarios." See: http://home.gwu.edu/~esialsf/Final%20Report%20- %20Fall%202003.pdf
22 For a dated description of the Global Futures Partnership, see:
T. Irene Sanders and Judith A. McCabe. "The Use of Complexity Science." Washington Center for Complexity and Public Policy. October 2003. 36. http://www.complexsys.org/pdf/ ComplexityScienceSurvey.pdf Also see a description of the program contained in the mirror of an old CIA webpage at: http:// cryptome.org/cia-di.htm
23 Folker, 9.
24 Richards J. Heuer, Jr., "Adapting Academic Methods and Models to Government Needs," in Quantitative Approaches to Political Intelligence The CIA Experience, ed. Richards J. Heuer, Jr. (Boulder: Westview Press, Inc., 1978),6.
25 Folker, 8.
26 Folker, 14.
27 Arthur S. Hulnick. Fixing the Spy Machine: Preparing American Intelligence for the Twenty-First Century. Praeger; Westport, CT. 1999. 53. These and the following quotes are from:
Stephen Marrin. "Homeland Security and the Analysis of Foreign Intelligence." Markle Foundation Task Force on National Security in the Information Age Background Paper. October 2002. Also reprinted under the same title in: The Intelligencer:
Journal of U.S. Intelligence Studies. Vl3. N2. (2003). 25-36. 28 Alan Schwartz. Email. December 25,2006.
29 Folker, 15.
30 Stanley A. Feder. "Forecasting for Policy Making in the PostCold War Period." Annual Review of Political Science. Vol. 5. (2002). 111-125. http://polisci.annualreviews.org/cgi/contentlfull/5111111
31 Stanley A. Feder. "FACTIONS and Policon: New Ways to Analyze Politics." In Inside CIA's Private World: Declassified Articles from the Agency's Internal Journal, 1955-1992, ed. H. Bradford Westerfield. New Haven: Yale University Press. 275. 32 Robert D. Folker, Jr. Exploiting Structured Methodologies to Improve Qualitative Intelligence Analysis. Masters Thesis. Joint Military Intelligence College. July 1999. 2.
33 Feder. "Forecasting for Policy Making in the Post-Cold War Period."
34 Folker, 14.
35 Richards J. Heuer. Adapting Academic Methods and Models to Governmental Needs: The CIA Experience. Strategic Studies Institute, US Army War College. 31 July 1978. 5. Heuer has since said that the analysts were resistant to quantitative techniques but more receptive to qualitative approaches.
36 Sherman Kent. Strategic Intelligence for American World Policy. Princeton University Press; Princeton, NJ. 1951. (reprint of 1949 version). vii.
37 Walter Laqueur, "The Question of Judgment: Intelligence and Medicine," The Journal of Contemporary History, Vol. 18. 1983, (533-548). 535; as quoted in Stephen Marrin and Jonathan Clemente. "Improving Intelligence Analysis by Looking to the Medical Profession." International Journal of Intelligence and Counterintelligence. Vol. 18. No.4. (2005) (707-729).708.
38 In particular, see: Rob Johnston. "Foundations for MetaAnalysis: Developing a Taxonomy ofIntelligence Analysis Variables." CIA Studies in Intelligence V. 47. No.3. (2003).
39 Richards J. Heuer. Email. January 5, 2005.
40 Jeffrey Pfeffer and Robert I. Sutton, "Evidence-Based Management," Harvard Business Review, January 2006. As cited in the invitation for the 2007 ODNI Conference on Improving Intelligence Analysis.
41 Jerome Groopman, quoted in "The Doctor's In, But Is He Listening?" National Public Radio Morning Edition March 16, 2007. http://www.npr.org/templates/story/story.php?storyId=8946558 42 Jerome Groopman. Excerpt from: "How Doctors Think." National Public Radio Morning Edition March 15,2007. http://www.npr.org/templates/story/story.php?storyId=8946558 43 Michael Moss. "Mammogram Team Learns From Its Errors." New York Times. June 28, 2002.
44 Rob Hubal; Email. December 26, 2006.
45 Washington Platt. Strategic Intelligence Production: Basic Principles. Frederick A. Praeger: New York. 1957. 151.
46 Rob Hubal; Email. December 26, 2006.
47 Stephen Marrin and Jonathan Clemente. "Modeling an Intelligence Analysis Profession on Medicine." International Journal of Intelligence and Counterintelligence. Vol. 19. No.4. (Winter 2006-2007). 642-665.
48 Rob Hubal; Email. December 26, 2006.
49 Randy Pherson Email. December 24, 2006.
50 Rob Johnston. Integrating Methodologists into Teams of Substantive Experts. Studies in Intelligence. Vol. 47. No.1.
51 Timothy 1. Smith. "Predictive Warning: Teams, Networks, and Scientific Method." Roger Z. George and James B. Bruce. Getting it Right: Expanding Frontiers for Intelligence Analysis. Georgetown University Press. 2007 (forthcoming).
52 Mary Louise Kelly. "Intelligence Community Unites for "Analysis 101." NPR Morning Edition. 7 May 2007. http://www .npr .org/templatesl story Istory. php ?story Id= 10040625 53 Randy Pherson Email. December 24, 2006.
54 Randy Pherson Email. December 24, 2006.
55 David T. Moore. Critical Thinking and Intelligence Analysis. Occasional Paper Number Fourteen (Washington, DC: National Defense Intelligence College, 2006). It includes the syllabus for NSA's critical thinking and structured analysis class in the appendix, pages 97-115.
56 Available at http://www2.parc.comlistl/projects/ach/ach.html 57 Richards 1. Heuer, Jr. "The Future of "Alternative Analysis" Presentation notes for the Director of National Intelligence conference on Improving Intelligence Analysis: What Works? How Can We Tell? Lessons from Outside the Intelligence Community, Chantilly, VA, January 9 - 10, 2007. 4.
58 Kristan Wheaton and Diane Chido. "Structured Analysis of Competing Hypotheses. Improving a Tested Intelligence Methodology." Competitive Intelligence Magazine, November! December 2006. http://www.mcmanis-monsalve.com/assets/publications/ intelligence-methodology-l-O? -chido.pdf Also see: "Structured Analysis of Competing Hypotheses: Theory and Application." Analytic Methodologies Project: Mercyhurst College Institute of Intelligence Studies Press. 2006.
59 Rob Johnston. "Foundations for Meta-Analysis: Developing a Taxonomy of Intelligence Analysis Variables." CIA Studies in Intelligence V. 47. No.3. (2003).
60 Conference materials. Improving Intelligence Analysis: What Works? How Can We Tell? Lessons from Outside the Intelligence Community." 9-10 January 2007. Chantilly, VA.
61 Steven Rieber and Neil Thomason. Creation of a National Institute for Analytic Methods. Studies in Intelligence. Vol. 49. No.4. (2005). 71-77.
62 Examples of this kind of research into analytic methods include: Leonard Adelman, Paul E. Lehner, Brant A. Cheikes, and Mark F. Taylor. "An Empirical Evaluation of Structured Argumentation Using the Toulmin Argument Formalism.
" IEEE Transactions on Systems, Man, and Cybernetics
-Part A: Systems and Humans. Vol. 37. No.3. (May 2007). 340-347. Also see: Paul E. Lehner, Leonard Adelman, Brant A. Cheikes, and Mark Brown. "Confirmation Bias in Complex Analyses." IEEE Transactions on Systems, Man, and Cybernetics. Forthcoming. The former paper reports the results of an experiment assessing the utility of a structured argumentation technique for intelligence analysis, and the latter paper reports the results of an experiment assessing the ability of ACH to mitigate confirmation bias.
63 Folker, 15.
64 Folker, 8.
65 Folker, 6.
66 For another perspective on improving the art and science of intelligence analysis, see: Stephen Marrin. "Adding Value to the Intelligence Product." Handbook of Intelligence Studies. (Ed. Loch Johnson) Routledge. 2006. 199-210.
publishing an article in the
American Intelligence Journal?
Submit a manuscript for consideration
to Anthony Mc Ivor <McIvor.Anthony@kingfishersys.com>
American Intelligence JournaJ
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue reading from where you left off, or restart the preview.