Professional Documents
Culture Documents
Security Studies
Publication details, including instructions for authors and
subscription information:
http://www.tandfonline.com/loi/fsst20
To cite this article: David Waldner (2015) Process Tracing and Qualitative Causal Inference, Security
Studies, 24:2, 239-250, DOI: 10.1080/09636412.2015.1036624
Taylor & Francis makes every effort to ensure the accuracy of all the information (the
“Content”) contained in the publications on our platform. However, Taylor & Francis,
our agents, and our licensors make no representations or warranties whatsoever as to
the accuracy, completeness, or suitability for any purpose of the Content. Any opinions
and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content
should not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or
howsoever caused arising directly or indirectly in connection with, in relation to or arising
out of the use of the Content.
This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http://www.tandfonline.com/page/terms-
and-conditions
Security Studies, 24:239–250, 2015
Copyright © Taylor & Francis Group, LLC
ISSN: 0963-6412 print / 1556-1852 online
DOI: 10.1080/09636412.2015.1036624
DAVID WALDNER
239
240 D. Waldner
All three authors express strong preferences for using process tracing to study
particular historical cases. Mahoney identifies process tracing as a procedure
for developing explanations of individual historical cases, averring that “the
notion of an average causal effect . . . is not particularly useful when applied
to this kind of historical question.”1 Bennett privileges the particular as well,
observing that “policymakers are not very interested in knowing how most
countries usually respond to a policy instrument; rather, they want to know
how country X will respond this time to this instrument.”2 Tannenwald draws
connections to diplomatic history and urges process tracers to emphasize the
habit of “good storytelling” along with methodological rigor.3
Yet each author also gives reason, at least tacitly, to reconsider this
devaluation of average treatment effects. Tannenwald observes that an im-
portant contribution of process tracing is to identify relevant causal mecha-
Downloaded by [UQ Library] at 13:31 16 July 2015
nisms; we might reasonably assume that these mechanisms will have rele-
vance beyond the specific case being studied.4 Mahoney’s reconstruction of
procedures for explaining individual historical cases echoes this observation,
as it rests at crucial moments on “relevant preexisting theories and gener-
alizations.”5 Bennett excoriates George W. Bush administration officials for
ignoring “well-developed academic theories” that would have predicted a
high likelihood of civil conflict and a low likelihood of successful democ-
ratization in Iraq.6 Each of these remarks points to the value of knowledge
about average treatment effects.
The relevant question is not whether to privilege case-specific expla-
nations or average treatment effects over the other; the question is how
to advance our knowledge of the general and the specific simultaneously.
The standard I develop below is a procedure to merge the general and the
specific.
CAUSATION
1
James Mahoney, “Process Tracing and Historical Explanation,” Security Studies 24, no. 2 (April–June
2015): 202–203.
2 Andrew Bennett, “Using Process Tracing to Improve Policy Making: The (Negative) Case of the
2003 Intervention in Iraq,” Security Studies 24, no. 2 (April–June 2015): 229.
3 Nina Tannenwald, “Process Tracing and Security Studies,” Security Studies 24, no. 2 (April–June
2015): 227.
4 Ibid.
5 Mahoney, “Process Tracing and Historical Explanation,” 202.
6 Bennett, “Using Process Tracing to Improve Policy Making,” 239 (quotation, 9).
7 Mahoney, “Process Tracing and Historical Explanation.”
Process Tracing and Qualitative Causal Inference 241
It should strike most readers as odd to talk about the causes of a fire with-
out any understanding of the underlying chemical processes that produce
fire, but this is precisely what Mackie’s INUS account does. This is not to
deny the importance of finding that a particular fire was caused by a short
circuit; rather it is to insist that a complete causal account of a fire requires
two types of features, the underlying causal model and the specific circum-
stances of the particular fire. Note, then, that insofar as we combine the
general causal model with the reconstruction of the particular fire, we satisfy
8 John L. Mackie, The Cement of the Universe: A Study of Causation (Oxford: Oxford University Press,
1974).
9 John L. Mackie, “Causes and Conditions,” in Causation, ed. Ernest Sosa and Michael Tooley
the desideratum of merging the general (average treatment effects) and the
particular.
Mahoney complements his discussion of causation as INUS conditions
with discussion of sequence analysis and mechanisms. For many scholars,
the defects of the regularity or Humean account of causality prompts the
requirement that knowledge of causation requires knowledge of the un-
derlying mechanisms; this argument was introduced to the field of security
studies in David Dessler’s plea for a mechanism-based causal theory of war.11
Thus, Mahoney counsels process tracers to identify mechanisms such that X
→ M → Y. This is an excellent suggestion: producing causal chains marks
progress over establishing simple associations between X and Y , because
causal chains yield more fine-grained knowledge and they yield additional
opportunities for theory testing. The value of this suggestion is restricted,
however, because Mahoney defines mechanisms as “a factor that intervenes
between a cause and outcome. I treat mechanisms in the same way as causes
Downloaded by [UQ Library] at 13:31 16 July 2015
11 David Dessler, “Beyond Correlations: Toward a Causal Theory of War,” International Studies
Qualitative and Multi-Method Research 8, no. 2 (Fall 2010): 30–34; David Waldner, “Process Tracing
and Causal Mechanisms,” in The Oxford Handbook of Philosophy of Science, ed. Harold Kincaid (Oxford:
Oxford University Press, 2012), 65–84. Alexander L. George and Andrew Bennett, Case Studies and Theory
Development in the Social Sciences (Cambridge: MIT Press, 2005), 131–45, provides an early statement of
a mechanistic position, albeit without the critical property of invariance.
14 “Combustion,” Wikipedia.com, http://en.wikipedia.org/wiki/Combustion, accessed 24 September
2013.
Process Tracing and Qualitative Causal Inference 243
INFERENTIAL VALIDITY
15 See Peter Machamer, Lindley Darden, and Carl F. Craver, “Thinking about Mechanisms,” Philosophy
of Science 67, no. 1 (March 2000): 1–25; Stuart Glennan, “Rethinking Mechanistic Explanation,” Philosophy
of Science 69, no. 3 (September 2002): S342–S35, for important philosophical statements of mechanisms
as invariant causal processes.
16 Of course the underlying causal model and mechanisms may be suppressed when specific causal
accounts are given; that the model is often taken as background knowledge is an indicator of our
confidence in the account, not an indicator of its insignificance.
17 The fundamental problem of causal inference is discussed by Paul Holland, “Statistics and Causal
Inference,” Journal of the American Statistical Association 81, no. 396 (December 1986): 945–60; Gary
King, Robert L. Keohane, and Sidney Verba, Designing Social Inquiry: Scientific Inference in Qualitative
Research (Princeton, N.J.: Princeton University Press, 1994), 79.
244 D. Waldner
explain some event, and “smoking-gun tests,” whose passage is sufficient but
not necessary for the truth of a hypothesis.18 Furthermore, our beliefs about
the truth status of a hypothesis and its rivals conditional on these tests can be
expressed in terms of Bayesian probabilities.19 As Tannenwald summarizes,
“Not all evidence is equal.”20 Some evidence might have such high probative
value that it allows us to support (provisionally and fallibly, to be sure) one
hypothesis and reject its rivals.
There is much to be said about this well-articulated and highly valuable
framework of hypothesis testing. Let me confine my remarks to three points:
interpretive debates about the status of a hypothesis test; ambiguity about
how much of a process must be tested before a hypothesis alleged to explain
it “passes” the test; and lingering questions about whether hoop and smoking
gun tests, even with the Bayesian armature, are sufficient to satisfy the criteria
of causal inference.
As we know, hoop and smoking gun tests are, respectively, necessary
Downloaded by [UQ Library] at 13:31 16 July 2015
but not sufficient and sufficient but not necessary for the acceptance of a
hypothesis. But necessity and sufficiency are derived characteristics; Van Ev-
era originally wrote of degrees of uniqueness and certitude. The change is
subtle but highly consequential, because, as Van Evera notes, “interpretive
disputes also arise from quarrels over the uniqueness and certitude of predic-
tions.”21 Uniqueness and certitude, it must be emphasized, are not objective
features of a hypothesis. Perhaps in many cases there will be ready assent
to the degree of uniqueness or certainty of a set of hypotheses; but it is
equally reasonable to note, as Van Evera did, that many hypotheses lack
either uniqueness or certainty, making them “straw-in-the-wind tests” with
limited inferential value. Tannenwald’s discussion of rival process tracing
accounts of the end of the Cold War illustrates the difficulty of arbitrating
these interpretive disputes.22 At minimum, we should ask for clear guide-
lines for measuring the degree of uniqueness and certainty; these features of
a test must be demonstrated, not asserted. And as David Collier has already
18 Stephen Van Evera, Guide to Methods for Students of Political Science (Ithaca, NY: Cornell Uni-
versity Press, 1994), 30–34; Mahoney, “Process Tracing and Historical Explanation.” For his efforts to
distinguish easy and hard tests, see James Mahoney, “The Logic of Process Tracing Tests in the Social
Sciences,” Sociological Methods & Research 41, no. 4 (November 2012): 570–97.
19 Andrew Bennett, “Process Tracing: A Bayesian Perspective,” in The Oxford Handbook of Political
Methodology, ed. Janet M. Box-Steffensmeier, Henry E. Brady, and David Collier (Oxford: Oxford Uni-
versity Press, 2008), 702–21. More recently, Bennett has suggested modifications of the basic framework
that relax the unnecessarily restrictive categorical logic of ncessity and sufficiency. See Andrew Ben-
nett, “Process Tracing with Bayes: Moving beyond the Criteria of Necessity and Sufficiency,” Qualitative
and Multi-Method Research 12, no. 1 (Spring 2014): 46–51, which draws on Macartan Humphreys and
Alan Jacobs, “Mixing Methods: A Bayesian Unification of Qualitative and Quantitative Approaches,” pre-
sented at the Annual Meeting of the American Political Science Association, Chicago, August 2013. These
developments complicate but do not fully obviate the critical points raised below.
20 Tannenwald, “Process Tracing and Security Studies,” 226.
21 Van Evera, Guide to Methods, 33.
22 Tannenwald, “Process Tracing and Security Studies.”
Process Tracing and Qualitative Causal Inference 245
that case.”24 Yet the Bayesian logic favored by Mahoney and Bennett does
not necessarily imply this continuity criterion, which is conspicuously absent
from virtually all recent writings on process tracing.
Suppose, after all, that an analyst is confident that there are only five
plausible hypotheses to explain outcome Y. Each of the first four hypotheses
fails a hoop test on its independent variable and so all four are eliminated;
the fifth survives its hoop test. Must we continue tracing the process linking
X to Y, or shall we be satisfied by eliminative induction on the independent
variables? From a Bayesian perspective, the answer depends upon the prior
and posterior likelihoods. Yet from the perspective of categorical logic, all
rivals have been eliminated and hence the sole survivor must be confirmed,
even without tracing the entire causal chain. Or suppose the first four of
these hypotheses each passes a hoop test on their independent variable and
so are not eliminated; but the fifth hypothesis passes both a hoop test and
a smoking gun test on its independent variable. By the categorical logic
of smoking gun tests, this passage should be sufficient for the truth of the
hypothesis. Must we continue tracing the process linking X to Y? It would
be highly valuable—perhaps even obligatory—for process tracers to address
these questions, for they seem to imply that the systematic tracing of an
entire process is not necessary for making inferences.
Finally, let us return to the question of whether the current approach
to process tracing, when successful, is equivalent to causal inference. I have
suggested that critical components of current statements of process tracing
rest on Humean or regularity notions of causation that are insufficient to
warrant causal inference. Explicating the Bayesian logic of process tracing
23 David Collier, “Understanding Process Tracing,” PS: Political Science and Politics 44, no. 3 (October
2011): 825.
24 George and Bennett, Case Studies and Theory Development, 207.
246 D. Waldner
I have suggested three challenges for process tracers: to resolve the ten-
sion between the particular and the general (between unit-level and average
treatment effects); to articulate and operationalize a non-Humean notion of
causation; and to justify using process tracing as a means of valid causal
inference, given the constraints of the fundamental problem of causal infer-
ence. Here I show how to address all three concerns using what I have called
elsewhere the “completeness standard.”28 I illustrate the proposal using John
Owen’s work on the liberal origins of the democratic peace.29
Aeschylus, and the Foundations of Qualitative Causal Inference,” unpublished manuscript, University
of Virginia, 2015.
28 For further discussion and illustrations of the completeness standard, see David Waldner, “What
Makes Process Tracing Good? Causal Mechanisms, Causal Inference, and the Completeness Standard
in Comparative Politics,” in Process Tracing: From Metaphor to Analytic Tool, ed. Andrew Bennett and
Jeffrey T. Checkel (Cambridge: Cambridge University Press, 2014), 126–52.
29 John M. Owen IV, Liberal Peace, Liberal War: American Politics and International Security (Ithaca,
NY: Cornell University Press, 1997); Owen, “How Liberalism Produces Democratic Peace,” International
Security 19, no. 2 (Autumn 1994): 87–125.
Process Tracing and Qualitative Causal Inference 247
30 James Woodward, Making Things Happen: A Theory of Causal Explanation (Oxford: Oxford
does not use the concepts of a causal graph, random variables, or joint probability distributions.
248 D. Waldner
and thus does not represent a joint probability distribution. The combina-
tion of a causal graph with a set of event-history maps thus bridges the gap
Downloaded by [UQ Library] at 13:31 16 July 2015
between the search for generality and the privileging of specific historical
outcomes and explanations. Figure 2 is a possible depiction of the events
in one of Owen’s early case studies, Franco-American relations from 1796
to 1798.32 By my proposal, Owen would construct a separate event-history
map for each of his ten historical case studies.
The third step in the procedure I am advocating is descriptive inference,
or checking the correspondence between the event-history map and the
causal graph. This procedure uses the standard tools of measurement theory:
construct validity, measurement reliability, and measurement validity. Do the
events in the map represent the conceptual connotation of the corresponding
node? Is the author’s evidence sufficient to confirm the descriptive inference?
For example, corresponding with the node “refusal to fight democracies =
true” in the causal graph is the event-history “republicans agitate against war
with France.” Here we ask whether the evidence about the events support the
event-history as a concrete historical representation of the abstract node in
the causal graph. Owen provides a detailed and well-documented historical
narrative of Republican members of Congress introducing legislation against
the war and refusing to vote for a war declaration.33 Owen gives substantial
empirical support to his inference that although President John Adams clearly
wanted war with France, going so far as to draft a war message to Congress,
he did not present the message to Congress because he knew that Republican
opposition would block its passage. Notice that Owen provides three direct
pieces of evidence: Republican actions in Congress, such as introducing
32 Owen does not formally and graphically represent his narrative, and the historical narrative does
not precisely correspond to the template set by the causal graph. I thank Owen for helping me with the
reconstruction of this event-history map.
33 Owen, “How Liberalism Produces Democratic Peace,” 86–87; Owen, Liberal Peace, Liberal War,
107.
Process Tracing and Qualitative Causal Inference 249
“Trust, but Verify: The Transparency Revolution and Qualitative International Relations,” Security Studies
23, no. 4 (October 2014): 663–88.
250 D. Waldner
explanations that can potentially satisfy the demands of unit-level causal in-
ference. The framework can accommodate other understandings of process
tracing as well. Some qualitative scholars prefer to immerse themselves in
context and sequence; they look to process tracing as disciplined “soaking
and poking,” to use Richard Fenno’s time-honored phrase. Think of this
inductive narrative approach as the construction of an event-history map
without the corresponding construction of a causal graph. Mahoney ad-
vocates using process tracing for theory discovery, extracting causal factors
from fine-grained analysis of a case.35 Think of this approach as the inductive
construction of a causal graph from the details of an event-history map, con-
verting events into random variables. The completeness standard is also fully
consistent with Bayesian hypothesis testing, which would be used to make
inferences from event-history maps to causal graphs. The additional struc-
ture of the completeness standard, however, promises substantial epistemic
value-added. These various uses of process tracing all have considerable
value, but scholars using them should carefully tailor the claims that they
make to the standards that they satisfy. Scholars using process tracing that
does not identify invariant causal mechanisms and that does not identify an
alternative instrument for overcoming the fundamental problem of causal
inference should adjust their claims of causal inference accordingly.
ACKNOWLEDGMENTS
The author thanks Andrew Bennett, Colin Elman, James Mahoney, John
Owen, and two anonymous reviewers for Security Studies for their very
helpful comments on earlier drafts of this essay.