You are on page 1of 4

184 BOOK REVIEWS

methodological, theoretical, and ethical issues that ac- The collection in hand, however, presents the art
company this innovation in evaluation. In essence, the and science of what ethnographers actually do in prac-
collection addresses the most salient factors in this tice. A thorough review would have addressed the sig-
adaptation process. The text demands not “immersion” nificance of using a Geertzian approach to discuss
(to use Bank’s terminology), but careful study and at- ethnography’s contribution to evaluation and noted
tention to the issues and experiences discussed by these the value of using a methods chapter as a lens through
authors. which to view the chapters. Moreover, the multilevel
Bank attempts a “personal reorga~zation” of the and multidimension~ model used in one of the collec-
text and proceeds to evaluate the book within the con- tion’s more significant national ethnographic evalua-
fines of this new, but undefined, format. Bank asks tions, as well as the subject of programmatic and pol-
and answers her own questions. These questions spring icy recommendations, would have been discussed. A
from “long conversations” with “a bona fide and well critical reviewer would ask about the validity and re-
trained ethnographer,” as well as from Bank’s own liability of the techniques used, the utility of the con-
personal search for “tips and techniques.” I was de- cept of culture on the program and evaluation project
lighted to note that despite her distortion of the text’s levels, the potentially contradictory nature of the term
structure, Bank found much material to answer her ethnographic evaluation, the conceptual crossroad of
questions. She offers a “distilled” list of “insights, sug- methodology and ethics-including “guilty knowledge
gestions, examples, and aphorisms . . . for the novice in and dirty hands”, the reflective nature of the endeaver
fieldwork,” at the end of her discussion. I am indebted for fieldworkers (across disciplines), and the issue of
to her for culling these “how-to-do-it” tips from the whether formalization negates ethnographicness. A
text. Her efforts, however, will not speak to those technically informed reviewer would penetrate to more
scholars, students, and policymakers who have more complex questions such as the role of cognitive theory
than a novice’s interest or experience in fieldwork and bias in ethnographic evaluation today or the relative
evaluation. merits of continuous versus non-continuous fieldwork
Bank would “like to see a more systematic com- in one’s own culture. Bank’s discussion leaves one with
parison between the various perspectives in evaluation the sense that the reviewer is not in touch with either
and ethnography”, an interesting topic, but not the the issues or the people involved in this enterprise. She
focus of this book. I might recommend a new collec- left unasked both surface and substantial questions.
tion, which may be closer to the model she envisions: The review provided insufficient insight into context
Educational Evaluation: Ethnography in Theory, or sensitivity to central issues. An informed book re-
Practice, and Politics, Fetterman, D. M. and M. A. view, however, is itself a scholarly contribution that
Pitman (Beverly Hills, CA: Sage 1986). This second should be evaluated as thoroughly and responsibly as
collection presents the next stage in the evolution of any other analytical endeavor.
this disciplin~y endeavor and may anticipate her
interests.
Qualitative Data Analysis: A Sourcebook of New Methods, by Mathew B. Miles and A. Michael Huberman.
Beverly Hills, CA: Sage, 1984. 263 pages.

Reviewer: Thomas A. Schwandt

“You know what you display” (p. 22) is the theme ized format: analysis problem (the problem for which
adopted by Miles and Huberman in their educating the method is a proposed solution), description, il-
and sometimes irritating discussion of analyzing lustration (including how to build the display and enter
qualitative data. “Beware of Greek’s bearing gifts” is and analyze the data), variations on the method, ad-
an adage I think applies equally well to this book, but vice on use, and time required. One hundred charts
more on that later. and figures illustrate the methods. A field study of
The five chapters which constitute the meat of the school improvement supplies the qualitative data used
book are bounded by an interesting introduction in in the illustrations and discussions.
which the authors explain their methodological point Chapter 2 begins by explaining the need for
of view and the nature of the book and by a very brief pr~tructured research design before starting a field
(2 pages) set of concluding remarks where they share study. The authors briefly discuss the merits of “induc-
some general pointers on data analysis. Within these tive,” “loosely designed, ” “emergent” approaches ver-
boundaries Miles and Huberman discuss the results of sus “tight, ” “prestructured” designs. They state that
their efforts of “casting about” for “manageable and their “stance lies off center, toward the structured end”
straightforward methods for analyzing qualitative of the continuum of designs, and they build a case ac-
data. They present 49 specific methods in a standard- cordingly. Their method of focusing and bounding the
BOOK REVIEWS 185

collection of qualitative data is a four-stage process Finally forms for a site-analysis meeting (a meeting of
which includes building a conceptual framework, for- field workers to summarize the current status of events
mulating research questions, sampling, and developing at the site) and an interim site summary (a synthesis of
instrumentation. what the researcher knows about the site) are illustrated
The conceptual framework is a graphic illustration and explained. The latter is described as “the first at-
of the “key factors or variables” to be studied. In this tempt to derive a coherent account of the site” (p. 75).
way the researcher labels probable “bins” of discrete Chapters 4 and 5 (about half of the book) discuss
events and behaviors and achieves “some clarity of methods for analysing data from a single site and from
thought about their interrelationships” (p. 28). Re- multiple sites, respectively. All of the methods involve
search questions arise from the relationships stipulated various strategies for displaying data in charts, tables,
in the conceptual framework. Through an iterative checklists, matrics, or figures. The authors define “dis-
process the researcher generates general research ques- play” as a “spatial format that presents information
tions as a first step toward “operationalizing” the con- systematically to the user.” Data entered into displays
ceptual framework. These first two stages aid in focus- may be “short blocks of texts, quotes, phrases, ratings,
ing data collection. The third stage, sampling, actually abbreviations, symbolic figures” derived from the coded
sets boundaries for data gathering. Here, the researcher field note writeups.
decides which settings, actors, events, and processes to The authors claim that displays enhance the chances
sample in order to answer the research questions. Ex- of drawing and verifying valid conclusions from quali-
plicit sampling decisions are necessary to avoid the pit- tative data. They argue that displays are superior to
falls of “indiscriminate, vacuum-cleanerlike collection narrative text as a tool of analysis. In their view, narra-
of every datum; accumulation of far more information tive text, “is an extremely weak and cumbersome form
than there will be time to analyze; and detours into. . . of display,” and “it is hard on analysts” because it is
blind alleys” (p. 37.). “dispersed, ” “sequential rather than simultaneous,”
Given a sampling frame, decisions must be made “usually only vaguely ordered, and it can get monoto-
about instrumentation. Miles and Huberman briefly nous and overloading” (p. 79). According to the au-
review arguments for and against the development of thors, narrative text is not only hard on analysts, but
well-structured instruments prior to entering the field. also cumbersome for readers of case studies.
They adopt an “it depends” stance based on choices the Taking their cue from data displays produced by
researcher makes between the following sets of param- statistical packages such as SPSS, the authors generate
eters: exploratory versus confirmatory studies, single- various set-ups for presenting qualitative data. In
site versus multiple-site studies; and, site-specific ver- Chapter 4, 19 methods are presented in order of simple
sus cross-site studies. They maintain that in each pair to complex and from descriptive to explanatory. A
of choices listed above, the former option typically context chart (somewhat akin to an organization
calls for less front-end preparation of instrumentation chart) and a checklist matrix are simple devices used
than the latter. An excerpt from an interview guide is for description. Ordering or arranging data in some
provided as an illustration of prior instrumentation. systematic way is made possible by time-ordered, role-
Throughout this discussion of the their four-stage re- ordered (roles of key players at the field site), and
search planning process, the authors emphasize the conceptually-clustered matrices. To facilitate explana-
need for revision and iteration. tion the analyst might prepare an effects or site-
Chapter 3 presents a set of 12 methods useful for dynamics matrix, an event listing, or, ultimately, a
analyzing data during the process of collection. The causal network. Causal networks are defined as “a
authors assume that the data to be analyzed (e.g., field visual rendering of the most important independent
notes) have already been converted to “write-ups” that and dependent variables in a field study and of the
“are intelligible to anyone, not just the fieldworker” relationships between them” (p. 132). Chapter 5 builds
(p. 50). The methods are presented in order of early to upon the previous chapter to show how displays can be
late in data collection and from simple to complex. constructed using data from multiple field sites. Eigh-
They range from forms for recording a field contact teen methods for conducting cross-site synthesis are il-
and summarizing a document to procedures for pre- lustrated. Me&-matrices (called “monster dogs” by the
paring codes and doing coding. The authors discuss authors) and various types of site-ordered, variable
descriptive codes for summarizing data and “pattern ordered, and effect-ordered matrices are examined.
codes” for identifying patterns, themes, or overarching The chapter concludes with a relatively lengthy (20
constructs. Creating and revising codes and double- pages) discussion and illustration of causal modeling
coding to achieve reliability are explained. This chap- and cross-site causal networking of critical variables.
ter also discusses the use of “reflexive” and “marginal” Chapter 6, only four pages long, summarizes key
remarks in writing up field notes and conceptual memos points for building displays, entering data into display
for theorizing about variables and their relationships. cells, and analyzing that data. Chapter 7 discusses
186 BOOK REVIEWS

specific tactics for drawing and verifying conclusions tion and not in view of their capacity to meet require-
from qualitative data displays. The authors first dis- ments of scientific investigation. Qualitative Data
cuss 12 tactics for generating meaning including, Analysis reveals the primacy of method approach on
among others, counting, noting patterns and themes, virtually every page. Despite these and similar errors
making metaphors, factoring, finding intervening which caution the reader to beware of Greeks bearing
variables, and building a logical chain of evidence. gifts, I believe that the sourcebook itself provides
These are followed by 12 tactics for confirming con- much grist for the qualitative researcher’s methodolog-
clusions including, checking for representativeness and ical mill. If quaIitative researchers looking for better
researcher effects, ruling out rival hypotheses and ways of displaying information can hold their dis-
spurious relations, triangulating, and getting feedback pleasure with the book’s language and philosophy in
from informants. The chapter concludes with a brief abeyance long enough to inspect the suggestions that
discussion of documenting and auditing the procedures Miles and Huberman offer, I beIieve that they also will
of a field study. find much of value in this sourcebook. I found myself
This book is likely to be well-received by quan- in the latter category, so I know whereof I speak.
titative researchers who find themselves having to cope These caveats not withstanding, the book is not with-
with qualitative data. The discussion of the four-stage out its irritants, and I will devote the remainder of this
research process will feel as comfortable as an old review to discussing the major and minor annoyances.
shoe, and the attempt to find analogies to computer- Under major irritants I would classify the following
generated data displays that will work with qualitative observations. First, Miles and Huberman frequently
data will be gratifying. Putting words and coded sym- adopt a very patronizing tone toward practitioners of
bols rather than numbers into display cells won’t ap- qualitative/naturalistic approaches and their method-
pear to be such an odd idea. Likewise researchers and ologies. They refer to themselves as “savvy practi-
evaluators schooled in the positivist/experimentalist tioners” (p. 48) and leave the reader with the implica-
tradition will find the idiom of the book consoling: tion that many who practice qualitative research are
The language of their paradigm (“independent and de- not. Consider the following two passages:
pendent variables, ” “intervening variables,” “replicat-
ing,” “rival explanations,” “outliers,” sampling So, unlike some schools within social phenomenology, we
frame,” “ representativeness,” etc.) pervades the book. consider it important to evolve a set of valid and verifiable
The authors admit that “we think of ourselves as logi- methods for capturing these social relationships with their
cal positivists who recognize and try to atone for the causes (p. 20)
limitations of that approach” (p. 19). Clearly this is a Our stance involves orderliness. There are many research-
book written about qualitative analysis from the per- ers who prefer intuitive, relaxed, nonobsessive, voyages
spective of the quantitative tradition. through their data, and we wish them well. But for 11.x.. .
thoroughness and explicitness are quite paramount. (p. 20)
It should also be noted that this book promotes the
pragmatic “primacy of method” solution to the debate
between quantitative and qualitative research tradi- In the passages above, I put the emphases where I
tions. It reflects a viewpoint which the authors pre- felt it when reading. I found Miles and Huberman’s
viously elaborated (Miles & Huberman, 1984), namely, argument that we need a clearer understanding of the
that the debate between different epistemological posi- process of qualitative data analysis appealing. But I
tions is best left to philosophers of science and to those think that the case for clarity is taken aback by this
methodologists with a philosophical bent. In this sort of us versus them tactic. Second, the authors tend
sourcebook the authors foster this point of view by to bear false witness against their colleagues, Consider
arguing that “any method that works is grist for our the following passage:
mill regardless of its antecedents” (p. 17).
The language and approach that pleases proponents Methodologi~ly, our beef is with the somewhat magical
of one tradition will no doubt peeve supporters and approach to the analysis of qualitative data advocated on
the grounds that such an approach is idiosyncratic, in-
believers of another. The authors admit as much, al- communicable and artistic and that only those who have
though I believe they think of themselves as being a bit been fully socialized and apprenticed in its practice can
more ecumenical than they actually are. For example, claim to comment upon it. (p. 20)
one of the cardinal principles distinguishing the two
traditions is the primacy of subject matter versus the “Wizards and magicians we ain’t!,” is the response
primacy of method (Diesing, 1971). In the qualitative/ this passage invoked from me. I resent being cast as an
ethnographic/naturalistic tradition, researchers are alchemist, and I know of no self-respecting qualitative
admonished to first be acted upon by the subject mat- researcher who would make a claim such as is presented
ter. Further, methods are to be chosen in view of their here. A third major irritant of this sourcebook was the
compatibility with the subject matter under investiga- labeling scheme employed to identify illustrations.
BOOK REVIEWS I87

Often I was bewildered by the scheme, searching for would be paid to their display! Lastly, there were an-
references in the text to Chart 13b or Box 1II.A.a. noying errors in the reference list. For example, a
Minor irritations included a significant number of reference to one of the authors’ previous publications
proofreading errors, poorly drawn charts, charts was nowhere to be found in the bibliography.
which weren’t typeset, and charts with print so small Finally, I would recommend this sourcebook as an
that it discouraged perusal. Often, I thought that if the addition to educational research courses, with the
chart in question was being proposed as a more palat- caveat that it be an advanced course. Prior knowledge
able alternative to narrative displays, then the authors of epistemological traditions and research approaches
must be pulling my leg. Since displays were the theme is necessary to fully appreciate its contribution to the
of the book, one would expect that more attention methodological literature.

REFERENCES
DIESING, P. (1971). Parterns of discxwery in the sociai sciences. MILES, M. B. & HUBERMAN, A. M. (1984). Drawing valid mean-
Chicago: Aldine. ing from qualitative data: Toward a shared craft. Educational
Researcher, 13, 20-30.

You might also like