You are on page 1of 9

277

I,ll', ,,"oll.Focused Evaluation

hlll'H.sting debate even as it helps students and evaluators sort through the
1111'111
y muddle. In my case, my placement on the major use branch may help
I illIl~l~t a common misconception about utilization-focused evaluation.
~ EIGHTEEN If
I" '1,IlISCI have also written extensively about qualitative evaluation methods
.
I. ." Patton, 2002), I find many students confused in thinking that utilization-
II.. lIsed evaluation advocates or emphasizes primarily qualitative methods.

THE ROOTS OF
UTILIZA TION-FOCUSED 1/'11LIZA nON-FOCUSED EV ALUA nON

EVALUATION \~ AIkin and Christie have explained in their overview chapter, utilization-
IOl'lIsedevaluation begins with the premise that evaluations should be judged
hy their utility and actual use; therefore, evaluators should facilitate the
Michael Quinn Patton ..valuation process and design any evaluation with careful consideration of
how everything that is done, from beginning to end, will affect use. Use
I'oncerns how real people in the real world apply evaluation findings and
l'xperience the evaluation process. Therefore, the focus in utilization-focused
l'valuation is on intended use by intended users. Since no evaluation can be
value-free, utilization-focused evaluation answers the question of whose

M arv AIkin and Christina Christie have taken on a daunting task in


attempting to locate various evaluation theories and theorists in rela-
I
values will frame the evaluation by working with clearly identified, primary
IIItendedusers who have the responsibility to apply evaluation findings and
tion to each other. No one is better positioned to take on this challenge. AIkin Implementrecommendations.
has taught an advanced-theory seminar for years, has contributed to the eval- Utilization-focused evaluation is highly personal and situational. The
uation theory in his own writings, and has carefully monitored theory devel- evaluation facilitator develops a working relationship with intended users to
opment as the field of evaluation has matured. help them determine what kind of evaluation they need. This requires negoti-
In reacting to an early draft of the "Evaluation Theory Tree," I proposed ation in which the evaluator offers a menu of possibilities within the frame-
a more fluid image of intersecting tributaries contributing to a Mississippi-like work of established evaluation standards and principles.
river of evaluation. The branches are a bit overly static for my taste. My sense Utilization-focused evaluation does not advocate any particular evalua-
of theory development in evaluation is better captured by an image of streams tion content, model, or method-including qualitative methods. Nor does
flowing back and forth with the waters intermingling as part of and contribut- utilization-focused evaluation advocate a particular kind of use. Rather, it is a
ing to a larger current. The flow would include various sections of turbulent process for helping primary intended users select the most appropriate content,
white water, obstacles that divide the current at places before rejoining the model, methods, theory, and uses for their particular situation. Situational
main stream, and even the occasional historical flood or drought. In contrast, responsiveness guides the interactive process between evaluator and primary
I find that the roots, main trunk, and branches of the theory tree depict linear- intended users. A utilization-focused evaluation can include any evaluative
ity and solidity to an extent that underestimates crosscurrents and turbulence. purpose (formative, summative, developmental); any kind of data (quantita-
That said, the evaluation theory tree has the virtue of making clear tive, qualitative, mixed); any kind of design (e.g., naturalistic, experimental);
distinctions and placing theorists on primary branches, which should foster and any kind of focus (processes, outcomes, impacts, costs, and cost-benefit,

276
278 USE Utilization-Focused Evaluation 279
.~

among many possibilities). Utilization-focused evaluation is a process for resources to those shared interests. I learned to ground my change efforts in
making decisions about these issues in collaboration with an identified group the perspectives, values, and interests of those with whom I worked, the
of primary usersfocusing on their intended uses of evaluation. indigenous people who were there before I came and would be there after I
A psychology of use undergirds and informs utilization-focused evalua- left. I learned to appreciate and honor local villagers and farmers as the
tion: Intended users are more likely to use evaluations if they understand and primary stakeholders in change and to see my role as facilitating their actions,
feel ownership of the evaluation process and findings; they are more likely to not letting my interests and values drive the process, but rather deferring to
understand and feel ownership if they've been actively involved; by actively and facilitating their interests and values. In that way, I tried to make myself
involving primary intended users, the evaluator is training users in use, prepar- useful to people struggling to survive in a harsh environment.
ing the groundwork for use, and reinforcing the intended utility of the evalua- My approach to evaluation grew out of those seminal community devel-
tion every step along the way. opment experiences in Africa. From the very beginning, it was clear to me that
I was not going to be the primary user of the evaluation findings. My niche
would be facilitating use by others. I could apply what I had learned about how
ROOT INFLUENCES to figure out what someone cared about, how to bring people together to iden-
tify shared interests, and how to match evaluation designs and resources to
Peace Corps in Africa those shared interests. I drew on what I had learned about how to ground my
One of the early and lasting influences on my perspective can be traced to Peace Corps efforts in the perspectives, values, and interests of those with
the years in the 1960s when I served as a Peace Corps volunteer in eastern whom I worked by grounding my evaluation efforts in the perspectives, val-
Burkina Faso among the Gourma people. We were community development ues, and interests of those with whom I worked: the indigenous program
generalists working in very poor, rural villages where farmers engaged in sub- participants, staff, administrators, lenders, and other decision makers who
sistence agriculture, growing primarily millet and sorghum. Soils were poor. were involved with the program before I came and would be there after I left.
Water was scarce. Infant mortality was high. Infectious diseases were com- I learned to appreciate and honor these people as the primary stakeholders in
mon and debilitating. Markets were underdeveloped. Resources were few. We program improvement efforts and to see my role as facilitating their actions,
were young, idealistic, hopeful, and clueless. not letting my interests and values drive the process, but rather deferring to
We began by talking with villagers, listening to their stories, gathering and facilitating their interests and values. In that way, I tried to make myself
their histories, learning about their experiences, and working to understand useful to people struggling to survive in harsh, demanding, and volatile human
their perspectives. Gradually, as we learned the language, engaged with the services, education, social change, and public policy environments.
people, and began to understand the local setting, project possibilities
emerged: well-digging projects, building one-room schools, introducing cash Sociological Influences
crops, new approaches to cultivation, organizing cooperatives, and initiating
education efforts. But our role was always more one of facilitation than actual In paying homage to root influences, I have to include the Beatles. Yes,
doing. We figured out shared interests, helped organize groups for action, and the 60s rock group. You see, they are responsible for my becoming a sociolo-
helped them find resources. Our efforts were highly pragmatic, just trying to gist, and sociology has shaped my worldview about evaluation use. The short
find something that would work, that might create a little leverage that could version of the story begins in the Cold War. I went to high school during the
be used to gather insights into and start to address larger problems. In the post-Sputnik push to produce more scientists to compete with the Soviet
grand scheme of things, our efforts were very modest. Union in the "Space Race" and development of intercontinental ballistic mis-
I learned how to figure out what someone cared about, how to bring siles. As a result, I took a lot of advanced science and math courses, no social
people together to identify shared interests, and how to match initiatives and science, and was channeled into engineering in college. During my freshman
280
USE Illilization-Focused Evaluation 281
III
III
lilll year, I had to take a required English course that assigned occasional ess~ys. IInd intended users. Understanding this has helped me explain to intended
On the night before one of these essays was due, the Beatles made their users how and why their involvement in a utilization-focused evaluation is in
'1/1
,I' I historic appearance on the Ed Sullivan Show. I was amazed and enthralled by their own best interest. It provides a basis for understanding how knowledge
the hysterical screaming, unrestrained adulation, frenzied dancing, and occa- is power and led me to the following premise: Use of evaluation will occur in
"11
III
sional fainting, and so wrote my essay about what I had seen-wondering direct proportion to its power-enhancing capability. Power-enhancing capa-
what it all meant. When the professor returned my paper, she suggested that I bility is determined as follows: The power of evaluation varies directly with
'1'
might want to take a sociology course because sociologists studied such phe- the degree to which thefindings reduce the uncertainty of action for specific
I
I
.I~II nomena. I had never heard of sociology, but I took a course on the sociology stakeholders. This view of the relationship between evaluation and power is
1'1
of youth, changed my major, and subsequently attained both undergraduate derived from the classic organizational theories of Michael Crozier (1964) and
.r/ and graduate degrees in sociology. James Thompson (1967).
III Sociology gave me a solid grounding in quantitative methods, theory
~II construction, philosophy of science, and sociology of knowledge, all of which The power of prediction stems to a major extent from the way information is
have influenced my evaluation thinking and practice. Substantively, three distributed. The whole system of roles is so arranged that people are given
information, the possibility of prediction and therefore control, precisely
I'/fll particular sociological specializations have influenced my evaluation perspec-
because of their position within the hierarchical pattern. (Crozier, 1964, p. 158)
III tive: diffusion of innovations, sociological perspectives on power and conflict,
and organizational sociology, which examines how people behave in institu-
III
tions and organizations, much of which transfers to behavior in programs, the Whereas Crozier's analysis centered on power relationships and uncer-
III primary arena of evaluation studies. tainties between individuals and among groups within organizations, James
The diffusion of innovations literature examines and attempts to explain Thompson found that a similar set of concepts could be applied to understand
1/;11
:11
the characteristics of innovations that affect adoption and dissemination relationships between whole organizations. He argued that organizations are
(Rogers, 1962; Rogers & Shoemaker, 1971; Rogers & Svenning, 1969). This open systems that need resources and materials from outside and that "with
this conception the central problem for complex organizations is one of cop-
I: was the framework that informed my first empirical inquiry into evaluation
.1 use, an inquiry that was the basis for the first edition of Utilization-Focused ing with uncertainty" (Thompson, 1967, p. 13). He found that assessment and
Iii Evaluation (Patton, 1978). We basically did case studies of evaluations to find evaluation are used by organizations as mechanisms for reducing uncertainty
" and enhancing their control over the multitude of contingencies with which
i ll out what characteristics were associated with use (a form of adoption from a
diffusion of innovations perspective). they are faced. Information for prediction is informationfor control: thus, the
ill A related field in organizational sociology focuses on the characteristics power of evaluation. To be power laden, information must be relevant and in
lIi a form that is understandable to users. Crozier (1964) recognized this qualifier
of innovative organizations (e.g., Azumi & Hage, 1973; Etzioni, 1961, 1968;
~II
Hage & Aiken, 1970; Morgan, 1986, 1989). In a broader context, much of in linking power to reduced uncertainty: "One should be precise and specify
IIII11
sociology concerns Hobbesian questions of order: What keeps society from relevant uncertainty. . . . People and organizations will care only about what
11111
falling apart? How does social change occur? What are barriers to change? they can recognize as affecting them and, in turn, what is possibly within their
How does conflict arise, and how is it managed by social processes? My grad- control" (p. 158).
1:1/1 uate studies focused on these questions, and both the understandings and con-
fusions that resulted from those studies have undergirded my inquiries into
and thinking about evaluation utilization. Unlearning Sociology: The Personal Factor

Another root sociological influence has been a theory of power that I have While sociology has constituted my intellectual foundation, the break-
found instructive in helping me appreciate what evaluation offers stakeholders through in developing utilization-focused evaluation came from unlearning

II
_,I~I L
IIII 282 I'tilization-Focused Evaluation 283
USE
nlill
"
and thinking beyond the sociological perspective into which I had b~en rhese are people who actively seek information to make judgments and reduce
11111 socialized in graduate school (University of Wisconsin). The dominant decision uncertainties. They want to increase their ability to predict the out-
Weberian perspective in organizational sociology posits that organizations comes of programmatic activity and thereby enhance their own discretion as
;;: Jj I
are made up of and operate based on positions, roles, and norms such that the decision makers, policymakers, consumers, program participants, funders, or
individualityof people matters little because individualsare socializedto occupy whatever roles they play. These are the primary users of evaluation.
11111111

specific roles and positions and behave according to specific learned norms, all Presence of the personal factor increases the likelihood of long-term
I for the greater good of the organization's goal attainment. Thus, I had been lollow-through, that is, persistence in getting evaluation findings used.
schooled in the notion that organizations are an impersonal collection of I'hough the specifics vary from case to case, the pattern is markedly clear:
till hierarchical positions. What I had to learn was that people, not organizations, Where the personal factor emerges, where some individuals take direct,
use evaluation information. I learned this by studying actual evaluation use. personal responsibility for getting findings to the right people, evaluations
till
In the mid-I 970s, as evaluation was emerging as a distinct field of have an impact. Where the personal factor is absent, there is a marked absence
Ili!11 professional practice, I undertook a study with colleagues and students of of impact. Use is not simply determined by some configuration of abstract
20 federal health evaluations to assess how their findings had been used and organizational dynamics; it is determined in large part by real, live, caring
"1111
to identify the factors that affected varying degrees of use. We interviewed the human beings. Sociology had not prepared me for that understanding.
evaluators and those for whom the evaluations were conducted. That study Once understood, this became the foundation of utilization-focused eval-
'II marked the beginning of the formulation of utilization-focused evaluation. We uation. Many decisions must be made in any evaluation. The purpose of the
asked respondents to comment on how, if at all, each of 11 factors extracted evaluation must be determined. Concrete evaluative criteria for judging pro-
II
from the literature on diffusion of innovations and evaluation utilization gram success will usually have to be established. Methods will have to be
had affected use of their study. These factors were methodological quality, selected and timelines agreed on. All of these are important issues in any eval-
,III
methodological appropriateness, timeliness, lateness of report, positive or uation. The question is: Who will decide these issues? The utilization-focused
ill negative findings, surprise of findings, central or peripheral program objec- answer is: primary intended users of the evaluation.
tives evaluated, presence or absence of related studies, political factors, deci- In the early years offormulating and testing utilization-focused evaluation,
II~III based on the importance of the personal factor, I sought to identify primary
sion maker/evaluator interactions, and resources available for the study.
Finally, we asked respondents to "pick out the single factor you feel had the intended users who were inclined to and predisposed toward evaluation use.
1111

greatest effect on how this study was used." Later, I came to place more importance on and gave more attention to people
From this long list of questions, only two factors emerged as consistently who are not inclined to use evaluation: people who are intimidated by, indiffer-
~:II
important in explaining utilization: (1) political considerations and (2) a factor ent to, or ev~n hostile to evaluation. I came to believe that a utilization-focused
:~III we called "the personal factor." This latter factor was unexpected, and its clear evaluator looks for opportunities and strategies for creating and training infor-
importance to our respondents had, we believed, substantial implications for mation users. Thus, the challenge of increasing use consists of two parts:
III1II
the use of program evaluation. (1) finding and involving those who are, by inclination, information users and
The personal factor is the presence of an identifiable individual or group (2) training those not so inclined. Just as in cards, you play the hand you're
Illfill
of people who personally care about the evaluation and the findings it gener- dealt, in evaluation, you sometimes have to play the stakeholders you're dealt.
111~~1
ates. Where such a person or group was present, evaluations were used; where
the personal factor was absent, there was a correspondingly marked absence Humanistic Values
IIU of evaluation impact.
The personal factor represents the leadership, interest, enthusiasm, deter- One of the seminal experiences that I believe prepared me for the impor-
:'
1

mination, commitment, assertiveness, and caring of specific, individual people. tance of the personal factor was 2 years I spent doing dissertation research at
1 I

IIIU

_,:I.L
284 USE t '1II,.:lItion-Focused Evaluation 285

the New School for Behavioral Studies in Education at the University of North Mtlldythat led to our discovery of the personal factor. Here are some of the
Dakota, with Vito Perrone. IllIngsI learned directing that program.
I dedicated the first edition of Utilization-Focused Evaluation (1978) to The evaluation methodology training program was designed to teach
Vito Perrone. His philosophy of open education emphasized the importance \'v.lluationmodels and research methods. But as program participants under-
and value of each individual child, which was part of the reason that Vito pre- IlIlIkreal evaluations and local settings, we found much of our traditional
ferred rich case studies to standardized tests when examining student learning. 1IIl,thodoiogicaitraining to be irrelevant. We learned that evaluators need skills
Vito was the core of the North Dakota Study Group on Evaluation, where I IIIhuilding relationships, facilitating groups, managing conflict, walking polit-
first became involved in the qualitative-quantitative paradigm debate, which hill tight ropes, and effective interpersonal communications to capitalize on
led to my first evaluation publication, Alternative Evaluation Research 1111' importance of the personal factor. Technical skills and social science
Paradigms (Patton, 1975). Having never had a course in qualitative methods 1\lIlIwledgearen't sufficient to get evaluations used. People skills are critical.
at the highly quantitative department of sociology at the University of Idl~alsof rational decision making in modem organizations notwithstanding,
Wisconsin, I learned qualitative methods from Vito and his colleagues while Ill'fsonaland political dynamics affect what really happens. Evaluators without
doing an evaluation of open classrooms throughout North Dakota (basically Ihl'savvy and skills to deal with people and politics will find their work largely
learning by doing). I also learned the politics of methods. The North Dakota Il/.lIoredor, worse yet, used inappropriately.
stakeholders were interested only in the rich qualitative data from classroom We learned that a particular evaluation may have multiple levels of
observations and interviews with teachers, parents, and students. My disserta- ~Iilkcholdersand therefore need multiple levels of stakeholder involvement.
tion committee was interested only in linear regression analysis. As a result, I "01'example, funders, chief executives, and senior officials may constitute
produced two separate documents, a qualitative evaluation for North Dakota thl' primary users for overall effectiveness results, while lower level staff and
users and, having coded the qualitative data to permit regression analysis, an pilfticipant stakeholder groups may be involved in using implementation and
entirely statistical dissertation for my doctorate. The North Dakota stakehold. IIHH1itoring data for program improvement.
ers never saw the statistical analysis, and my doctoral committee never saw th(' We learned that the sophisticated methodological techniques that were
qualitative evaluation. Thus, I experienced firsthand the implications of work- highly valued for dissertation research had little applicability for small-scale,
ing with people who value different methods and the need to adapt to the inter. IlIral, formative evaluations. We had to develop methods, including qualita-
ests and perspectives of different users. tive approaches that were appropriate and responsive to local needs.
We also learned that the national funders at NIMH didn't value any of
Evaluation Training Ihese learn'ings. The three national site visitors who evaluated the program
, liticized it for not teaching large-scale experimental designs for national stud-
In 1973, the year I completed my doctorate, National Institute of Mental h'~.The site visit team dismissed as unimportant the 80 local evaluations we
Health (NIMH) funded a handful of evaluation methodology training pro. hil(Iconducted and the fact that we had placed graduates in important positions
grams at major universities, one of which was the University of Minnesota. I IIIlocal government, philanthropic foundations, nonprofits, and training units
went to Minnesota that year as the program's first postdoctoral fellow in eval. IIIcorporations. They also dismissed the research we had conducted on eval-
uation methodology. The program was highly interdisciplinary, involving pro. IIIIUonuse and the publication of Utilization-Focused Evaluation, noting that
fessors from 17 different departments across the university. A year later, I tl1l'sample size was insignificant (n = 20 case studies),that the findingswere
became director of the program and established the Minnesota Center fOf Ilwreforemeaningless, and that the very fact that we pointed to our utilization
Social Research as a place where participants in the program could conduct Il'scarch as the centerpiece of the program demonstrated that the program was
actual evaluations. We made the study of evaluation use the focus of the pro. IllItfulfilling its purpose (sophisticated methodological training, in their judg-
gram, and it was participants in this program who conducted the utilization ment) and had taken a turn in the wrong direction. The program therefore

Illi:1 --
286 USE Urilization-Focused Evaluation 287

was not renewed, lost NIMH funding, and subsequently lost University of Carol Weiss: I think we limit ourselves too much if we think of interpersonal
Minnesota support. interaction as the critical component in utilization.

Michael Patton: From my perspective, I feel a great responsibility to serve


Early Encounters With Evaluation Luminaries my clients.

While the national NIMH site visitors found little of value in our work on Ernest House: How far would you pursue this orientation? Surely you can't
consider your only purpose to be meeting your client's interests?
utilization-focused evaluation, that work had attracted the attention of two of
the luminaries in evaluation, Carol Weiss, who is credited with first making Michael Patton: Tell me why I can't?
evaluation use a priority in the emerging field of evaluation, and Marv Aikin,
Ernest House: Why? It's an immoral position.
who had founded and was still directing UCLA's Center for the Study of
Evaluation. Carol Weiss published our utilization study in her important book Michael Patton: I could argue. . .
on using social research for policymaking (Patton et aI., 1977; Weiss, 1977).
Ernest House: You can't. You can't. It would be a long argument, which
It was enormously encouraging to us that Carol Weiss would include in her you'd lose.
book the work of a group of complete unknowns. When we received word that
she had accepted the chapter, we had a boisterous party (well, boisterous by Michael Patton: Let's go for it. (For the full exchange see AIkin, 1990,
Minnesota standards). pp. 101-105.)

Meanwhile, Sage Publications had sent the draft manuscript for


And so we did-with lasting impact on my thinking.
Utilization-Focused Evaluation to Marv Aikin for review. His supportive and
Over the years, I've been involved with many important exchanges and
helpful review led not only to publication (occasion for yet another boisterous
intense debates with colleagues at meetings of the American Evaluation
celebration) but also an invitation from him to participate in an extraordinary
Association and evaluation associations around the world. The feedback I've
gathering with other evaluators at the UCLA Conference Facility at Malibu
Beach. For 3 days, with Marv's facilitation, we discussed evaluation utiliza- received on my work, the challenges that colleagues have offered, and the
tion: research, implications for practice, and potential new directions. opportunities to hear the perspectives and approaches of others have been
invaluable in further developing my thinking and fine-tuning my practice. But
Participants included Ross Connor, Ernie House, Michael Kean, Jean King,
Susan Klein, Alex Law, Milbrey McLaughlin, and Carol Weiss. Those dis- I never again experienced such an intense and expansive 3 days of dialogue
about evaluation use as that which we had so long ago in Malibu.
cussions and the long-term relationships formed have had a lasting impact on
my work. In complexity (chaos) theory terms, those 3 days of butterfly wings
flapping created a tempest that energized the study of evaluation use and set Discovering Process Use
the agenda for issues that continue to challenge the field.
Consider the following exchange I had with Carol Weiss, who was argu- In recent years, the greatest influence on my thinking and practice has
ing that findings must stand on their own rather than depend on interpersonal come from client feedback and follow-up utilization reviews of evaluations
relationships, and Ernie House, who believes that evaluators are ethically I've conducted. When I established the Minnesota Center for Social Research
obligated to consider the interests of the poorly educated or less powerful in in the mid-1970s, I began the practice of following up every evaluation we
society who are not in a position to advocate on their own behalf. Though conducted to find out how it was used.
this exchange took place a quarter century ago, rereading it now takes me Those evaluations are the basis for many of the stories in my writings.
back as if it happened yesterday-and I've been dealing with these issues Part of my preparation for doing a new edition of Utilization-Focused
ever SInce. Evaluation is reviewing client feedback from evaluations and workshops.
289
288 USE , Utilization-Focused Evaluation

When, in the mid-1990s, I went to prepare the third edition of the book and people get the things they want, not least of all to attract resources for their
began reflecting on what had happened in the field in the 10 years since the programs. They may also develop skills in reality testing that have application
last edition, I was struck by something that my own myopia had not allowed in other areas of professional and even personal life.
me to see before. When I have followed up my own evaluations over the years, This culture of evaluation that we as evaluators take for granted in our
I have enquired from intended users about actual use. What I would typically own way of thinking is quite alien to many of the folks with whom we work.
hear was something like: "Yes, the findings were helpful in this way and that, Examples of the values of evaluation include clarity, specificity, and focusing;
and here's what we did with them." If there had been recommendations, I being systematic and making assumptions explicit; operationalizing program
would ask what subsequent actions, if any, followed. But beyond the focus on concepts, ideas, and goals; distinguishing inputs and processes from out-
findings and recommendations, what they almost inevitably added was some- comes; valuing empirical evidence; and separating statements of fact from
thing to the effect that "it wasn't really the findings that were so important in interpretations and judgments. These values constitute ways of thinking that
the end, it was going through the process." And I would reply: "That's nice. are not natural to people and that are quite alien to many. When we take people
I'm glad you appreciated the process, but what did you really do with the find- through a process of evaluation, at least in any kind of stakeholder involve-
ings?" In reflecting on these interactions, I came to realize that the entirefield ment or participatory process, they are in fact learning things about evaluation
had narrowly defined use as use of findings. We have thus not had ways to culture and often learning how to think in these ways.
conceptualize or talk about what happens to people and organizations as a Process use is distinct from use of the substantive findings in an evalua-
result of being involved in an evaluation process: what I have come to call tion report. It's equivalent to the difference between learning how to learn ver-
"process use" (Patton, 1997, 1998). sus learning substantive knowledge about something. Learning how to think
I have defined process use as relating to and being indicated by individ- evaluatively is learning how to learn. I think that facilitating learning about
ual changes in thinking and behaving that occur among those involved in evaluation opens up new possibilities for positioning the field of evaluation
evaluation as a result of the learning that occurs during the evaluation process. professionally. It is a kind of process impact that organizations are coming to
Changes in program or organizational procedures and culture may also be value because the capacity to engage in this kind of thinking can have more
manifestations of process impacts. enduring value than a delimited set of findings, especially for organizations
One way of thinking about process use is to recognize that evaluation interested in becoming what has come to be called popularly "learning orga-
constitutes a culture, of sorts. We, as evaluators, have our own values, our own nizations." Findings have a very short "half-life," to use a physical science
ways of thinking, our own language, our own hierarchy, and our own reward metaphor. They deteriorate very quickly as the world changes rapidly. In con-
system. When we engage other people in the evaluation process, we are pro- Irast, learning to think and act evaluatively can have a lasting impact on how
viding them with a cross-cultural experience. They often experience evalua- Ihey think, on their openness to reality testing, and on how they view the
tors as imperialistic, that is, as imposing the evaluation culture on top of their things they do.
own values and culture---or they may find the cross-cultural experience stim-
ulating and friendly. But in either case, and all the spaces in between, it is a
cross-cultural interaction. Those new to the evaluation culture may need help Global Influences
and facilitation in coming to view the experience as valuable. One of the ways
I sometimes attempt to engage people in the value of evaluation is to suggest In the last few years, I've had extraordinary opportunities to be part of
that they may reap personal and professional benefits from learning how to evaluation conferences and professional associations throughout the world. I
operate in an evaluation culture. Many funders are immersed in that culture. think it is fair to say that evaluation is being globalized, infusing the profes-
I
Knowing how to speak the language of evaluation and conceptualize programs sion with new energy as we learn different ways of looking at what we do and
logically are not inherent goods, but can be instrumentally good in helping face challenges about how to translate some of what we do for the rest of the
I

I
III
291
- _ .l~U USE Utilization-Focused Evaluation

'1' [I of Halcolm (pronounced "how-come," as in "why?") Writing Halcolm stories


11II! I world. For example, I came back from Japan realizing that some aspects of the
lill II American approach to evaluation are quite macho and unlikely to work'well became an outlet for that part of me that wanted and needed to do creative
,~IIiI in Asian cultures. We exhort: "Tell it like it is and let the chips fall where they writing and storytelling. I've long studied good writing and storytelling and
II ii may." The Japanese have a deep reverence for societal harmony, and so have attempted to incorporate what I've learned in my writings, including a
\1\ III creative nonfiction book about my efforts to pass on the evaluation perspec-
" II they're not interested in processes that will tear apart group harmony. In the
II Ii short time I was in Japan, experiencing that different cultural context, I came tive to my eldest son (Patton, 1999). Here, as a way of ending this reflective
II II journey, is Halcolm's interpretation of the personal factor:
II1II
11
to believe that they are going to challenge us to think in new ways about the
.1111II social relations side of the work we do, the human side of our work, not to
I
soften our message or judgments, but in fact to make our communications There are five key variables that are absolutely critical in evaluation use.
II
more effective. Given what I believe is the American tendency to blame and They are, in order of importance: people, people, people, people, and people.
II
II embarrass, we can learn much from Asian sensitivity to group harmony and
II
~ II social dynamics. By the same token, our Asian colleagues will be challenged
11II to engage in straightforward reporting of the findings, what we call "telling it REFERENCES
I1II
1111 like it is"-that is, to deal with looking at real results and the loss of face that
can come from acknowledging that things weren't effective. They will learn Aikin, M. (Ed.). (1990). Debates on evaluation. Newbury Park, CA: Sage.
Azumi, K., & Hage, J. (1973). Organizational systems. Lexington, MA: D. C. Heath.
II
from our commitment to "tell it like it is," and I think that we have a lot to
Crozier, M. (1964). The bureaucratic phenomenon. Chicago: University of Chicago
learn about how to manage that process in a way that respects the importance Press.
of the social fabric. Etzioni, A. (1961). A comparative analysis of complex organizations. New York: Free
11111

I1III From colleagues throughout the world, I find myself challenged to think Press.
1'1111 Etzioni, A. (1968). The active society: A theory of societal and political processes. New
more deeply about the cultural values and biases of Western approaches to
York: Free Press.
evaluation and to include ethnocentrism, racism, and sexism as important
Hage, 1., & Aiken, M. (1970). Social change in complex organizations. New York:
factors affecting evaluation quality and use. As I look to the future, I expect to
'Iii be increasingly influen'cedby international and cross-cultural dialogues as we
Random House.
Morgan, G. (1986). Images of organization. Newbury Park, CA: Sage.
tl\I ~~ define terms, share case examples, and explore innovative approaches as part Morgan, G. (1989). Creative organizational theory. Newbury Park, CA: Sage.
11(1111 of a global evaluation community. Patton, M. Q. (1975). Alternative evaluation research paradigms. Grand Forks:
University of North Dakota, North Dakota Study Group on Evaluation.
III::
1III
Patton, M. Q. (1978). Utilization-focused evaluation. Beverly Hills, CA: Sage.
;1111 Halcolm Patton, M. Q. (1997). Utilization-focused evaluation: The new century text (3rd ed.).
I II~ I
Thousand Oaks, CA: Sage.
1Ii'1
How we communicate with nonevaluators affects their openness to use. Patton, M. Q. (1998). Discovering process use. Evaluation, 4(2),225-233.
fill II
I've had long-standing interest in increasing the effectiveness of my commu- Patton, M. Q. (1999). Grand Canyon celebration: A father-son journey of discovery.
'~III Amherst, NY: Prometheus Books.
nications and interactions through metaphors and storytelling. My writings are
Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Thousand
infused with metaphors and stories, especially Sufi tales, which represent the Oaks, Ca: Sage.
I m tradition of stories with morals. Stories and metaphors are especially impor-
!1'111
Patton, M. Q., Grimes, P. S., Guthrie, K. M., Brennan, N. J., French, B. D., &
tant for cross-cultural communications. Blyth, D. A. (1977). In search of impact: An analysis of the utilization offederal
1111
In my own work, I found that I needed a wise character, a kind of Sufi health evaluation research. In Carol Weiss (Ed.), Using social research in public
'I1~I'
!
master, to express evaluation wisdom in my books, so I created the character policy making (pp. 141-164). Lexington, MA: D. C. Heath.
l

'ii
Ill:11

li~
292 USE

III11
Rogers, E. (1962). Diffusion of innovation. New York: Free Press.
HI Rogers, E. M., & Shoemaker, F. F. (1971). Communication of innovation. New York:
,
,I
I Free Press.
Rogers, E. M., & Svenning, L. (1969). Managing change. San Mateo, CA: Operation it NINETEEN If
ilill PEP.
Thompson, J. D. (1967). Organizations in action. New York: McGraw-Hill.
UI~III Weiss, C. (Ed.). (1977). Using social research in public policy making. Lexington,
MA: D. C. Heath.
nl~III
CONTEXT-ADAPTED UTILIZATION
Iii III

!IIII
A Personal Journey
i!1 ~ III

"I ~ III
Marvin C. A/kin
III! III

I!IIII

I11II11

:
I~1111

111111
I n the immortal words of a noted seaman philosopher, "I y'am what I y'am"
(Popeye). We are all modeled by our experiences, whether they be educa-
tional (formal or informal), events, or interactions with people. I acknowl-
IIII1II
1 edged this when I wrote a chapter for the National Society for the Study of
~.I
,11111
Education (NSSE) yearbook on evaluation (1991), reflecting on what had led
to changes in my views on evaluation over a 25-year period. My roots, how-
~~IIII ever, shaped not only my changes at any particular point but also my initial
MI~ thoughts on evaluation as well as various changes over time. Aspects of my
education and training had influence on how I came to think about evaluation.
i;I~1 (I will, however, spare the reader stories of my early childhood.)
'1111

II'j~1
ROOTS
'UIII

Our training and educational experiences have an impact not only on our indi-
IIIII1
vidual knowledge base but also attitudinally in the way that we view the world
I~IIII and attack problems. I have a bachelor's degree in mathematics and have
taught math at both the high school and community college levels. This
11111

orientation to systematic, sequential, and logical thinking was an influence


,jlll
293
II~~ II

II ~I

I
_~~'II

You might also like