You are on page 1of 20

Journal of Evidence-Based Social Work

ISSN: 1543-3714 (Print) 1543-3722 (Online) Journal homepage: https://www.tandfonline.com/loi/webs20

Evidence-Based Practice and Social Work

C. Aaron McNeece PhD & Bruce A. Thyer PhD

To cite this article: C. Aaron McNeece PhD & Bruce A. Thyer PhD (2004) Evidence-Based
Practice and Social Work, Journal of Evidence-Based Social Work, 1:1, 7-25, DOI: 10.1300/
J394v01n01_02

To link to this article: https://doi.org/10.1300/J394v01n01_02

Published online: 20 Oct 2008.

Submit your article to this journal

Article views: 5321

View related articles

Citing articles: 26 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=webs22
Evidence-Based Practice
and Social Work
C. Aaron McNeece, PhD
Bruce A. Thyer, PhD

ABSTRACT. The essential features of contemporary evidence-based


practice (EBP) are outlined, with specific reference to the applications of
this model to various areas of social work, micro through macro. EBP is
seen as a welcome addition to our field, representing a fuller and more
comprehensive development of earlier and related positions such as em-
pirical clinical practice within social work, and the delineation of empiri-
cally-supported therapies within psychology. Social work should proactively
adopt EBP as its preferred conceptual model, reorient BSW and MSW train-
ing programs along the lines advocated by EBP, and inculcate these princi-
ples into the delivery of social work services. This is seen as both a
professional and ethical imperative necessary for the survival of the
field. [Article copies available for a fee from The Haworth Document Delivery
Service: 1-800-HAWORTH. E-mail address: <docdelivery@haworthpress.com>
Website: <http://www.HaworthPress.com> © 2004 by The Haworth Press, Inc.
All rights reserved.]

KEYWORDS. Conceptual model, evidence-based practice, social work

From the recent attention that is given to evidence-based practice


(EBP) in our academic journals, disciplinary meetings, and professional

C. Aaron McNeece (E-mail: amcneece@garnet.acns.fsu.edu) and Bruce A. Thyer


(E-mail: bthyer@mailer.fsu.edu) are affiliated with the School of Social Work, Florida
State University, Tallahassee, FL 32306.
Journal of Evidence-Based Social Work, Vol. 1(1) 2004
http://www.haworthpress.com/web/JEBSW
 2004 by The Haworth Press, Inc. All rights reserved.
Digital Object Identifier: 10.1300/J394v01n01_02 7
8 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

organizations, an observer from outside the profession could easily be


mislead into thinking that professionally trained social workers com-
monly use the current best scientific evidence available, rather than
solely relying on practice wisdom, tradition, or “common sense” in de-
ciding how to assist a client. However, a closer look at the EBP litera-
ture is more likely to discover essays on why we are not doing evidence-
based practice (e.g., Gibbs & Gambrill, 2002) or arguments for why we
should be doing it (e.g., Thyer, 1995, 2002a, 2002b) rather than descrip-
tions of how evidence-based practice is being used (e.g., Thyer, 2001).
In a review of trends in knowledge development over the past quarter
century, Reid (2002) concludes that “the increasing role of research in
testing practice methods suggests that scientific methods may be gradu-
ally upstaging tests of time and consensus as a means of building core
knowledge” (p. 16, italics added). His somewhat optimistic view of this
trend views clients as “self-directing collaborators with knowledge and
strengths” and social workers as having “new awareness of the limits of
their expertise and knowledge” (p. 16). At least one school of social
work, Washington University, has adopted EBP as its organizing
framework (Howard, McMillen & Pollio, 2003). Many new social work
books devote increasing attention to EBP (Corcoran, 2000, 2003; Mac-
donald, 2001; McNeece & DiNitto, 1998; Roberts & Greene, 2002;
Springer, McNeece & Arnold, 2003; Thyer & Wodarski, 1998; Thyer &
Kazi, 2003; Wodarski & Thyer, 1998). Nevertheless, adoption of EBP
within the profession seems to continue at a slow pace, slower in some
areas of practice than in others.

WHAT IS EBP?

One of the simplest definitions of EBP is that it is ‘treatment based on


the best available science.’ One might make this somewhat less nar-
rowly clinical by changing “treatment” to “intervention.” The basic ele-
ments of EBP, according to Persons (1999, p. 2), is one in which, “The
evidence-based practitioner:

• provides informed consent for treatment.


• relies on the efficacy data (especially from RCTs) when recom-
mending and selecting and carrying out treatments.
• uses the empirical literature to guide decision-making.
• uses a systematic, hypothesis-testing approach to the treatment of
each case.”
C. Aaron McNeece and Bruce A. Thyer 9

This approach:

• “begins with careful assessment.


• sets clear and measurable treatment goals.
• develops an individualized formulation and a treatment plan based
on that formulation.
• monitors progress towards the goals frequently and modifies or
ends treatment as needed.”

Although this is also a clinically-oriented definition, most of the el-


ements could be modified to encompass many aspects of macro prac-
tice (e.g., community practice, administrative practice, and social
policy practice.) Cournoyer and Powers (2002) use the term “client
system” in their definition, rather than client. The essential concept of
EBP is to rely upon the best scientific evidence that is currently avail-
able. It also involves providing the client (whether a person, a commu-
nity, or an organization) with appropriate information about the
efficacy of different interventions and allowing the client to make the
final decision (Thyer, 2003). This is very different from the traditional
practice model in which the social worker rarely looked for empirical
evidence of treatment efficacy or presented treatment alternatives to
the client.
A more conservative (and perhaps more widely accepted) construc-
tion of evidence-based practice is found in the seminal textbook Evi-
dence-based Medicine: How to Practice and Teach EBM by Sackett,
Strauss, Richardson, Rosenberg and Haynes (2000) from which much
of the following content is derived. Evidence-based practice can been
defined as the integration of the best research evidence with clinical ex-
pertise and client values in making practice decisions. ‘Best research
evidence’ means clinically relevant research from basic and applied sci-
entific investigations, especially drawing from intervention research
evaluating the outcomes of social work services, and from studies on
the reliability and validity of assessment measures, and clinical exper-
tise refers to our ability to use our education, interpersonal skills and
past experience to assess client functioning, diagnose mental disorders
and/or other relevant conditions, including environmental factors, and
to understand client values and preferences.
‘Client values’ refers to the unique preferences, concerns and expec-
tations each client brings to a clinical encounter with a social worker,
and which must be integrated into practice decisions if they are to serve
the client.
10 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

WHAT KIND OF EVIDENCE IS BEST EVIDENCE?

Evidence-based practice in social work makes practice decisions


utilizing a variety of different sources of scientifically credible evi-
dence, in conjunction with the social worker’s clinical skills, and the
client’s preferences. What distinguished EBP from other models of
social work intervention is that the practitioner is seen as having an
ethical and professional obligation seek out the best available evi-
dence (if any), weigh its scientific credibility and applicability in
terms of the client’s circumstances, values, and preferences, and apply
it if appropriate as a first choice treatment option. Scientific evidence
is not simply an optional consideration in EBP, it is a mandatory factor
to be incorporated into practice decision-making, where such knowl-
edge has been developed.
Within the EBP model, all forms of evidence are not seen as equiva-
lently informative. There is a clear and explicitly articulated hierarchy
of research methods which science accepts as possessing the potential
to provide credible answers to clinical questions. In rough order from
high to low, in terms of their ability to reliably and directly inform prac-
tice, are

• Systematic Reviews/Meta-Analyses
• Randomized Controlled Trials
• Quasi-Experimental Studies
• Case-Control and Cohort Studies
• Pre-Experimental Group Studies
• Surveys
• Qualitative Studies

A systematic review is a genuinely comprehensive interdisciplinary


worldwide compilation of published and unpublished (where accessi-
ble) research which addresses a particular answerable question (see be-
low), which is carefully critiqued, and conclusions derived. The very
best systematic reviews incorporate a statistical methodology called
meta-analysis, which enables one to compare findings across different
studies which used different outcome measures in evaluating the effects
of a particular treatment when used with clients experiencing a specific
problem. And the very best meta-analyses limit their parameters to in-
clude only randomized controlled trials (see below).
Randomized controlled trials (RCTs) involve the random assignment
of clients to differing conditions, such as to an experimental treatment
C. Aaron McNeece and Bruce A. Thyer 11

group, a standard treatment, to a placebo or bogus treatment group, or to a


no-treatment group. The publication describing the RCT should provide
sufficient information for the reader to make a determination if the as-
signment to such groups was legitimately random, or possessed the po-
tential for bias (which can cloud results). In the case of RCTs using
placebo or bogus treatment conditions, clients must not know that they
are receiving a less credible intervention than those assigned to the ‘real’
treatment, and assessors evaluating client outcomes must not know which
‘condition’ the client was assigned. Such circumstances can be difficult
to arrange in RCTs of social work treatments, but a number of such pla-
cebo-controlled social work outcome studies have been published (see
LeCroy, 1985). In well-designed and executed RCTs, client attrition
(dropping out, withdrawing from the study, deaths) across treatment/no-
treatment groups should be roughly equivalent, and follow-up periods
must be sufficiently long as to permit inferences regarding the durability
of any observed effects. Of course, RCTs, like all nomothetic outcome
studies, are predicated on the employment of reliable and valid outcome
measures, the appropriate use of inferential statistics, and sample sizes
possessing adequate statistical power sufficient to detect reliable differ-
ences.
The major distinction between RCTs and quasi-experimental re-
search designs is that the latter does not employ randomization tech-
niques to assign clients to different groups, but makes use of naturally
occurring groups that receive differing treatments. For example, clients
self-selecting to receive individual therapy could be compared with cli-
ents with similar problems who self-selected group therapy as their pre-
ferred treatment modality, in order to compare the relative effectiveness
of group versus individual treatment. The fact that clients ‘self- selected’
their treatment condition means that they are possibly not equivalent in
terms of demographic or psychosocial factors, making inferences re-
garding post-treatment differences difficult to ascribe solely to treat-
ment modality.
Pre-experimental research designs lack control groups, and consist of
simple efforts to evaluate outcomes such as looking only at client func-
tioning after receiving a service (a post-test only study), or validly as-
sessing client functioning before and after treatment, aggregating the
results for a sufficiently large sample of clients with a similar problems,
and seeing if they improved post-intervention compared to before treat-
ment (the pretest-posttest design). Pre-experimental nomothetic de-
signs may permit inferences regarding average changes among group
members, but they usually do not allow the researcher to confidently
12 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

conclude whether or not social work intervention specifically caused


any observed improvements. Changes may have been due not to inter-
vention but to nonspecific factors such as the passage of time, repeated
testing, placebo influences, and the like.
Purely qualitative research is very rarely used in the evaluation of
social work outcomes inasmuch as these methodologies are more
suitable for the study of client subjective reactions to service, or to
the examination of the processes of change, rather than to the objec-
tive determination of meaningful improvements in client function-
ing.
Generally, outcome studies designed prospectively are seen as more
credible than retrospectively designed studies, and findings obtained by
independent researchers across different service settings with different
clients are seen as more reliable than consistent results reported by a sin-
gle researcher or research team.
Many important areas of social work practice suffer from a lack of
RCTs, which means that less credible research reports must be the pri-
mary foundation to guide practice decisions. Please note that EBP
does NOT mean that one can only make use of systematic reviews or
meta-analyses in order to make practice decisions. If these are lacking,
EBP suggests that the practitioner then find out what guidance can be
obtained from quasi-experimental, cohort studies, or qualitative reports,
if these indeed represent the very ‘best’ outcomes research which is
available to inform practice. EBP says that one should rely on the best
AVAILABLE evidence, not only on BEST evidence, such as that
based on RCTs (because often there is no BEST evidence available).
According to Sackett et al. (2000), EBP consists of the following se-
quence of events:

1. Convert the need for information into an answerable question(s)


(see next section).
2. Track down the best available evidence to answer each question.
3. Critically evaluate this evidence in terms of its validity, impact,
and potential relevance to our client.
4. Integrate relevant evidence with our own clinical expertise and
client values and circumstances.
5. Evaluate our expertise in conducting steps 1-4 above, and evalu-
ate the outcomes of our services to the client, especially focusing
on an assessment of enhanced client functioning and/or problem
resolution.
C. Aaron McNeece and Bruce A. Thyer 13

Step 1–Answerable questions consist of:

1. A question with a verb, as in:

What has been shown to help . . . ? or


What psychosocial treatments work . . . ? or
What community-based interventions reduce . . . ? or
What group therapies improve . . . ? and

2. A question including some aspect of the client’s problem or con-


dition, as in:

What psychosocial interventions reduce the risk of teenage


pregnancy? or
What individual therapies are the most successful in getting cli-
ents to stop abusing crack cocaine? or
How can schools reduce student absenteeism? or
What treatments are effective in improving prenatal care ad-
herence?

Step 2–Having framed one’s search for information into answerable


questions, the next step consists of tracking down the best available evi-
dence. Generally speaking, most journals do not publish many articles
that can usefully and directly inform practice, and Sackett et al. (2000)
actually suggest that practitioners can basically give up reading tradi-
tional journals! Instead, one can focus one’s reading on those few jour-
nals that emphasize reporting the results of empirical outcome studies in
the human services. Among these are Research on Social Work Prac-
tice, sponsored by the Society for Social Work and Research (http://
www.sswr.org), the Journal of Consulting and Clinical Psychology, pub-
lished by the American Psychological Association, and Evidence-based
Mental Health, which provides practitioner-friendly summaries of re-
cently published outcomes research. Sackett et al. (2000) also suggest
that traditional textbooks are not very useful to practitioners, and we
agree, with some notable exceptions, such as:

The Handbook of Empirical Social Work Practice (Thyer &


Wodarski, 1998; Wodarski & Thyer, 1998)
Effective Interventions for Child Abuse and Neglect: An Evidence-
based Approach to Planning and Evaluating Interventions (Mac-
donald, 2001)
14 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

Evidence-based Social Work Practice with Families (Corcoran,


2000)
Clinical Applications of Evidence-based Family Interventions
(Corcoran, 2003)
Substance Abuse Treatment for Criminal Offenders: An Evidence-
based Guide for Professionals (Springer, McNeece, & Mayfield-
Arnold, 2003)
A Guide to Treatments that Work (Nathan & Gorman, 2002)

Other exceptionally valuable sources of systematic reviews are the


web pages of the Campbell Collaboration (http://campbell.gse.upenn.
edu/), the Cochrane Collaboration (http://www.cochrane.org), and the
Bandolier Database (http://www.jr2.ac.uk/bandolier). These groups
consist of international interdisciplinary groups of human service pro-
fessionals who have undertaken systematic reviews of the best available
research evidence dealing with selected psychosocial and health prob-
lems. Anyone with access to the Internet can consult these regularly up-
dated and constantly expanding databases and download top-quality
research summaries providing answers to answerable questions.
Step 3–Once one has gathered the best evidence, the social worker is
faced with the task of critically evaluating this mass of information. De-
veloping critical appraisal skills in evaluating research yourself is of
course a hallmark of the well-trained and conscientious professional,
but often you will find yourself faced with an overwhelming amount
of information to digest and synthesize. This is where recent evi-
dence-based systematic reviews really can prove their worth, so that
one can find succinct appraisals related to answerable questions. The
textbooks cited above are good sources for this type of information, but
the websites for the Cochrane (focusing on health problems, including
mental disorders), and Campbell (focusing on the fields of social work,
education, and crime) Collaborations are even better–likely more genu-
inely comprehensive and unbiased, as well as more up-to-date.
Step 4–How can you integrate relevant evidence with your own clini-
cal expertise, client values, and circumstances? You can seek out confer-
ences and workshops providing training in evidence-based psychosocial
interventions, as in the Niagara Conference on Evidence-based Treat-
ments for Childhood and Adolescent Mental Health Problems, which
was offered 24-26 July, 2003 in Niagara-on-the-Lake, Ontario, Canada.
The annual conferences of professional associations that have an
empirical focus to their presentations are also useful, such as those pro-
vided by the Association for Advancement of Behavior Therapy, the
C. Aaron McNeece and Bruce A. Thyer 15

Association for Behavior Analysis, and the Society for Social Work and
Research. As an individual practitioner you should enquire as to the evi-
dence-based foundations of practice-skills training programs offered
through continuing education programs, and let CEU-providers know
of your interest in such content. Decline to attend those lacking a sub-
stantive empirical foundation. Students can opt to earn their MSW or
PhD at reputable social work programs sympathetic to EBP (ask them!),
and choose as your focus evidence-based practice.
Step 5–How can I evaluate the outcomes of my own (or my pro-
gram’s) practice? One useful approach is to conduct simple group out-
come studies, or single-system research designs with individual clients
or client/systems (see Royse, Thyer, Padgett & Logan, 2001). Lacking
sufficient expertise or resources to do this, you can contract with outside
agencies to do this (e.g., your local School of Social Work). You could
also seek out funding from evaluation research grants supported by the
federal government, the state, private foundations, corporation, etc. Do
this solo, or in partnership with a local university, or with your local
School of Social Work.

EVIDENCE-BASED CLINICAL PRACTICE

An evidence-based approach to social work practice in clinical con-


texts is a much more feasible undertaking at present than in other
interventive venues. For many (if not most) answerable questions capa-
ble of being posed by clinical social workers, a review of the empiri-
cally-oriented practice textbooks and evidence-based practice websites
listed earlier in this article will reveal a considerable amount of informa-
tion relating to effective psychosocial interventions, information with
direct applications to practice.

EVIDENCE-BASED MACRO PRACTICE

Relatively little has been written about evidence-based macro prac-


tice. Thyer (2001) discussed evidence-based approaches to community
practice in a recent book, but his examples dealt mostly with evidence-
based clinical practice within the community context, not with the ap-
plications of EBP to social action, social planning, or locality develop-
ment models to community problems. Thyer (1996) also wrote about
the application of the principles of behavior analysis to social welfare
16 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

policy as a way of explaining how the usual nonempirical approach em-


ployed in policy practice frequently leads to policy failure. The macro
area of social work, generally speaking, is characterized by much less
concern with the role of research and scientific knowledge than is clini-
cal or direct practice.

COMMUNITY PRACTICE

For the most part, social work abandoned community practice as a


separate area of practice with the disappearance of community action
strategies and the Office of Economic Opportunity. Few schools pro-
vide students with the opportunity to focus on community practice or
community organization as a practice method. Much of community
practice has been merged with administrative practice or other “macro”
practice designations. Course content in community practice has been
either severely diluted or integrated with course content related to hu-
man behavior in the social environment. To the cynic, the role of EBP in
community practice would seem to be irrelevant, since community
practice hardly exists as a separate field in contemporary social work.
As optimists, the authors believe otherwise–to the extent that commu-
nity interventions are hypothesized to bring about community change,
these changes can be scientifically evaluated and an evidentiary founda-
tion developed.
Hundreds of community social action programs were evaluated dur-
ing the 1960s and 1970s: compensatory education programs (McDill,
McDill & Sprehe, 1969; Watts & Horner, 1969), manpower training pro-
grams (Somers, 1968), housing and urban renewal programs (Rothenberg,
1967), juvenile delinquency prevention programs (Cloward & Ohlin,
1960), nutrition programs (Macdonald, 1977), and income maintenance
programs (Kersaw, 1972), among others. However, research designs
were typically very weak, and there was little or no social work involve-
ment in the development of these evaluation projects. The most com-
mon design was a simple “before and after” study. For the most part,
research was conducted by economists, political scientists, sociologists,
and other social scientists.
Within recent years there have been several much more rigorous
studies evaluating community needle exchange programs (NEP) or sy-
ringe exchange programs (SEP), including some RCTs. The over-
whelming majority of these studies indicate positive benefits from the
programs. However, almost all of them were conducted by disciplines
C. Aaron McNeece and Bruce A. Thyer 17

other than social work. A few juvenile delinquency and community po-
licing programs have been studied by social workers (Springer, Shader &
McNeece, 1999), but the research designs were either a process evalua-
tion or a pre-experimental pretest/posttest study only. Altogether, well-
designed outcome studies of community practice conducted by social
workers are quite scarce.

ADMINISTRATIVE PRACTICE

There is a quite large body of knowledge in the administrative prac-


tice literature generated from well-designed experimental and quasi-ex-
perimental studies. Although Max Weber developed the first modern
theories about the functioning of bureaucracies, the origins of “scien-
tific management” are usually traced to Frederick W. Taylor. Taylor’s
theories stimulated a great deal of research on the best ways to organize
work groups and management styles that result in the more efficient
functioning of the organization. Herbert Simon’s work during the 1950s
on administrative behavior helped him win a Nobel Prize and has con-
tinued to influence research for six decades.
Until recent years most of the research in administration was the re-
sult of efforts by scholars in management, public administration, social
psychology and related disciplines. A relatively small number came
from professional social work. Although early studies described a wide
range of management competencies and skills, very few were empirical
works (Menefee, 2000). Hasenfeld (see Hasenfeld & English, 1974),
one of the early social work scholars noted for his research on organiza-
tions and his contributions to organization theory, wrote about human
service organizations as “people-processing” organizations. Patti (1977)
conducted one of the earliest and most definitive studies of social wel-
fare administration, focusing on management activities of 90 agency
managers. However, the sudden popularity of administrative tracks in
schools of social work in the 1970s supported the dramatic increase in
our profession’s concern with research literature in this field, exempli-
fied by the Resnick and Menefee (1993) empirical study that generated
a model of management for social work practice.
There is currently an impressive array of research literature on ad-
ministrative practice written by and for social workers, ranging from the
study of organizational innovation (Shin & McClomb, 1998; Burt &
Taylor, 2003), organizational collaboration (Foster & Meinhard, 2002)
and organizational structure (Crook, 2002) to staff burnout (Zunz,
18 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

1998), and the organizational utilization of research (Anderson, 2001).


A parallel empirical literature developed by behavior analysts related to
administrative practice also exists, which is positively impacting the
field and can be usefully drawn upon by social workers (e.g., Komaki,
1998; McKenzie & Lee, 1998; Daniels, 1989, 2000; the Journal of Or-
ganizational Behavior Management, published by Haworth Press). Per-
haps a more relevant issue is whether this literature has had any
significant impact on social service organizations. It is our impression
that it has not. One possible reason is that the administrators of those or-
ganizations may not be aware of the research literature. Until quite re-
cently textbooks on administration were more likely to cite organizational
theory or “practice wisdom” (just as in clinical practice) than the best avail-
able scientific evidence (Neugeboren, 1985; Perlmutter & Slavin, 1980).
A second explanation is that career bureaucrats, rather than profession-
ally trained social work administrators, frequently provide leadership to
those organizations. A third reason is that public social service organi-
zations are tightly constrained by statutes and administrative codes that
don’t permit much flexibility.
The status of EBP in administration has evolved tremendously since
Patti’s (1983) seminal work and it continues to change. Today the lead-
ing textbooks used in social work administration or management
classes are likely to be replete with references to the best available cur-
rent scientific knowledge (Patti, 2000). There is also greater pressure on
social welfare agencies to adopt management practices and principles
that have been developed for business organizations and have stood the
test of the competitive marketplace (Hasenfeld, 2000). It will be some
time, however, before current students will be able to put this knowl-
edge into practice, and there will continue to be pressures (as mentioned
above) to continue using traditional non-EBP management principles.

POLICY PRACTICE

The number of professional social workers engaged full-time in pol-


icy practice is considerably smaller than those in other fields of social
work practice, partly because other disciplines had much sooner estab-
lished legitimate claims to expertise in this field, and partly because not
many social workers are drawn to this area. The policy literature is rife
with books and articles by social work scholars, but most of it is either
descriptive or hortatory. Pick up any textbook on social welfare policy,
and it will almost always provide a thorough description of the policy
C. Aaron McNeece and Bruce A. Thyer 19

process, as well as a number of theories (group, elite, institutional, etc.)


about how policies are developed and implemented. Few will cite any
research regarding the most efficacious ways to achieve policy out-
comes–beyond a simple case study of a major policy issue. Thyer
(1996) suggested that one of the major problems in the design and eval-
uation of social welfare policy is that a nonempirical approach is often
taken. This occurs partly because the custom of this country is to evalu-
ate such policies by the goodness of their intentions and not the results
(Caplow, 1994). In doing so, governments frequently provide just the
opposite incentives that are necessary to achieve a desired outcome. For
example, throughout the history of income maintenance programs, we
have penalized clients for being successful. Prior to the implementation
of Temporary Assistance to Needy Families (TANF), if a client man-
aged to find employment, she would loose the family’s Medicaid bene-
fits.
A further complication is that there are many programs which serve
clients for whom we have decided that noncontingent entitlement bene-
fits should be provided. Persons who are developmentally disabled or
terminally ill may simply “deserve” a benefit in order to make them
more comfortable or better off in some way. Providing the service is an
end in itself.
We also seem to have forgotten the important role that contingencies
play in the behavior of the policy-makers. According to several political
theories, policies are frequently developed as a way to secure the vote of
the citizenry (Dye, 2003). The behavior change sought by the policy-
makers is simply that the voters will remember and vote for them on
election day. Any other behavior change on the part of the policy recipi-
ent is purely incidental. Any policy that keeps the policy maker in office
is, ipso facto, a successful policy. This phenomenon was identified by
Roberto Michels as the “iron law of oligarchy” (cited in Cassinelli,
1953). Another political fact of life frequently overlooked is that poli-
cies can, and frequently do, provide only a symbolic reward to the target
audience (Edelman, 1964). For example, the politics of Confederate
symbols on state flags has cost many a Southern politician an election.
How does one rationally evaluate the value of either maintaining or dis-
carding Confederate symbols? It begins to make sense only when one
considers the political economy of such symbolism. Symbols can serve
as incentives just as powerful as money, education, or health care, and
they have minimal impact on the budget–a politician’s dream!
Despite these problems and complications, however, there is a con-
siderable amount of research that is valuable to the development and
20 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

implementation of evidence-based social welfare policy. First there are


empirical clinical studies. RCTs of pharmacotherapy programs to treat
addiction should inform any decision at the community, state, or na-
tional level about whether to include such treatment as a covered ex-
pense under Medicaid or Medicare. Similar studies of the efficacy of
drug therapies for mental health problems should enlighten legislators
and other politicians who fashion mental health policy. Studies on the
efficacy of family therapy for incarcerated inmates have already helped
to shape policies within both the state and federal prison systems. Simi-
lar claims could be made in child welfare, aging, and other areas of so-
cial policy.
Second, there are many studies of community programs that could
also inform policy development. The Ford Foundation’s Gray Areas
Project in New York City served as a model for thousands of subse-
quent programs funded by the Office of Economic Opportunity. There
are numerous well-designed studies of community-based needle and/or
syringe exchange programs that demonstrate their positive effects, such
as decreases in the rate of disease transmission (see Gibson, Flynn &
Perales, 2001). The reader may ask–“Then why are NEPs and SEPs not
more common?” There is a symbolic value involved in this area of pol-
icy. Many policy makers and voters interpret (incorrectly, we believe),
the presence of a NEP in their community as an endorsement of drug
use. The symbolic reward is more powerful than any imagined im-
provements in health status.
Third, there are many larger “social experiments” that are helpful in
policy practice (see DiNitto, 1983; Orr, 1999; Campbell & Russo,
1999). Few of them, however, meet the standard of the preferred re-
search design–the randomized clinical trial. Among the better designed
are the New Jersey Experiment Negative Income Tax Experiment
(Kersaw, 1972), the Westinghouse evaluation of the Head Start pro-
gram (Williams & Evans, 1969), and recent research on the impact of
welfare reform.
The New Jersey study actually randomly assigned clients to one of
eight different negative tax plans or to a control group. A major problem
was that during the experiment the state of New Jersey changed the ben-
efits for persons in the control group so that they actually received more
cash assistance than the highest-paid persons in the experimental
groups! Then there were legal questions about whether some persons
might be legally eligible for both AFDC-UP and the experimental pay-
ments.
C. Aaron McNeece and Bruce A. Thyer 21

The Westinghouse study used an ex post facto design, and it was also
plagued by criticisms that instrumentation was faulty and that important
goals related to health, nutrition, and community involvement were ig-
nored. While the official pronouncement from the Nixon White House
was that “the long-term effect of Head Start appears to be extremely
weak” (Williams & Evans, 1969, p. 118), liberals continued to argue on
the basis of the same study that Head Start was a worthwhile program.
The Personal Responsibility and Work Opportunity and Reconcilia-
tion Act of 1996 was enacted without benefit of any well-designed pre-
liminary studies on a smaller scale. Recent studies on welfare reform
agree on one thing: we have significantly reduced the welfare rolls. To
conservatives, that is sufficient information to establish the success of
welfare reform. To liberals, the more important question is whether the
clients who have been terminated from the TANF program have been
able to achieve economic self-sufficiency. While the evidence is mixed, it
appears that most have not become self-sufficient (Cancian, Haveman,
Meyer & Wolfe, 1999). So how one feels about PRWORA is probably
based on political leanings rather than any credible scientific evidence.
Real social experiments are difficult to accomplish. The issue of ran-
domization is moot for most social welfare services, since there are ei-
ther legal or moral obstacles to the randomization of clients to services
when an entitlement is involved. In most cases the best that we can hope
for is a good quasi-experimental design with treatment and comparison
(not no-treatment control) groups. The ensuing threats to validity leave
enough “wiggle room” for critics and proponents alike to support their
position.

CONCLUSION

Evidence-based practice is a useful perspective for social work prac-


tice, across the spectrum of services, micro through macro. The intellec-
tual successor in many respects to the earlier empirical clinical practice
movement within social work, and the empirically-supported treat-
ments projects within the American Psychological Association during
the 1990s, EBP is a more comprehensive and more carefully laid out
program of gradually enhancing the empirical and ethical justifications
of social work service. A simple literature search using ‘evidence-based
practice’ as search terms will reveal the astonishing impact EBP is hav-
ing within diverse human services. It is most well-developed in medi-
cine and other areas of physical health care, it is making substantial
22 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

inroads in mental health care, and has extended tentative feelers into the
areas of macro social work practice.
What is as important as the evidentiary state of knowledge in various
areas of social work intervention is the existing template for the expan-
sion of EBP (see Sackett et al., 2000) in areas which are not well devel-
oped as of yet. We look forward to the augmented role of science within
social work practice, used in collaboration with our well-established
values and ethics.

REFERENCES
Anderson, S.G. (2001). The collaborative research process in complex human service
agencies: Identifying and responding to organizational constraints. Administration
in Social Work, 25, 1-19.
Burt, E. & Taylor, J. (2003). New technologies, embedded values, and strategic
change: Evidence from the U.K. voluntary sector. Nonprofit and Voluntary Sector
Quarterly, 32, 115-127.
Campbell, D.T. & Russo, M.J. (1999). Social experimentation. Thousand Oaks, CA: Sage.
Cancian, M., Haveman, R., Meyer, D., & Wolfe, B. (1999). Before and after TANF:
The economic well-being of women leaving welfare. Madison: University of Wis-
consin, Institute for Research on Poverty.
Caplow, T. (1994). Perverse incentives: The neglect of social technology in the public
sector. Westport, CT: Praeger.
Cassinelli, C.W. (1953). The law of oligarchy. American Political Science Review, 47,
773-784.
Cloward, R.A. & Ohlin, L.E. (1960). Delinquency and opportunity: A theory of delin-
quent gangs. Glencoe, IL: Free Press.
Corcoran, J. (2000). Evidence-based social work practice with families. New York:
Springer.
Corcoran, J. (2003). Clinical applications of evidence-based family interventions. New
York: Oxford.
Cournoyer, B.R. & Powers, G.T. (2002). Evidence-based social work: The quiet revo-
lution continues. In A. R. Roberts & G. Greene (Eds.), Social workers’ desk refer-
ence (pp. 798-807). New York: Oxford.
Crook, W.P. (2002). Trickle-down bureaucracy: Does the organization affect client re-
sponses to programs? Administration in Social Work, 26, 37-60.
Daniels, A.C. (1989). Performance management: Improving quality and productivity
through positive reinforcement. Tucker, GA: Author.
Daniels, A.C. (2000). Bringing out the best in people: How to apply the astonishing
power of positive reinforcement. New York: McGraw-Hill.
DiNitto, D.M. (1983). Time series analysis: An application to social welfare policy.
Journal of Applied Behavioral Science, 19, 507-518.
Dye, T.R. (2002). Understanding public policy (10th edition). New York: Prentice-
Hall.
C. Aaron McNeece and Bruce A. Thyer 23

Edelman, M. (1964). The symbolic uses of politics. Chicago, IL: University of Illinois
Press.
Foster, M.K. & Meinhard, A.G. (2002). A regression model explaining predisposition
to collaborate. Nonprofit and Voluntary Sector Quarterly, 31(4), 549-564.
Gibbs, L. & Gambrill, E. (2002). Evidence-based practice: Counterarguments to objec-
tions. Research on Social Work Practice, 12, 452-476.
Gibson, D.R., Flynn, N.M., & Perales, D. (2001). Effectiveness of syringe exchange
programs in reducing HIV risk behavior and HIV seroconversion among injecting
drug users. AIDS, 15, 1329-1341.
Hasenfeld, Z. & English, R. (Eds.) (1974). Human service organizations. Ann Arbor,
MI: University of Michigan Press.
Howard, M.O., McMillen, C. J., & Pollio, D.E. (2003). Teaching evidence-based prac-
tice: Toward a new paradigm for social work education. Research on Social Work
Practice, 13, 234-259.
Kersaw, D.N. (1972). Issues in income maintenance experimentation. In P.H. Rossi &
W. Williams (Eds.), Evaluating social programs: Theory, practice, and politics
(pp. 221-245). New York: Seminar Press.
Komaki, J. (1998). Leadership from an operant perspective. New York: Routledge.
LeCroy, C.W. (1985). Methodological issues in the evaluation of social work research.
Social Service Research, 59, 345-357.
Macdonald, G. (2001). Effective interventions for child abuse and neglect: An evi-
dence-based approach to planning and evaluating interventions. New York: Wiley.
Macdonald, M. (1977). Food, stamps, and income maintenance. Madison, WI: Insti-
tute for Research on Poverty.
Manpower Research Demonstration Corporation. (1994). The national evaluation of
welfare-to-work strategies: Early lessons from seven sites. New York: Author.
McKenzie, R.B. & Lee, D.R. (1998). Managing through incentives: How to develop a
more collaborative, productive, and profitable organization. New York: Oxford.
McDill, E.L., McDill, M.S., & Sprehe, T. (1969). Strategies for success in compensatory
education: An appraisal of evaluation research. Baltimore, MD: John Hopkins’
Press.
McNeece, C.A. & DiNitto, D.M. (1998). Chemical Dependency: A systems approach.
Boston, MA: Allyn & Bacon.
Mechanic, D. (1999). Mental health and social policy: The emergence of managed
care (4th edition). Boston: Allyn & Bacon.
Nathan, P. E. & Gorman, J. (Eds.) (2002). A guide to treatments that work (second edi-
tion). New York: Oxford.
Neugeboren, B. (1985). Organization, policy and practice in the human services. New
York: Longman.
Orr, L.L. (1999). Social experiments: Evaluating public programs with experimental
methods. Thousand Oaks, CA: Sage.
Patti, R. (1983). Social welfare administration. Englewood Cliffs, NJ: Prentice-Hall.
Patti, R. (Ed.) (2000). The handbook of social welfare management. Thousand Oaks,
CA: Sage.
Perlmutter, F.D. & Slavin, S. (1980). Leadership in social administration: perspectives
for the 1980s. Philadelphia: Temple University Press.
24 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

Persons, J. (1999). Evidence-based psychotherapy: A graduate course. Clinical Sci-


ence, 2, 12.
Reid, W.J. (2002). Knowledge for direct social work practice: An analysis of trends.
Social Service Review, 76, 6-33.
Resnick, H. & Menefee, D. (1993). A comparative analysis of organization develop-
ment and social work, with suggestions for what organization development can do
for social work. Journal of Applied Behavioral Science, 29, 432-445.
Roberts, A.R. & Greene, G. (Eds.) (2002). Social worker’s desk reference. New York:
Oxford.
Rothenberg, J. (1967). Economic evaluation of urban renewal. Washington, DC:
Brookings Institution.
Sackett, D.L., Strauss, S.E., Richardson, W.S., Rosenberg, W., & Haynes, R.B. (2000).
Evidence-based medicine: How to practice and teach EBM. New York: Churchill-
Livingstone.
Shin, J. & McClomb, G.E. (1998). Top executive leadership and organization innova-
tion: An empirical investigation of nonprofit human service organizations (HSOs).
Administration in Social Work, 22(3), 1-21.
Somers, G. (Ed.) (1968). Retraining the unemployed. Madison, WI: University of Wis-
consin Press.
Springer, D.W., McNeece, C.A., & Arnold, E.M. (2003). Substance abuse treatment
for criminal offenders: An evidence-based approach for practitioners. Washington,
DC: American Psychological Association.
Springer, D.W., Shader, M.A., & McNeece, C.A. (1999). Operation of juvenile assess-
ment centers: trends and issues. Journal for Juvenile Justice and Detention Ser-
vices, 14, 45-61.
Thyer, B.A. (1995). Promoting an empiricist agenda within the human services: An
ethical and humanistic imperative. Journal of Behavior Therapy & Experimental
Psychiatry, 26, 93-98.
Thyer, B. (1996). Behavior analysis and social welfare policy. In M.A. Mattaini & B.
A. Thyer (Eds.), Finding solutions to social problems: Behavioral strategies for
change (pp. 41-60). Washington, DC: American Psychological Association.
Thyer, B.A. (2001). Evidence-based approaches to community practice. In H.E. Briggs &
K.J. Corcoran (Eds.), Social work practice: Treating common client problems (pp.
54-65). Chicago, IL: Lyceum.
Thyer, B.A. (2002a). Principles of evidence-based practice and treatment develop-
ment. In A.R. Roberts & G.J. Green (Eds.), Social workers’ desk reference (pp. 739-
742). New York: Oxford.
Thyer, B.A. (2002b). Evidence-based practice and clinical social work. Evidence-
based Mental Health, 6, 6-7.
Thyer, B.A. (2003). Evidence-based practice in the United States. In B.A. Thyer &
M.A.F. Kazi (Eds.), International perspectives on evidence-based practice in social
work. Birmingham, UK: Venture Press.
Thyer, B.A. & Kazi, M.A.F. (Eds.) (2003). International perspectives on evidence-
based practice in social work. Birmingham, UK: Venture Press.
Thyer, B.A. & Wodarski, J.S. (Eds.) (1998). Handbook of empirical social work prac-
tice: Volume 1, mental disorders. New York: Wiley.
C. Aaron McNeece and Bruce A. Thyer 25

Williams, W. & Evans, J.W. (1969). The politics of evaluation: The case of Head Start.
The Annals of the American Academy of Political and Social Science, 385, 118-132.
Watts, H.W. & Horner, D.L. (1969). The educational benefits of Head Start: A quanti-
tative analysis. Madison, WI: Institute for Research on Poverty.
Wodarski, J.S. & Thyer, B.A. (Eds.) (1998). Handbook of empirical social work prac-
tice: Volume 2, psychosocial problems and practice issues. New York: Wiley.
Zunz, S.J. (1998). Resiliency and burnout: Protective factors for human service man-
agers. Administration in Social Work, 22, 39-54.

For FACULTY/PROFESSIONALS with journal subscription


recommendation authority for their institutional library . . .
If you have read a reprint or photocopy of this article, would you like to
make sure that your library also subscribes to this journal? If you have
the authority to recommend subscriptions to your library, we will send you
a free complete (print edition) sample copy for review with your librarian.
1. Fill out the form below and make sure that you type or write out clearly both the name
of the journal and your own name and address. Or send your request via e-mail to
docdelivery@haworthpress.com including in the subject line “Sample Copy Request”
and the title of this journal.
2. Make sure to include your name and complete postal mailing address as well as your
institutional/agency library name in the text of your e-mail.
[Please note: we cannot mail specific journal samples, such as the issue in which a specific article appears.
Sample issues are provided with the hope that you might review a possible subscription/e-subscription with
your institution's librarian. There is no charge for an institution/campus-wide electronic subscription
concurrent with the archival print edition subscription.]

Please send me a complimentary sample of this journal:

(please write complete journal title here–do not leave blank)

I will show this journal to our institutional or agency library for a possible subscription.
Institution/Agency Library: ______________________________________________
Name: _____________________________________________________________
Institution: __________________________________________________________
Address: ___________________________________________________________
City: ____________________ State: __________ Zip: ____________________
Return to: Sample Copy Department, The Haworth Press, Inc.,
10 Alice Street, Binghamton, NY 13904-1580

You might also like