You are on page 1of 12

ISSUES AND OPPORTUNITIES IN OUT-OF-SCHOOL TIME EVALUATION

NUMBER 8 HARVARD FAMILY RESEARCH PROJECT AUGUST 2004

PROMOTING QUALITY THROUGH


PROFESSIONAL DEVELOPMENT
A Framework for Evaluation

Harvard Family Research Project’s (HFRP) Issues and ations that will help programs understand the benefits of
Opportunities in Out-of-School Time Evaluation briefs their professional development efforts.
highlight current research and evaluation work in the This brief offers a preliminary framework for under-
out-of-school time field. These documents draw on standing and implementing evaluations of OST profes-
HFRP’s research work in out-of-school time to provide sional development initiatives. In so doing, it draws on
practitioners, funders, evaluators, and policymakers with research from early childhood and elementary education
information to help them in their work. Recognizing the and highlights the evaluation methods and results of re-
critical role that staff play in promoting quality out-of- cent OST professional development initiatives at both
school time (OST) programs, in this brief we examine local and national levels.2
OST professional development efforts and offer a frame-
work for their evaluation. What Is Professional Development?

T
Professional development is a broad term that can refer
here is no question that staffing is a key component
to a variety of education, training, and development
of quality in out-of-school time (OST) programs.1
opportunities. For the purpose of this brief, the term
Many programs attribute their success to skillful
will be applied to a full range of activities that have the
providers, and research demonstrates the importance
common goal of increasing
of positive staff-child relationships for youth outcomes
the knowledge and skills Given the importance of
(Eccles & Gootman, 2002; McLaughlin, 2000; Rosenthal
of staff members and vol-
& Vandell, 1996; Tolman, Pittman, Yohalem, Thoma- staff development for both
unteers. Professional devel-
ses & Trammel, 2002). Recently, when a set of leading
opment programs target higher quality programs and
experts in the OST field was asked to identify the single
many specific outcomes, better youth outcomes, it
most important ingredient for creating and sustaining
including increased staff
quality improvement systems in OST, five of the eight is critical that we design
knowledge about child and
respondents articulated issues of staff recruitment, train- evaluations that will help
adolescent development,
ing, and development (Little, 2004).
use of effective strategies programs understand the
Professional development initiatives have been a key
for activity programming, benefits of their professional
mechanism for increasing the skills of everyone from new
and implementation of
personnel to longtime staffers, in part because many OST development efforts.
methods for promoting
workers are paraprofessionals who have little preservice
positive relationships with
training (Partnership for Afterschool Education, 1999).
youth. Many professional development initiatives also
However, evaluating the effectiveness of these initiatives is
have the secondary goal of improving the quality and
a relatively recent development. Given the importance of
sustainability of the out-of-school time workforce, by
staff development for both higher quality programs and
increasing providers’ marketable skills and by garnering
better youth outcomes, it is critical that we design evalu-
public support for the youth development field.
Professional development for OST providers can oc-
© 2004 President & Fellows of Harvard College. Published by Har-
cur in many settings and at many stages throughout a
vard Family Research Project, Harvard Graduate School of Educa- provider’s career. These activities may be formal or casual,
tion. All rights reserved. No part of this publication may be repro- highly structured or flexible, conducted before the job
duced in any way without the written permission of the publisher. application process or throughout the career cycle. Pro-

HARVARD FAMILY RESEARCH PROJECT 1


development activities, are likely to show the largest and
most sustained effects when they are evaluated (Costley,
DEFINING ESSENTIAL SKILLS
1998). Professional development settings include:
FOR YOUTH WORKERS
 Higher education, such as continuing education
An increasing number of stakeholders are asking the
courses and degree programs
question, what defines a skillful out-of-school time  Preservice training and new-staff orientation
provider? The answer is a complex one, particularly  In-service training provided by programs to current
given the diverse experiences, responsibilities, and staff
needs of youth workers.  Training seminars and resource centers provided by
external organizations outside the program setting
One way to address the issue is through youth worker
 Local and national credentialing systems and pro-
competency standards. Several organizations have de-
grams
veloped competency frameworks, and a framework  Local and national conferences
from the National Collaboration for Youth aims to  Mentoring programs and relationships
integrate these into one unified model (see Select  Ongoing informal resources, such as newsletters,
Professional Development Resources on page 11). online discussion boards, and “brown bag” lunches
Competency standards come from several sources, for staff members to share ideas and expertise
including the voices of youth workers themselves, In many national organizations (e.g., Girls Inc. and
research from the fields of early childhood education 4-H), staff development is coordinated by a central
and daycare, standards from the U.S. Department of office and integrated into the national infrastructure.
Labor on successful workplace skills, developmental For smaller local programs, access to professional de-
research on the needs of youth (e.g., the National velopment opportunities is often facilitated by regional
Research Council’s publication Community Programs “clearinghouses” and advocacy organizations. A recent
to Promote Youth Development), and, most recently, and promising approach to professional development
is to harness audio, video, and computer technologies
youth reports (e.g., qualitative research by Reed
to provide distance-learning programs for teacher and
Larson from the University of Illinois at Urbana-
staff training.
Champaign). From college courses to staff training seminars, profes-
While no clear consensus yet exists, several fac- sional development activities vary according to program
tors have been consistently identified from these characteristics and providers’ needs. At the program
level, operating budgets and program infrastructure
sources, including, but not limited to, possessing a
have a large impact on the kinds of trainings and other
strong knowledge of youth development; engaging in
opportunities offered. In addition, the type and setting of
warm, positive relationships; promoting positive peer professional development activities depend, in part, on
interactions; and encouraging youth to be actively the program’s staffing model (i.e., who is hired and for
involved in shaping their programs and experiences. what purpose). OST staffing models range from employ-
Identifying other areas of overlap is an important area ing full-time directors who harness volunteer energy, to
for future work, in order to provide a comprehensive engaging older youth as staff members, to contracting
picture of a skillful provider. with content-specific consultants to work on special arts
and music programs. Many programs incorporate several
of these methods.
For example, the New York City Beacons Initiative3
fessional development opportunities are diverse largely offers a career ladder, which can eventually lead from the
because OST workers represent a range of backgrounds— volunteer level to paid employment. The program encour-
there is no standard route to becoming a provider. A large ages youth to run many activities themselves and to be-
number of staff who work directly with youth have no come future providers, offers the MOSAIC Employment
preservice training and many programs do not require Club to provide job skills training to community mem-
specific credentials or academic degrees; in fact, very few bers, employs only college graduates or current students
colleges and universities offer courses or degrees in OST to promote higher education for youth participants, and
or after school (Partnership for Afterschool Education, holds mandatory weekly training meetings. LA’s BEST4
1999). Therefore, a range of professional development provides opportunities for “alumni” to return to the pro-
opportunities can be beneficial, and overarching “profes- gram as volunteers and eventually as paid staff members,
sional development systems,” or combinations of various thus continuing to engage young people in the program

2 HARVARD FAMILY RESEARCH PROJECT


even after they have “graduated.” The San Francisco Educational Development, 2002; Girls Inc., 1996). These
Beacons Initiative5 also offers a variety of training mod- outcomes could lead to better staff retention and lower
els for their diverse staff, including a citywide training turnover rates. It is important to emphasize the possibil-
network for directors and administrators, weekly training ity, however, that staff retention may improve only if staff
for Americorps volunteers, and on-the-job training for who have training opportunities also have opportunities
contractual employees, such as visiting artists. for increased responsibility and higher compensation.
One method for addressing this need is the career
How Does Professional Development development ladder, a system that would tie increases in
Impact Program Quality?
Professional development activities can impact program
quality on at least two levels. First, and most important,
THE APPRENTICESHIP MODEL
staff development can affect youth outcomes. Evidence
of this process comes from other fields related to OST. A new and promising approach to professional devel-
Early childhood education studies have found that the opment is the apprenticeship model, in which youth
quality of daycare children receive is associated with organizations provide employment in out-of-school
providers’ educational attainment and participation in time settings, along with training and college-level
training workshops (NICHD Early Child Care Research
courses. This model benefits both staff and programs:
Network, 2001; Norris, 2001), and that quality of care
It provides staff members with training and experi-
is in turn linked with children’s more positive social and
cognitive outcomes (NICHD Early Child Care Research ence, which will increase their skills and job opportu-
Network, 2001). Similarly, an education reform study nities, and provides programs with dedicated staffers
showed that elementary school students performed higher who are committed to youth development.
in reading and math when their teachers had participated
The Youth Development Practitioner Appren-
in and given high ratings to a professional development
ticeship Program (YDPA) 1 is a national initiative
program that provided content knowledge and instruc-
tional strategies aligned with state education standards of the U.S. Department of Labor (DoL) to provide
(Westat & Policy Studies Associates, 2001). a special out-of-school time certification for current
The positive impact of staff development most likely and future youth workers. Local YDPA programs
operates through a chain of events: Professional devel- combine mentoring, direct instruction, and on-the-
opment activities affect the practices that youth workers job training, on topics ranging from youth develop-
use in their programs, and these practices in turn affect ment to program organization. DoL is working with
youth participants. For example, well-trained providers organizations around the country, including 4-H, 2 to
use their knowledge of child and adolescent develop-
implement this initiative.
ment to interact positively and effectively with youth.
Research from developmental psychology has shown that The Teaching Fellowship Program3 is another
having at least one positive and caring relationship with apprenticeship model, run by Americorps, in partner-
an adult can buffer children against risk factors in their ship with Boston’s Citizen Schools, Lesley University,
lives, and in fact, can predict positive outcomes in later and local nonprofit and commercial organizations.
life (e.g., Gambone, Klem & Connell, 2002; Werner &
Participating Teaching Fellows work at local organi-
Smith, 1982). OST programs provide a vital opportunity
zations in the mornings, teach at Citizen Schools in
for such connections, whether the program’s focus is on
academics, leadership skills, or other outcomes (Seligson the afternoons, and work toward a master’s degree
& Stahl, 2003). Staff training can also help providers plan in education with a special concentration in Out-of-
enriching activities, connect with participants’ families, School Time Education.
and communicate with community stakeholders—all of
1 For more information on YDPA, visit the National Clearinghouse
which are elements of high quality programs (Eccles &
for Youth Development Practitioner Apprenticeship Programs
Gootman, 2002; Rosenthal & Vandell, 1996; Tolman website at www.ydpaclearinghouse.org.
et al., 2002). 2 For more information on the National 4-H Youth Develop-

Secondly, professional development is important be- ment Practitioner Apprenticeship Certificate Program, visit www.
cause of its potential impact on the sustainability of the nae4ha.org/ydpa.
3 For more information on the Teaching Fellowship Program,
youth development workforce. OST providers who have
visit the Citizen Schools website at www.citizenschools.org/
professional development opportunities have reported teachingfellows/index.cfm.
feeling more confident and more satisfied with their jobs
(Center for School and Community Services, Academy for

HARVARD FAMILY RESEARCH PROJECT 3


information in these evaluations is usually limited to
hiring qualifications and staff/child ratios. This could be
PROFESSIONALIZING THE WORKFORCE:
due to a number of barriers, including limited time and
WASHINGTON STATE’S
resources for evaluation and limited evaluation skills.
CAREER LADDER INITIATIVE
Educators and other youth workers often report feel-
The Washington State Early Childhood Educa- ing that evaluation of professional development wastes
tion Career and Wage Ladder is a pilot project valuable staff time (Guskey, 2000). However, evaluation
run by the Economic Opportunity Institute. It uses is a critical part of the professional development process
because it identifies which program elements are (or are
public funding to tie salary increments for childcare
not) successful, which ultimately leads to the creation of
workers to continuing education, experience, and
more effective and efficient programs.
responsibility. Its goals are to professionalize the Although many core features of assessment are
status of childcare workers, increase job satisfaction consistent across contexts, there are some features of
and retention, and improve the quality of childcare. evaluating professional development programs and
Evaluation results from the pilot study showed that initiatives that are unique. Most important, and unlike
childcare workers in centers with career ladders had many other program elements, it is necessary to demon-
higher job retention rates, exhibited higher morale, strate the effect of professional development activities at
multiple levels of outcomes. The next section of this brief
and were more likely to complete higher education
describes a multilevel framework for understanding and
courses than childcare workers in centers without
implementing evaluations of professional development
career ladders. For more information on this proj- in OST programs.
ect and its results, visit www.eoionline.org/Policy-
EarlyLearningAndCare.htm#ECE. A Multilevel Framework for Evaluation
Professional development experts agree that evaluations
can and should assess the effects of professional develop-
staff salaries and responsibilities to ongoing education ment programs at multiple levels (Guskey, 2000; Killion,
and experience. Advocates propose that a career devel- 1998; Kirkpatrick, 1998; Mizell, 2003). Some experts rec-
opment ladder would increase the professional status of ommend three levels, while others add a fourth. Kirkpat-
OST providers and, correspondingly, staff retention rates rick’s framework, which has been a model for evaluating
(Brown, 2002). Findings of a recent pilot study indicate training initiatives for several decades, details four levels
that a wage ladder in early childhood education had of evaluation: (1) reaction to the training, (2) learning of
positive effects on staff retention and morale (see the information and practices from the training, (3) transfer
box on this page). Although there are some distinctions of this knowledge into practice, and (4) results for key
between early childhood and OST (most obviously, the stakeholders. Table 1 below presents these four levels and
age of the children served), early childhood education applies them to professional development in OST.
initiatives may serve as models for the OST field, and As noted above, the reality of evaluation in the con-
there is some evidence to suggest that including a career text of limited time, resources, and evaluation skills often
ladder is a key component for system building in OST translates into a scarcity of much-needed evaluation in-
(National Institute on Out-of-School Time & Academy
for Educational Development Center for Youth Develop- TABLE 1
ment and Policy Research, 2003). THE FOUR-LEVEL EVALUATION MODEL
APPLIED TO OUT-OF-SCHOOL TIME (OST)
Why Evaluate Professional Development
Initiatives? Level Applied to OST

Despite the prevalence of professional development 1. Reaction Feedback from OST providers about training
activities and their numerous potential benefits, the ef- needs and satisfaction
fectiveness of these efforts cannot be ensured by good
2. Learning OST providers’ knowledge of best practices
intentions alone. Evaluations are needed to measure and
verify their quality and usefulness. However, evaluations 3. Transfer The practices used by OST providers
of professional development have been sparse. According
4. Results Positive developmental outcomes for youth
to a scan of our Out-of-School Time Program Evaluation and other stakeholders, such as families and
Database,6 very few program evaluations include infor- communities
mation about training or staff development, and staffing

4 HARVARD FAMILY RESEARCH PROJECT


formation that programs can use for accountability and training program run by Boston Public Schools to link
program improvement. The four-level model can facilitate OST learning with school learning, administers surveys
professional development evaluations in OST, because it of perceptions about the training’s usefulness immediately
describes how evaluation can be conducted—and how it after the program and also later in the year for follow-up.
can be useful—at each level. Both organizations report that this feedback has helped
Before planning an evaluation strategy or choosing an them to better understand the needs of OST providers
evaluation level, it is important for professional develop- and to improve their training initiatives.
ment evaluators to define their goals. These goals may be
affected by the program’s stage of development. For ex- Levels 2 and 3 – Learning and Transfer:
ample, evaluation of a first-time program may place more Increasing Knowledge and Improving
emphasis on implementation and provider opinions; after Program Practices
implementation has been studied and improved, evalu- Some evaluations conducted at both the state and na-
ators may then turn more attention to youth outcomes. tional levels have taken additional steps to document the
Each of the four evaluation levels plays an important effects of professional development on the second and
role in facilitating learning about the initiative and, ulti- third outcome levels—OST provider learning and transfer
mately, in improving it. The sections that follow describe of this knowledge into practice. The learning and transfer
the uses of each level, present examples of evaluations at categories can refer to both general principles, such as
each level, and, finally, offer recommendations for future youth development and effective management, and to
evaluation. specific content, such as the application of a lesson or ac-
Our review of evaluations reveals that most current tivity that was taught during a professional development
professional development evaluations focus on the first workshop. Level 2 and 3 evaluations can help establish
level: They collect information about provider needs the effects of professional development initiatives on OST
and program satisfaction that can be used to reflect providers, providing valuable information about whether
on program practice. Only a few program evaluations the initiatives should be continued, how they work, and
have reported outcomes on the second and third levels how they can be improved. Below we feature five efforts
of provider knowledge and program practices, and, to to examine increased knowledge and improved program
our knowledge, no completed studies have yet examined practices through evaluation levels 2 and 3.
the fourth level—how professional development efforts
impact participant outcomes. The results of our scan of Making the Most of Out-of-School Time
OST professional development evaluations and their Making the Most of Out-of-School Time (MOST)9 was
findings are provided below and are also compiled in a multiyear initiative, designed to build community-wide
Table 2 on page 8. support and infrastructure for OST programs in Boston,
Seattle, and Chicago. As part of its system-building ef-
Level 1 – Reaction: Understanding Providers’ forts, MOST supported the creation and availability of
Needs and Satisfaction college-level courses, tuition subsidies, staff mentoring,
The first level of evaluation is quite common among OST on-site training, and other professional development
professional development initiatives: Many initiatives de- activities. These professional development efforts were
scribe the demographics of participating youth workers evaluated with a descriptive methodology, using observa-
and their satisfaction with the programs. The question tions, interviews, and surveys to collect information at
answered by these surveys relates to staff members’ per- participating programs (Halpern, Spielberger & Robb,
ceptions of the training programs. As such, the results 2001). Specifically, evaluators examined a variety of OST
can be helpful in improving the professional development program features, including staff training and quality.
program from the perspective of its participants, which Evaluators of the MOST initiative found that the staff
could lead to higher participation and higher levels of development efforts were effective, but more so for some
engagement. staffers than others: Program directors benefited more
Level 1 evaluations usually take the form of brief par- than “front-line” providers. For front-line staff, work-
ticipant surveys, either immediately following the training shops and college courses increased their knowledge of
or soon after. For instance, the After-School Corporation youth development, but this knowledge was not often
(TASC)7 studies perceptions of their professional devel- integrated into their program practices. Furthermore,
opment programs from multiple perspectives, including there were particular barriers to the effectiveness of col-
feedback from the participants at the conclusion of the lege course offerings: Staff members were reluctant to
workshops, self-assessments by the contractors who run enroll, frequently did not complete courses, faced time
the trainings, and annual surveys of site coordinators and financial constraints, and often found that course
at each OST program. Similarly, Bridging the Gap,8 a completion did not result in higher salaries.

HARVARD FAMILY RESEARCH PROJECT 5


San Jose 4 Quality es in youth participants’ active involvement in program
A similar methodology was used to evaluate San Jose 4 planning; and directors’ use of more intentional planning
Quality,10 a multifaceted initiative for improving OST strategies in both the short and long term.
programs in the San Jose, California, area. San Jose 4 However, the evaluation also revealed some less posi-
Quality was designed to affect four desired outcomes, tive findings. Most notably, staff turnover remained a
including creation of a better-trained workforce in lo- significant challenge for the affiliate programs after
cal programs. Staff improvement components included the initiative, and the goal of fostering teamwork be-
four staff trainings per year with follow-up meetings and tween program directors and executive directors was
presentations to discuss and reinforce implementation not always fulfilled. In addition, although respondents
of training material, regular meetings of lead staff from reported increased awareness of the importance of pa-
local programs, and site-specific technical assistance. The rental involvement, diversity in staff recruitment, and
initiative has been evaluated using a range of methods, ongoing staff training, little progress was made in these
from observations to interviews to surveys, with findings areas. Results revealed that front-line staff needed more
due out in an annual report in 2004. ongoing training in youth development and appropriate
Like MOST, preliminary results of San Jose 4 Qual- activity structure.
ity suggest a pattern of positive results, accompanied by
some challenges. Notably, staff members reported that the North Carolina Quality Enhancement Initiative
initiative broadened their horizons, heightened their sense Baseline comparisons were also used in a study evaluat-
of professional affiliation, and increased opportunities for ing the North Carolina Quality Enhancement Initiative
networking and support. Other results of the initiative (NC QEI) (Hall & Cassidy, 2002), a project designed to
may also be related in part to the staff training compo- prepare selected OST programs in North Carolina for
nent, such as improved staff interactions with youth. the process of accreditation by the National School-Age
Care Alliance (NSACA) (now known as the National
Girls Inc. Strong, Smart and Bold Initiative Afterschool Association). Like the MOST Initiative, NC
An evaluation of Girls Inc.’s Strong, Smart and Bold11 QEI was a comprehensive program improvement project
initiative used baseline data12 to move the field toward that included a professional development component. The
a higher level of rigor (Girls Inc., 1996). The initiative initiative began with a 2-day staff workshop and provided
provided intensive training for program directors in de- ongoing consultation and support for 9 months.
veloping skills for planning, implementing, managing, Both before and after the initiative, standardized ob-
and evaluating best practices, and in integrating the Girls servations and staff interviews were conducted to obtain
Inc. mission at local affiliate programs. The training was quantitative data on OST program environments and
grounded in a combination of theories of youth devel- staff-child relationship quality. Participating programs
opment, organizational change, and adult learning and showed statistically significant improvements in each of
was envisioned as a systems approach in which program these areas. Furthermore, the largest improvements were
directors and executive directors worked together. in programs that initially scored lowest on a predeter-
The initiative was assessed using two types of evalua- mined set of quality indicators, i.e., those which served
tions. An internal evaluation conducted by Girls Inc. staff larger numbers of youth, did not have state licenses, and
was used primarily for continuous program improvement, were led by directors without college degrees. One caveat
while a case study evaluation of four affiliate programs should be noted, however: It is possible that program
was conducted by external evaluators. The case study improvements were due not only to the training program,
evaluation used a qualitative methodology13 to assess or- but also to awareness of and increased effort for the ensu-
ganizational culture, operations, and programs of four ing accreditation process.
representative affiliates. Data were collected from site
visits at four time points, using semi-structured interviews Building Exemplary Systems for Training Youth Workers
(with directors, staff, youth participants, their parents, An evaluation of the Building Exemplary Systems for
and community stakeholders), structured program ob- Training Youth Workers (BEST) initiative14 also re-
servations, and site “portfolios” of program records and ported statistically significant changes from baseline to
self-assessments. follow-up at outcome levels 2 and 3 (Center for School
The evaluation illuminated many positive effects at and Community Services, Academy for Educational
evaluation levels 2 and 3, including (but not limited to) Development, 2002). To address the problem of frag-
increased confidence, enthusiasm, and understanding of mented, disjointed training efforts, BEST was created as
Girls Inc.’s mission among program directors; increased a cohesive and theory-based approach to professional
capacity to serve traditionally underserved populations; development. The primary goal of the program was to
implementation of more varied programming and increas- provide staff training, with a secondary goal of building a

6 HARVARD FAMILY RESEARCH PROJECT


community-wide career support infrastructure for youth comes. Clearly, level 4 evaluations play an essential role
workers (e.g., availability of college degree programs, in demonstrating the ultimate goal of all staff work—to
accreditation systems, etc.). While the 15 national sites improve the lives of youth. These evaluations can also
differed in minor ways, all relied heavily on a positive play an important role in supporting sustainability for
youth development curriculum, which emphasized the professional development initiatives. However, program
strengths and assets of youth. directors and evaluators should be aware that positive
Built in to the BEST initiative was an evaluation study youth outcomes, while most conclusive, are also the most
that used surveys, interviews, and diaries administered difficult to establish. A multitude of factors operate to
before and after the program to assess whether the train- influence youth outcomes, both within and outside of
ing program changed the practices of participating youth OST programs. Furthermore, OST programs aim to affect
workers. Using this method, researchers found that, for a broader range of outcomes and operate on a less struc-
the majority of youth workers (over 70%), BEST had a tured and defined schedule than school-day programs.
positive impact on the practices they used in their pro- The complexity of these features makes it difficult to
grams. Specifically, data indicated the following signifi- demonstrate that youth outcomes are due specifically to
cant changes after the training: Youth workers (1) were staff training. Evaluators should also consider the age
more likely to integrate youth development knowledge and stage of their professional development initiatives;
and theory into their program planning, (2) were more affecting youth outcomes takes time, and newer programs
likely to involve youth participants in the process of may wish to first investigate levels 1, 2, and 3. For these
running the program, (3) learned a common language reasons, it is important to examine youth outcomes in
of positive youth development, which allowed them to combination with the other three levels of outcomes
communicate more effectively, and (4) experienced a (satisfaction, knowledge, and practices).
greater amount of networking and information sharing To date no studies have reported on the fourth out-
with other providers. come level—whether professional development programs
The BEST program also partially achieved its second affect outcomes for the youth served by participating
goal: Youth workers and community stakeholders re- OST providers. Level 4 outcomes will be part of an in-
ported an increase in the amount of support provided progress evaluation of the New York State School-Age
by their organizations in the form of staff mentoring Care Credentialing Project (SACC). SACC, begun in
and supervision. However, youth workers continued to 1999, promotes quality youth services by providing spe-
request more support after the program. In interpreting cific standards, training, and evaluation of school-age care
these results, it is important to note that the BEST study providers. In 2002, the Cornell Early Childhood Program
assessed training participants’ reports of whether they began to evaluate this new statewide credential, using a
changed their youth work practices, and did not include multistage evaluation approach similar to the framework
more objective measures such as observations by external we present in this brief.15
evaluators. To date, this multifaceted evaluation has focused on
three main goals. The first goal is to document the history
The level 2 and 3 evaluations reported here demonstrate and development of SACC through in-depth interviews
both the potential of staff development initiatives and the with SACC development leaders. The second goal is to
need for additional strategies and investments to translate examine the effects of SACC on the following level 1
this knowledge into practice. A common theme across all and 2 evaluation outcomes: school-age care providers’
the evaluations was that the staff initiatives were beneficial participation in professional development; beliefs and
but not sufficient on their own, particularly for front-line attitudes about providing school-age care; and employ-
youth workers. These studies also offer important lessons ment characteristics, such as salary, responsibilities, and
about evaluation design: Each evaluation approach had job retention. Pre- and post-questionnaires are adminis-
unique strengths (e.g., MOST provided rich descriptive tered to participating providers and a comparison group
information, BEST used baseline comparisons, Strong, of providers for this purpose. The third goal is to assess
Smart, and Bold used program observations), but none the impact of SACC on the overall quality of school-
combined these methods for the most rigorous possible age programs, with a particular emphasis on providers’
evaluation design. practices and interactions with children (evaluation level
3), using pre- and post-program observations. Analyses
Level 4 – Results: Assessing Impact on of these questions are currently underway, with plans to
Participants disseminate findings widely.16 With additional funding,
The studies described above represent important advances the evaluation will expand to include an examination of
in evaluation. However, they do not demonstrate the im- evaluation level 4, i.e., how SACC affects the social and
pact of professional development initiatives on youth out- academic outcomes of children in school-age programs.

HARVARD FAMILY RESEARCH PROJECT 7


TABLE 2: THE MULTILEVEL EVALUATION MODEL IN PRACTICE: EXAMPLES FROM THE FIELD
This table illustrates the four-level method of evaluating professional development through examples from out-of-school time (OST) professional develop-
ment initiatives. This method can assess outcomes for OST providers participating in professional development, their programs, and the youth they serve.

OST Professional Sample Evaluation


Evaluation Level Sample Evaluation Select Evaluation
Development Methods and
and Purpose Questions Findings and Challenges
Initiatives Sources

• Understanding OST pro- • Bridging the Gap • How do staff and ad- • Surveys of workshop Findings
viders’ needs (e.g., skills, • TASC ministrators perceive participants (pre/post • Participating OST providers perceived workshops to
Level 1: Reaction

knowledge, etc.) • Numerous other training workshops? or post only) be useful


• Assessing participating unpublished initiatives • What additional • Trainer surveys • Trainers reported greater knowledge of participant
providers’ satisfaction training is needed? • Participating provid- needs
• Improving professional ers’ self-assessments • Trainers reported workshop improvement due to
development initiative • Annual program co- feedback
from providers’ perspec- ordinator surveys
tives and increasing
participation

• Assessing whether par- • BEST • Did youth workers • Interviews Findings


ticipating OST providers • Girls Inc. Strong, have greater knowl- • Surveys of participat- • Front-line staff reported increased knowledge of
learned new information Smart and Bold edge of youth devel- ing providers youth development after workshops and college
Level 2: Learning

from the initiative • MOST opment principles • Diaries kept by courses


• Determining how the • New York State after a multi-session providers during the • Providers experienced a heightened sense of profes-
initiative worked, how it School-Age Care training program? professional develop- sional affiliation
could be improved, and Credentialing Project • Did program direc- ment initiative and • Program directors reported increased confidence,
whether it should be (in progress) tors have a clearer their OST programs enthusiasm, and understanding of program mission
continued • San Jose 4 Quality understanding of the • Staff recounted “broadened horizons”
national program Challenges
mission following a • Front-line staff reported needing more training in
leadership institute? youth development

• Assessing whether par- • BEST • Did front-line provid- • Interviews Findings


ticipating OST providers • Girls Inc. Strong, ers plan and imple- • Surveys of participat- • Staff implemented more varied programming and
applied the information Smart and Bold ment age-appropriate ing providers increased youth’s active involvement in program
from the initiative in their • MOST activities after youth • Observations planning
programs • North Carolina development training, • Case study method • Participating providers were more likely to integrate
• Determining how the Quality Enhancement and did they involve • Site portfolios youth development principles into program planning
initiative worked, how it Initiative program youth in (program records and • Program directors used more intentional planning
could be improved, and • New York State activity planning? self-assessments) strategies
whether it should be School-Age Care • Did staff and youth • Diaries kept by • OST programs’ capacity to serve traditionally under-
continued Credentialing Project develop more posi- providers during the served populations increased
(in progress) tive relationships? professional develop- • Staff-child relationships improved
Level 3: Transfer

• San Jose 4 Quality • Did the overall ment initiative and • Program environments were rated as more positive
program environment their OST programs • Providers communicated more effectively
improve? • Youth workers reported more support (e.g.,
• Was there an in- mentoring and supervision) and networking
crease in networking Challenges
and communication • Program directors benefited more than front-line
among program di- providers
rectors after attend- • Improved knowledge of youth development was not
ing a multi-session integrated into practices by front-line staff
leadership institute? • Front-line staff needed more ongoing training in
youth development and appropriate activity structure
• Staff turnover continued
• Teamwork skills did not improve
• Staff members were reluctant to enroll in college
courses, frequently did not complete courses, faced
time and financial constraints, and often found that
course completion did not result in higher salaries

• Assessing whether the • New York State • Did youth’s social • Not available • Not available
Level 4: Results

initiative improved youth School-Age Care outcomes improve?


outcomes Credentialing Project (planned)
• Determining whether (in planning stage) • Did youth’s academic
the program should be outcomes improve?
continued (planned)

8 HARVARD FAMILY RESEARCH PROJECT


Implications for Future Evaluation site method, starting with activities requested by provid-
and Practice ers, even if there is no proof that these activities are linked
with outcomes (Killion, 1999). While the latter method
Our review confirms that professional development
may score highly on satisfaction, it may not affect youth
initiatives are associated with positive outcomes, includ-
outcomes. According to staff development expert Hayes
ing providers’ satisfaction and use of a positive youth
Mizell (2003), “It is what students need rather than what
development framework. Also of note, several of these
adults want that should shape educators’ learning.”
evaluation reports highlighted the role of professional
development in improving opportunities for networking, Choose outcome indicators based on the goals of the
information sharing, and social support among provid- program. Outcome data at levels 2, 3, and 4 should be
ers. However, to date, evaluations of OST professional collected differently depending on the goals of the pro-
development efforts have often been limited to the first gram. If the goal is to increase OST providers’ tutoring
evaluation level—participants’ immediate reports of sat- skills, indicators might include providers’ knowledge
isfaction with the training (Killion, 1999; Mizell, 2003). of tutoring strategies, as well as students’ end-of-year
The true prevalence of these satisfaction surveys is not grades. If the goal of the program is to increase positive
known because most are unpublished and used internally relationships between students and staff, indicators might
by programs; however, our scan of the field reveals them include staffers’ and youth’s communication skills and
to be a very common practice. This first evaluation level reports of mutual trust. Evaluators should consider using
is an important level for learning; training programs and multiple outcome indicators, particularly if the program
trainers can gain valuable information by attending to has multiple goals.
the voices of the providers they serve. However, to really
Collect data from appropriate sources. After choosing
understand the benefits of professional development ef-
indicators, the next decision should concern the appro-
forts, programs need to engage in more comprehensive
priate sources: From whom should the outcome data
evaluations that will address issues of knowledge and skill
be collected? Data can be collected from OST provid-
building, best practice, and effectiveness.
ers, program youth, parents, school records, and other
To facilitate this goal and to demonstrate the important
sources. The choice of data source largely depends on
role that professional development plays in contributing
the evaluation level. For example, to measure satisfaction
to quality OST programs for young people, we make the
and knowledge (levels 1 and 2), evaluators should collect
following recommendations for future evaluation. These
data from OST providers themselves. To measure youth
recommendations apply to all professional development
outcomes (level 4), data should be collected from youth’s
initiatives, but some are more critical at specific evalua-
own reports and also from archival documents, such as
tion levels, as noted below.
school records and police reports. (For more information
Plan the evaluation before the professional development on choosing appropriate sources and data collection mea-
program starts. Evaluation planning should begin at the sures, see our Out-of-School Time Evaluation Snapshots,
same time as program planning. First, this improves the available online at www.gse.harvard.edu/hfrp/projects/
program by encouraging planners to develop a theory afterschool/resources.)
of how the program will affect outcomes. Second, it
Assess long-term outcomes. The process of developing
improves the quality of evaluation by providing the op-
and applying knowledge takes time; the results of pro-
portunity to collect baseline, or pre-program, measures
fessional development programs may not be apparent
of the intended outcomes. Incorporating this information
immediately after program completion. Accordingly,
into data analyses makes for a stronger research design.
evaluations should assess outcomes for providers’ prac-
This recommendation is particularly important for evalu-
tices (level 3) and youth (level 4) over multiple years
ation levels 2, 3, and 4.
(Guskey, 2000).
Plan a timeline of evaluation. Effectively assessing all
Base professional development activities and evalua-
four levels of outcomes requires strong organizational
tions on strong theory and research. As in the field of
skills, a specific timeline outlining when each aspect of
education, there is a growing push for OST programs
the process will occur, and a commitment to following
and practices that are based on rigorous research stud-
this plan (Mizell, 2003).
ies, which the government refers to as scientifically based
Use “backward planning.” Evaluator Thomas Guskey research (Bouffard, 2003). Theory and research from
(2001) highlights the importance of starting with the de- youth development fields provide a reliable framework
sired outcomes for youth and working backward to de- for understanding both what works and why. As such,
sign activities that reflect these goals. This is particularly research-driven practices are an important complement
important in trying to affect level 4 (youth) outcomes. to the personal experiences and expertise that OST
Many professional development programs use the oppo- professionals bring to bear on staff training. They are

HARVARD FAMILY RESEARCH PROJECT 9


6
also an important part of establishing how to evaluate Our database contains profiles of OST program evaluations,
professional development programs, since evaluations which are searchable on a wide range of criteria. It is available in
the OST section of the HFRP website at www.gse.harvard.edu/
should assess only those outcomes they intend to affect. hfrp/projects/afterschool/evaldatabase.html.
Although the use of research-based trainings is increasing 7 For more information on TASC, visit their website at www.
(e.g., the BEST initiative), it is not yet the norm across tascorp.org.
staff training programs. 8 For more information on Bridging the Gap, contact Dishon
Mills at 617-635-1578 or dmills@boston.k12.ma.us.
Just as professional development initiatives play a key 9 MOST was a collaborative effort by many stakeholders from
role in ensuring quality out-of-school time program- 1994 to 2001, facilitated by the Wallace Foundation, the Na-
ming, evaluation plays a key role in ensuring effective tional Institute on Out-of-School Time, the Center for Career
professional development initiatives. Evaluation results Development in Early Care and Education at Wheelock College,
and local stakeholders in Boston, Chicago, and Seattle.
can be used to understand and build on the strengths and 10 San Jose 4 Quality was a collaboration between the National
weaknesses of these initiatives, to meet accountability Institute on Out-of-School Time, the California School-Age
requirements, and to advocate for more investment in Consortium, San Jose area service providers, and the David
professional development. As a result, they can help to and Lucile Packard Foundation.
11 The Strong, Smart and Bold initiative was conducted from 1993
establish a more coherent and consistent framework for
to 1996 with support from the Wallace Foundation.
professional development activities, including a common 12 Comparing data before the program—at baseline—and after
language and set of goals. This common framework has
the program can establish whether providers’ knowledge and
the power to bring information—and quality—to all practices did in fact change over the period of the program.
professional development programs, regardless of size, 13 Qualitative data provide descriptive details, often collected from
location, or focus. a purposive sample of interviews, focus groups, or observations.
In contrast, quantitative data provide objective, numerical data
Suzanne Bouffard, Research Analyst that can be quantified.
Priscilla M. D. Little, Associate Director 14 BEST was a 1996 national pilot program, funded by the Wallace
and Project Manager Foundation.
15 The New York SACC evaluation is supported by funds from
the Cooperative State Research, Education and Extension Ser-
Acknowledgements vice and the New York State Office of Children and Family
Services.
Preparation of this brief was made possible through the 16 For more information on the SACC evaluation, contact Lisa
support of the Charles Stewart Mott Foundation and the McCabe, Ph.D., Evaluation Project Manager, at lam4@
W. K. Kellogg Foundation. We would also like to thank cornell.edu.
Julia Coffman of Harvard Family Research Project, Pam
Garza of the National Assembly of Health and Human References
Service Organizations/National Collaboration for Youth, Bouffard, S. (2003). Doing what works: Scientifically based
Nancy Peter of the Out-of-School Time Resource Center research in education. The Evaluation Exchange, 9(1), 15,
at the University of Pennsylvania, and Michelle Seligson 17. [Available at www.gse.harvard.edu/hfrp/eval/issue21/
of the Wellesley Centers for Women, for their insightful bbt1.html.]
Brown, J. (2002). Washington State Early Childhood Education
review and feedback.
Career and Wage Ladder pilot project: Evaluation report of
Year 1. Seattle, WA: Economic Opportunity Institute.
Notes Center for School and Community Services, Academy for Edu-
1 cational Development. (2002). BEST strengthens youth
In this brief we use the terms OST provider and youth worker
to refer to individuals who work with young people, ages 5–18, worker practice: An evaluation of building exemplary
in programs that occur outside of school hours. These out-of- systems for training youth workers. New York: Author.
school time (OST) programs include before and after school [Available at www.aed.org/ToolsandPublications/upload/
programs, tutoring and mentoring programs, and community- best.pdf (Acrobat file).]
based youth development programs. Costley, J. (1998). Building a professional development system
2 This publication is designed to focus on evaluation rather than that works for the field of out-of-school time. Wellesley,
to provide a comprehensive portrait of professional development MA: National Institute on Out-of-School Time, Center for
programs; for readers interested in professional development Research on Women, Wellesley College.
resources, a list is provided on page 11. Eccles, J. S., & Gootman, J. A. (Eds.). (2002). Community
3 For more information on the New York City Beacons Initia- programs to promote youth development. Washington,
tive, visit the website of the Fund for the City of New York at DC: National Academy Press.
www.fcny.org/html/youth/school.htm. Gambone, M. A., Klem, A. M., & Connell, J. P. (2002). Find-
4 For more information on LA’s BEST, visit their website at www. ing out what matters for youth: Testing key links in a
lasbest.org. community action framework for youth development.
5 For more information on the San Francisco Beacons Initiative, Philadelphia: Youth Development Strategies and Institute
visit the website of the Community Network for Youth Develop-
ment at www.cnyd.org. continued on page 12

10 HARVARD FAMILY RESEARCH PROJECT


SELECT PROFESSIONAL DEVELOPMENT RESOURCES

Achieve Boston, a partnership of local public and private publications about the OST workforce and the field in general,
organizations, provides a Web-based clearinghouse of out- and other resources. www.niost.org
of-school time (OST) professional development trainings, The National Staff Development Council (NSDC)
seminars, college courses, and other offerings in the Boston, provides extensive resources on professional development.
Massachusetts, area. www.achieveboston.org Although focused primarily on education, many of the re-
Bringing Yourself to Work: Caregiving in After-School sources are also helpful for the OST field. They are available
Environments is a program run by researchers at Wellesley at www.nsdc.org. See also the fall 2003 issue of NSDC’s Journal
College, which uses a research-driven framework to pro- of Staff Development, which focuses on evaluating professional
mote reflective practice and emotional intelligence among development.
OST providers. For more information on the project and on A fall 2004 volume of New Directions for Youth Develop-
the recent book written by project directors Michelle Selig- ment will focus on professional development in the youth
son and Patricia Jahoda Stahl, see the program’s website at development field. Edited by Pam Garza, Lynne Borden, and
www.bringingyourselftowork.com. Kirk Astroth, the volume will contain articles on the history
Harvard Family Research Project’s Out-of-School and current state of professional development initiatives, the
Time Program Evaluation Database is a compilation roles of various organizations and stakeholders, and recom-
of profiles of evaluations of OST programs and initiatives. It mendations for the field.
provides accessible information about evaluation work of both The North Central Regional Educational Laboratory
large and small OST programs to support the development (NCREL) has created a National Awards Program for Model
of high quality evaluations and programs in the out-of-school Professional Development to recognize schools with highly
time field. www.gse.harvard.edu/hfrp/projects/afterschool/ effective staff development programs. NCREL has carefully
evaldatabase.html examined the programs and practices of these schools in order
to compile a list of best practices in professional development.
The Massachusetts School-Age Coalition (MSAC) has
NCREL has also created a professional development toolkit
developed a set of Core Competencies for Massachusetts
that includes worksheets for planning and conducting all stages
School-Age Practitioners through the PASS (Professional
of the professional development process, including evaluation.
Advancement of School-Age Staff) Initiative. MSAC has also
Although focused on education, this resource can be used for
conducted training workshops across the state, posted a regis-
OST as well. www.ncrel.org/pd/toolkit.htm
try of professional development opportunities, and advocated
for a career ladder system. www.mass-sac.org The Out-of-School Time Resource Center is housed in
the University of Pennsylvania’s Center for Research on Youth
The National Afterschool Association (formerly the
and Social Policy. The center aims to help connect OST programs
National School-Age Care Alliance) Standards for Quality
with resources, and features professional development tools
School-Age Care include standards for OST staff members
and research. The center’s resources include the recent publi-
among their six categories of program quality standards.
cation, Out-of-School Time (OST) Professional Development Work-
www.naaweb.org
shops: An Evaluation Framework. www.ssw.upenn.edu/ostrc
The National Collaboration for Youth of the National The Partnership for After School Education is a “net-
Assembly of Health and Human Service Organizations has work of professionals … committed to quality education in
produced a set of Youth Development Worker Competen- after school settings” and offers professional development
cies. The competencies are “the knowledge, skills, and per- resources. www.pasesetter.com
sonal attributes needed by entry-level youth development
workers to support the critical features of positive youth The After-School Corporation (TASC) has produced a
development settings.” www.nydic.org/nydic/documents/ toolkit, Building the Skills of After-School Staff, which includes
FinalCompetencies1.doc (Word file) worksheets for planning, evaluation, and more. It is available
at www.tascorp.org/publications/catalog/indrep_n. TASC has
The National Collaboration for Youth’s special initiative, the also published a series of resource briefs on staffing issues,
National Youth Development Learning Network, is “de- which are available at www.tascorp.org/publications/catalog.
signed to build and strengthen youth workers’ performance
YouthNet of Greater Kansas City focuses on promoting
by analyzing, synthesizing, and linking existing information and
a set of youth worker standards, providing training and other
training.” Information and professional development resources
professional development opportunities, and establishing a
(including a professional development e-newsletter) are avail-
monitoring and assessment process. The organization has
able at www.nassembly.org/nassembly/2003/initiatives.htm.
established a set of standards for providers who work with
The National Institute on Out-of-School Time (NIOST) school-age children and another set for those who work
at Wellesley College is a national voice for promoting a skilled with teens. Information on these standards can be found at
and stable workforce. NIOST provides both in-the-field train- www.kcyouthnet.org/developing_standards.asp. Also visit
ings and biannual seminars for OST staff members of all levels, YouthNet’s homepage at www.kcyouthnet.org.

HARVARD FAMILY RESEARCH PROJECT 11


continued from page 10 National Institute on Out-of-School Time, & Academy for Edu-
cational Development Center for Youth Development and
for Research and Reform in Education. [Available at Policy Research. (2003). Strategic plan: Building a skilled
www.ydsi.org/YDSI/pdf/WhatMatters.pdf (Acrobat file).] and stable out-of-school time workforce. Wellesley, MA:
Girls Inc. (1996). Becoming Strong, Smart and Bold: Girls In- Author. [Available at www.niost.org/about/stategic_plan_
corporated program directors as change agents. Indianapo- building_skilled.pdf (Acrobat file).]
lis, IN: Girls Incorporated National Resource Center. NICHD Early Child Care Research Network. (2001). Nonma-
Guskey, T. R. (2000). Evaluating professional development. ternal care and family factors in early development: An
Thousand Oaks, CA: Corwin Press. overview of the NICHD Study of Early Child Care. Journal
Guskey, T. R. (2001). JSD Forum: The backward approach. of Applied Developmental Psychology, 2, 457–492.
Journal of Staff Development, 22(3), 60. Norris, D. J. (2001). Quality of care offered by providers with
Hall, A. H., & Cassidy, D. J. (2002). An assessment of the North differential patterns of workshop participation. Child &
Carolina School-Age Child Care Accreditation Initiative. Youth Care Forum, 30, 111–121.
Journal of Research in Childhood Education, 17, 84–96. Partnership for Afterschool Education. (1999). Developing
Halpern, R., Spielberger, J., & Robb, S. (2001). Evaluation the afterschool professional and the profession: Address-
of the MOST (Making the Most of Out-of-School Time) ing quality and scale. New York: Author. [Available at
initiative: Final report and summary of findings. Chicago: www.pasesetter.com/publicationResources/Publications/
Chapin Hall Center for Children at the University of PDF/DevelopingAfterschoolProfession.pdf (Acrobat
Chicago. [Available at www.chapinhall.org/article_ab- file).]
stract_new.asp?ar=1316&L2=62&L3=105.] Rosenthal, R., & Vandell, D. L. (1996). Quality of care at
Killion, J. (1998). Scaling the elusive summit. Journal of Staff school-aged child-care programs: Regulatable features,
Development, 19(4), 12–16. observed experiences, child perspectives, and parent per-
Killion, J. (1999). Design staff development with student spectives. Child Development, 67, 2434–2445.
needs in mind. Results. Retrieved November 15, 2003, Seligson, M., & Stahl, P. (2003). Bringing yourself to work:
from http://www.nsdc.org/library/publications/results/res4- A guide to successful staff development in after-school
99killion.cfm programs. New York: Teachers College Press.
Kirkpatrick, D. L. (1998). Evaluating training programs: The Tolman, J., Pittman, K., Yohalem, N., Thomases, J., & Tram-
four levels. San Francisco: Berrett-Koehler. mel, M. (2002). Moving an out-of-school agenda: Les-
Little, P. M. D. (2004). A recipe for quality out-of-school time sons across cities. Takoma Park, MD: Forum for Youth
programs. The Evaluation Exchange, 10(1), 18–19. [Avail- Investment.
able at www.gse.harvard.edu/hfrp/eval/issue25/expert3. Werner, E. E., & Smith, R. S. (1982). Vulnerable but invincible:
html.] A longitudinal study of resilient children and youth. New
McLaughlin, M. W. (2000). Community counts: How youth York: McGraw-Hill.
organizations matter for youth development. Washington, Westat, & Policy Studies Associates. (2001). The longitudinal
DC: Public Education Network. evaluation of school change and performance (LESCP) in
Mizell, H. (2003). Facilitator: 10, Refreshments: 8, Evaluation: Title 1 schools. Final report, volume 2: Technical report.
0. Journal of Staff Development, 24(4), 10–13. [Available Rockville, MD and Washington, DC: Author. [Available at
at www.nsdc.org/library/publications/jsd/mizell244.cfm.] www.policystudies.com/studies/school/lescp.html.]

ABOUT HARVARD FAMILY RESEARCH PROJECT

Founded in 1983 by Dr. Heather Weiss, Harvard Family Research Project conducts research about pro-
grams and policies that serve children and families throughout the United States. Publishing and dissemi-
nating its research widely, HFRP plays a vital role in examining and encouraging programs and policies
that enable families and communities to help children reach their potential.

Harvard Family Research Project


Harvard Graduate School of Education
3 Garden Street
Cambridge, MA 02138
Tel: 617-495-9108
Fax: 617-495-8594
Email: hfrp@gse.harvard.edu
Harvard Family Research Project
Website: www.hfrp.org
Harvard Graduate School of Education

12 HARVARD FAMILY RESEARCH PROJECT

You might also like