Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011
1

Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
2

Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
3


Summer of Service

Evaluation Toolkit
This toolkit was published with generous support
from Lumina Foundation for Education
Written by Alan Melchior
The Center for Youth and Communities
Heller School for Social Policy and Management
Brandeis University
Waltham, MA

Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
4



Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
5

The SOS Program Model






















The SOS Program Evaluation Toolkit Overview


Program Goals
 Develop Positive and
Empowered Identity
 Civic Agency
 Increased Academic
Engagement and Success
 Meaningful Community Impact
Key Program Elements
 Intentional Program Design and Planning
 Meaningful Service
 Summer-Style Learning
 Youth Voice
 Qualified Staffing
 School, Community and Family Involvement
 Continuity and Intensity
 Present, Reflect, and Celebrate
 Monitor, Evaluate, and Sustain

Participant Outcomes
 Understanding Context
 Civic Agency
 Initiative and Action
 Efficacy and Connection
 Future Civic Roles
 Academic Engagement and
Success
Community Outcomes
 Meaningful Service to the
Community
Evaluation
Planning
Bring stakeholders
together, define
your questions,
build your plan.
Data Collection
Select your
method and tools,
create a data
collection
strategy.
Analysis
Organize and
interpret your
results.
Using the
Results
Use your results
to make your
case, plan for
program
improvement.
Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
6

Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
7

Table of Contents

Chapter 1: Introduction to the Summer of Service Evaluation Toolkit 9
1.1 What is the SOS Evaluation Toolkit
1.2 How the SOS Evaluation Toolkit was Developed
1.3 The Summer of Service Program Model
1.4 How to Use the SOS Evaluation Toolkit

Chapter 2: A Program Evaluation Primer 19
2.1 What is Program Evaluation?
2.2 Why Do Program Evaluation?
2.3 What are the Different Types of Program Evaluation
2.4 What Kinds of Data Can I Use in an Evaluation?
2.5 What Role do Surveys Play in Program Evaluation?
2.6 Using Surveys to Assess Outcomes and Impacts
2.7 Assessing Impacts – Where Does a Comparison Group Come In?
2.8 A Few More Terms You May Encounter

Chapter 3: Preparing for Your Program Evaluation 30
3.1 Getting Started: Create and Evaluation Team
3.2 Using a Logic Model to Define Your Program
3.3 Why Use Logic Models
3.4 How Do I Develop a Logic Model for My Program?
3.5 Using Your Logic Model to Plan Your Evaluation
3.6 Developing an Evaluation Plan
3.7 What Questions Do You Want to Ask?
3.8 What Kinds of Information Can You Use to Answer Your Questions?
3.9 What Methods Should I Use to Collect Data?
3.10 How Are You Going to Use the Results?
3.11 How Can You Make it Do-Able?
3.12 Final Thoughts as You Prepare and Plan Your Program Evaluation

Chapter 4: Implementing Your Evaluation Using the SOS Evaluation
Toolkit 43
4.1 What is in the SOS Evaluation Toolkit?
4.2 The Pre/Post Participant Surveys
4.3 Comparison Group Surveys
4.4 Post-Program Only Survey
4.5 Program Staff/Program Implementation Survey
4.6 Community Partner Survey
4.7 Administering the Surveys: Protecting the Rights of Participants and Partners
4.8 Administering the SOS Participant Surveys
4.9 Final Thoughts on Data Collection

Chapter 5: Analyzing and Using Your Data 51
5.1 Analyzing Your Data
5.2 What do you Do with the Results?
5.3 Conclusion
Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
8


Chapter 6: Evaluation Planning and Reporting Tools 53
6.1 Logic Model Worksheet
6.2 Planning Your Evaluation: Key Questions
6.3 Data Planning Worksheet
6.4 Building an Evaluation Plan: Tasks, Roles and Timeline
6.5 Presenting Your Results
6.6 Domains and Scales


Chapter 7: Survey Materials 63
7.1 Pre/Post Participant Survey Materials
7.2 Pre/Post Comparison Group Survey Materials
7.3 Post-Only Participant Survey Materials
7.4 Community Partner Survey
7.5 Semi-Structured Interview Questions

Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
9


Chapter 1: Introduction to the Summer of Service Evaluation Toolkit

he Summer of Service Evaluation Toolkit is designed to provide a basic understanding of program
evaluation and a set of practical evaluation tools for the emerging network of programs providing
summer of service experiences for middle and high school-aged young people. The Toolkit is the
product of a collaborative effort between Innovations in Civic Participation (ICP), a national non-profit
supporting the development of innovative high-quality youth civic engagement policies and programs,
Brandeis University’s Center for Youth and Communities, and a working group of service-learning
practitioners and researchers. The Toolkit is one of a set of Summer of Service (SOS) resources being
developed by ICP with the support of Lumina Foundation for Education. Other products include a SOS
Program Design Toolkit, a program design training module, and a SOS Online Resource Center.

The Summer of Service (SOS) Evaluation Toolkit has been developed in response to the emergence of
summer of service as a new model for providing intensive, hands-on service-learning experiences for
young people, with a particular focus on summer service-learning as a rite of passage for young people
as they make the transition from middle to high school. The goals of SOS programs are to engage young
people in helping to solve critical community problems and, through that experience, build their civic,
leadership and academic skills and promote higher education aspirations. The impetus towards the
development of a targeted SOS initiative came through the work of ICP in the early part of the 2000s,
whose research on summer of service programs culminated in the publication of Summer of Service: A
New American Rite of Passage? in 2006. Continued interest in the SOS programming led to pilot grants
for a small group of programs in 2007 from the Corporation for National and Community Service and,
ultimately, establishment of a new funding stream for summer of service programs through the Edward
M. Kennedy Serve America Act in 2009. The first programs funded under the SOS legislation are being
implemented in the summer of 2010.

The Summer of Service Evaluation Toolkit itself is based on two basic premises:

1. That well-designed, well-implemented evaluations are an essential part of
effective program management and growth. As we’ll discuss later,
evaluations provide the data programs need to both demonstrate effectiveness
(particularly to funders) and to inform program improvement. In the words of
the W.K. Kellogg Foundation’s Evaluation Handbook, evaluation can both prove
and improve programs. If Summer of Service programs are to thrive, they need
the capacity to examine their implementation and impacts and to demonstrate
that they are making a difference for the youth and communities they serve.

2. That service-learning program directors and staff typically do not have the
time and resources to develop a set of evaluation strategies and tools from
scratch. As such, an evaluation toolkit that provides basic guidance on
evaluation and a set of ready-to-use tools can help practitioners begin to build
evaluation into their programs from the start.

Finally, the SOS Evaluation Toolkit also grew out of the recognition that there has been little direct
research on summer of service programs, since they are relatively new, and that the development of a
common set of evaluation tools may help to generate data that can be aggregated across programs to
help assess the effectiveness and strengthen the case for summer of service programs generally. It is
T
Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
10

hoped that, as programs begin using the tools in the SOS Evaluation Toolkit, they will share their data
with ICP and others to further inform the development of Summer of Service as a vital and important
element in the service-learning field.


1.1 What is in the SOS Evaluation Toolkit?

The SOS Evaluation Toolkit has been designed to provide an introductory set of knowledge, skills and
tools to help summer of service practitioners plan, conduct, and use the results of credible and reliable
program evaluations as part of their on-going summer of service efforts. Specifically, we have written
the SOS Evaluation Toolkit with the following two goals in mind:

1. To provide service-learning practitioners with the a basic understanding of
evaluation methods and approaches, so they can effectively plan and implement a
basic evaluation of their own programs; and

2. To provide a basic set of tools that practitioners can use to get started,
recognizing that, over time, practitioners may want to revise and customize these
tools, or develop their own to better address specific aspects of the own
programs. There is no such thing as a “one size fits all” evaluation, but it is often
easier to start with a basic set of tools and procedures and to build from there.

Reflecting these goals, the SOS Evaluation Toolkit is organized into two parts:

Part One is an introductory evaluation primer, designed to provide service-learning practitioners
with a basic understanding of what evaluation is and to outline the steps involved in planning,
conducting, and using a program evaluation.

Part Two provides the tools (planning tools, survey instruments, forms and instructions, etc.)
that SOS practitioners can use in implementing an evaluation of their own programs.

Taken together, Parts One and Two are intended to provide service-learning practitioners with both the
background knowledge and skills and the practical tools they need develop begin proving and improving
the effectiveness of summer of service programs around the country.


1.2 How the SOS Evaluation Toolkit was Developed

The SOS Evaluation Toolkit was designed by the Center for Youth and Communities at Brandeis
University’s Heller School for Social Policy and Management and Innovations in Civic Participation in
conjunction with an evaluation working group comprised of practitioners and researchers with
experience developing and implementing summer of service programs. That working group helped to
develop an initial logic model for summer of service programs with the goal of identifying program
goals, strategies, and outcomes that might be common across a wide variety of summer of service
efforts. The logic model was then used to develop a set of surveys for inclusion in the Toolkit. The draft
logic model, surveys and the toolkit were then reviewed by the working group members and circulated
more broadly to service-learning practitioners for feedback and revisions. The Toolkit also builds on
similar evaluation toolkits developed for the service-learning field – in particular, the Making Knowledge
Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
11

Productive evaluation toolkit developed by Brandeis for the Massachusetts Department of Education’s
office of Community Service-Learning.

1.3 The Summer of Service Program Model

As we will discuss in more detail in Part One, the starting point for any effective evaluation is an
understanding of the goals, key strategies or activities, and expected outcomes of the program being
evaluated. As noted above, there is no “one size fits all” evaluation. Rather, effective evaluation design
means making sure that the data you are collecting and what you are measuring are relevant and
appropriate for the program that you are evaluating.

That said, this evaluation toolkit was developed to reflect a set of basic program goals, activities, and
outcomes that we (Brandeis, ICP, and the evaluation working group) believe are relevant and
appropriate for many of the newly emerging summer of service programs. Those goals, strategies and
assumptions are outlined in the SOS Program Logic Model summarized inside the cover of this Toolkit
and outlined in detail at the end of this section. While programs are certainly likely to differ in many of
their details, we believe that the program model outlined here, and the tools developed to reflect that
model, can be used across a variety of summer of service as the starting point for their evaluation
efforts.

At the heart of the SOS Program Logic Model is a set of four primary goals for the Summer of Service
Program:

 Develop Positive and Empowered Identity: A sense of empowered identity,
strength and purpose (efficacy), an ethic of service and social responsibility, and
ability to conduct critical reflection on themselves and issues in their community).
Summer of Service should foster a generation of public intellectuals.

 Civic Agency: Youth voice and leadership, and a commitment to action and service).
Summer of Service should help prepare young people to become active contributors
in their communities.

 Increased Academic Engagement and Success: Improved academic skills, including
basic literacy; improved higher-order thinking skills; increased motivation; and
increased school success in transitional years, increased awareness and
understanding of college and careers). Summer of Service should provide young
people with opportunities to apply their learning, experience academic success, and
learn about educational opportunities.

 Meaningful Community Impact: Impacts on community needs, increased school-
community collaborations, increased youth voices in community reform, changes in
community and educational programming). Summer of Service should engage
youth in making a real difference in their community.

It is anticipated that those goals will be met through a summer service-learning experience that
integrates meaningful service, summer-style (hands-on, engaging, project-based) learning, youth voice,
led by a well-qualified, high quality staff in an effective, high quality overall program design.

Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
12


In turn, we believe that these experiences can result in changes in civic, social, and academic attitudes
and skills for participating young people. Based on the work of the evaluation workgroup, we have
defined six broad sets of participant outcomes that, we believe, fit a broad array of summer of service
programs. They are:

Understanding Context (Analysis):
 Able to reframe issues from the personal to the social/political.
 Able to place issue in broader community/ social context.
 Able to effectively communicate multiple sides of an argument.
 Distinguish between direct service and root cause social change efforts.

Civic Agency:
 Willingness to step up and speak out. Willingness to talk with peers about issues and engage
them in action.
 Willingness to take action about issues that one cares about, on a range of issues (personal to
societal).
 Take action on issues of concern (examples: initiate a project; start school organization; create
community programs; join existing program/project).

Initiative and Student Action (Civic Skills):
 Understand how public decisions are made and how to influence them.
 Ability to communicate effectively with decision makers.
 Ability to gather and use information needed for civic action.
 Articulate a long-term civic engagement action plan.
 Plan and implement a project addressing an issue in the community.
 Ability to assess the impacts of their actions and revise plans as needed.

Efficacy and Connection:
 Sense of personal and civic efficacy.
 Feel part of community.
 Sense that it is important to be engaged.
 See self as important to community.
 Sense of solidarity as motivation for actions.
 Feel ownership of community resources.

Future Civic Roles:
 Understand potential roles that might be taken in society.
 Increased commitment to future civic involvement.

Academic Outcomes:
 Increased academic motivation.
 Greater school attendance, homework, reduced discipline problems.
 Enhanced reading/literacy/science math skills (depending on project).
 Improved communication and presentation skills.
 Science and math skills for environmental projects.
 Improved meta cognition/learning to learn/ learning strategies.
 Understand link between school decisions and desired career.
Summer of Service Evaluation Toolkit
Innovations in Civic Participation © 2011
13

 Understand steps that need to be taken to become college-ready (course choices, etc.).
 Know how to find information on and apply to college.

We also expect that the work of the young people involved in summer of service will result in
meaningful contributions to their communities.

These goals, program elements, and outcomes form the framework for the tools developed for the SOS
Evaluation Toolkit. Not every outcome has been addressed – to do so would result in tools too
cumbersome for use in a relatively short summer program. But we have attempted to build tools that
address all of the domains to at least some degree, and look forward to developing additional evaluation
tools and procedures as the Toolkit and the field evolve over time.


1.4 How to Use the SOS Evaluation Toolkit

We recognize that service-learning practitioners vary widely in their familiarity and experience with
evaluation. Some are entirely new to evaluation and need a basic introduction; some may feel that they
have heard altogether too much about evaluation (“Oh, no! Not again!”), but might benefit from a new,
more user-friendly approach; and some may have substantial experience and feel comfortable with the
basic concepts. We recommend that everyone start by reading through Part One of the Toolkit, with a
more careful read if you are relatively new to evaluation. If you already have a great deal of experience
with conducting program evaluations, some of this may seem elementary. Even so, you may want to
read through this section to refresh your memory around some of the technical terminology.

Part Two contains a number of evaluation planning, data collection, and analysis tools. Again, we
recommend that before using these tools, you finish reading Part I (Chapters Two through Five), which
provide an overview of the four stages of the evaluation process: Preparing for an Evaluation, Collecting
Data, Analyzing Data and Using the Results.



Innovations in Civic Participation © 2011


Summer of Service Program/Evaluation Logic Model

Mission Assumptions/ Theory of Change Strategies/ Activities Civic and Academic Outcomes
Effective Summer of Service programs are
designed to accomplish the following goals for
participating youth and the communities they
serve:
 Develop Positive and Empowered
Identity (a sense of empowered identity,
strength and purpose (efficacy), an ethic
of service and social responsibility, and
ability to conduct critical reflection on
themselves and issues in their
community): Summer of Service should
foster a generation of public intellectuals.
 Civic Agency (youth voice and
leadership, and a commitment to action
and service): Summer of Service should
help prepare young people to become
active contributors in their communities.
 Increased Academic Engagement and
Success (improved academic skills,
including basic literacy; improved higher-
order thinking skills; increased motivation;
and increased school success in
transitional years, increased awareness
and understanding of college and
careers): Summer of Service should
provide young people with opportunities to
apply their learning, experience academic
success, and learn about educational
opportunities.
 Meaningful community impact (impacts
on community needs, increased school-
community collaborations, increased youth
voices in community reform, changes in
community and educational
programming): Summer of Service should
engage youth in making a real difference
in their community.
 High-Quality Staff: Quality of the personnel
is paramount. Staff need to be adequately
trained and have enthusiasm and
commitment. Staff must understand and be
committed to service- learning. Staff need a
solid understanding of adult role in youth
programs: when to lead and direct and when
to step back. Community partners need to
be receptive and full partners in planning,
delivery of service, and reflection/
assessment. Program needs to be
embedded with community culture to provide
for ongoing opportunities.
 Applied Academics: Academics become
relevant through opportunities to apply skills
in the real world. Service-learning
demonstrates relevance of academics;
deepens and intensifies learning. Academic
learning needs to be an intentional part of
program design and integrated through
hands-on learning.
 High-Quality Program: Programs need to
incorporate elements of quality youth
programs: low adult-student ratio; small
group and one-on-learning opportunities;
“summer style” hands-on learning that offers
an alternative institutional opportunities;
emphasis on collaboration and teamwork,
participation of caring adults, fostering sense
of safety and belonging. Relationships
matter and are an essential part of program
quality.
 Youth Voice: Youth must have voice and
opportunities for leadership. Service-
learning includes dialogue on the bigger
picture and root causes and an emphasis on
youth decision-making and leadership to
youth a sense of own power and efficacy.

Understanding Context, Analysis:
 Participants engage in community
mapping and/or other activities to
identify issues/needs in the community.
 Participants have the opportunity to
explicitly discuss connection between
community issues, service work, and
root causes of problem or issue.
 Participants have direct contacts with
community partners that include
discussions of context/rationale for
service projects.
 Activities expose participants to diverse
points of view.
Understanding Context, Analysis:
 Able to reframe issues from the personal
to the social/political.
 Able to place issue in broader community/
social context.
 Able to effectively communicate multiple
sides of an argument.
 Distinguish between direct service and
root cause social change efforts.
Civic Agency:
 Participants have opportunities to make
importance decisions regarding their
projects.
 Participants have opportunities to
exercise leadership roles on some
aspect of their projects.
 Participants have real
responsibilities/tasks on a project that
they see as significant
 Student take meaningful action on an
issue in the community.
 Participants have opportunities to talk
with/engage others (students,
community members, etc.) in their
projects.
 Programs/projects have sufficient
duration and intensity so that
participants can become deeply
involved in a community issue/problems
and carry out a meaningful project.
 Participants have opportunities to reflect
on their experience and what they are
learning about their role in the
community.
Civic Agency:
 Willingness to step up and speak out.
Willingness to talk with peers about issues
and engage them in action.
 Willingness to take action about issues
that one cares about, on a range of issues
(personal to societal).
 Take action on issues of concern
(examples: initiate a project; start school
organization; create community programs;
join existing program/project).

Innovations in Civic Participation © 2011


Mission Assumptions/ Theory of Change Strategies/ Activities Civic and Academic Outcomes
 Constructive, safe, sustained and
intensive opportunities. Meaningful
service takes time, so there has to be time
for meaningful service: summer provides a
unique opportunity to accomplish this.
Summer of service should be of sufficient
duration and intensity to provide meaningful
experiences for participating youth. Out of
school time provides time to foster youth
engagement; focus on complex thinking;
and make a difference in the community.
Summer also provides time to access
resources, community, adults, and other
youth.
Initiative and Student Action (Civic Skills)
 Participants conduct research on an
issue or problem and policies and
public decision-makers /civic leaders
who can who need to be involved in
addressing the problem.
 Participants have opportunities to meet
with/communicate with decision-makers
related to their project.
 Participants develop an action plan for a
project addressing a community need.
 Participants implement action plan,
including revising plan as needed and
tracking progress towards goals.
 Participants take steps to assess the
community impacts of the project and
assess possible next steps.
Initiative and Student Action (Civic Skills):
 Understand how public decisions are
made and how to influence them.
 Ability to communicate effectively with
decision makers.
 Ability to gather and use information
needed for civic action.
 Articulate a long-term civic engagement
action plan.
 Plan and implement a project addressing
an issue in the community.
 Ability to assess the impacts of their
actions and revise plans as needed.
Efficacy and Connection:
 Participants have the opportunity for
team activities and problem solving.
 Participants have opportunity to present
project results to other community
members and/or community leaders.
Efficacy and Connection:
 Sense of personal and civic efficacy.
 Feel part of community.
 Sense that it is important to be engaged.
 See self as important to community.
 Sense of solidarity as motivation for
actions.
 Feel ownership of community resources.
Future Civic Roles:
 Participants have multiple opportunities
to reflect on what they have learned
through their project and how they might
continue as civic actors.
Future Civic Roles:
 Understand potential roles that might be
taken in society.
 Increased commitment to future civic
involvement.
Academic outcomes:
 Teachers make clear connections
between service activities and academic
skills.
 Reading, writing, and critical thinking/
problem-solving are integrated into
community action through intentional
activities/exercises.
 Participants have opportunities to use
communications and presentation skills,
such as writing letters or brochures,
making public presentations, etc.
 Participants have an opportunity to learn
about career and college options related
to their service-learning activities.
 Participants have opportunities to learn
about resources that can be used in
learning about and applying to college.
 Student reflection includes reflection on
uses of academic skills in service-
learning and relationship of school
success, college-going, and careers.
Academic Outcomes:
 Increased academic motivation.
 Greater school attendance, homework,
reduced discipline problems.
 Enhanced reading/literacy/science math
skills (depending on project).
 Improved communication and presentation
skills.
 Science and math skills for environmental
projects.
 Improved meta cognition/learning to learn/
learning strategies.
 Understand link between school decisions
and desired career.
 Understand steps that need to be taken to
become college-ready (course choices,
etc.).
 Know how to find information on and apply
to college.
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 18


Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 19

Chapter 2: A Program Evaluation Primer



“Program evaluation is the systematic collection of information about the activities, characteristics, and
outcomes of programs to make judgments about the program, improve program effectiveness, and/or
inform decisions about future programming.”
Michael Quinn Patton, Utilization-Focused Evaluation

“Evaluation is the systematic assessment of the operation and/or the outcomes of a program, compared
to a set of explicit or implicit standards, as a means of contributing to the improvement of the program
or policy.”
Carol H. Weiss, Evaluation

“Project-level evaluation can be defined as the consistent, ongoing collection and analysis of information
for use in decision-making.”
W.K. Kellogg Foundation, Evaluation Handbook


Much has been written about evaluation. There are toolkits, handbooks and textbooks that cover a
wide array of content areas and structures using various approaches and degrees of sophistication. (See
the bibliography provided in the appendix). This chapter of the SOS Evaluation Toolkit is not meant to
replace these existing resources – but it does highlight a few of the key lessons and components that are
essential to making an evaluation process useful and credible.

We recognize that most program directors, service-learning coordinators, and educators are stretched
for time - that you would love to better document your successes, identify shortcomings and gather
information needed to inform changes, but do not have lots of time to invest in planning and conducting
an evaluation.

At the same time, we recognize the types of problems that can arise if you neglect to begin evaluating
your programs early on, or if you try to throw together an evaluation at the last minute. Evaluations
have become essential tools in the effort to maintain funding for service-learning and other youth and
education programs. And evaluating service-learning, like service-learning itself, is a process that
requires preparation, action and reflection. Knowing why, how and for whom you are doing an
evaluation is as important as administering a survey and collecting information. This chapter helps
provide an overview of program evaluation to help you understand what a program evaluation is and
whether (and how) you should be doing one.

Finally, it is difficult to go too far into the world of program evaluation without encountering a good deal
of “jargon.” This is unfortunate, and we will do our best not to add to the problem. We hope the rest
of this chapter will help de-mystify the practice and language used by professional researchers and
evaluators and explain evaluation basics in simple, clear and direct language.




Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 20

2.1 What is Program Evaluation?

Any of the three definitions at the beginning of this section could serve as an effective definition of
program evaluation, and all three emphasize a common set of ideas:

 That evaluation is based on the systematic or consistent collection of information
(“data”). Evaluation is more than just telling stories or coming up with examples.
Rather, evaluation requires a systematic approach to collecting data so that we
know we are basing the evaluation on information that fairly represents what is
going on in the program.

 That the data/information is used to make judgments about the program, generally
by comparing evaluation results to some standard or expectation. Doing evaluation
requires you to be clear about what you are trying to accomplish in your program.

 That the results are intended to be used to inform program decisions and improve
program operations. Evaluation is a practical enterprise – it is intended to provide
practical, useful information for practitioners and policy makers. That means that
the questions that guide your evaluation should be questions that you and your
colleagues want to answer, and the data that is collected should help you better
understand how your program is operating and whether it is accomplishing the
goals that you have defined.

At its simplest, then, program evaluation is the systematic collection and analysis of data to prove and
improve the effectiveness of programs and/or organizations.


2.2 Why do Program Evaluation?

People do program evaluations for a variety of reasons. Many
evaluate their programs because it is required of them by their
organization’s Board or their funders. Others recognize
evaluation as an important strategy for building support for
their programs and for that all-important reason: raising
money! But many others also engage in evaluation because it
is good practice: evaluation is a way to learn how well your
program is doing, whether it is accomplishing its goals, and to
better understand why or why not:

 Is our program being implemented the way we had
planned – are the key activities taking place as
expected? What kinds of problems or barriers are we
encountering? What steps can we take to address
those challenges and improve our operations?

 Is our program having the intended effects for
program participants? Are participants gaining the
knowledge, skills or attitudes that we had planned? If not, what can we learn about
Why do evaluation?
 To clarify goals of a
program and define specific
outcomes that you are
trying to achieve
 To track how programs are
being implemented in
different ways
 To identify areas for
program improvement
 To inform public policy
 To prove your cases/tell
your story to the world
 To help raise money
 To engage stakeholders in
the process of program
reflection and improvement
 To meet the requirements of
a grant
 To celebrate
accomplishments.

Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 21

why not – are there problems with the program design, how it is being
implemented, issues of staffing, time, cooperation with community partners, etc.?

 What is the nature of the program experience for our program’s participants and
others involved in the program? What lessons can we learn about ways of
strengthening the program and, perhaps, improving program outcomes.

In the end, most evaluations have the capacity to serve multiple purposes: to meet requirements,
document accomplishments, provide data that can be used to promote the program, and inform
program practice. To the extent that evaluation is seen as addressing multiple needs (and not just
meeting an outside requirement), it is often easier for program staff, managers, and other stakeholders
to see the value of investing the energy and time needed to get it done.


2.3 What are the Different Types of Program Evaluations?

A Vision for Evaluation

Much of what is presented in this toolkit reflects the approach to evaluation that is summarized in the
Evaluation Handbook created by the W.K. Kellogg Foundation for its grantees. The Handbook
emphasizes its vision of evaluation as a practical tool for program improvement as well as a means of
proving program results: Several key ideas inform that vision:

• Our vision for evaluation is rooted in the conviction that project evaluation and project
management are inextricably linked. In fact, we believe that “good evaluation” is nothing more
than “good thinking.”

• Effective evaluation is not an “event” that occurs at the end of a project, but is an ongoing process
which helps decision makers better understand the project; how it is impacting participants, partner
agencies, and the community; and how it is being influenced/impacted by both internal and external
factors.

• Thinking of evaluation in this way allows you to collect and analyze important data for decision
making throughout the life of a project: from assessing community needs prior to designing a
project, to making connections between project activities and intended outcomes, to making mid-
course changes in program design, to providing evidence to funds that yours is an effort worth
supporting.

• We also believe that evaluation should not be conducted simply to prove that a project worked,
but also to improve the way it works. Therefore, do not view evaluation only as an accountability
measuring stick imposed on projects, but rather as a management and learning tool for projects, for
[funders], and for practitioners in the field who can benefit from the experiences of other projects.

From the W.K. Kellogg Foundation, Evaluation Handbook

Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 22


Just as there are many typologies to categorize and understand community service and service-learning
programs, so too, there are many typologies to categorize and understand program evaluation. Exhibit
1 provides one useful approach to understanding evaluations by illustrating two general types of
evaluation: process and outcomes evaluations.

Process Evaluations attempt to answer questions about the implementation of a program and what
aspects of the program are working or not, and why. Process Evaluations tend to focus on such things
as: describing the program context, documenting activity, and understanding the program’s processes.

 Describing context includes evaluation approaches designed to place the program
you are evaluating in context, in terms of understand the organizational context,
target group, or community needs and assets that your program will address.
Often, the context in which a program takes place, including prior efforts, strengths
and weaknesses of the organization, administrative support, and local culture can
have an effect on how effectively the program is implemented and operates. As
such, it is often important to understand context as well as implementation.

 Documenting activities provides basic descriptive information on the inputs and
outputs of the program. Inputs refer to the resources that a program requires to
operate (e.g. - people, space, time). Outputs refer to the direct products of a
service-learning program, such as the number of service hours that students
complete. While it is rarely enough for an evaluation to only document inputs and
outputs (for example, by counting hours), activity data is usually an important
starting point for describing who is being served and what the program is doing.

 Understanding processes describes how the program works – the ins and outs of
how it is operating. That is, how well is your program working? What are its goals?
Are the program activities taking place in the way you expected? Is the program
meeting the standards for a quality service-learning experience? What kinds of
challenges are you encountering that might affect program results? What lessons
are you learning about what works or what you might change in the program
design or operations. While you may use lots of numbers in describing context or
documenting activities, understanding the process often relies more on qualitative
data, such as information from interviews with staff and participants.

Outcomes evaluation focuses on the results of your program. An outcomes evaluation asks what kinds
of changes are taking place as a result of your program’s activities and how well you are meeting your
goals in the light of the program’s intended result? The two columns to the right of the table are
examples of Outcome Evaluations.

When assessing the results of a program and the changes taking place as a result of the program’s
activities, evaluations often distinguish between studies of program outcomes and program impacts.

 Outcomes evaluations look at changes in the knowledge, behavior, skills, or
functioning in the target participant, institutions and/or community of the program.
When you study program outcomes, you are basically asking “did the expected
changes take place?” For example, did service-learning participants show a change
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 23

in attitudes towards service or an increased interest in future volunteer service?
Was there a drop in the amount of litter in a park where a community clean-up
campaign took place?

 Impact evaluations take the question a step further, asking “did the observed
changes take place as a result of the program? In a simple sense, impact
evaluations ask “did the program make a difference?” For example, can we prove
that an increased interest in civic engagement was a result of the service-learning
program, or did that change in attitudes take place because of some other event or
experience that influenced all of the young people in the community (for example,
another program at school, an exciting election, an event in the community, etc.)?

Generally, where outcome studies look at change over time, impact studies
compare the changes that took place among program participants to some kind of
comparison or control group of similar individuals or settings that were not part of
the program. The key question guiding an impact evaluation is “would these
changes have taken place in the absence of our program?” While most of us in the
service-learning field generally think of impact studies as focusing on impacts on
participants (for example, changes in civic attitudes or skills), impact studies can
also look at changes in institutional and community outcomes. For example, a
study might look at the impact of involvement by a school’s teaching staff in
Summer of Service on a school’s use of service-learning during the school year;
similarly, a study might examine the impact of a service-learning project at a senior
center on attitudes of elders in the community towards young people. In both
cases, we might look at places that service-learning was not taking place as points
of comparison.

Most evaluations will try to include a combination of process and outcomes evaluation in order to be
able to answer questions about both about what happened and why. By combining process and
outcomes studies information, you are more likely to be able to make the connections between
program implementation and participant or community outcomes that are needed to fully understand
your evaluation results.

2.4 What Kinds of Data Can I Use in an Evaluation?

Most of the time, when people think about evaluation, they associate evaluations with the use of
surveys. And surveys are an important, widely-used tool for evaluation. They have the advantage of
being relatively inexpensive to administer, not as time-consuming as individual or group interviews, they
can be designed to collect a broad range of data reliably, and they provide a means of collecting data
systematically within a single program site or, using the same survey tools, across a large number of
sites. For all those reasons, the major tools provided in this SOS Evaluation Toolkit are surveys – for
participants, educators, and community partners.

At the same time, surveys represent only one method of collecting evaluation data. Process evaluations
often use qualitative data (data than is not readily “countable”) to describe how a program is being
implemented. Qualitative data might include reviews of program records or reports; interviews with
program administrators, staff, or participants; focus group discussions, or program observations, as just
a few examples. While qualitative data may not be easily countable, you can still take care to collect it
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 24

through a systematic approach so that it reflects the program experience fairly. That might mean having
a plan to talk with multiple program staff or participants, chosen by lottery, for example, or to visit
multiple classrooms/program sites. Qualitative data is often used in outcomes studies as well. In some
cases interviews, focus groups, journal reflections might be used to provide a more in-depth
understanding of how a program is affecting participants; in some cases, those qualitative sources can
be used as quantitative data as well, by coding the responses and counting the frequency of different
ideas or issues. Again, it is important to think about how the qualitative data is being collected – are you
choosing a few good examples, or a more representative mix of people or materials.
Exhibit 1: Types of Evaluation
Process/Implementation Evaluation Outcomes Evaluation
Describing
Context
Documenting
Activity
Understanding
Process
Assessing
Outcomes
Assessing
Impact
S
a
m
p
l
e

Q
u
e
s
t
i
o
n
s

f
o
r

A
n
a
l
y
s
i
s

 Who is my
target group?
 What are the
needs of my
target group or
community?
 What personal
or collective
strengths or
assets exist
within the
target
community?

 How many
participants?
 What are
characteristics of
participants?
 How many hours
of service?
 What types of
activities?
 What are the
Costs/Funding
for the program?
 Was the program
implemented as
expected?
 What parts of the
program worked
well, and what parts
need to be
strengthened?
 Did the program
staff have the
resources and
training they
needed?
 Did the program’s
design and activities
match the
program’s goal?
 Did the
expected
changes in
participant
attitudes and
behavior take
place?
 Were there
changes in
attitudes,
instructional
strategies, etc.
among
teachers/
leaders during
the program?
 What
contributions
did the program
make to the
community?
 What
difference has
the program
made (relative
to no program
or a different
program)?
 Would the
observed
changes have
taken place in
the absence of
the program?
S
a
m
p
l
e

D
a
t
a

S
o
u
r
c
e
s
/

M
e
t
h
o
d
s

 Public data
sources (i.e.-
census, etc.)
 Organizational
reports.
 News clippings
 Traffic counts
 Existing
databases or
surveys.
 Interviews with
local leaders,
program
administrators,
community
partners.


 Time sheets
 Attendance
records
 Work products
 Activity logs
 Participant
reports
 Enrollment/sign-
up forms
 Surveys of teachers
and community
partners as
appropriate
 Interviews with
stakeholders
 Focus groups
 Observations
 Portfolios/
projects
 Student self-
report (essays,
surveys, etc.)
 Pre/Post
surveys on
student
attitudes and
behavior
 Grades/school
record
information.
 Supervisor
interview or
evaluation.
 Focus groups.
 Observations.
Generally
impact
evaluations
include pre/post
assessments of
participants
compared with
a random
assignment
comparison
group (called an
“Experimental
Design” study),
or with a
comparison
group selected
in other ways (a
“Quasi-
Experimental
Design.” )
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 25


Quantitative data is information that can be readily counted. Most of us are familiar with multiple
choice surveys where the responses can be tallied and translated into counts and percentages.
Quantitative data might also include information on numbers of participants, service hours, grades or
attendance data. The advantage of quantitative data is that it is often easier to collect across large
numbers of people, it uses a consistent format, and provides very specific, consistent information. On
the other hand, you are often limited in the kinds of information you can collect in a quantitative
format. So, it is often useful to use a mix of quantitative and qualitative data – numbers and stories – in
order to both demonstrate results in an easy to understand form and to be able to explain the how and
they why for those results.


2.5 What Role Do Surveys Play in Program Evaluation?

Many people think of evaluation and surveys as the same thing. As the section above suggests, surveys
are an important part of many evaluations, but they are not the only way to do evaluations. For
example, evaluations can also include review of written materials, data collected from existing records
(such as grades or attendance records), observations of activities, interviews, and focus groups.

For the Summer of Service Evaluation Toolkit, we do focus on using surveys as a tool for evaluating your
program, and we provide several sets of surveys that we hope practitioners will use or adapt for their
programs. While we focus on surveys here, we want to emphasize that they are only a part of the
evaluation process. We want to encourage practitioners to think about other kinds of data they might
collect to help them evaluate their programs. The evaluation planning tools in Part Two should help you
think about other kinds of data that you might collect for your program.


2.6 Using Surveys to Assess Outcomes and Impacts

Once you decide that you are going to use surveys as a tool in evaluating your program, there are
several ways in which they can be used. Surveys can be given at one point in time (“post-tests” or “post-
only surveys”) or two points in time (“pre/post surveys”).
1
You can also give your surveys to just your
program participants (in order to assess outcomes – change over time), or you can include a
comparison or control group in your study to look at impacts (differences in the degree of change
between your participants and comparison group).

 A post-only survey is designed to be given only once at the end of the program, and
it generally assesses change by asking participants (or teachers, or community
partners) about their program experience and how their attitudes or skills or
behaviors changed as a result of their involvement in the program. Different
surveys use different formats, but a typical question might read: “As a result of this
program, I learned about issues in my community” (Strongly agree, agree, neither
agree nor disagree (neutral), disagree, strongly disagree), or “How much did you
learn about issues in your community as a result of this program? (None, A Little, A
Lot).

1
You can also do surveys at multiple points in time (longitudinal surveys), for example to track participants during and after their
program participation (pre/post/follow-up). While longitudinal survey designs can provide invaluable information about the longer-
term impacts of service-learning programs, we do not talk about them in this Toolkit.
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 26


 A pre/post survey design surveys participants twice: once at the beginning
(baseline) and again at the end of the program (post-program). A pre/post
approach allows you to estimate the changes that have taken place during the
program by comparing the answers given to the same question at the two points in
time. An example of a question on a pre/post survey design might be: “I know
about the issues that are important in my community.” (Strongly agree, agree,
neither agree nor disagree (neutral), disagree, strongly disagree). The evaluator
would then look to see if, on average participants’ answers to that question
changed from baseline (beginning) to post (end) of the program.

There are advantages and disadvantages to each of these approaches. Post-only surveys are the easiest
to administer and take the least amount of time, since you only have to survey participants once. In a
short-term program, using a post-only survey can avoid the “survey fatigue” that often occurs when
participants are asked to take several sets of surveys in a relatively short amount of time. On the other
hand, post-only surveys are seen by some as somewhat less reliable measures since they depend on an
accurate assessment of change by the participants themselves (do they remember what they felt at the
beginning of the program, for example). Also, since they represent only a single point in time, the
statistical analysis that can be done is more limited.

Pre/post surveys are often considered a more rigorous form of assessment, since they represent two
relatively independent assessments. As such, participants only have to report on their attitudes,
knowledge or skills at the time of the survey -- there are fewer “recall” issues (i.e., accurately
remembering back to baseline). Because pre/post surveys provide data over two points in time, one
can apply a more rigorous statistical analysis. For example, with outcomes measured at two points in
time, we can determine whether the differences between pre- and post-program results are
“statistically significant” – that whether they are large enough to represent a real change or simple a
random variation. Similarly, in studies using a comparison or control group, pre/post surveys allow for a
much more rigorous and sophisticated analysis.

The disadvantages of the “pre/post” approach is that it takes twice as much time to administer the
surveys, and in a relatively short program, the time involved may be disproportionate to the time in the
program as a whole (that is, staff and participants may feel that they are spending more time doing
surveys than providing service!). Pre/post surveys also tend to be more expensive, since there are two
rounds of surveys to collect, and require more time managing the process to ensure that programs
complete and return both sets of surveys.

Ultimately, the decision about what approach to use comes down to a question of the purposes of the
evaluation and the available resources – questions that you want to address as part of your evaluation
planning process (see Part Two). In general, if you are producing evaluation results for an external
audience, if you are interested in assessing impacts (with a comparison/control group), or if you are
interested in generating data that can be used for a more statistically sophisticated analysis, you may
want to consider using a pre/post design. If the program you operated is of relatively short duration, or
the results are primarily for internal use, or you are trying to get some initial feedback from participants
in a new program, a post-only survey may be the most effective way to gather data.


Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 27

2.7 Assessing Impacts – Where does a Comparison Group Come in?

Knowing that a change in participant attitudes and skills or in other outcomes of interest has taken place
during a program is important. But, for many audiences, simply demonstrating improvements is not
enough to demonstrate the success of your program – you need to show that the change would not
have taken place in the absence of the program experience you provided. For a Summer of Service
program, for example, some might argue that the simple fact that young people are older might lead to
changes in their attitudes about the community or that another summer experience would provide the
same gains in civic skills, or than external events (such as a local election) was responsible for any
changes that you found. In some cases you might be able to simply make the argument that, given the
specific nature of the SOS experience, it is reasonable to associate any changes with the program
experience. But for a more skeptical audience, the traditional way to demonstrate that it is your
service-learning program that is causing the change is to compare the results for your participants a
group of young people who are not participating in your program.

What a comparison or “control” group allows you to do is to look at how your participants changed
versus how the comparison group members changed during the same period. The difference in the
amount of change between participants and non-participants is a measure of the impact of your
program.




There are two basic approaches to conducting an impact study. The first is called an experimental
design or “Random Control Trial” design. In an experimental design study, young people are assigned to
the participant or non-participant/control group randomly – much like the process used in a medical
experiment or tests of new drugs. The goal in this type of design is to be able to eliminate any
unmeasured differences between participants and non-participants that might bias the results. For
programs like Summer of Service, the issue of bias is an important one to consider because young
people who enroll in an SOS program may be more interested in civic engagement from the beginning
and more receptive to being influenced by the program than young people who decided to do
something else that summer. This is called “selection bias” and if you don’t take it into account, it can
undercut the credibility of your evaluation.



At the same time, experimental design studies raise a number of difficult issues for practitioners (and for
parents and young people) and, as a result, are often difficult and expensive to design and carry out (see



Participant Change
Comparison Group Change
Participant/Comparison
Difference = Impact
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 28

What is a “Random Assignment Design” Study?
Increasingly government agencies, such as the U.S. Department of Education, and
some foundations have made statements in support of a well-respected approach to
program evaluation called “random assignment design” or “Random Control Trial” studies.
Random assignment is based on the scientific method and refers to the process of
randomly selecting (i.e.- by lottery) people or groups who will participate in a program (and
the study) and individuals who will not be allowed to participate and who will serve as a
“control” group. For example, one group of youth might be randomly assigned to a
summer service-learning program, while others would be assigned to a regular summer
camp or a non-service-learning summer program. We might expect (hypothesize) to see a
greater increase in civic commitment among those that participated in service-learning
than those who did not.
Random assignment experiments represent the most rigorous form of research design
and have become the most widely accepted way to obtain evidence for the impact of a
program. Its great strength is that, since individuals are randomly sorted into or out of the
program, it controls for any unmeasured differences between those would normally have
selected to participate or not (for example, because they have a prior interest or are more
motivated). At the same time, there is also widespread recognition among professional
evaluators that random assignment carries both practical and ethical challenges. Many
programs are reluctant to exclude anyone from participation, for example, or have difficulty
recruiting enough additional applicants to make random assignment possible. Further,
random assignment studies still demand proper tools and instruments free of bias and
careful monitoring. For these reasons, such studies can be difficult and expensive. We do
not recommend that practitioners attempt to create their own random assignment control
studies without professional researchers.


the box below for a discussion of random assignment studies). The alternative approach is called a
quasi-experimental or comparison group design study. In a quasi-experimental design study, a
comparison group of non-participants is identified and recruited, usually from among participants in
another class or program. The goal in developing a quasi-experimental design study is to try to find
comparison group members who are as similar as possible to the participants in terms of demographic
characteristics, academic background, interests, etc. The more similar the comparison students are to
those in your program, the more credible the comparison. A comparison group design is generally much
easier to put in place and more practical for smaller evaluations, but is not considered as reliable as
random assignment.

While it is important to understand the basic differences between an outcomes evaluation (participants
only) and an impact study (participants and comparison or control groups) and between experimental
and quasi-experimental designs, we expect that most users of the Summer of Service Evaluation Toolkit
will be starting with a relatively simple participant-only (outcomes) evaluation. But it is useful to
understand the options and trade-offs so you can make informed decisions about the kinds of
evaluation you want to do now and in the future.






Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 29

2.8 A Few More Terms You May Encounter

This section has provided a brief introduction to some of the key concepts in program evaluation. We
realize that we have had to introduce some technical terminology. Nevertheless, we hope that this
provides you with a relatively easy-to-use resource to approach your program evaluation work
thoughtfully and confidently.

There are a few additional terms and concepts that you might want to be familiar with as you begin to
think and talk with others about evaluation.

 Reliability refers the consistency or dependability of your measures or indicators.
It refers to the notion that if you were to implement the same evaluation tool
multiple times with the same people and under the same circumstances, it would
produce the same information and findings each time.

 Validity refers to the degree to which you can be sure that the data is actually
measuring what you hope to measure. For example, if a survey question is
confusing or hard to understand, it is possible that the young people who are
responding may think you are asking something different from what you had
intended. In that context the information gathered through the survey would not
be valid. One way to check validity is to have a group of young people or
practitioners review any surveys you use to make sure they are clear and easy to
understand.

Taken together, reliability and validity are a good way to think about your own tools. Are they as
reliable and valid as possible?

 Statistical Significance is a way to test of the validity of a quantitative result. A
statistically significant finding is one that is large enough to be unlikely to have
occurred only by chance – that is, a statistically significant finding is likely to
represent a real difference and not just a random variation in survey results. In
general, a result (for example the change on a measure from baseline to post
program) is considered statistically significant if the probability of a change that
large happening at random is less than 5 in 100 times or 5% (p< .05). A strongly or
highly significant finding refers to a finding that would occur randomly fewer than 1
in 100 times (p <.01). It is important to recognize that the likelihood that a
particular result will be statistically significant is determined by a number of factors,
including the size of the change and the number of people in your study. As a
result, studies of smaller programs may not generate many statistically significant
results. But where a program or its results are large enough, the fact that a result is
statistically significant helps to confirm that real changes are taking place during the
program.

The chapters that follow are designed to help you translate these general ideas about evaluation into a
solid, practical evaluation of your own Summer of Service program. Chapter Two walks you through
some of the steps you might want to follow in planning and carrying out your evaluation project. Part
Two provides you with tools and resources that you can in putting your evaluation into action.

Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 30


Chapter 3: Preparing for Your Program Evaluation

There is an old adage that “those who fail to plan, plan to fail.” It applies to program evaluation much as
it does to many other endeavors. As the first chapter suggests, there are a number of questions that
you need to answer in the course of planning your evaluation. Who is the audience for your evaluation?
What types of questions do you want answers to? What kind of data do you need to collect to answer
those questions? How do you plan to use the data that you collect? We believe that taking the time to
answer those questions up-front is a key step in developing an evaluation that is both credible and
useful to you and the others interested in your program. We think there are three key steps in that
process:

1. Pull together your Evaluation Team.
2. Build a Program Logic Model to Define Your Program
3. Develop an Evaluation Plan

This chapter and the tools associated with it in the second part of this toolkit focuses on the first two
steps – creating an Evaluation Team and a Program Logic Model. Chapter Three focuses on the
questions that guide the development of your evaluation plan.


3.1 Getting Started: Create an Evaluation Team

We believe that the first step in planning any evaluation is to create an “evaluation team” for your
project that includes the key “stakeholders” in the program and the evaluation. “Stakeholders” are
those people with an interest in your program or evaluation, and might include program leadership and
staff, Board members, funders, program participants, and others. Once gathered, the evaluation team
can help you to define your evaluation questions and begin to plan the steps you need to take for your
evaluation. By involving others in the evaluation process, you can help to make sure that you get
needed buy-in from key stakeholders, that you are asking the questions that key decision-makers want
answered, and that the evaluation addresses the most important elements of your program. This early
involvement will help to ensure that these same
stakeholders accept and appreciate the results
of your program evaluation later and that the
evaluation itself can be carried out effectively.

As suggested above, when you think about who
to invite to be part of the evaluation team, you
may want to consider both the decision-makers
from whom you want support, and the ‘worker
bees’ who will help you get the job done. You
should also be sure to think about how to
involve young people as part of the process –
program participants or alumni, members of a
youth advisory group, or other youth
representatives. They can help you think about
what is important to participants, as well as how
Question: What Roles can young people
play in an evaluation?

Answer: All of them!

Young people can be involved in:
 Planning and Design Team
 Instrument Design
 Data Collection (Surveys, Interviews,
Site Observation, etc.)
 Data Analysis
 Reporting/Presentation/Discussion

Some programs have even used their
program evaluation as a service-learning
project!
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 31

arrange the evaluation process so that young people are most likely to participate.

In short, your evaluation team might include some of the following kinds of people:

 Youth participants and/or alumni
 Project Funders
 Project Staff and Administrators
 Community Leaders
 Collaborating Agencies
 Others with an Interest in Program Effectiveness


3.2 Using a Logic Model to Define Your Program

Once your evaluation team is in place, you need to clearly define the key elements of the program that
you are evaluating, including who you are serving, what activities or services your are providing, and the
outcomes you expect. A now common approach to do this involves developing a logic model. A logic
model is a picture or a map of how your program works. It brings together in one place that basic
description of your program: who you are planning to serve, the theory and assumptions underlying
your program, key strategies and expected outcomes. As such, it helps you to see how the pieces of
your program fit together and what processes and outcomes you might want to include in the
evaluation.


3.3 Why Use Logic Models?

There are a number of reasons why it makes sense to develop a logic model for your program (and to
continue to review and revise it as your program develops):

1. Improved Program Design. The process of creating a program logic model helps clarify your
thinking about the program, how it was intended to work, and what adaptations need to be
made once it is operational.

2. A Starting Point for Management Improvement and Evaluation. A logic model makes the
program design explicit so you can decide more systematically what pieces of the program to
study and what outcomes are important to track.

3. Understanding Complex Initiatives. In complex programs or initiatives, a logic model can lay
out interim outcomes, highlight assumptions, and make it easier to identify gaps in the thinking
about how program activities might lead to hoped-for outcomes.

4. Partnership Building. The process of developing a logic model requires stakeholders to work
together, to clarify the rationale for the program and the conditions for success. The model
becomes a focal point for discussion and a means of creating shared ownership and
understanding among the stakeholders.


3.4 How do I develop a logic model for my program?
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 32


Many people sit down and try to develop a program logic model by themselves. We think that is a
mistake – much of the value of creating a logic model comes from the discussion and exchange that
occurs during the process as people share their understanding of the program. Consequently, we
strongly recommend developing the logic model with your program evaluation team as a way of
building a common understanding of the program and buy-in for the evaluation.

It is important to note that there are a number of different formats or approaches for logics models,
with no single “correct” or “official” version. In this chapter, we are using a relatively simple version that
focuses on a few key elements: clarifying your mission, looking at who you are serving, and your
assumptions, strategies, outputs, outcomes, and long-term impact. Other versions may include a
section on inputs (i.e. resources that you need), divide outcomes into short-term or long-term, or drop
the section on ultimate impacts (which may feel like it duplicates your mission statement). The tools we
provide represent one model – look around and find an approach that is comfortable for you.

The basic approach to logic models used here asks for answers to the following questions:

 What is the mission or goal of your program? What are you trying to accomplish?
In the Summer of Service Logic Model at the beginning of this guide, the mission
includes building a sense of civic agency and academic engagement among young
people. Talking about this helps you to focus on your goals and on the ultimate
outcomes you want to measure.

 Who is your program is for? Who are you trying to serve? Are you focused on
younger or older youth? On low income or educationally disadvantaged young
people? Thinking about whom you are serving helps to think about whether there
are specific strategies or services that need to be part of your program design.

 What are the key assumptions driving your program (some people call this your
“theory of change”)? Basically, what ideas do you hold about what your target
population needs, what kinds of services you think are effective, what needs to be
in place for your program to be effective. Your assumptions/theory may be
research-based (for example, a statement that reflection is a critical element in
successful service programs, based on the service-learning research), or based on
your practical experience (for example, that hands-on involvement of community
partners helps young people engage in an issue).

 What are the key activities or strategies do you that make up your program
approach? Are there specific steps, activities or experiences that make up your
program experience or that ensure a “quality” program experience (for example, a
certain number of hours of service or contact with community partners or specific
types of reflection)? If so, what are they? In the logic model in this guide, for
example, key activities include conducting research on a community issue, having
opportunities for leadership roles, and taking action on a community issue.

 What are the immediate outputs of these activities or strategies? As we discussed
in Chapter One, outputs might be simple measures of activity (hours of service) or
products (trails cleared, trees planted, etc.).
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 33


 What outcomes do you expect to achieve as a result of the program? What kinds
of measurable changes in knowledge, skills, attitudes or behaviors to you expect to
take place among program participants? What changes in institutional supports or
in community practice might take place as a result of program activities. When
thinking about outcomes, it is often useful to think about short-term, intermediate-
term, or longer-term outcomes: what do you expect to see during the program, by
the end of the program, or sometime after the program is completed.

 What is the expected long-term impact of your program? In this case the term
“impact” is being used to ask how will the world be different over the long term?
What, in broad terms are we trying to accomplish (World peace? A new generation
of citizens?). Most evaluations do not have a chance to measure these kinds of
long-term impacts, but it is helpful to have them in mind so you maintain a clear
sense of what you are ultimately trying to accomplish with your program.

It is worth noting that different people develop their logic models in different ways. Some groups like to
start with the mission and then work left to right – talking about whom they are serving, assumptions,
strategies and outcomes in that order. Others like to work back and forth – starting by talking about
who they are serving, but then jumping to the desired outcomes, then working back towards needed
services and the assumptions underlying their approach. Once again, there is no one “right” way to do
this. What is important is to take on the challenge in a way that works for you and your “team.”

Finally, when you are done, the elements of your logic model should fit together. For example, there
should be a clear relationship between the outcomes you hope to achieve and the services or activities
you have outlined for your program. If you have included an outcome, but there is no clear strategy
leading to that outcome, your logic model (and your program) is likely missing an important piece.

Exhibit 2 summarizes the basic elements of the program logic model and provides a format for
displaying the logic model information (a blank template is included in the “tools” section). The box
below includes some questions you might ask yourself once you have created a draft logic model.

Questions to Ask Yourself Once You Have a Draft Logic Model

 Is it reasonable to expect that the initiative’s planned strategies will lead to the
expected outcomes?
 Are all the outcomes supported by strategies?
 Are all target groups, strategies, and outcomes included?
 Are there sufficient resources to undertake the activities?
 Do all the components lead to one or more outcomes?
 Are the outcomes really outcomes, not activities?
 Are the outcomes reasonably measurable?
 Are all stakeholders in agreement about the logic model?

Adapted from Porteous et al., 2002

Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 34



3.5 Using Your Logic Model to Plan Your Evaluation

Once you have completed your logic model, it can serve as a map or guide for your evaluation efforts:

 Is your program actually serving the young people you expected? You may want to
collect demographic information to demonstrate that you are serving your target
population?

 Your activities and strategies, and your outputs columns also help point to aspects
of program implementation that you may want to measure. Are the program
activities in the logic model in fact taking place as expected, or are there variations
in how the program is being implemented? Are the expected activities taking place
at the expected level (for example, are participants engaged in at least 100 hours of
service)? Are practices that are considered essential to program quality taking
place? And so on.

 Your list of expected outcomes can help focus your discussion on what outcomes
you want to measure and how. You may need to make choices about what to
measure (some things are easier to assess than others, and we always want to
know more than we can reasonably assess), and by having the outcomes listed
together in one place, your evaluation team can talk about which are the highest
priority to include in the evaluation.

 Finally, your assumptions can also point to questions you may want to ask. Are
your assumptions correct about what is important in the program? You may want

Exhibit 2: Sample Logic Model Format
Program Name:
Mission:
Who
Assumptions
(Theory of
Change) Strategies Outputs Outcomes Impacts
Who is your
target
population?
Who will
benefit from
your work?
Who needs to
be involved in
your efforts?
What
assumptions or
theories guide
your work?
What do you
know, think,
and believe
about why you
expect your
program to
work?
What mix of
programs,
services, and
activities need
to be in place
to achieve the
desired
outcomes?
What kinds of
experiences
need to take
place for the
program to be
effective?
What products
or services
were created
and/or
delivered by
your program?
How many?
What
reasonably
measurable
outcomes do
you want to
achieve
through your
efforts – in
terms of
changes in
participant
knowledge,
skills, attitudes,
behaviors;
institutional or
community
outcomes?
What long-term
changes in the
lives of the
participants,
participating
institutions, or
communities
do you hope to
achieve?
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 35

to ask participants or staff as part of your evaluation about what they think is
making the program work.

In short, as you begin to think about what questions you want to answer and what data you
need to collect (next chapter), your logic model should help inform your answers.


3.6 Developing an Evaluation Plan

Once you have your evaluation team and logic model in place, it is time to get down to the nuts and
bolts of planning your evaluation. To help you build your evaluation plan, we believe that there are a
few key questions that you and your evaluation team need to answer:

1. What questions do you want to answer through your evaluation, and for whom?

2. What information/ data do you need to answer these questions? What data do you do you
already collect? What data do you need to collect? Who do you need to collect the data
from? And, what methods can you use to collect that data?

3. How will you analyze your data?

4. How will you use the results?

5. How can you make this do-able?


3.7 What Questions Do You Want to Ask?

While your logic model will help you think about the major elements of your program and some of the
kinds of data you want to collect, it is important to step back a bit and think about what are the basic
questions that you want to ask and answer through your program evaluation. What do you need to
know, and why?

The questions you ultimately decide to ask will depend on the purpose of the evaluation and its
audience, as well on the time and resources you have available for evaluation (fewer resources may
mean concentrating on one or two key questions). Remember the different types of evaluation
discussed in Chapter One. Are you interested in understanding the context for your program? In
documenting activities? In the implementation and operation of the program, or in assessing outcomes
or impacts? If you are primarily doing the evaluation for internal use to improve program operations
(often called formative evaluation), you may focus on questions of context and implementation. If you
need to demonstrate results for outside funders, you may focus more heavily on assessing outcome or
impacts (often described as summative evaluation). In most circumstances, you are likely to want to
answer a mix of questions about both implementation and outcomes.

You can use the “Key Questions” tool in Part Two to think about what types of questions you might
want to ask about your program and who would want to know the answers to those questions. Exhibit
3 also outlines some sample questions that you might want to ask about a hypothetical summer service-
learning program.
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 36


Exhibit 3: Sample Summer of Service Evaluation Questions
Type of
Evaluation
Question Key Questions
Audience
(Who Wants to Know)
Describing
Context
- How does the organizational setting for my
program affect operations and outcomes.
Does the program work better in a school or
community-based setting?
- What kinds of social or cultural issues need to
be taken into consideration in my program
design?
Program leadership and staff
Documenting
Activities
- How many hours of preparation and service
are my participants engaged in? Are they
meeting the minimum program requirements?
- How many different projects have participants
in this program completed over the summer?
What kinds of projects?
Program leadership and funders
Understanding
Process
- How carefully was our program model
implemented at different sites? Did most sites
successfully implement all the key elements of
the program?
- What kinds of challenges and barriers did staff
and leadership encounter in implementing the
program this year?
- Did the central program office provide
adequate training and support for project staff
as they worked with participants?
Program leadership, staff, and
funders
Assessing
outcomes and
impacts
- What was the impact of the program on civic
knowledge and skills of program participants?
- Did the program help change community
members’ attitudes (for example, about a
particular issue you worked on) or behavior
(for example, by promoting recycling)?
Program leadership, staff, and
funders


3.8 What Kinds of Information Can You Use to Answer Your Questions?

Given the questions that you have outlined for your evaluation, what kinds of information/data will help
you answer them? If, for example, you want to document service hours, what are some simple ways in
which you can track and collect that information? If you are interested in showing changes in attitudes
and beliefs among your participants, what kinds of information would help you document those
changes?

What data do you already collect? When thinking about these questions, it is often useful to begin by
thinking about what you already collect. For example, does your program collect attendance data that
could be used to document hours? Are young people keeping time sheets as part of their work on the
program. If so, you do not need to create additional tools just for the evaluation. If you are interested
in attitudes and beliefs, you might think about whether there are program activities, such as reflection
exercises, journals, or participant portfolios that may provide data on how participants think about
themselves and their community. One step towards making evaluation ‘do-able’ is to think about what
you already to and how to integrate evaluation into activities that are already part of your program.
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 37


What additional data do you need? Often, you are not collecting the information you need already. In
those cases, you need to think about what is the most appropriate way to collect that additional
information. For something like service hours, you might develop an attendance sheet, if you do not
already use one, time sheets completed by individual participants, or some form of sign-up sheet, for
example. If you are interested in showing changes in attitudes and beliefs among your participants, you
might use a mix of surveys, reflective essays, focus groups or presentations, for example. The “Data
Planning” worksheet in the Toolkit can help you think about what data sources you already have and
what additional data you might need to collect.

Who Should You Collect the Data From? In thinking about the questions you want to answer and the
data you want to collect, you also need to think about who you should collect the data from. Who is the
best source of information for a particular question – the program staff, participants, community
partners, parents? One the one hand, you might think about who is the most direct or credible source
of information on a particular topic. Who is most likely to be able to report credibly on changes in
participant attitudes, for example? On the other hand, you may also want to think about who can
provide information most easily. Is it easier, for example, to gather information about the service
projects carried out over the summer from the participants, program staff, or community partners? It
might be easiest to get information from program staff, but community partners might have a better
sense of whether the projects really made a difference in the community.

Often, there is more than one source of information that you can use: participants, program staff, and
parents, for example, might all be able to report on the program’s impacts on participant attitudes and
beliefs. Whenever possible, we recommend that you think about collecting and analyzing data from
multiple perspectives – don’t rely on just one type of source! There are several reasons to do this. First,
each source may have a different perspective on the activities or accomplishments of your program. To
the degree that each group – for example, participants, staff and parents – has a different
understanding about what has worked (or not) in your summer program, or whether the program has
made a difference, this is useful information for program improvement. At the same time, to the extent
that multiple sources report similar results, the credibility of your results is strengthened: if participants,
parents, and staff all agree that the program had an impact on attitudes, we can be more confident in
the results, and they are more likely to be accepted outside audiences.

Sampling. Part of thinking about who you should collect the data from also includes thinking about
whether you collect data from everyone in a particular group or a sample. Generally, in a smaller
program you will collect data on all the participants for your evaluation. However, in a larger program, it
may not be feasible or cost-effective to collect data from all participants or stakeholders. If you have a
thousand participants, it could get very expensive to print and process surveys for every participant. In
those cases, you may want to collect a sample -- a smaller set of cases that is selected from a larger
pool and is used to draw conclusions about the entire pool or population. Ideally, you want to select a
sample that is representative (typical) of all of the participants or stakeholders. A ‘representative’
sample allows you to draw conclusions that apply to all participants or subgroups within your program.

The size of your sample is very important. In general, the larger sample size, the more likely it is to be
representative and allow you to detect smaller changes. In a small sample (for example, 30
participants), the “margin of error” is very large, so that you cannot be sure that small changes reflect a
real difference in outcomes. The larger the sample, the smaller the margin of error and the more likely
you get useful results. Typically, for example, a sample of 100 participants will provide a margin of error
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 38

of +/- 10%, 300 participants or more will usually result in a margin of error of less than 5%. If you think
you are going to be doing sampling in your evaluation, it would be a good idea to consult with a
professional researcher or evaluator for guidance.

Do you need a comparison group? In Chapter One, we discussed the value of having a comparison
group. As part of your evaluation planning process, you and your evaluation team should discuss
whether you want to try to include a comparison group in your evaluation, and if so, how you will
identify or select the comparison group members. As noted earlier, the goal in selecting a comparison
group is to find a group of young people who are as similar to your program participants as possible, but
who are not participant in your program. In the case of a school-based program, for example, you might
select students in the same grade but in another class. In the case of the Summer of Service program,
you may want to think about participants in other summer programs operated by the same organization
or using the same facility.

There are a variety of practical challenges involved in including a comparison group in an evaluation,
particularly in the case of the summer program. One key issue is how to control for “selection bias” –
the fact that your summer of service participants may have volunteered for the program because they
are already interested in civic engagement. How do you find young people with a similar interest at
baseline? In practical terms, can you gain the cooperation of another group leader or program to supply
a comparison group, and what incentives would you need to offer to get comparison group members to
participate. As with the issue of sampling, if you are considering a comparison group as part of your
study, you may want to consult with an experienced researcher or evaluators to get some advice or
help.


3.9 What Methods Should I Use to Collect Data?

The final step in thinking about your data collection is deciding on a method. It is important to recognize
that, while surveys are the first tool for data collection that we often think about for evaluation, they are
by no means the only methods for collecting data that are worth building into your evaluation plan. This
Toolkit provides you with a set of survey tools that you can use to assess your Summer of Service
program. But you can supplement those surveys (or replace them) with other methods as well. Some
other common program evaluation methods include:

 Focus groups: You can ask one or more small groups of people questions or
facilitate a discussion about ideas, and collect their responses.

 Interviews: Asking young people to answer questions individually and recording
their responses (on tape and/or using notes).

 Site visits: Visiting and observing a program on one or more occasions and
recording your observations.

 Photographic documentation: Taking photographs of a program or event to be
coded and analyzed later according to specific themes or characteristics.

 Portfolio or product reviews: Reviewing materials generated by participants in a
program, including portfolios, journals, or presentations.
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 39


The important thing to remember in using any of these methods is to develop a strategy for using them
systematically, so that the data collected is representative and reliable. For focus groups, for example,
make sure you have a standard set of questions and a clear plan for selecting who will be in the groups.
For site visits, think about what kind of observation guide or rubric you can prepare, so you know ahead
of time what you need to look for in each program that you visit.

The decision about what kind of data collection method to use usually involves balancing several
considerations: what kind of question you are trying to answer; what kind of evidence your audience is
mostly likely to accept; how much time you and/or the program staff have to devote to data collection;
and the resources available for collecting and analyzing the data. Often, some kind of trade-off is
involved: for example, surveys generally cannot capture information on process or the richness of
participants’ experience, but are often substantially easier and cheaper to administer on a large scale
than focus groups or interviews. Portfolios may provide an excellent way of assessing student work, but
your funder may want to see impacts measured by more traditional assessment tools. In the end, it is
important to recognize that there is no single “right” answer. The best you can do is thoughtfully weigh
your options as you make your choice.


3.10 How Will You Analyze the Results?

It is important to give some thought as part of your evaluation planning process about how you want to
analyze and present your results, since the answers to those questions may influence your decisions
about the types of data you collect and your data collection methods. For example, are you (and your
external audiences) mostly interested in basic descriptive information – frequencies and basic crosstab
tables (for example, showing percentage increase by age or gender), or are you interested in more
sophisticated analyses that test whether changes are statistically significant? Are you primarily
interested in looking at your program participants or beneficiaries as a whole, or do you want to see
separate results for different groups (again, for example, by age, gender, race, different program
activities or locations, etc.) and be able to compare those results statistically?

The kinds of analyses you want have practical implications for your evaluation design. If, for example,
you want to be able to show that changes in attitude that took place among participants over the
summer are statistically significant (that is, unlikely to be just random variations), you will want to use a
pre/post survey design that allows you to conduct that type of analysis. If you want to be able to
compare results among groups, you will want to make sure your surveys or data collection tools collect
the information on gender or program participant that you need in order to capture those differences.
Finally, the more sophisticated the analysis, the more likely you need someone with experience doing
statistical analysis helping you with your evaluation, and helping you think through your plans so you do
not get to the end of the summer and find that you have forgotten to collect some important piece of
information.

In the end, by thinking about where you plan to end up with your evaluation (i.e., the analysis), you can
think more effectively about what you need to do, and whose help you need, in order to get there.




Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 40

3.11 How Are You Going to Use the Results?

In the same way that it makes sense to think in advance about how you plan to analyze your data, it is
important to think through how you plan to use the results. Are you planning to use the evaluation to
inform program improvement? To build support among funders, program administrators, or policy-
makers? Are you expecting to share the results with other external audiences – for marketing,
recruitment, or fundraising purposes? As you begin to answer these questions, you may want to re-
think who the stakeholders are for your project, and possibly even add people to your evaluation team,
and you may want to revise or refine your evaluation questions. The key point here, as stated at the
beginning of the Toolkit, is that evaluation is intended to be a practical enterprise. The more you think
ahead of time about the uses of your evaluation, the better you can plan to make sure the evaluation
meets your needs.


3.12 How Can You Make It Do-Able?

Evaluations take time and energy and resources – three things that are in short supply for many service-
learning programs. You and your staff are busy, and evaluation often represents one more (often
dreaded) task. Because of this, as you plan your evaluation, you may want to think creatively about who
you can involve in the evaluation and how you can carry it out while limited the burden on those who
are already working hard. Are there volunteers, partners, and even participants, who can take on roles
in the evaluation process and help you to make it happen as painlessly as possible. For example, think
about the following:

 What roles can participants play in the evaluation process. We suggested early on
that participants or interested youth can help with the planning and design. Are
there ways to have a team of participants help with the data collection process (for
example, managing the distribution and collection of surveys)? Can participants
help with the analysis (without violating the confidentiality of data)? Can they help
interpret and present results?

 Are there interested volunteers, parents, or community partners who can help you
manage the process or, if appropriately skills, conduct the analysis?

 Are there college or university faculty or graduate students who might be able to
provide interns or advice, or who might want to involve their students as a way of
teaching about research and data collection?

 Finally, are there funders who might provide some resources targeted to
evaluation, since they are interested in the results?

The point here is to think about how you can make the best use of your available resources, and to think
creatively about who can help you do this particular job.

3.13 Final Thoughts As You Prepare and Plan your Program Evaluation

Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 41

The “Tools” section includes a final set of tools for developing your evaluation plan and building an
evaluation timeline. As you work through the planning process, we offer a few “final” thoughts on
evaluation planning:

 Involve your stakeholders in the process.

 Design evaluation to meet your needs. There is no one “right” approach. Your logic
model should help you define your program’s unique evaluation needs.

 Make effective use of the resources (people and information) that you have on
hand.

 Make evaluation a living, useful process – a “want to” instead of a “have to.”


The Chapter that follows walks you through the process of implementing an evaluation,
using the tools in this SOS Evaluation Toolkit.
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 42


Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 43

Chapter Four: Implementing Your Evaluation Using the SOS
Evaluation Toolkit

The SOS Evaluation Toolkit is designed to provide you with a basic set of tools that you can use in
evaluating your Summer of Service program. As outlined in the preceding chapters, our goal is to
provide you with some simple starting points for evaluation; but we encourage you to work through
your own planning process and to supplement the tools provided here with other forms of research and
data collection, particularly if your program includes goals or strategies that are different from the
“typical” or “average” Summer of Service program, or if you are interested in outcomes that are
different from those addressed by the tools provided here.

This chapter is designed to walk you through the basic evaluation process using the tools in the Summer
of Service Toolkit. The tools themselves are found in Part Two.

4.1 What is in the SOS Evaluation Toolkit?

The SOS Evaluation Toolkit includes several sets of participant surveys, as well as surveys for use with
staff and program partners. Together, they provide a basic set of tools for assessing program
implementation and program impacts on participants and the community (through the community
partners). The Toolkit also includes instructions and materials for implementing the surveys in your
program. Specifically, the “Tools” section includes:

 Evaluation planning tools, including a logic model worksheet and planning worksheets for
thinking about the questions you want to ask, data you need to collect, and the timeline and key
steps in your evaluation.

 Three sets of participant surveys – baseline and post-program surveys for use with program
participants and comparison group members in a “pre/post” evaluation design, plus a separate
“post-program only” or “retrospective” survey that can be used in those settings where a
separate baseline and post-program survey would be difficult to administer. The “post-only”
survey is designed for use with participants only (there is no comparison version).

 Instructions for administering the surveys – these include baseline and post-
program instructions for program staff who are administering the surveys

 Sample parent permission forms – two versions: an “active” permission form, which requires
parental sign-off before participants can take the survey, and a “passive” notification that asks
parents to let you know if they do not want their children to participate in the survey.

 A survey for program staff/assessment tool for program implementation –
Forthcoming

 A community partner survey – an end-of-program survey to collect feedback from
community partners about their program experience and their assessment of
project impacts.


Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 44

4.2 The Pre/Post SOS Participant Surveys

The core set of tools provided in the SOS Evaluation Toolkit are the participant surveys. The surveys
were created in consultation with a practitioner working group that developed the Summer of Service
logic model (at the end of Chapter One) and shared for comment with a broader network of Summer of
Service program practitioners and researchers.

The surveys themselves are designed to address the central activities and outcomes for Summer of
Service programs as defined in the SOS program logic model. In terms of program outcomes, those
include six broad participant outcomes:

 Understanding Context
 Civic Agency
 Initiative and Action
 Efficacy and Connection
 Future Civic Roles
 Academic Engagement and Success

The outcomes are assessed through a set of pre- and post-program questions, with groups of questions
around a single set of ideas (such as those related to civic agency) making up a “scale” or measure that is
used to measure each outcome. Exhibit 4 at the end of this chapter provides a key that indicates which
questions are associated with which outcomes or “domains.”

In addition to the questions associated with the outcome measures, the surveys also collect basic
demographic information about participants (age, gender, race/ethnicity, prior service-learning
experience, etc.) that can be used in examining differences in program impacts for different groups.

The post-program surveys also include a set of questions about the program experience itself: whether
young people had a say in choosing their issue, learned about the causes and effects of the problems
they are addressing, had real responsibilities, had opportunities for reflection, etc. These questions can
be seen as providing a basic measure of the quality of the summer of service program experience, from
the perspective of program participants. The questions also line up closely with the emerging national
standards for quality service-learning programs.
2
The surveys also include a number of questions about
college awareness, reflecting a growing emphasis on the use of summer programs to promote
educational aspirations and to help young people think ahead to postsecondary education.

Finally, it is worth noting that the post-program surveys also include a set of “retrospective pre/post”
questions about civic skills: how well young people think they can find information, work in teams,
present information, etc. Rather than ask these in a traditional pre/post format on both the baseline
and post-program surveys, participants are asked to only answer these questions at the end of the
program – they are asked to think back and indicate how well they could do each task at the beginning
of the summer and “now.” The reason for this approach is that many young people are confident that
they can accomplish these tasks at the beginning of the program, but then learn how challenging it can
actually be to find information, present information to community members, etc. As a result, their
assessments of their skills are often much more modest at the end of the program. In effect, they have

2
See the “K12 Service-Learning Standards for Quality Practice” (St. Paul, MN: National Youth Leadership Council). Available at:
http://www.nylc.org.
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 45

changed the standard by which they assess themselves as a result of the program. Consequently, we
have found it make sense to ask about skills only at the end of the program, when participants have a
clear idea about what is involved in each task.


4.3 Comparison Group Surveys

The Toolkit also includes a separate set of pre- and post-program comparison group surveys (and
associated instructions, permission forms, etc.). The comparison surveys are identical in content to the
participant surveys, but allow comparison group members to skip the questions about the quality of
their service experience (unless their summer program included service projects). Note that the
comparison group surveys are marked “Summer Program” surveys vs. the “Summer of Service” surveys
for program participants. It is often helpful to print the participant and comparison surveys on two
different colors of papers to help ensure that you are using the right survey with the appropriate group.


4.4 Post-Program Only Survey

As noted above, the Toolkit also includes a version of the participant survey that is designed to be
administered only at the end of the program. While the pre/post surveys are designed so that we can
compare participants’ answers at two different points of time to assess changes, the post-only version
asks respondents to assess the changes themselves by agreeing or disagreeing with statements such as:
“As a result of my summer of service experience, I learned that doing something that helps others is
important to me” or “My community is more important to me.” As noted in Chapter Two, post-only or
retrospective surveys are often viewed as less objective or rigorous than separate pre/post surveys, but
in those situations where it is not feasible to use a pre/post approach, they offer the opportunity to gain
valuable feedback on your program and a better understanding of how your participants assess their
own experience.


4.5 Community Partner Survey

The “other” outcome for Summer of Service programs, in addition to participant development, is the
provision of meaningful service to the community. The Community Partner Survey provides one tool for
documenting and assessing some of those service benefits. Designed to be distributed to community
partners working with your Summer of Service program, the survey asks about the partner experience
(whether staff and participants were well prepared), the role(s) played by the community partner, the
type of project undertaken, whether it addressed a real community need, and the impact of the
involvement in Summer of Service on the partner agency. A key measure of the success of the
partnership with community agencies is the extent to which they plan to continue their involvement in
the Summer of Service program – one of the questions on the survey.

Again, it is important to recognize that all of these surveys are intended to provide an easy-to-use
starting point for your own evaluation, but they are not likely to address all of the outcomes from your
program or all of the questions you want answered. Feel free to supplement the surveys with additional
question, or data drawn from additional sources, including program records, interviews, focus groups,
directed writing, performance evaluations, and the like.

Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 46


4.6 Administering the Surveys: Protecting the Rights of Participants and Partners

“Evaluation deals with real people in real programs, often people in serious need of help. The results of
the evaluation may have real consequences for these programs and these people. Therefore, evaluation
has an obligation to pay even more attention to ethical questions than most other kinds of social science
research have to do.”
Carol Weiss, Evaluation

Even if you are not a professional evaluator, it is important to collect data and conduct your evaluation
in a way that meets basic professional and ethical standards. A key watchword for evaluators, as it is for
physicians, is to “do no harm.” As such, as you plan and conduct your evaluation, it is important to
consider how the data you collect is going to used, who might be hurt by the release of data, and how to
protect the privacy of those involved in the evaluation process. Some key ethical concerns include:

 Honesty: Of course most of us recognize that honesty is an important ethical consideration in any
work that we do. It is especially important that you are clear and honest with all those involved in
the evaluation about its purpose and likely use of the data.

 Informed Consent: Informed consent refers to the idea that participating in a survey, evaluation
project, any other kind of research study is voluntary. For that to be true, the people who are
involved in the study need to have enough information to make an informed decision about
whether to participate. If there are potential social or legal consequences to providing
information, participants need to be aware and have the freedom to decline participation. This is
especially important when working with groups of people who may be vulnerable due to power
differences in age, physical ability, position, or historical oppression and should always be a
consideration when administering surveys of students in schools. This is not to say that people
should not take part in studies like this, but it is to emphasize that they need to know the potential
consequences and be able to freely decide.

 Confidentiality and Anonymity: One of the key elements in both protecting the privacy of
participants and encouraging honest responses is ensuring the confidentiality and anonymity of
survey data. Confidentiality means you will take steps necessary to keep individuals’ information
or survey responses private and not share results in any way that would allow individuals’
responses to be identified. In terms of the Toolkit surveys, for example, we recommend that
participants be given envelopes in which they can seal their surveys so that program staff cannot
see their individual survey responses. It also means that when results are reported, care is taken
not to show information in ways that would point to particular respondents (for example, if there
were only a few participants in a particular demographic group, you might not want to show
results broken out for that group).

Anonymity means that you do not even collect the names or identifying information from people –
their responses are fully anonymous. In the case of the pre/post surveys, where there is a need to
be able to match surveys from the beginning to the end of the program, the Toolkit surveys use a
participant code made up of individual’s initials and birthdates. Other approaches assign
participants an ID number that is used on the surveys, with evaluation staff using a master list to
make sure that participants enter the appropriate ID numbers on their surveys. In the case of the
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 47

post-only surveys, which do not require matching baseline and post-program results, the surveys
are completely anonymous – there is no identifying information on the surveys.

The key point is that both confidentiality and anonymity are important in order to allow students
to report their honest responses, without fear of embarrassment or pressure – and both increase
the trustworthiness of your results.

 Professionalism and Competence: The SOS Evaluation Toolkit does not presume that you are
professional evaluators. However, the reliability and usefulness of the data you collect depends on
carrying out the key steps in the evaluation in a careful, thoughtful manner – following directions
and working to make sure that data is collected and reviewed carefully. Remember, evaluation is
defined as the “systematic” collection and use of data – in that regard, attention to detail and
follow-through is critically important.

 Reciprocity: Good evaluations are based in a sense of reciprocity and mutual respect among those
involved in the study. In terms of planning an evaluation, it means including those most affected
by the evaluation in the planning and design. In terms of data collection, it means that those who
provide data should have a chance to learn about and discuss the results. When you plan your
evaluation, it is worth thinking about how you can feed the results back to program staff and
participants so they can see how the information they provided is being used.


4.7 Administering the SOS Participant Surveys

By the time you are ready to use the survey tools in the Toolkit, you will have done much of the upfront
planning for your evaluation. You will have identified the questions you want to answer and the
outcomes that you are interested in assessing. You will have thought about the kinds of data you can
collect and the methods to be used to collect that data, including the surveys in this Toolkit. You will
have decided whether to include all of the participants in your program or a sample, and whether to
include a comparison group or not. As the evaluation coordinator for your program, then, you are ready
to move ahead with the data collection process.

Key steps in that process include:

1. Collecting parent permissions.

2. Administering surveys with participants

3. Collecting participant surveys and encouraging a good response rate

4. Distributing and collecting staff and community partner surveys.

Here is some “how to” advice for each step.

Collecting parent permissions. In most cases, once you have identified the participants or groups to be
included in your evaluation, the first step will be to distribute and collect parent permission forms. Even
though you may consider the evaluation an integral part of your summer program, it is a good idea to
make sure that parents are aware of what you are doing. As noted above, the Toolkit includes two
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 48

versions of the parent permission forms – an “active” permission form that requires parents to return a
signed form in order to include their child in the evaluation, and a “passive” consent that notifies
parents about the study and asks them to send in a form only if they do not want their child to
participate. Different organizations have different rules about which form to use, which may depend on
whether the contents of the surveys are seen as “risky” or likely to cause concern among parents.
Confer with your program leadership as to which approach you think is necessary. You can also feel free
to modify the permission forms to customize them for your program or audience.

Whichever approach you take, you want to make sure that you get the permission forms out to
parents in advance of the date on which you plan to conduct the survey. If you are using the “active”
version, make sure there is enough lead time so that you can send out reminders and encourage a high
level of response. You may also want to translate the permission forms into languages other than
English if your participants come from families where English is not the primary language. You should
have copies of the surveys available if parents want to review the survey before granting permission.
Finally, if you are using the “active” permission, you may want to consider some additional strategies
to promote a high return rate for the permission forms. One approach is to include the evaluation
permission forms with other types of forms and permissions that are collected at the beginning of the
Summer of Service programs (for example field trip permissions or insurance forms). That way, parents
can return all the materials needed for the program at the same time. You may also consider providing
some kinds of incentives for parents and/or participants to return the forms. One example is to enter all
of those who returned their forms into a drawing for a gift certificate; another approach is to promise
participants a pizza or ice cream party if more than 75% bring back their permission forms.

Administering the Surveys. The Toolkit includes most of the materials needed to administer the surveys
– the surveys themselves, a set of instructions for program staff, and a “survey cover sheet” to help you
keep track of which surveys are from which groups or classes. You may decide to administer the surveys
yourself, or to ask program staff or teacher to administer the surveys for their groups.

As the evaluation coordinator, one way you can make the survey process easier is to package the survey
materials together in “classroom sets” with a copy of the survey instructions, a survey cover sheet, and
enough copies of the surveys for all the participants in each group or class. We also strongly
recommend including an envelope for each participant. That allows participants to place their
completed survey in the envelope to protect the confidentiality of their responses.

If you are having someone else administer the surveys (teachers or group leaders), we recommend that
you take a few minutes to go over the survey and read through the instructions with the program staff
so they understand the process and the steps involved in administering the surveys. You want to make
sure that staff understand the purpose of the surveys; that they present the survey process positively to
program participants; that they understand the importance of protecting the privacy of participant
responses; and that they can help participants if they have questions about the survey process. As
noted in the survey instructions, if staff are concerned that participants will have trouble reading the
surveys, they can read them out loud to the group.

In particular, we hope that you will emphasize that program staff should not look at participant
responses to the surveys, as tempting as that might be. Having promised participants that the surveys
are confidential, we hope that staff will honor that commitment.

Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 49

If you have decided to include a comparison group in your evaluation, we also recommend working
closely with staff at the comparison sites to make sure that the surveys are administered correctly.
The ideal is for you, as evaluation coordinator, to administer the surveys yourself to the comparison
group members. In that way, you can be sure that the instructions are followed and that the surveys are
properly presented to comparison group members. It also allows you to collect the surveys when they
are completed so that they are not misplaced by the comparison site staff.

If you are not able to administer the comparison surveys yourself, we recommend that you meet with
the staff at the comparison site who will be administering the surveys to go over the instructions with
them. You may want to find out when they plan to administer the survey so you can stop by to pick up
the completed survey soon afterwards.

This is another instance in which you may want to consider providing some kind of incentive or “thank
you” for the comparison site staff and participants, since they are often doing you (or your program
leaders) a favor by agreeing to take part in the study. A small gift certificate for staff, or pizza or ice
cream party for program participants can often make the evaluation process feel like a more positive
and rewarding experience!

Collecting the Surveys and Encouraging a Good Response Rate. Often the biggest challenge to a
successful evaluation is that of simply getting the surveys back, so it is worth spending a little time
thinking about how to make sure that the surveys you have distributed get completed and returned to
you in a timely fashion.

We strongly recommend that participant and comparison group surveys be administered to young
people as a group at the program sites. We do not recommend letting young people take the surveys
home – they are likely to be misplaced or forgotten and never be returned.

If the surveys are being administered by program staff (rather than by you), try to identify the day on
which the surveys will be administered and go and pick up the completed surveys soon afterwards. It is
our experience that the longer that completed surveys are kept around a program site, the more likely
they are to be forgotten or lost. If you need the surveys sent back to you, include a postage paid, self-
addressed envelope that program staff can use to return the surveys. The best approach is to use some
form of shipping that includes a tracking number in case the surveys are lost – priority mail, FedEx, or
UPS.

If you are using the pre- and post-program surveys, we recommend that you do not distribute both
sets of surveys to program staff at baseline – the post-program surveys will only get lost over the
course of the summer. Rather, we suggest distributing and collecting the baseline surveys, then doing
the same with the post-program surveys towards the end of the summer program. In that situation, it
will be important to make sure you know when the program is ending so that you can distribute the
post-program surveys in plenty of time to be completed.

There are no special tricks to making the survey process work well. In general, the two most important
steps you can take are (1) to keep in close touch with those who are administering the surveys for you;
and (2) be sure to say “Thank you” at every opportunity. Busy program staff are much more likely to pay
attention to doing evaluation work well if they feel that their time and efforts are being appreciated.

Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 50

Distributing and Collecting Community Partner Surveys. The partner surveys provide an opportunity
to gain an additional perspective on your program, but also provide their own challenges in terms of
data collection.

As with your program participants, the most effective way to survey your partners is in a group setting –
an end of summer convening, or a celebration. The easiest ways to make sure that surveys are
completed is to administer them and collect the completed surveys while everyone is in the same room.

However, often it is not possible to get everyone together – staff are busy, often supervising
participants, and community partners also have other things to do. As such, you may have to distribute
surveys and ask partners to return them to you or someone else at the program. To make that easier,
you may want to include a stamped, pre-addressed envelope that partners can use to return the survey.
If you have access to Survey Monkey or another online survey program, you may also want to create a
web-based version of the surveys. You can then email the request to complete the survey to partners,
track who is responding, and send follow-up reminders as needed. Finally, you may want to consider
offering incentives to those who respond, such as inclusion in a drawing for a gift certificate. It is also
often helpful to have a letter from the program director encouraging partners to complete the surveys
and reinforcing their value to the program.

Ultimately, in order to achieve a reasonably high response rate, it often takes several rounds of
reminders (also known as “nagging”). Don’t be discouraged – remember how you respond when asked
to take a survey – and continue with follow-up calls and emails until the surveys are done.


4.8 Final Thoughts on Data Collection

Data collection is often the slow, tedious part of evaluation (at least compared to planning or analysis
and use of the data). It takes patience and persistence to gather good quality, reliable data that you can
use to better understand your program.

This is also a good point to remind ourselves that, while the Toolkit provides surveys for your use,
surveys are not the only form of data collection that should be used in an evaluation. As you chase after
participants, staff and partners to get their completed surveys back, think about whether you can also
organize some focus groups; review project reports and other materials, and/or conduct some
observations and interviews about the Summer of Service experience. That mix of “stories” and
numbers will ultimately give you the information you need to “prove” and “improve” your Summer of
Service program.

NOTE: We have included sample semi-structured interview questions for participants and staff that
were developed by Nicole Tysvaer, ICP Research Fellow, for ICP’s 2010 SOS Pilot Impact Evaluation.
You can find these at the end of Chapter 7.
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 51


Chapter Five: Analyzing and Using Your Data

What do you do once you have collected all those surveys? In order for your evaluation to be useful,
you need to analyze the data you have collected, including the survey data, and think about how you
want to present and use the data you have collected.

5.1 Analyzing Your Data

Cleaning, entering, and analyzing the surveys you have collected can be a challenge. While many larger
programs have staff with some experience in data management and analysis, few smaller programs
have that kind of expertise on staff. The best solution may be to consider partnering with faculty or
graduate students at a local university to help in conducting the analysis. In some cases, faculty
teaching research or statistics courses may be quite happy to have a real dataset that their students can
work on. In other cases, social work schools, public health programs, and other types of graduate
programs often have internships or practicum projects that involve working with local nonprofits. Some
high schools also have statistics classes and might be interested in an opportunity to take on a
community project. Finally, you might consider hiring a local evaluation consultant to help with the
analysis, particularly if it is a larger or more complex program.

It is important to note that the analysis itself can range widely from the simple to the complex. At one
end of the spectrum is a simple count of the different responses (frequencies), so that you can see what
percentage of participants “Agree or Strongly Agree” with each set of statements at baseline and post-
program. A more sophisticated analysis might look at the percentage of participants that showed gains
from baseline to end-of-program, and an even more sophisticated approach will test to see if the
pre/post gains were large enough to be statistically significant. If you get to that point, it probably
makes sense to work with some kind of evaluation consultant.


5.2 What Do You Do with the Results?

Once you do begin to get the results (and in fact, even before you see the results), you need to begin
thinking about who your audiences are and how you can present and interpret the data that you are
seeing. One tool to help you do that is the “Presenting the Results” tool in Chapter 6. Those
worksheets help you think about the audiences for your findings – funders, program leaders, board
members, other staff, etc. – and the kinds of information you need to make your case.

Whatever the audience, it is important to think about how to use your results. You can:

 Write a report
 Present at Board and staff meetings
 Use as part of your agency planning
 Host a conference to discuss your program and others
 Present at conferences
 Use to inform professional development

And so on. The point is to look at ways to make the data useful, and to make use of the data. You did
not go to all this trouble just to write a report that will sit on a shelf.
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011 52


5.3 Conclusion

The SOS Evaluation Toolkit has been built around several key assumptions. First, that to be effective,
evaluation takes planning and commitment. A good evaluation is planned; it doesn’t “just happen.”
Second, we also believe the most effective evaluations involve your stakeholders in this process. This
makes evaluation an effective use of resources (people and information) that you have at hand and
helps your evaluation become a living, useful process that you will ‘want to’ do, instead of ‘having to’
do. Third, evaluation should be useful: think about what questions you really want to answer and how
you can best use the data you collect. Finally, as your evaluation begins to comes alive, remember, you
should design it to meet your needs – there is no one “right” approach. The tools in the SOS Evaluation
Toolkit provide a template, but you can adjust them to your own needs.




Summer of Service Program Design Toolkit


Innovations in Civic Participation © 2011
53



Chapter 6: Evaluation Planning and Reporting Tools


6.1 Logic Model Worksheet

6.2 Planning Your Evaluation: Key Questions

6.3 Data Planning Worksheet

6.4 Building and Evaluation Plan: Tasks, Roles, and Timelines

6.5 Presenting Your Results

6.6 Domains and Scales












Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011






Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011

6.1 LOGIC MODEL WORKSHEET

Program/Organization Name:

Program Mission:


Who
Assumptions
(Theory of Change) Strategies Outputs Outcomes Impacts

Who is your target
population? Who will
benefit from your work?
Who needs to be
involved in your efforts?
What assumptions or
theories guide your
work? What do you
know, think, and believe
about why you expect
your program to work?
What mix of programs,
services, and activities
need to be in place to
achieve the desired
outcomes?
What products or
services were created
and/or delivered by your
program? How many?
What outcomes do you
want to achieve through
your efforts?

What long-term
changes in the lives of
the participants,
participating institutions,
or communities do you
hope to achieve?

This tool was developed by the Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011

6.2 PLANNING YOUR EVALUATION: KEY QUESTIONS YOU WANT TO ANSWER

Type of
Evaluation
Question Key Questions
Audience
(Who Wants to Know)
Describing Context
(Contextual issues that
affect your program)


Documenting
Activities
(Number and
characteristics of
participants, number of
service hours, types of
services provided, etc.)



Understanding
Process
(Was the program
implemented as
expected, what worked
well, and what did not?)



Assessing
outcomes and
impacts
(Did changes take
place? What difference
did the program
make?)


This tool was developed by the Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011

6.3 DATA PLANNING WORKSHEET


Key Questions
Information/Data That I
Already Collect
Additional
Data/Information that
Would Answer the
Question
How to Collect the
Additional Information
(Tools/From Who)
Example: Has the SOS program
increased my participants’
knowledge of their local
community?
Project planning materials that
include descriptions of the local
community and community issues
Participant survey or focus group
responses to questions about
community issues
Pre/post participant survey
Focus groups with a sample of
program participants





This tool was developed by the Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011

6.4 Building an Evaluation Plan: Tasks, Roles, and Timeline

Task To Accomplish Specific Steps Roles Deadline
Evaluation Planning and
Preparing









Data Collection













Data Analysis







Reporting and Using Results



This tool was developed by the Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011

6.5 Presenting Your Results – Telling a Story

Part of making effective use of your data is thinking about who needs to know about your
program, what you hope to accomplish through your report, what kinds of arguments you need
to make, and what data you have – from the surveys and other sources that you can use to
build your argument. This worksheet helps you think about those questions.


Who is your audience? Whom are you approaching for support?




What is your overall goal or outcome in presenting your evaluation findings?






What argument(s) do you need to make?







What data will you use to support the argument(s)?






Do you have a story to illustrate your point?






Have you taken any similar action in the past? If so, what worked or didn’t work? What was
convincing or not?

This tool was developed by the Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Program Design Toolkit


Innovations in Civic Participation © 2011
Presenting Your Results – To Whom and How?

Audience
What is your overall goal or
outcome? (e.g., more publicity,
more funding)
What argument(s) do you
need to make?
What data will you use to
support the argument(s)?
Do you have a story to illustrate
your point?
Have you taken an action in the past (e.g.,
submitted a proposal, made a presentation)?
If yes, what worked
about it?
If yes, what didn’t
work? What would
you add/change?

Media



Foundation



Federal or State
Grant/Program



Community Agency



Business



Higher Ed. Institutions



Other CS-L Programs



Local Government



School District



School Staff



Youth & Program
Alumni



Families & Citizens




Based on tools developed in the Making Knowledge Productive Toolkit, Brandeis University and the Massachusetts Department of Education.
This tool was developed by the Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011

6.6 Pre/Post Survey “Domains” or “Scales”

Questions 1-38 in the pre/post surveys include questions reflecting seven major outcomes or
“domains,” drawn from the SOS program logic model. Key domains include:

 Civic Agency
 Educational Aspirations
 Sense of Belonging
 Understanding Context
 Civic Efficacy
 Future Civic Involvement
 Civic Knowledge

The table below indicates the questions associated with each domain.

Domain Number Item
Agency 2. Doing something that helps others is important to me.
Agency 7. I feel like I can stand up for what I think is right, even if my friends disagree.
Agency 16.
I think each person in the community should do what he or she can to solve
community problems.
Agency 22. It is my responsibility to help improve the community.
Agency 23. I want to help other people, even if it is hard work.
Agency 27. Helping other people is something I am personally responsible for.
Aspire 3. It is important to me to do the best I can at school.
Aspire 10. School is a waste of time.
Aspire 18. Finishing high school is important for me.
Aspire 24. I would like to quit school as soon as possible.
Aspire 37. I am looking forward to going back to school in the Fall.
Aspire 38. I really want to graduate from college
Belong 5. I feel like I am an important part of my community.
Belong 9. Adults in my community value my opinion.
Belong 17. My community is important to me.
Belong 36. I know about other people who care about the same social issues that I do.
Context 4.
In order to solve problems in my community, it is important to be able to identify the
good things in my community as well as the problems.
Context 13.
In order to solve problems in my community, it is important to understand how
issues I care about affect others in my community, not just how they affect me.
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011

Domain Number Item
Context 21.
In order to solve problems in my community, I need to understand the different
points of view on an issue or problem.
Context 29.
In order to solve problems in my community, I need to understand how events and
decisions outside my community might affect the problem that I am interested in
solving.
Efficacy 8.
I believe that young people my age have enough influence to be able to impact
community decisions.
Efficacy 11.
I know what resources are available to help me with a project in my school or
community.
Efficacy 12.
I feel that most adults are supportive of young people’s efforts to work on school and
community problems.
Efficacy 20. I feel like I can make a difference in my community.
Efficacy 26. I am aware of needs in my school or community.
Efficacy 28. I know how to design and do a service project in my community.
Efficacy 31. I am confident expressing my opinions in front of a group.
Efficacy 33. I know how to lead a group project.
Efficacy 35. I am committed to helping improve my community both now and later in life.
Future 14. When I am an adult, I plan to be active in community organizations where I live.
Future 15.
When I grow up, I plan to work with a group to solve a problem in the community
where I live.
Future 25. When I am an adult, I want to be able to vote.
Future 32.
When I am an adult, I plan to get information about candidates and issues before
voting in an election.
Know 1. I know how to influence the decisions made by my local government.
Know 6. I understand how public decisions are made in my community.
Know 19.
I understand the different kinds of services my town provides to people in my
community.
Know
30.
When community issues or problems are being discussed, I usually have something
to say.

This tool was developed by the Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Program Design Toolkit


Innovations in Civic Participation © 2011
63


Chapter 7: Survey Materials


7.1 Pre/Post Participant Survey Materials

7.2 Pre/Post Comparison Group Survey Materials

7.3 Post-Only Participant Survey Materials

7.4 Community Partner Survey

7.5 Semi-structured Interview Questions














Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011
























Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011










7.1 Pre/Post Participant Survey Materials






















Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011









Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

SAMPLE PARENT NOTIFICATION NOTICE – PASSIVE PERMISSION
(Service-Learning Participants)

Dear Parent or Guardian:

As part of our efforts to ensure that the [insert your program name] Summer of Service program is
providing participating youth with a quality summer experience, we are asking the young people in
our programs to complete a survey at the beginning and end of their summer experience.

The survey asks program participants about their experience in the summer of service program and
its impact on a variety of civic and school-related attitudes and skills. The survey was developed by
Innovations in Civic Participation (ICP) and the Center for Youth and Communities at Brandeis
University as part of a national effort to evaluate Summer of Service programs. Copies of the
complete Summer of Service survey are available upon request.

We want to assure you that all of the information collected through the survey will be kept
strictly confidential. It will only used for the purposes of evaluating our SOS program and other
similar programs around the country. The surveys do not include participant names (students use a
‘code’ instead of their name on the survey), and participants are instructed to seal their completed
surveys in envelopes before handing them in to program staff. Participants may also skip any
questions in the survey that they are uncomfortable answering.

We are sending this notice home to inform you of the survey and to give you the opportunity to let us
know if you do not want your child to participate in the evaluation. We believe that the survey
provides valuable information that will help our ongoing efforts to improve our program and provide a
quality experience for area children. We want to encourage you, therefore, to allow your child to
participate.

If you DO NOT want your child to participate in this study, please let us know by completing the form
below and returning it to your child’s program staff within the next three days. If you are willing to
have your child participate you do not need to take any further action.

If you have any questions about the study, please feel free to call me at (___) ____________.

Thank you for your cooperation.


-------------------------------------------------------------------------------------------------------------------------


WITHDRAWAL OF PERMISSION
Please Sign and Return to a Member of our Program Staff Within Three Days if You DO NOT want
Your Child to be Included in this Evaluation

I do not want my child, ______________________, to participate in the Summer of Service evaluation
survey. Please take the necessary steps to make sure that my child is not included in the survey process.

Parent(s)/Guardian(s) Signature(s):__________________________________________________

Daytime Telephone: (_______) ______ - _________ Today’s Date: _____________
Area Code Number
Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

SAMPLE PARENT PERMISSION NOTICE – ACTIVE PERMISSION
(Service-Learning Participants)

Dear Parent or Guardian:

As part of our efforts to ensure that the [insert your program name] Summer of Service program is
providing participating youth with a quality summer experience, we are asking the young people in
our programs to complete a survey at the beginning and end of their summer experience.

The survey asks students about their experience in the summer of service program and its impact on
a variety of civic and school-related attitudes and skills. The survey was developed by Innovations
in Civic Participation (ICP) and the Center for Youth and Communities at Brandeis University as part
of a national effort to evaluate Summer of Service programs. Copies of the complete Summer of
Service survey are available upon request.

We want to assure you that all of the information collected through the survey will be kept
strictly confidential. It will only used for the purposes of evaluating our SOS program and other
similar programs around the country. The surveys do not include participant names (students use a
‘code’ instead of their name on the survey), and participants are instructed to seal their completed
surveys in envelopes before handing them in to program staff. Participants may also skip any
questions in the survey that they are uncomfortable answering.

We are writing to ask your permission to include your child in the survey process. We believe that
the survey provides valuable information that will help our ongoing efforts to improve our program
and provide a quality experience for area children. We want to encourage you, therefore, to allow
your child to participate.

If you agree to have your child participate in the survey, please complete the form below and
returning it to a member of our program staff within the next three days. If you do not return the
permission form, we will not include your child in the survey.

If you have any questions about the study, please feel free to call me at (___) ____________.

Thank you for your cooperation.


---------------------------------------------------------------------------------------------------- ----------------------------------------


SURVEY PERMISSION
Please Sign and Return to a Member of our Program Staff Within Three Days if You Give
Permission for Your Child to be Included in this Evaluation

I give permission for my child, ______________________, to participate in the Summer of Service
evaluation survey.

Parent(s)/Guardian(s) Signature(s):__________________________________________________

Daytime Telephone: (_______) ______ - _________ Today’s Date: _____________
Area Code Number

Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

1


Summer of Service Evaluation
Baseline Survey Instructions


Thank you for your help administering the Summer of Service participant survey. The survey is part of
the ongoing effort by this program and others to document the impacts of service-learning programs and
to learn how to make them even more effective.
There are a few simple instructions for administering the survey with your service-learning participants.
1. Send the enclosed parent permission forms home at least a few days prior to administering
the surveys.
The forms are designed to make parents aware of the study and to allow them to “opt out” if they do
not want their children to participate. If your program is using the “Active” permission forms, parents
must return the form before their children take the survey. If you are using the parent “Notification‟
version, only those participants whose parents do not want them to participate need to return signed
parent forms.

2. Please give the survey to your service-learning program participants before you start (or just
as you start) your service-learning activities for the summer.
The purpose of the baseline survey is to help us understand participant attitudes and ideas at the
beginning of the service-learning process (i.e., before they begin preparation and/or service). We will
be asking your students/participants to take a similar survey at the end of the program to see how
their ideas, attitudes and skills have changed.

3. Take a few minutes to introduce the survey to your participants and to encourage them to
answer the questions as honestly as they can.
Please emphasize that the survey is not a test – there are no „right‟ or „wrong‟ answers, and the
surveys do not measure whether someone is „good‟ or „bad.‟ The questions are designed to help us
understand how participants think and what they are learning through the program.

Also, please read through the cover page of the survey. Please remind participants that the surveys
are confidential: they will use a „code‟ instead of their names (see below) and will place their
completed surveys in an envelope to assure confidentiality. The only people who will see their
answers are the researchers who compile the survey data.

Finally, please make sure that your participants understand that the survey is voluntary. Participants
can skip individual questions that they do not want to answer. Participants who do not want to
complete the survey at all may simply seal their blank survey in their envelope.

4. Explain how to fill in the bubbles on the surveys.
Point out the instruction box on the upper right corner
of the first page of their surveys (like the box to the
right). Participants should use a pencil or a blue or
black ink pen. They should fill in the bubbles
completely.
Instructions:
For each question, make a solid mark that
fills the oval completely.

Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

2

5. Make sure your participants complete the “code” at the beginning of the survey.
In order to be able to assess changes in participant attitudes during the summer, we need to be able to
match each participant‟s baseline and end-of-program surveys. At the same time, we want to keep the
surveys anonymous (and confidential) so that participants will answer honestly.
To accomplish this, we are asking participants to create a “code” at the beginning of their surveys using their
initials, birth date, and gender. The code will let us match baseline and post surveys without participant
names. Please be sure your students/participants complete this section at the beginning of the survey. An
example of how the “code” section should be completed is provided at the end of these instructions.

6. Feel free to read the surveys aloud to your class/group if you think they will have difficulty reading
the survey on their own.
The surveys were designed with the advice of a workgroup of educators and should be at an appropriate
reading level for most middle and high school students. However, we want to encourage any educators with
concerns about the reading level to read the survey aloud for their students/participants.

7. Please remind your participants to complete the demographic information in the survey.
The participant demographic information is important in helping us see if there are differences in the
response to service-learning among different groups of participants – younger and older, male and female,
and participants with different racial or ethnic backgrounds. Consequently, we want to encourage
participants to complete this information as well as answering the main survey questions themselves.
However, we also recognize that some participants are reluctant to provide this information. If any
participant is uncomfortable or reluctant to provide any of the demographic information, he or she can leave
that information blank.

8. When participants have completed their surveys, please have them seal the surveys in an envelope
to assure the confidentiality of their answers.
Once again, we want to assure participants that their individual answers on the surveys will not be seen by
anyone who works directly with them in program. Having participants seal their surveys in an envelope
helps to ensure that confidentiality is maintained.

9. After the surveys are completed, you may want to use them as the basis for a discussion about how
surveys are used to collect information.
Surveys are an important tool for gathering information in community programs. You may want to ask
participants what they thought the survey was trying to learn, what they liked or disliked about the survey, or
what kinds of information they might want to use a survey to collect. Ideally, a survey like this can serve
multiple purposes – as a means of gathering information and as an opportunity to teach skills that are part of
your service-learning program.

10. Please complete the Survey Cover Sheet and return it with the surveys and any signed parent
notification letters to your program’s evaluation contact person.

The Survey Cover Sheet helps programs track which groups have completed surveys, making it possible to
report site by site results, and also helping to ensure that baseline and post-program surveys are properly
matched. Please make sure you complete the cover sheet and return it with the surveys.

If you have any questions, please get in touch with your Summer of Service evaluation contact before
administering the surveys.

THANK YOU for your help with this important process.


Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

3

Sample Participant “Code”
As the first step in the survey process, we want you to create a personal
code. The code will let us keep track of your surveys without knowing your
name. The code has five parts: your initials, your birth date, whether you are
a boy or a girl, and your school and teacher.

A. What are the first initials of your first name, middle name, and last name?

Write
Your
Initials
Then fill in the bubbles below.
First
Name
Initial
A  @ @ @ @ _ ¸ ¤ ¦ @ g { \ × 4 w ·
Middle
Name
Initial

P
@ @ @ @ _ ¸ ¤ ¦  g { \ × 4 w ·
Last
Name
Initial

M
@ @ @ @ _ ¸  ¤ ¦ @ g { \ × 4 w ·

B. What is your birth date? Please fill in the circles to indicate
the month, the date, and the year in which you were born.

C. Are you a boy or a girl?
Month Date Year

± Jan ± 1 ± 13 ± 25 ± 1987
± Feb ± 2 ± 14 ± 26 ± 1988

Mar ± 3 ± 15 ± 27 ± 1989
± Apr ± 4 ± 16 ± 28 ± 1990
± May ± 5 ± 17 ± 29 ± 1991
± Jun ± 6 ± 18 ± 30 ± 1992
± Jul ± 7 ± 19

31 ± 1993
± Aug ± 8 ± 20

1994
± Sep ± 9 ± 21 ± 1995
± Oct ± 10 ± 22 ± 1996
± Nov ± 11 ± 23 ± 1997
± Dec ± 12 ± 24 ± 1998


± 1999


± Other

 Boy
O Girl


D. What is the name of your
school/program?

_____________________________________

E. What is the name of your teacher/program
leader?

____________________________________



Instructions:
For each question, make a solid mark that
fills the oval completely.

Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011


Summer of Service Evaluation
Baseline Survey Cover Sheet

Before administering survey, please complete this cover sheet and
place it in the survey return envelope.

The information on this page will help us know which groups and participants have
completed Summer of Service participant and comparison group surveys. It will also
help us get you the appropriate materials for the end-of-program survey later in the
summer.

1. Are these Summer of Service Participant Surveys or Comparison Group
Surveys? (Please check one)

O Summer of Service (service-learning) participant surveys
O Comparison group (not service-learning) surveys

2. Your contact information
Program Staff/Adult Leader Name:

Organization: ___________

Community (City/State):

Telephone (where you prefer to be reached):

Email:

3. Survey Information
Program/Group that completed surveys (for example, Joe’s Team)

Date Survey was Administered:

Number of Participants in Group:

Number of Completed Surveys:


Thank you for your help with the Summer of Service evaluation. If you have any
questions about the survey process, please contact your Summer of Service evaluation
coordinator.

Summer of Service Program Evaluation Toolkit


Innovations in Civic Participation ©





Thank you for taking the time to complete this survey.

As part of this summer program, we want to learn more about how young people think about themselves
and their communities. To do that, we are asking young people who are involved in summer service-
learning programs to complete this survey. You will be asked to complete a similar survey later in the
summer.

In asking these questions, we want you to know that the information in this survey is confidential. No one
at your program site will see your answers. The only people who will see your survey are the researchers
who are analyzing the surveys, and they are not allowed to let anyone else know your answers. We do ask
you to create a code that will let us match your surveys together. But, again, the only people to see your
completed surveys will be the researchers analyzing the survey results.

We also want you to know that the survey is voluntary. If there are any individual questions on the survey
that you do not want to answer, you may skip those questions. Also, if you do not want to answer any of the
questions, you can leave the survey blank.

Even though the survey is voluntary, we hope that you will take the time to answer the questions and to be
as honest and thoughtful as you can. The information in the survey will help your summer program and
others like it improve the way they work with young people like you.

Thank you for your help. If you have any questions about the survey, please ask your teacher/program
leader.





Summer of Service Survey
Baseline Survey – Summer of Service Participants
This survey was developed by the
Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Program Evaluation Toolkit

Innovations in Civic Participation ©
*B* *P* *1* Page 1



As the first step in the survey process, we want you to create a personal
code. The code will let us keep track of your surveys without knowing your
name. The code has five parts: your initials, your birth date, whether you are
a boy or a girl, and your school/program, and teacher/program leader.

A. What are the first initials of your first name, middle name, and last name?

Write
Your
Initials
Then fill in the bubbles below.
First
Name
Initial
___ ¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸
Middle
Name
Initial

___
¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸
Last
Name
Initial

___
¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸

B. What is your birth date? Please fill in the circles to indicate
the month, the date, and the year in which you were born.

C. Are you a boy or a girl?
Month Date Year










¸
Jan
¸
1
¸
13
¸
25
¸
1987
¸
Feb
¸
2
¸
14
¸
26
¸
1988
¸
Mar
¸
3
¸
15
¸
27
¸
1989
¸
Apr
¸
4
¸
16
¸
28
¸
1990
¸
May
¸
5
¸
17
¸
29
¸
1991
¸
Jun
¸
6
¸
18
¸
30
¸
1992
¸
Jul
¸
7
¸
19
¸
31
¸
1993
¸
Aug
¸
8
¸
20
¸
1994
¸
Sep
¸
9
¸
21
¸
1995
¸
Oct
¸
10
¸
22
¸
1996
¸
Nov
¸
11
¸
23
¸
1997
¸
Dec
¸
12
¸
24
¸
1998


¸
1999


¸
Other

O Boy
O Girl


D. What is the name of your
school/program?

__________________________________

E. What is the name of your teacher/program
leader?

__________________________________


Please Leave Blank. For Administrative Use Only:



______ ______ ______ ______ ______ ______ ______
Program Site Number Survey Number
Instructions:
For each question, make a solid mark that
fills the oval completely.
Summer of Service Evaluation Toolkit

Innovations in Civic Participation ©
*B* *P* *2* Page 2


About You and Your Community
The questions below ask about how you think about you and your community. Please answer the questions as
carefully and honestly as you can. There are no right or wrong answers. We just want to know how you think or feel.

For each statement below, please tell us if you Strongly Disagree, Disagree, Not Sure, Agree, or Strongly Agree.
Please be sure to fill in the circle completely.


Strongly
Disagree Disagree



Not Sure Agree
Strongly
Agree
1. I know how to influence the decisions made by my local
government.
¸ ¸ ¸ ¸ ¸
2. Doing something that helps others is important to me. ¸ ¸ ¸ ¸ ¸
3. It is important to me to do the best I can at school. ¸ ¸ ¸ ¸ ¸
4. I feel like I am an important contributor to my community. ¸ ¸ ¸ ¸ ¸
5. I understand how public decisions are made in my
community.
¸ ¸ ¸ ¸ ¸
6. I feel like I can stand up for what I think is right, even if my
friends disagree.
¸ ¸ ¸ ¸ ¸
7. After this summer, I plan to continue volunteering to help out
my community.
¸ ¸ ¸ ¸ ¸
8. Adults in my community value my opinion. ¸ ¸ ¸ ¸ ¸
9. School is a waste of time. ¸ ¸ ¸ ¸ ¸
10. I know who to ask for help to get something done in my
community.
¸ ¸ ¸ ¸ ¸
11. I feel that most adults are supportive of young people’s efforts
to work on school and community problems.
¸ ¸ ¸ ¸ ¸
12. When I grow up, I plan to work with a group to solve a
problem in the community where I live.
¸ ¸ ¸ ¸ ¸
13. My community is important to me. ¸ ¸ ¸ ¸ ¸
14. Finishing high school is important for me. ¸ ¸ ¸ ¸ ¸
15. I understand the different kinds of services my town provides
to people in my community.
¸ ¸ ¸ ¸ ¸
16. I believe that I can make a difference in my community. ¸ ¸ ¸ ¸ ¸
17. I want to help other people, even if it is hard work. ¸ ¸ ¸ ¸ ¸
18. It is my responsibility to help improve the community. ¸ ¸ ¸ ¸ ¸
19. I would like to quit school as soon as possible. ¸ ¸ ¸ ¸ ¸
20. When I am an adult, I want to be able to vote. ¸ ¸ ¸ ¸ ¸
21. Helping other people is something I am personally
responsible for.
¸ ¸ ¸ ¸ ¸
22. I know how to design and do a service project in my
community.
¸ ¸ ¸ ¸ ¸
23. By working with others in the community, I can help make
things better.
¸ ¸ ¸ ¸ ¸
24. When I am an adult, I plan to get information about
candidates and issues before voting in an election.
¸ ¸ ¸ ¸ ¸
Summer of Service Evaluation Toolkit

Innovations in Civic Participation ©
*B* *P* *3* Page 3



Strongly
Disagree Disagree



Not Sure Agree
Strongly
Agree
25. I know other people who care about the same social issues
that I do.
¸ ¸ ¸ ¸ ¸
26. I am looking forward to going back to school in the Fall. ¸ ¸ ¸ ¸ ¸
27. I really want to graduate from college. ¸ ¸ ¸ ¸ ¸

28. As things stand now, how far would you like to go in school? (Please check one response)

O Less than high school graduation
O High school graduation or GED only
O Attend a technical/vocational school
O Attend college
O Attend graduate school (for example, law school or medical school) after college


29. During the past school year, has anyone from your family, school, or from a community program done any of the
following?

Yes No
Don’t
Know
a. Talked to you about going to college?
¸ ¸ ¸
b. Taken you to visit a college? ¸ ¸ ¸
c. Talked with you about the kinds of courses you need to take so you
can get into college?
¸ ¸ ¸
d. Explained how you could pay for college? ¸ ¸ ¸



30. How much do you feel you know about each of the following? For each item please indicate whether you know
Nothing at All, Very Little, Some, A Lot, All I Need.

Nothing
at All
Very
Little Some A Lot
All I
Need
a. How to pay for college
¸ ¸ ¸ ¸ ¸
b. What high school courses you need to take to get into college ¸ ¸ ¸ ¸ ¸
c. How to apply for college ¸ ¸ ¸ ¸ ¸
d. Why you should go to college ¸ ¸ ¸ ¸ ¸

About You
Finally we would like to ask a few questions about you.

31. How old are you? (Please fill in the circle for your age)
11 12 13 14 15 16 17 18 19 20 Other
Summer of Service Evaluation Toolkit

Innovations in Civic Participation ©
*B* *P* *4* Page 4

O O O O O O O O O O O


32. What grade were you in this past school year? (Please fill in the circle for your grade last year)
5 6 7 8 9 10 11 12 Other
O O O O O O O O O


33. How would you describe your racial or ethnic background? (Please feel free to mark all the answers that apply.)

O Alaskan or Native American O Native Hawaiian or other Pacific Islander
O Asian O White
O Black or African-American O Hispanic/Latino(a)
O Other

34. Did you have any classes in the past year where you did a service project in your community as part of the class?
O Yes, I had one or more classes last year where we did a service project.
O No, I did not have any classes last year where we did a service project.


35. During the last school year, approximately how many hours did you spend each week volunteering/providing
community service (including service performed through your school)?

O 0 hours per week
O Less than 1 hour per week
O 1-3 hours per week
O 4-6 hours per week
O 7 or more hours per week

36. Did you participate in this Summer of Service program last year?
O Yes
O No


You are done!! Thank you for helping with our survey!
Please do not fold survey. Hand to the appropriate person for collection.

Summer of Service Program Evaluation Toolkit



Innovations in Civic Participation ©
Summer of Service Survey
End of Program Survey – Summer of Service Participants

Thank you for taking the time to complete this survey.

As you may remember, as part of this summer program we are asking participants to complete a set of
surveys so we can learn about how young people think about themselves and their communities. You may
have completed a similar survey earlier in the summer.

In asking these questions, we want you to know that the information in this survey is confidential. No one at
your program site will see your answers. When you are done with the survey you will seal it in an envelope
before you turn it in. The only people who will see your survey are the researchers who are analyzing the
surveys, and they are not allowed to let anyone else know your answers. We do ask you to create a code
that will let us match your surveys together. But, again, the only people to see your completed surveys will
be the researchers analyzing the survey results.

We also want you to know that the survey is voluntary. If there are any individual questions on the survey
that you do not want to answer, you may skip those questions. Also, if you do not want to answer any of the
questions, you can leave the survey blank. Just seal your blank survey in your envelope and turn it in with
the rest of your program or class.

Even though the survey is voluntary, we hope that you will take the time to answer the questions and to be
as honest and thoughtful as you can. The information in the survey will help your summer program and
others like it improve the way they work with young people like you.

Thank you for your help. If you have any questions about the survey, please ask your teacher/program
leader.




This survey was developed by the
Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Program Evaluation Toolkit



*P* *P* *1* Innovations in Civic Participation ©
Page 1


As the first step in the survey process, we want you to create a personal
code. The code will let us keep track of your surveys without knowing your
name. The code has five parts: your initials, your birth date, whether you are
a boy or a girl, and your school/program, and teacher/program leader.

A. What are the first initials of your first name, middle name, and last name?

Write
Your
Initials
Then fill in the bubbles below.
First
Name
Initial
___ ¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸
Middle
Name
Initial

___
¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸
Last Name
Initial

___
¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸

B. What is your birth date? Please fill in the circles to indicate
the month, the date, and the year in which you were born.

C. Are you a boy or a girl?
Month Date Year










¸
Jan
¸
1
¸
13
¸
25
¸
1987
¸
Feb
¸
2
¸
14
¸
26
¸
1988
¸
Mar
¸
3
¸
15
¸
27
¸
1989
¸
Apr
¸
4
¸
16
¸
28
¸
1990
¸
May
¸
5
¸
17
¸
29
¸
1991
¸
Jun
¸
6
¸
18
¸
30
¸
1992
¸
Jul
¸
7
¸
19
¸
31
¸
1993
¸
Aug
¸
8
¸
20
¸
1994
¸
Sep
¸
9
¸
21
¸
1995
¸
Oct
¸
10
¸
22
¸
1996
¸
Nov
¸
11
¸
23
¸
1997
¸
Dec
¸
12
¸
24
¸
1998


¸
1999


¸
Other

O Boy
O Girl


D. What is the name of your
school/program?

__________________________________

E. What is the name of your teacher/program
leader?

__________________________________



Please Leave Blank. For Administrative Use Only:



______ ______ ______ ______ ______ ______ ______
Program Site Number Survey Number
Instructions:
For each question, make a solid mark that
fills the oval completely.
Summer of Service Program Evaluation Toolkit




*P* *P* *2* Innovations in Civic Participation ©
Page 2

About You and Your Community
The questions below ask about how you think about you and your community. Please answer the questions as
carefully and honestly as you can. There are no right or wrong answers. We just want to know how you think or feel.

For each statement below, please tell us if you Strongly Disagree, Disagree, Uncertain, Agree, or Strongly Agree.
Please be sure to fill in the circle completely.


Strongly
Disagree Disagree



Uncertain Agree
Strongly
Agree
1. I know how to influence the decisions made by my local
government.
¸ ¸ ¸ ¸ ¸
2. Doing something that helps others is important to me. ¸ ¸ ¸ ¸ ¸
3. It is important to me to do the best I can at school. ¸ ¸ ¸ ¸ ¸
4. I feel like I am an important contributor to my community. ¸ ¸ ¸ ¸ ¸
5. I understand how public decisions are made in my
community.
¸ ¸ ¸ ¸ ¸
6. I feel like I can stand up for what I think is right, even if my
friends disagree.
¸ ¸ ¸ ¸ ¸
7. After this summer, I plan to continue volunteering to help out
my community.
¸ ¸ ¸ ¸ ¸
8. Adults in my community value my opinion. ¸ ¸ ¸ ¸ ¸
9. School is a waste of time. ¸ ¸ ¸ ¸ ¸
10. I know who to ask for help to get something done in my
community.
¸ ¸ ¸ ¸ ¸
11. I feel that most adults are supportive of young people’s efforts
to work on school and community problems.
¸ ¸ ¸ ¸ ¸
12. When I grow up, I plan to work with a group to solve a
problem in the community where I live.
¸ ¸ ¸ ¸ ¸
13. My community is important to me. ¸ ¸ ¸ ¸ ¸
14. Finishing high school is important for me. ¸ ¸ ¸ ¸ ¸
15. I understand the different kinds of services my town provides
to people in my community.
¸ ¸ ¸ ¸ ¸
16. I believe that I can make a difference in my community. ¸ ¸ ¸ ¸ ¸
17. I want to help other people, even if it is hard work. ¸ ¸ ¸ ¸ ¸
18. It is my responsibility to help improve the community. ¸ ¸ ¸ ¸ ¸
19. I would like to quit school as soon as possible. ¸ ¸ ¸ ¸ ¸
20. When I am an adult, I want to be able to vote. ¸ ¸ ¸ ¸ ¸
21. Helping other people is something I am personally
responsible for.
¸ ¸ ¸ ¸ ¸
22. I know how to design and do a service project in my
community.
¸ ¸ ¸ ¸ ¸
23. By working with others in the community, I can help make
things better.
¸ ¸ ¸ ¸ ¸
Summer of Service Program Evaluation Toolkit




*P* *P* *3* Innovations in Civic Participation ©
Page 3



Strongly
Disagree Disagree



Uncertain Agree
Strongly
Agree
24. When I am an adult, I plan to get information about
candidates and issues before voting in an election.
¸ ¸ ¸ ¸ ¸
25. I know other people who care about the same social issues
that I do.
¸ ¸ ¸ ¸ ¸
26. I am looking forward to going back to school in the Fall. ¸ ¸ ¸ ¸ ¸
27. I really want to graduate from college. ¸ ¸ ¸ ¸ ¸


28. As things stand now, how far would you like to go in school? (Please check one response)

O Less than high school graduation
O High school graduation or GED only
O Attend a technical/vocational school
O Attend college
O Attend graduate school (for example, law school or medical school) after college


29. During this summer, has anyone in your Summer of Service program done any of the following?

Yes No
Don’t
Know
a. Talked to you about going to college
¸ ¸ ¸
b. Taken you to visit a college? ¸ ¸ ¸
c. Talked with you about the kinds of courses you need to take so you
can get into college?
¸ ¸ ¸
d. Explained how you could pay for college? ¸ ¸ ¸


30. How much do you feel you know about each of the following? For each item please indicate whether you know
Nothing at All, Very Little, Some, A Lot, All I Need.

Nothing
at All
Very
Little Some A Lot
All I
Need
a. How to pay for college
¸ ¸ ¸ ¸ ¸
b. What high school courses you need to take to get into college ¸ ¸ ¸ ¸ ¸
c. How to apply for college ¸ ¸ ¸ ¸ ¸
d. Why you should go to college ¸ ¸ ¸ ¸ ¸

Summer of Service Program Evaluation Toolkit




*P* *P* *4* Innovations in Civic Participation ©
Page 4

Experience in the Community
31. We would also like to learn about your experience with any service-learning projects you were involved in this
summer through this program. For each of the following statements, please tell us if you feel that the statement is
Not True at All, Not Very True, Sort of True, or Very True for you.

Not
True
at All
Not
Very
True
Sort
of
True
Very
True
a. I had a say in choosing the problem or issue that I worked on. ¸ ¸ ¸ ¸
b. I had a chance to discuss or research the problem or issue before I took
action.
¸ ¸ ¸ ¸
c. I met with or worked with people or organizations in the community in
order to learn more about the problem.
¸ ¸ ¸ ¸
d. I learned about the causes and effects of the problem or issue we worked
on as part of our project.
¸ ¸ ¸ ¸
e. I had a chance to compare different solutions to a problem before deciding
what kind of action to take.
¸ ¸ ¸ ¸
f. I felt like the problem I worked on was important. ¸ ¸ ¸ ¸
g. I felt like I had real responsibilities on my project. ¸ ¸ ¸ ¸
h. I had a chance to talk or write about my experiences on our project. ¸ ¸ ¸ ¸
i. My teacher/project leader talked about how our project related to the
subjects we study in school.
¸ ¸ ¸ ¸
j. I completed all the steps on my project that I had planned. ¸ ¸ ¸ ¸
k. I felt like my project made a difference. ¸ ¸ ¸ ¸
l. I presented and/or discussed the results of our project with one or more
members of the community.
¸ ¸ ¸ ¸
m. People in my school or community thought the work we did on our project
was important.
¸ ¸ ¸ ¸
n. We tried to find out whether our project made a difference. ¸ ¸ ¸ ¸
o. I want to continue working on this issue, either on my own or with another
class at school.
¸ ¸ ¸ ¸

32. How would you rate your experience working on your service project (or projects) this summer?

O Poor O Fair O Good O Excellent


33. Would you be interested in participating in Summer of Service or a similar program next year?
O Yes
O No
O Maybe
Summer of Service Program Evaluation Toolkit




*P* *P* *5* Innovations in Civic Participation ©
Page 5

34. We would also like to know about how well you think you can do some important tasks in your community. For
each of the following questions, please tell us how well you could do each type of task at the beginning of the
summer and now. Could you do it Not at all? A little? Pretty well? Or Very well?
For example, in the sample question below, we ask you how well you could ‘Give a friend accurate directions to
the town hall.’ To answer, first you need to fill in one of the circles on the left side of the page to tell us how
well you could give the right directions at the beginning of the summer. Then, you would fill in one of the
circles on the right side of the page to tell us how well you think you can give identify accurate directions now.
In the example below, we have filled in the circle indicating that you could give directions ‘a little’ at the beginning
of the year and ‘pretty well’ now.
At the beginning of the
summer

How well could you do each of the following? Now
Not at
All
A
Little
Pretty
Well
Very
Well
Not at
All
A
Little
Pretty
Well
Very
Well
¸

¸ ¸
a. Give a friend accurate directions to the town hall?
(sample question)
¸ ¸

¸
¸ ¸ ¸ ¸
b. Identify needs or problems that are important to
your community?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
c. Use more than one source to gather information
on a school or community problem (for example,
newspapers, the Internet, people in government
agencies or community organizations, etc.)?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
d. Make phone calls or do interviews to gather
information on community problem?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
e. Decide what is important to think about in
choosing a community project?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
f. Set up a timeline and action steps for a community
project?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
g. Identify people who need to be involved in a
community project?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
h. Manage your time so you can get all of the steps
in a project done?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
i. Look at different ways to solve a community
problem to find the best solution?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
j. Talk or present to people about a community issue
that you care about?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
k. Work on a team with other students to help solve a
community problem?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
l. Figure out whether or not a project made a
difference?
¸ ¸ ¸ ¸





You are done!! Thank you for helping with our survey!
Please do not fold survey. Hand to the appropriate person for collection.
Summer of Service Program Design Toolkit


Innovations in Civic Participation © 2011








7.2 Pre/Post Comparison Group Survey Materials

























Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011






Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

SAMPLE PARENT NOTIFICATION NOTICE – PASSIVE PERMISSION
(Comparison Group Members)

Dear Parent or Guardian:

As part of a study of summer programs in your community, we are asking young people in several
area summer programs to complete a survey at the beginning and end of their summer experience.

The survey asks program participants about their experience in their summer program and its impact
on a variety of civic and school-related attitudes and skills. The survey was developed by
Innovations in Civic Participation (ICP) and the Center for Youth and Communities at Brandeis
University as part of a national effort to evaluate summer of service and other youth programs.
Copies of the complete Summer of Service survey are available upon request.

We want to assure you that all of the information collected through the survey will be kept
strictly confidential. It will only used for the purposes of evaluating this summer program and
other similar programs around the country. The surveys do not include participant names (students
use a ‘code’ instead of their name on the survey), and participants are instructed to seal their
completed surveys in envelopes before handing them in to program staff. Participants may also skip
any questions in the survey that they are uncomfortable answering.

We are sending this notice home to inform you of the survey and to give you the opportunity to let us
know if you do not want your child to participate in the evaluation. We believe that the survey
provides valuable information that will help our ongoing efforts to improve summer programs and
provide a quality experience for area children. We want to encourage you, therefore, to allow your
child to participate.

If you DO NOT want your child to participate in this study, please let us know by completing the form
below and returning it to your child’s program staff within the next three days. If you are willing to
have your child participate you do not need to take any further action.

If you have any questions about the study, please feel free to call me at (___) ____________.

Thank you for your cooperation.


-------------------------------------------------------------------------------------------------------------------------


WITHDRAWAL OF PERMISSION
Please Sign and Return to a Member of our Program Staff Within Three Days if You DO NOT want
Your Child to be Included in this Evaluation

I do not want my child, ______________________, to participate in the summer program evaluation
survey. Please take the necessary steps to make sure that my child is not included in the survey process.

Parent(s)/Guardian(s) Signature(s):__________________________________________________

Daytime Telephone: (_______) ______ - _________ Today’s Date: _____________
Area Code Number
Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

SAMPLE PARENT PERMISSION NOTICE – ACTIVE PERMISSION
(Comparison Group Members)

Dear Parent or Guardian:

As part of a study of summer programs in your community, we are asking young people in several
area summer programs to complete a survey at the beginning and end of their summer experience.

The survey asks students about their experience in their summer program and its impact on a
variety of civic and school-related attitudes and skills. The survey was developed by Innovations in
Civic Participation (ICP) and the Center for Youth and Communities at Brandeis University as part of
a national effort to evaluate Summer of Service and other youth programs. Copies of the complete
Summer of Service survey are available upon request.

We want to assure you that all of the information collected through the survey will be kept
strictly confidential. It will only used for the purposes of evaluating this summer program and
other similar programs around the country. The surveys do not include participant names (students
use a ‘code’ instead of their name on the survey), and participants are instructed to seal their
completed surveys in envelopes before handing them in to program staff. Participants may also skip
any questions in the survey that they are uncomfortable answering.

We are writing to ask your permission to include your child in the survey process. We believe that
the survey provides valuable information that will help our ongoing efforts to improve summer
programs and provide a quality experience for area children. We want to encourage you, therefore,
to allow your child to participate.

If you agree to have your child participate in the survey, please complete the form below and
returning it to a member of our program staff within the next three days. If you do not return the
permission form, we will not include your child in the survey.

If you have any questions about the study, please feel free to call me at (___) ____________.

Thank you for your cooperation.


-------------------------------------------------------------------------------------------------------------------------


SURVEY PERMISSION
Please Sign and Return to a Member of our Program Staff Within Three Days if You Give
Permission for Your Child to be Included in this Evaluation

I give permission for my child, ______________________, to participate in the summer program
evaluation survey.

Parent(s)/Guardian(s) Signature(s):__________________________________________________

Daytime Telephone: (_______) ______ - _________ Today’s Date: _____________
Area Code Number

Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

1


Summer of Service Evaluation
Baseline Survey Instructions – Comparison Group


Thank you for your help administering the Summer of Service comparison group survey. The survey is
part of a national effort to document the impacts of service-learning programs and to learn how to make
them even more effective. For comparison group members, we are describing the program as a study of
different summer programs in your community (rather than describing this as a “comparison survey”) in
hopes that we can encourage participants to take part in the survey and answer as seriously as possible.

There are a few simple instructions for administering the survey with your summer program participants.
1. Send the enclosed parent permission forms home at least a few days prior to administering
the surveys.
The forms are designed to make parents aware of the study and to allow them to “opt out” if they do
not want their children to participate. Again, the permission forms describe this as a study of different
types of summer programs. If your program is using the “Active” permission forms, parents must
return the form before their children take the survey. If you are using the parent “Notification‟
version, only those participants whose parents do not want them to participate need to return signed
parent forms.

2. Please give the survey to your program participants at the beginning of your summer program
activities.
The purpose of the baseline survey is to help us understand participant attitudes and ideas at the
beginning of their summer program experience, so that we can compare them to those attitudes and
ideas at the end of the summer. We will be asking your participants to take a similar survey at the
end of the program to see how their ideas, attitudes and skills have changed.

3. Take a few minutes to introduce the survey to your participants and to encourage them to
answer the questions as honestly as they can.

Please emphasize that the survey is not a test – there are no „right‟ or „wrong‟ answers, and the
surveys do not measure whether someone is „good‟ or „bad.‟ The questions are designed to help us
understand how participants think and what they are learning through the program.

Also, please read through the cover page of the survey. Please remind participants that the surveys
are confidential: they will use a „code‟ instead of their names (see below) and will place their
completed surveys in an envelope to assure confidentiality. The only people who will see their
answers are the researchers who compile the survey data.

Finally, please make sure that your participants understand that the survey is voluntary. Participants
can skip individual questions that they do not want to answer. Participants who do not want to
complete the survey at all may simply seal their blank survey in their envelope.

4. Explain how to fill in the bubbles on the surveys.
Point out the instruction box on the upper right corner
of the first page of their surveys (like the box to the
right). Participants should use a pencil or a blue or
black ink pen. They should fill in the bubbles
completely.
Instructions:
For each question, make a solid mark that
fills the oval completely.

Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

2

5. Make sure your participants complete the “code” at the beginning of the survey.
In order to be able to assess changes in participant attitudes during the summer, we need to be able to
match each participant‟s baseline and end-of-program surveys. At the same time, we want to keep the
surveys anonymous (and confidential) so that participants will answer honestly.
To accomplish this, we are asking participants to create a “code” at the beginning of their surveys using their
initials, birth date, and gender. The code will let us match baseline and post surveys without participant
names. Please be sure your students/participants complete this section at the beginning of the survey. An
example of how the “code” section should be completed is provided at the end of these instructions.

6. Feel free to read the surveys aloud to your class/group if you think they will have difficulty reading
the survey on their own.
The surveys were designed with the advice of a workgroup of educators and should be at an appropriate
reading level for most middle and high school students. However, we want to encourage any educators with
concerns about the reading level to read the survey aloud for their students/participants.

7. Please remind your participants to complete the demographic information in the survey.
The participant demographic information is important in helping us see if there are differences in the
response to service-learning among different groups of participants – younger and older, male and female,
and participants with different racial or ethnic backgrounds. Consequently, we want to encourage
participants to complete this information as well as answering the main survey questions themselves.
However, we also recognize that some participants are reluctant to provide this information. If any
participant is uncomfortable or reluctant to provide any of the demographic information, he or she can leave
that information blank.

8. When participants have completed their surveys, please have them seal the surveys in an envelope
to assure the confidentiality of their answers.
Once again, we want to assure participants that their individual answers on the surveys will not be seen by
anyone who works directly with them in program. Having participants seal their surveys in an envelope
helps to ensure that confidentiality is maintained.

9. Please complete the Survey Cover Sheet and return it with the surveys and any signed parent
notification letters to your program’s evaluation contact person.

The Survey Cover Sheet helps programs track which groups have completed surveys, making it possible to
report site by site results, and also helping to ensure that baseline and post-program surveys are properly
matched. Please make sure you complete the cover sheet and return it with the surveys.


If you have any questions, please get in touch with your Summer of Service evaluation contact before
administering the surveys.

THANK YOU for your help with this important process.


Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

3

Sample Participant “Code”
As the first step in the survey process, we want you to create a personal
code. The code will let us keep track of your surveys without knowing your
name. The code has five parts: your initials, your birth date, whether you are
a boy or a girl, and your school and teacher.

A. What are the first initials of your first name, middle name, and last name?

Write
Your
Initials
Then fill in the bubbles below.
First
Name
Initial
A  @ @ @ @ _ ¸ ¤ ¦ @ g { \ × 4 w ·
Middle
Name
Initial

P
@ @ @ @ _ ¸ ¤ ¦  g { \ × 4 w ·
Last
Name
Initial

M
@ @ @ @ _ ¸  ¤ ¦ @ g { \ × 4 w ·

B. What is your birth date? Please fill in the circles to indicate
the month, the date, and the year in which you were born.

C. Are you a boy or a girl?
Month Date Year

± Jan ± 1 ± 13 ± 25 ± 1987
± Feb ± 2 ± 14 ± 26 ± 1988

Mar ± 3 ± 15 ± 27 ± 1989
± Apr ± 4 ± 16 ± 28 ± 1990
± May ± 5 ± 17 ± 29 ± 1991
± Jun ± 6 ± 18 ± 30 ± 1992
± Jul ± 7 ± 19

31 ± 1993
± Aug ± 8 ± 20

1994
± Sep ± 9 ± 21 ± 1995
± Oct ± 10 ± 22 ± 1996
± Nov ± 11 ± 23 ± 1997
± Dec ± 12 ± 24 ± 1998


± 1999


± Other

 Boy
O Girl


D. What is the name of your
school/program?

_____________________________________

E. What is the name of your teacher/program
leader?

____________________________________



Instructions:
For each question, make a solid mark that
fills the oval completely.

Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011


Summer of Service Evaluation
Baseline Survey Cover Sheet

Please complete this cover sheet and return it to your Summer of
Service evaluation contact person with the completed participant
surveys from your group or class.

The information on this page will help us know which groups and participants have
completed Summer of Service participant and comparison group surveys. It will also
help us get you the appropriate materials for the end-of-program survey later in the
summer.

1. Are these Summer of Service Participant Surveys or Comparison Group
Surveys? (Please check one)

O Summer of Service (service-learning) participant surveys
O Comparison group (not service-learning) surveys

2. Your contact information
Program Staff/Adult Leader Name:

Organization: ___________

Community (City/State):

Telephone (where you prefer to be reached):

Email:

3. Survey Information
Program/Group that completed surveys (for example, Joe’s Team)

Date Survey was Administered:

Number of Participants in Group:

Number of Completed Surveys:


Thank you for your help with the Summer of Service evaluation. Please return the
surveys from your group to your Summer of Service evaluation contact person as soon
as the surveys have been completed. If you have any questions about the survey
process, please contact your Summer of Service evaluation coordinator.

Summer of Service Program Evaluation Toolkit



Innovations in Civic Participation ©


Summer of Service Survey
Baseline Survey – Summer Programs


Thank you for taking the time to complete this survey.

As part of this summer program, we want to learn more about how young people think about themselves
and their communities. To do that, we are asking young people who are involved in a variety of summer
programs in your community to complete this survey. You will be asked to complete a similar survey later in
the summer.

In asking these questions, we want you to know that the information in this survey is confidential. No one
at your program site will see your answers. When you are done with the survey you will seal it in an
envelope before you turn it in. The only people who will see your survey are the researchers who are
analyzing the surveys, and they are not allowed to let anyone else know your answers. We do ask you to
create a code that will let us match your surveys together. But, again, the only people to see your
completed surveys will be the researchers analyzing the survey results.

We also want you to know that the survey is voluntary. If there are any individual questions on the survey
that you do not want to answer, you may skip those questions. Also, if you do not want to answer any of the
questions, you can leave the survey blank. Just seal your blank survey in your envelope and turn it in with
the rest of your program or class.

Even though the survey is voluntary, we hope that you will take the time to answer the questions and to be
as honest and thoughtful as you can. The information in the survey will help your summer program and
others like it improve the way they work with young people like you.

Thank you for your help. If you have any questions about the survey, please ask your teacher/program
leader.





This survey was developed by the
Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Program Evaluation Toolkit


Innovations in Civic Participation ©
*B* *C* *1* Page 1


As the first step in the survey process, we want you to create a personal
code. The code will let us keep track of your surveys without knowing your
name. The code has five parts: your initials, your birth date, whether you are
a boy or a girl, and your school/program, and teacher/program leader.

A. What are the first initials of your first name, middle name, and last name?

Write
Your
Initials
Then fill in the bubbles below.
First
Name
Initial
___ ¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸
Middle
Name
Initial

___
¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸
Last Name
Initial

___
¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸

B. What is your birth date? Please fill in the circles to indicate
the month, the date, and the year in which you were born.

C. Are you a boy or a girl?
Month Date Year










¸
Jan
¸
1
¸
13
¸
25
¸
1987
¸
Feb
¸
2
¸
14
¸
26
¸
1988
¸
Mar
¸
3
¸
15
¸
27
¸
1989
¸
Apr
¸
4
¸
16
¸
28
¸
1990
¸
May
¸
5
¸
17
¸
29
¸
1991
¸
Jun
¸
6
¸
18
¸
30
¸
1992
¸
Jul
¸
7
¸
19
¸
31
¸
1993
¸
Aug
¸
8
¸
20
¸
1994
¸
Sep
¸
9
¸
21
¸
1995
¸
Oct
¸
10
¸
22
¸
1996
¸
Nov
¸
11
¸
23
¸
1997
¸
Dec
¸
12
¸
24
¸
1998


¸
1999


¸
Other

O Boy
O Girl


D. What is the name of your
school/program?

__________________________________

E. What is the name of your teacher/program
leader?

__________________________________



Please Leave Blank. For Administrative Use Only:



______ ______ ______ ______ ______ ______ ______
Program Site Number Survey Number
Instructions:
For each question, make a solid mark that
fills the oval completely.
Summer of Service Program Evaluation Toolkit



Innovations in Civic Participation ©
*B* *C* *2* Page 2


About You and Your Community
The questions below ask about how you think about you and your community. Please answer the questions as
carefully and honestly as you can. There are no right or wrong answers. We just want to know how you think or feel.

For each statement below, please tell us if you Strongly Disagree, Disagree, Uncertain, Agree, or Strongly Agree.
Please be sure to fill in the circle completely.


Strongly
Disagree Disagree



Uncertain Agree
Strongly
Agree
1. I know how to influence the decisions made by my local
government.
¸ ¸ ¸ ¸ ¸
2. Doing something that helps others is important to me. ¸ ¸ ¸ ¸ ¸
3. It is important to me to do the best I can at school. ¸ ¸ ¸ ¸ ¸
4. I feel like I am an important contributor to my community. ¸ ¸ ¸ ¸ ¸
5. I understand how public decisions are made in my
community.
¸ ¸ ¸ ¸ ¸
6. I feel like I can stand up for what I think is right, even if my
friends disagree.
¸ ¸ ¸ ¸ ¸
7. After this summer, I plan to continue volunteering to help out
my community.
¸ ¸ ¸ ¸ ¸
8. Adults in my community value my opinion. ¸ ¸ ¸ ¸ ¸
9. School is a waste of time. ¸ ¸ ¸ ¸ ¸
10. I know who to ask for help to get something done in my
community.
¸ ¸ ¸ ¸ ¸
11. I feel that most adults are supportive of young people’s efforts
to work on school and community problems.
¸ ¸ ¸ ¸ ¸
12. When I grow up, I plan to work with a group to solve a
problem in the community where I live.
¸ ¸ ¸ ¸ ¸
13. My community is important to me. ¸ ¸ ¸ ¸ ¸
14. Finishing high school is important for me. ¸ ¸ ¸ ¸ ¸
15. I understand the different kinds of services my town provides
to people in my community.
¸ ¸ ¸ ¸ ¸
16. I believe that I can make a difference in my community. ¸ ¸ ¸ ¸ ¸
17. I want to help other people, even if it is hard work. ¸ ¸ ¸ ¸ ¸
18. It is my responsibility to help improve the community. ¸ ¸ ¸ ¸ ¸
19. I would like to quit school as soon as possible. ¸ ¸ ¸ ¸ ¸
20. When I am an adult, I want to be able to vote. ¸ ¸ ¸ ¸ ¸
21. Helping other people is something I am personally
responsible for.
¸ ¸ ¸ ¸ ¸
22. I know how to design and do a service project in my
community.
¸ ¸ ¸ ¸ ¸
Summer of Service Program Evaluation Toolkit



Innovations in Civic Participation ©
*B* *C* *3* Page 3



Strongly
Disagree Disagree



Uncertain Agree
Strongly
Agree
23. By working with others in the community, I can help make
things better.
¸ ¸ ¸ ¸ ¸
24. When I am an adult, I plan to get information about
candidates and issues before voting in an election.
¸ ¸ ¸ ¸ ¸
25. I know other people who care about the same social issues
that I do.
¸ ¸ ¸ ¸ ¸
26. I am looking forward to going back to school in the Fall. ¸ ¸ ¸ ¸ ¸
27. I really want to graduate from college. ¸ ¸ ¸ ¸ ¸

28. As things stand now, how far would you like to go in school? (Please check one response)

O Less than high school graduation
O High school graduation or GED only
O Attend a technical/vocational school
O Attend college
O Attend graduate school (for example, law school or medical school) after college


29. During the past year, has anyone from your family, school, or from a community program done any of the
following?

Yes No
Don’t
Know
a. Talked to you about going to college
¸ ¸ ¸
b. Taken you to visit a college? ¸ ¸ ¸
c. Talked with you about the kinds of courses you need to take so you
can get into college?
¸ ¸ ¸
d. Explained how you could pay for college? ¸ ¸ ¸

30. How much do you feel you know about each of the following? For each item please indicate whether you know
Nothing at All, Very Little, Some, A Lot, All I Need.

Nothing
at All
Very
Little Some A Lot
All I
Need
a. How to pay for college
¸ ¸ ¸ ¸ ¸
b. What high school courses you need to take to get into college ¸ ¸ ¸ ¸ ¸
c. How to apply for college ¸ ¸ ¸ ¸ ¸
d. Why you should go to college ¸ ¸ ¸ ¸ ¸
About You
Finally we would like to ask a few questions about you.

31. How old are you? (Please fill in the circle for your age)
Summer of Service Program Evaluation Toolkit



Innovations in Civic Participation ©
*B* *C* *4* Page 4

11 12 13 14 15 16 17 18 19 20 Other
O O O O O O O O O O O


32. What grade were you in this past school year? (Please fill in the circle for your grade last year)
5 6 7 8 9 10 11 12 Other
O O O O O O O O O


33. How would you describe your racial or ethnic background? (Please feel free to mark all the answers that apply.)

O Alaskan or Native American O Native Hawaiian or other Pacific Islander
O Asian O White
O Black or African-American O Hispanic/Latino(a)
O Other


34. Did you have any classes in the past year where you did a service project in your community as part of the class?
O Yes, I had one or more classes last year where we did a service project.
O No, I did not have any classes last year where we did a service project.


35. During the last term/semester of school last year, approximately how many hours did you spend each week
volunteering/providing community service (including service performed through your school)?

O 0 hours per week
O Less than 1 hour per week
O 1-3 hours per week
O 4-6 hours per week
O 7 or more hours per week




You are done!! Thank you for helping with our survey!
Do not fold. Please hand to the appropriate person for collection.


Summer of Service Program Evaluation Toolkit




Innovations in Civic Participation ©
Pilot Version – Summer 2010 NOT FOR DISTRIBUTION
Summer of Service Survey
End of Program Survey – Summer Programs

Thank you for taking the time to complete this survey.

As you may remember, as part of this summer program we are asking participants to complete a set of
surveys so we can learn about how young people think about themselves and their communities. You may
have completed a similar survey earlier in the summer.

In asking these questions, we want you to know that the information in this survey is confidential. No one at
your program site will see your answers. When you are done with the survey you will seal it in an envelope
before you turn it in. The only people who will see your survey are the researchers who are analyzing the
surveys, and they are not allowed to let anyone else know your answers. We do ask you to create a code
that will let us match your surveys together. But, again, the only people to see your completed surveys will
be the researchers analyzing the survey results.

We also want you to know that the survey is voluntary. If there are any individual questions on the survey
that you do not want to answer, you may skip those questions. Also, if you do not want to answer any of the
questions, you can leave the survey blank. Just seal your blank survey in your envelope and turn it in with
the rest of your program or class.

Even though the survey is voluntary, we hope that you will take the time to answer the questions and to be
as honest and thoughtful as you can. The information in the survey will help your summer program and
others like it improve the way they work with young people like you.

Thank you for your help. If you have any questions about the survey, please ask your teacher/program
leader.





This survey was developed by the
Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Program Evaluation Toolkit




*P* *C* *1* Innovations in Civic Participation ©
Page 1


As the first step in the survey process, we want you to create a personal
code. The code will let us keep track of your surveys without knowing your
name. The code has five parts: your initials, your birth date, whether you are
a boy or a girl, and your school/program, and teacher/program leader.

A. What are the first initials of your first name, middle name, and last name?

Write
Your
Initials
Then fill in the bubbles below.
First
Name
Initial
___ ¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸
Middle
Name
Initial

___
¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸
Last Name
Initial

___
¸ ¸ ¦ [ ¦ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸ ¦ ¸ ¸ ¸ [ ¸ ¸ ¦ ¸ ¸ ¸ ¸ ¸

B. What is your birth date? Please fill in the circles to indicate
the month, the date, and the year in which you wer born.

C. Are you a boy or a girl?
Month Date Year










¸
Jan
¸
1
¸
13
¸
25
¸
1987
¸
Feb
¸
2
¸
14
¸
26
¸
1988
¸
Mar
¸
3
¸
15
¸
27
¸
1989
¸
Apr
¸
4
¸
16
¸
28
¸
1990
¸
May
¸
5
¸
17
¸
29
¸
1991
¸
Jun
¸
6
¸
18
¸
30
¸
1992
¸
Jul
¸
7
¸
19
¸
31
¸
1993
¸
Aug
¸
8
¸
20
¸
1994
¸
Sep
¸
9
¸
21
¸
1995
¸
Oct
¸
10
¸
22
¸
1996
¸
Nov
¸
11
¸
23
¸
1997
¸
Dec
¸
12
¸
24
¸
1998


¸
1999


¸
Other

O Boy
O Girl


D. What is the name of your
school/program?

__________________________________

E. What is the name of your teacher/program
leader?

__________________________________


Please Leave Blank. For Administrative Use Only:



______ ______ ______ ______ ______ ______ ______
Program Site Number Survey Number
Instructions:
For each question, make a solid mark that
fills the oval completely.
Summer of Service Program Evaluation Toolkit




*P* *C* *2* Innovations in Civic Participation ©
Page 2


About You and Your Community
The questions below ask about how you think about you and your community. Please answer the questions as
carefully and honestly as you can. There are no right or wrong answers. We just want to know how you think or feel.
For each statement below, please tell us if you Strongly Disagree, Disagree, Uncertain, Agree, or Strongly Agree.
Please be sure to fill in the circle completely.


Strongly
Disagree Disagree



Uncertain Agree
Strongly
Agree
1. I know how to influence the decisions made by my local
government.
¸ ¸ ¸ ¸ ¸
2. Doing something that helps others is important to me. ¸ ¸ ¸ ¸ ¸
3. It is important to me to do the best I can at school. ¸ ¸ ¸ ¸ ¸
4. I feel like I am an important contributor to my community. ¸ ¸ ¸ ¸ ¸
5. I understand how public decisions are made in my
community.
¸ ¸ ¸ ¸ ¸
6. I feel like I can stand up for what I think is right, even if my
friends disagree.
¸ ¸ ¸ ¸ ¸
7. After this summer, I plan to continue volunteering to help out
my community.
¸ ¸ ¸ ¸ ¸
8. Adults in my community value my opinion. ¸ ¸ ¸ ¸ ¸
9. School is a waste of time. ¸ ¸ ¸ ¸ ¸
10. I know who to ask for help to get something done in my
community.
¸ ¸ ¸ ¸ ¸
11. I feel that most adults are supportive of young people’s efforts
to work on school and community problems.
¸ ¸ ¸ ¸ ¸
12. When I grow up, I plan to work with a group to solve a
problem in the community where I live.
¸ ¸ ¸ ¸ ¸
13. My community is important to me. ¸ ¸ ¸ ¸ ¸
14. Finishing high school is important for me. ¸ ¸ ¸ ¸ ¸
15. I understand the different kinds of services my town provides
to people in my community.
¸ ¸ ¸ ¸ ¸
16. I believe that I can make a difference in my community. ¸ ¸ ¸ ¸ ¸
17. I want to help other people, even if it is hard work. ¸ ¸ ¸ ¸ ¸
18. It is my responsibility to help improve the community. ¸ ¸ ¸ ¸ ¸
19. I would like to quit school as soon as possible. ¸ ¸ ¸ ¸ ¸
20. When I am an adult, I want to be able to vote. ¸ ¸ ¸ ¸ ¸
21. Helping other people is something I am personally
responsible for.
¸ ¸ ¸ ¸ ¸
22. I know how to design and do a service project in my
community.
¸ ¸ ¸ ¸ ¸
23. By working with others in the community, I can help make
things better.
¸ ¸ ¸ ¸ ¸
Summer of Service Program Evaluation Toolkit




*P* *C* *3* Innovations in Civic Participation ©
Page 3



Strongly
Disagree Disagree



Uncertain Agree
Strongly
Agree
24. When I am an adult, I plan to get information about
candidates and issues before voting in an election.
¸ ¸ ¸ ¸ ¸
25. I know other people who care about the same social issues
that I do.
¸ ¸ ¸ ¸ ¸
26. I am looking forward to going back to school in the Fall. ¸ ¸ ¸ ¸ ¸
27. I really want to graduate from college. ¸ ¸ ¸ ¸ ¸

28. As things stand now, how far would you like to go in school? (Please check one response)

O Less than high school graduation
O High school graduation or GED only
O Attend a technical/vocational school
O Attend college
O Attend graduate school (for example, law school or medical school) after college


29. During this summer, has anyone from your family, school, or from a community program done any of the
following?

Yes No
Don’t
Know
a. Talked to you about going to college
¸ ¸ ¸
b. Taken you to visit a college? ¸ ¸ ¸
c. Talked with you about the kinds of courses you need to take so you
can get into college?
¸ ¸ ¸
d. Explained how you could pay for college? ¸ ¸ ¸


30. How much do you feel you know about each of the following? For each item please indicate whether you know
Nothing at All, Very Little, Some, A Lot, All I Need.

Nothing
at All
Very
Little Some A Lot
All I
Need
a. How to pay for college
¸ ¸ ¸ ¸ ¸
b. What high school courses you need to take to get into college ¸ ¸ ¸ ¸ ¸
c. How to apply for college ¸ ¸ ¸ ¸ ¸
d. Why you should go to college ¸ ¸ ¸ ¸ ¸

Summer of Service Program Evaluation Toolkit




*P* *C* *4* Innovations in Civic Participation ©
Page 4

Experience in the Community
We would also like to learn about any experience you may have had with any community service projects you were
involved in this summer through this program.

31. During this summer, did you work on any service or volunteer projects that tried to solve a problem in your
community?
O Yes (please answer question 32) O No (Skip to Question 34 on the last page of the survey)

32. For each of the following statements, please tell us if you feel that the statement is Not True at All, Not Very
True, Sort of True, or Very True for you in describing the experience you had with your community service
projects this summer.
Not
True
at All
Not
Very
True
Sort
of
True
Very
True
a. I had a say in choosing the problem or issue that I worked on. ¸ ¸ ¸ ¸
b. I had a chance to discuss or research the problem or issue before I took
action.
¸ ¸ ¸ ¸
c. I met with or worked with people or organizations in the community in
order to learn more about the problem.
¸ ¸ ¸ ¸
d. I learned about the causes and effects of the problem or issue we worked
on as part of our project.
¸ ¸ ¸ ¸
e. I had a chance to compare different solutions to a problem before deciding
what kind of action to take.
¸ ¸ ¸ ¸
f. I felt like the problem I worked on was important. ¸ ¸ ¸ ¸
g. I felt like I had real responsibilities on my project. ¸ ¸ ¸ ¸
h. I had a chance to talk or write about my experiences on our project. ¸ ¸ ¸ ¸
i. My teacher/project leader talked about how our project related to the
subjects we study in school.
¸ ¸ ¸ ¸
j. I completed all the steps on my project that I had planned. ¸ ¸ ¸ ¸
k. I felt like my project made a difference. ¸ ¸ ¸ ¸
l. I presented and/or discussed the results of our project with one or more
members of the community.
¸ ¸ ¸ ¸
m. People in my school or community thought the work we did on our project
was important.
¸ ¸ ¸ ¸
n. We tried to find out whether our project made a difference. ¸ ¸ ¸ ¸
o. I want to continue working on this issue, either on my own or with another
class at school.
¸ ¸ ¸ ¸

33. How would you rate your experience working on your service project (or projects) this summer?

O Poor O Fair O Good O Excellent


34. How would you rate your experience at your summer program as a whole this summer?

O Poor O Fair O Good O Excellent
Summer of Service Program Evaluation Toolkit




*P* *C* *5* Innovations in Civic Participation ©
Page 5


35. We would also like to know about how well you think you can do some important tasks in your community. For
each of the following questions, please tell us how well you could do each type of task at the beginning of the
summer and now. Could you do it Not at all? A little? Pretty well? Or Very well?
For example, in the sample question below, we ask you how well you could ‘Give a friend accurate directions to
the town hall.’ To answer, first you need to fill in one of the circles on the left side of the page to tell us how
well you could give the right directions at the beginning of the summer. Then, you would fill in one of the
circles on the right side of the page to tell us how well you think you can give identify accurate directions now.
In the example below, we have filled in the circle indicating that you could give directions ‘a little’ at the beginning
of the year and ‘pretty well’ now.
At the beginning of the
summer

How well could you do each of the following? Now
Not at
All
A
Little
Pretty
Well
Very
Well
Not at
All
A
Little
Pretty
Well
Very
Well
¸

¸ ¸
a. Give a friend accurate directions to the town hall?
(sample question)
¸ ¸

¸
¸ ¸ ¸ ¸
b. Identify needs or problems that are important to
your community?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
c. Use more than one source to gather information
on a school or community problem (for example,
newspapers, the Internet, people in government
agencies or community organizations, etc.)?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
d. Make phone calls or do interviews to gather
information on community problem?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
e. Decide what is important to think about in
choosing a community project?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
f. Set up a timeline and action steps for a community
project?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
g. Identify people who need to be involved in a
community project?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
h. Manage your time so you can get all of the steps
in a project done?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
i. Look at different ways to solve a community
problem to find the best solution?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
j. Talk or present to people about a community issue
that you care about?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
k. Work on a team with other students to help solve a
community problem?
¸ ¸ ¸ ¸
¸ ¸ ¸ ¸
l. Figure out whether or not a project made a
difference?
¸ ¸ ¸ ¸


You are done!! Thank you for helping with our survey!
Please do not fold survey. Hand to appropriate person for collection.
Summer of Service Program Design Toolkit


Innovations in Civic Participation © 2011








7.3 Post Only Participant Survey Materials

























Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011






Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

SAMPLE PARENT NOTIFICATION NOTICE – PASSIVE PERMISSION
(Service-Learning Participants)

Dear Parent or Guardian:

As part of our efforts to ensure that the [insert your program name] Summer of Service program is
providing participating youth with a quality summer experience, we are asking the young people in
our programs to complete a survey at the beginning and end of their summer experience.

The survey asks program participants about their experience in the summer of service program and
its impact on a variety of civic and school-related attitudes and skills. The survey was developed by
Innovations in Civic Participation (ICP) and the Center for Youth and Communities at Brandeis
University as part of a national effort to evaluate Summer of Service programs. Copies of the
complete Summer of Service survey are available upon request.

We want to assure you that all of the information collected through the survey will be kept
strictly confidential. It will only used for the purposes of evaluating our SOS program and other
similar programs around the country. The surveys do not include participant names (students use a
‘code’ instead of their name on the survey), and participants are instructed to seal their completed
surveys in envelopes before handing them in to program staff. Participants may also skip any
questions in the survey that they are uncomfortable answering.

We are sending this notice home to inform you of the survey and to give you the opportunity to let us
know if you do not want your child to participate in the evaluation. We believe that the survey
provides valuable information that will help our ongoing efforts to improve our program and provide a
quality experience for area children. We want to encourage you, therefore, to allow your child to
participate.

If you DO NOT want your child to participate in this study, please let us know by completing the form
below and returning it to your child’s program staff within the next three days. If you are willing to
have your child participate you do not need to take any further action.

If you have any questions about the study, please feel free to call me at (___) ____________.

Thank you for your cooperation.


-------------------------------------------------------------------------------------------------------------------------


WITHDRAWAL OF PERMISSION
Please Sign and Return to a Member of our Program Staff Within Three Days if You DO NOT want
Your Child to be Included in this Evaluation

I do not want my child, ______________________, to participate in the Summer of Service evaluation
survey. Please take the necessary steps to make sure that my child is not included in the survey process.

Parent(s)/Guardian(s) Signature(s):__________________________________________________

Daytime Telephone: (_______) ______ - _________ Today’s Date: _____________
Area Code Number
Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

SAMPLE PARENT PERMISSION NOTICE – ACTIVE PERMISSION
(Service-Learning Participants)

Dear Parent or Guardian:

As part of our efforts to ensure that the [insert your program name] Summer of Service program is
providing participating youth with a quality summer experience, we are asking the young people in
our programs to complete a survey at the beginning and end of their summer experience.

The survey asks students about their experience in the summer of service program and its impact on
a variety of civic and school-related attitudes and skills. The survey was developed by Innovations
in Civic Participation (ICP) and the Center for Youth and Communities at Brandeis University as part
of a national effort to evaluate Summer of Service programs. Copies of the complete Summer of
Service survey are available upon request.

We want to assure you that all of the information collected through the survey will be kept
strictly confidential. It will only used for the purposes of evaluating our SOS program and other
similar programs around the country. The surveys do not include participant names (students use a
‘code’ instead of their name on the survey), and participants are instructed to seal their completed
surveys in envelopes before handing them in to program staff. Participants may also skip any
questions in the survey that they are uncomfortable answering.

We are writing to ask your permission to include your child in the survey process. We believe that
the survey provides valuable information that will help our ongoing efforts to improve our program
and provide a quality experience for area children. We want to encourage you, therefore, to allow
your child to participate.

If you agree to have your child participate in the survey, please complete the form below and
returning it to a member of our program staff within the next three days. If you do not return the
permission form, we will not include your child in the survey.

If you have any questions about the study, please feel free to call me at (___) ____________.

Thank you for your cooperation.


---------------------------------------------------------------------------------------------------- ---------------------


SURVEY PERMISSION
Please Sign and Return to a Member of our Program Staff Within Three Days if You Give
Permission for Your Child to be Included in this Evaluation

I give permission for my child, ______________________, to participate in the Summer of Service
evaluation survey.

Parent(s)/Guardian(s) Signature(s):__________________________________________________

Daytime Telephone: (_______) ______ - _________ Today’s Date: _____________
Area Code Number

Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

1


Summer of Service Evaluation
Post-Only Survey Instructions


Thank you for your help administering the Summer of Service participant survey. The survey is part of the
ongoing effort by this program and others to document the impacts of service-learning programs and to learn how
to make them even more effective.
There are a few simple instructions for administering the survey with your service-learning participants.

1. Send the enclosed parent permission forms home at least a few days prior to administering the
surveys.
The forms are designed to make parents aware of the study and to allow them to “opt out” if they do not want
their children to participate. If your program is using the “Active” permission forms, parents must return the
form before their children take the survey. If you are using the parent “Notification‟ version, only those
participants whose parents do not want them to participate need to return signed parent forms.

2. Please give the survey to your Summer of Service program participants at the end of your service-
learning program activities, but not as the final program activity.

The purpose of this survey is to help us see how participant attitudes, ideas and skills have changed since the
beginning of the program. As such, the survey should be completed toward the end of service-learning
activities in order to capture as much of the program experience as possible. However, we also want to make
sure that participants have time to complete the survey thoughtfully and that it is presented in a positive
context and environment. As a result, we want to encourage educators not to make the survey the last
activity of the program and to provide adequate time and encouragement for participants to respond.

3. Take a few minutes to introduce the survey to your participants and encourage them to answer the
questions as honestly as they can.

Please emphasize that the survey is not a test – there are no „right‟ or „wrong‟ answers, and the surveys do
not measure whether someone is „good‟ or „bad.‟ The questions are designed to help us understand how
participants think and what they are learning through the program.

Also, please read through the cover page of the survey. Please remind participants that the surveys are
confidential: we do not ask for their names and they will place their completed surveys in an envelope to
assure confidentiality. The only people who will see their answers are the researchers who compile the
survey data.

Finally, please make sure that your participants understand that the survey is voluntary. Participants can skip
individual questions that they do not want to answer. Participants who do not want to complete the survey at
all may simply seal their blank survey in their envelope.

4. Feel free to read the surveys aloud to your class/group if you think they will have difficulty reading the
survey on their own.

The surveys were designed with the advice of a workgroup of educators and should be at an appropriate
reading level for most middle and high school students. However, we want to encourage any educators with
concerns about the reading level to read the survey aloud for their students/participants.

In particular, you may want to walk participants through the example at the beginning of the skills
section of the survey (last page). In that section, participants are asked to assess their skills as of the
Summer of Service Evaluation Toolkit

Innovations in Civic Participation © 2011

2

beginning of their service-learning program this year and at the end of the program by marking two sets of
responses – one on the left side of the page and one on the right. While we have tried to make this format as
clear as possible, some participants may need some additional explanation using the first question as an
example. Please feel free to help participants if they have questions about where or how to answer these
questions.

5. When participants have completed their surveys, please have them seal the surveys in an envelope to
assure the confidentiality of their answers.

Once again, we want to assure participants that their individual answers on the surveys will not be seen by
anyone who works directly with them in program. Having participants seal their surveys in an envelope helps
to ensure that confidentiality is maintained.

6. After the surveys are completed, you may want to have a group discussion about how surveys are
used to collect information.

Ideally, an end-of-program survey can serve an opportunity for reflecting on the service-learning program
experience. You may want to ask participants what they have accomplished, what they have learned, and
what they might want to do to continue the service-learning process.

7. Please complete the Survey Cover Sheet and return it with the surveys to your program’s evaluation
contact person.

We want to emphasize that the Survey Cover Sheet is a critical part of the survey process. The cover sheet
lets us make sure that we know which surveys came from which sites, and the program information on the
cover sheet will allow us to look at how different types of programs and program experiences are related to
the outcomes for participants. We want to encourage educators to take the extra 2-3 minutes needed to fill
out the cover sheet and include it with your surveys.


If you have any questions, please get in touch with your Summer of Service evaluation contact before
administering the surveys.

As always, THANK YOU for your help with this important process.


Summer of Service Evaluation Toolkit


Innovations in Civic Participation ©

Summer of Service Survey
End of Program Survey – Summer of Service Participants



Thank you for taking the time to complete this survey.

As part of the Summer of Service program we are asking you to complete a survey that will help us learn
about your program experience and what you have learned from the service projects that you took part in.

In asking these questions, we want you to know that the information in this survey is confidential. No one at
your program site will see your answers. When you are done with the survey you will seal it in an envelope
before you turn it in. The only people who will see your survey are the researchers who are analyzing the
surveys, and they are not allowed to let anyone else know your answers.

We also want you to know that the survey is voluntary. If there are any individual questions on the survey
that you do not want to answer, you may skip those questions. Also, if you do not want to answer any of the
questions, you can leave the survey blank. Just seal your blank survey in your envelope and turn it in with
the rest of your program or class.

Even though the survey is voluntary, we hope that you will take the time to answer the questions and to be
as honest and thoughtful as you can. The information in the survey will help your summer program and
others like it improve the way they work with young people like you.

Thank you for your help. If you have any questions about the survey, please ask your teacher/program
leader.



This survey was developed by the
Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Evaluation Toolkit




*B* *PO* *1* Innovations in Civic Participation ©
Page 1

Experience in the Community
1. We would also like to learn about your experience with the service projects you were involved in this summer
through this program. For each of the following statements, please tell us if you feel that the statement is Not True
at All, Not Very True, Sort of True, or Very True for you.

Not
True
at All
Not
Very
True
Sort
of
True
Very
True
a. I had a say in choosing the problem or issue that I worked on.
b. I had a chance to discuss or research the problem or issue before I took
action.

c. I met with or worked with people or organizations in the community in
order to learn more about the problem.

d. I learned about the causes and effects of the problem or issue we worked
on as part of our project.

e. I had a chance to compare different solutions to a problem before deciding
what kind of action to take.

f. I felt like the problem I worked on was important.
g. I felt like I had real responsibilities on my project.
h. I had a chance to talk or write about my experiences on our project.
i. My teacher/project leader talked about how our project related to the
subjects we study in school.

j. I completed all the steps on my project that I had planned.
k. I felt like my project made a difference.
l. I presented and/or discussed the results of our project with one or more
members of the community.

m. People in my school or community thought the work we did on our project
was important.

n. We tried to find out whether our project made a difference.
o. I want to continue working on this issue, either on my own or with another
class at school.



2. How would you rate your experience working on your service project (or projects) this summer?

O Excellent O Good O Fair O Poor



3. Would you be interested in participating in Summer of Service or a similar program next year?
O Yes
O No
O Maybe
Summer of Service Evaluation Toolkit



*B* *PO* *2* Innovations in Civic Participation ©
Page 2

About You and Your Community
We would also like to know what you learned as a result of your summer of service experience, or if your summer
experience changed the way you think about yourself, your school or the community. Please answer the questions
below as carefully and honestly as you can. There are no right or wrong answers. We just want to know how you
think or feel.

For each statement below, please tell us if you Strongly Disagree, Disagree, Not Sure, Agree, or Strongly Agree.
Please be sure to fill in the circle completely.
As a result of my Summer of Service experience:

Strongly
Disagree Disagree



Not Sure Agree
Strongly
Agree
4. I learned how to influence the decisions made by my local
government.

5. I learned that doing something that helps others is important
to me.

6. It is important to me to do the best I can at school.
7. I feel like I am an important contributor to my community.
8. I understand how public decisions are made in my
community.

9. I now feel like I can stand up for what I think is right, even if
my friends disagree.

10. After this summer, I plan to continue volunteering to help out
my community.

11. I learned that adults in my community value my opinion.
12. I feel like school is a waste of time.
13. I now know who to ask for help to get something done in my
community.

14. I now feel that most adults are supportive of young people’s
efforts to work on school and community problems.

15. When I grow up, I plan to work with a group to solve a
problem in the community where I live.

16. My community is important to me.
17. Finishing high school is important for me.
18. I have a better understanding of the different kinds of services
my town provides to people in my community.

19. I believe that I can make a difference in my community.
20. I want to help other people, even if it is hard work.
21. I learned that it is my responsibility to help improve the
community.

22. I would like to quit school as soon as possible.
23. When I am an adult, I want to be able to vote.
24. I learned that helping other people is something I am
personally responsible for.

25. I learned how to design and do a service project in my
community.

Summer of Service Evaluation Toolkit



*B* *PO* *3* Innovations in Civic Participation ©
Page 3

As a result of my Summer of Service experience:

Strongly
Disagree Disagree



Not Sure Agree
Strongly
Agree
26. By working with others in the community, I can help make
things better.

27. When I am an adult, I plan to get information about
candidates and issues before voting in an election.

28. I now know other people who care about the same social
issues that I do.

29. I am looking forward to going back to school in the Fall.
30. I really want to graduate from college.

31. As a result of your summer service experience, are more or less likely to do the following. For each item, please
mark whether it is Much Less Likely, A Little Less Likely, No Change, A Little More Likely, Much More Likely.

Much
Less
Likely
A Little
Less
Likely
No
Change
A Little
More
Likely
Much
More
Likely
a. Leave high school without graduating
b. Graduate high school or get a GED, but not go to
college

c. Attend a technical/vocational school
d. Attend college
e. Attend graduate school (for example, law school or
medical school)


32. During this summer did anyone from your Summer of Service program do any of the following?

Yes No
Don’t
Know
a. Talk to you about going to college

b. Take you to visit a college?
c. Talk with you about the kinds of courses you need to take so you can
get into college?

d. Explain how you could pay for college?

33. As a result of your summer service experience, do you feel like you know more or less about the following aspects
of college planning? For each item please indicate whether you know Much Less, A Little Less, No Change, A
Little More, Much More.
Much
Less
A
Little
Less
No
Change
A
Little
More
Much
More
a. How to pay for college

b. What high school courses you need to take to get into college
c. How to apply for college
d. Why you should go to college
Summer of Service Evaluation Toolkit



*B* *PO* *4* Innovations in Civic Participation ©
Page 4

34. We would also like to know about how well you think you can do some important tasks in your community. For
each of the following questions, please tell us how well you could do each type of task at the beginning of the
summer and now. Could you do it Not at all? A little? Pretty well? Or Very well?
For example, in the sample question below, we ask you how well you could ‘Give a friend accurate directions to
the town hall.’ To answer, first you need to fill in one of the circles on the left side of the page to tell us how
well you could give the right directions at the beginning of the summer. Then, you would fill in one of the
circles on the right side of the page to tell us how well you think you can give identify accurate directions now.
In the example below, we have filled in the circle indicating that you could give directions ‘a little’ at the beginning
of the year and ‘pretty well’ now.

At the beginning of the
summer

How well could you do each of the following? Now
Not at
All
A
Little
Pretty
Well
Very
Well
Not at
All
A
Little
Pretty
Well
Very
Well

a. Give a friend accurate directions to the town hall?
(sample question)

b. Identify needs or problems that are important to
your community?


c. Use more than one source to gather information
on a school or community problem (for example,
newspapers, the Internet, people in government
agencies or community organizations, etc.)?


d. Make phone calls or do interviews to gather
information on community problem?


e. Decide what is important to think about in
choosing a community project?


f. Set up a timeline and action steps for a community
project?


g. Identify people who need to be involved in a
community project?


h. Manage your time so you can get all of the steps
in a project done?


i. Look at different ways to solve a community
problem to find the best solution?


j. Talk or present to people about a community issue
that you care about?


k. Work on a team with other students to help solve a
community problem?


l. Figure out whether or not a project made a
difference?

About You
We would also like to ask a few questions about you.

35. How old are you? (Please fill in the circle for your age)
11 12 13 14 15 16 17 18 19 20 Other
O O O O O O O O O O O

Summer of Service Evaluation Toolkit



*B* *PO* *5* Innovations in Civic Participation ©
Page 5



36. What grade were you in this past school year? (Please fill in the circle for your grade last year)
5 6 7 8 9 10 11 12 Other
O O O O O O O O O


37. Are you a boy or a girl?

O Boy O Girl


38. How would you describe your racial or ethnic background? (Please feel free to mark all the answers that apply.)

O Alaskan or Native American O Native Hawaiian or other Pacific Islander
O Asian O White
O Black or African-American O Hispanic/Latino(a)
O Other


39. Did you have any classes in the past year where you did a service project in your community as part of the class?
O Yes, I had one or more classes last year where we did a service project.
O No, I did not have any classes last year where we did a service project.


40. During the last term/semester of school last year, approximately how many hours did you spend each week
volunteering/providing community service (including service performed through your school)?

O 0 hours per week
O Less than 1 hour per week
O 1-3 hours per week
O 4-6 hours per week
O 7 or more hours per week








You are done!! Thank you for helping with our survey!
Do not fold. Please hand to the appropriate person for collection.
Summer of Service Program Design Toolkit


Innovations in Civic Participation © 2011








7.4 Community Partner Survey

























Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011






Summer of Service Program Evaluation Toolkit


Innovations in Civic Participation © 2011

1

Summer of Service Community Partner Survey

Thank you for taking the time to respond to the Summer of Service Community Partner Survey. The
information you provide will help to document and strengthen our Summer of Service programs.

Please note that the survey is voluntary, but we hope that you will take a few minutes to respond. The
survey is also confidential. Your responses will only be seen by the researchers analyzing the survey
results. No one at the programs you work with will see the individual survey responses and the results will
be reported only as as part of a larger group of responses.


A. ABOUT YOUR ORGANIZATION

1. Please tell us about your organization
Organization Name: _______________________________
City/Town: ______________________________________
State: ___

2. What program did you work with this summer? (if you worked with more than one program, please
complete a survey for each)
Program Name/Organization: _________________________________
City/Town: ____________________________________
State: ____

3. Have you or your organization ever been involved in a service-learning project before this summer
(with this organization or any other)?
O Yes
O No
O Don't Know

4. Which of the following best describes your organization? (please check one)
O Government agency (local, state or federal)
O Non-profit organization (Social Service Agency, Health Care, Community Development
Organization, etc.)
O Educational Institution (K-12 or higher education)
O Public Interest/Advocacy Organization (environmental organizations, voters group, etc.)
O Other (please describe) ___________________________________


B. ABOUT YOUR SERVICE-LEARNING PROJECT

5. Please briefly describe the service-learning project that you were involved in this year.
___________________________________________________________________
___________________________________________________________________
___________________________________________________________________

Summer of Service Program Evaluation Toolkit


Innovations in Civic Participation © 2011

2

6. Which of the following best describes the topic or focus of the service-learning project? (please check
all that apply)
O Education (for example, tutoring, literacy, after-school, etc.)
O Health/Nutrition (hunger, elder care, prevention/awareness programs, etc.)
O Environmental (recycling, environmental awareness, conservation, community clean-up, etc.)
O Homeland Security (disaster preparation/relief, etc.)
O Public Safety (community watch, mediation, fire prevention, bike safety, etc.)
O Housing (housing rehabilitation, tenant assistance, etc.)
O Human Needs-General (adult day care, mentoring, culture/arts, clothing distribution, etc.)
O Community/Economic Development (community organizing, food production/distribution,
job/business development, etc.)
O Other (please describe) ___________________________________

7. How many young people worked with your organization?
Estimated number _____

8. What role did you play in working with the young people in the program? (Please check all that apply)
O Provided information about a community need or issue by phone/mail.
O Presented information about a community need or issue to program participants at a meeting
or workshop.
O Worked directly with one or more young people provide information.
O Helped one or more youth/groups of youth design a project.
O Worked directly with program participants/youth on project implementation/project activities.
O Supervised one or more young people working with/providing services at our organization.
O Helped to evaluate summer of service projects and/or provided feedback on the service
provided by program participants.

9. How would you assess your experience working with the summer of service program on this service
project? For each of the following statements, please indicate if you Strongly Agree, Agree,
Disagree, Strongly Disagree, or Don’t Know.

Strongly
Agree Agree Disagree
Strongly
Disagree
Don’t
Know
Program staff clearly explained the goals
and purpose of the service-learning project.
O O O O O
I had a clear understanding of what was
expected of me/our organization.
O O O O O
The summer program staff had a good
understanding of my organization and the
resources we could provide.
O O O O O
The young people were well-prepared to
work with our organization on the project.
O O O O O
The program staff kept in touch with me to
make sure that the project stayed on track.
O O O O O
I developed a good relationship with the
program staff involved in the project.
O O O O O
Summer of Service Program Evaluation Toolkit


Innovations in Civic Participation © 2011

3

I developed a good relationship with the
young people involved in the service-
learning project.
O O O O O
10. Overall, how would you assess the quality of the work done by students, using a scale from 1-5,
where 1 is unacceptable and 5 is the best possible quality.

O 1 (Unacceptable)
O 2
O 3
O 4
O 5 (Best)


C. ASSESSING PROJECT IMPACTS

11. Based on your knowledge and experience, how would you assess the impacts of this summer’s
project on the community? For each of the following, please indicate if you Strongly Agree, Agree,
Disagree, Strongly Disagree, or Don’t Know.

Strongly
Agree Agree Disagree
Strongly
Disagree
Don’t
Know
The service project was designed to
address an important local need or problem
in our community.
O O O O O
The service-learning project succeeded in
having an impact on the problem it was
designed to address (even if it did not
entirely solve the problem).
O O O O O
The service-learning project improved
community attitudes towards youth as
contributors to the community.
O O O O O


12. From your perspective, has your organization’s involvement in the summer of service project had any
of the following impacts on your organization? (Please check all that apply)

Yes No Don’t Know
Improved the quality of services/activities provided by your
organization.
O O O
Increased your organization’s capacity to take on new
projects/ expand services to new communities.
O O O
Increased visibility of your organization in the community. O O O
Increased your organization’s willingness to engage young
people in your work in the future.
O O O
Established a new relationship or expanded an existing
relationship between your organization and the organization
running the summer of service program.
O O O


13. Did you encounter any major barriers in working on the service-learning project with the summer of
service program?
O Yes
Summer of Service Program Evaluation Toolkit


Innovations in Civic Participation © 2011

4

O No

14. If YES, please briefly describe. How were the problems/barriers resolved?
______________________________________________________________
______________________________________________________________
______________________________________________________________


15. Given the chance, would you want to work on summer of service projects in the future?
O Yes
O No
O Don’t Know


16. Do you have any other comments on your experience with the summer of service program this year?
_____________________________________________________________________________
_____________________________________________________________________________
_____________________________________________________________________________


Thank you for taking the time to complete the survey.



This survey was developed by the
Center for Youth and Communities, Brandeis University, for Innovations in Civic Participation (May, 2010).
Summer of Service Program Design Toolkit


Innovations in Civic Participation © 2011








7.5 Semi-structured Interview Questions

























Summer of Service Evaluation Toolkit


Innovations in Civic Participation © 2011






Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011

7.5 Semi-structured Interview Questions


A. Youth Pre-Service Semi-Structured Interview Questions

These questions are designed to gather detailed, qualitative data related to participant growth in five of
the six SOS outcome areas: understanding context, civic agency, initiative and student action, efficacy,
and academic outcomes. Youth responses to each of the questions below should be probed for
additional information. The interview process takes about 4-5 minutes per youth participant. The
interviews should be conducted one-on-one in a somewhat private setting that is relatively free of
background noise. Please test your audio recording device before taping interviews to be sure that
sound levels are adequate.

1) Can you please say your name and what your grade will be in the fall?

2) Why did you sign-up for SOS this summer? (Why are you here? If you’re at a project site, then ask:
Why are you doing this project? Probe for understanding of context… Why is this project – eg. picking up
trash at the beach—important?)

3) If you weren’t here participating in this program, what else would you be doing this summer?

4) Did you sign-up yourself or did someone else sign you up?

5) What do you hope to accomplish this summer? (What do you hope to learn?)

6) Have you participated in Summer of Service before, or is this your first time?

7) Have you done any other volunteering in the past? Ask for detail explaining the project(s) and the
youths’ role in volunteering. (If yes, do you think what you did made a difference? In what way? What
did you learn from that project(s)?)

8) How do you define what it means to be a “leader”?

9) Do you think young people can be leaders in your community? (If no, why not? If yes, in what ways?
Are you a leader? How?)

10) Where do you see yourself in 10 years? (Probe for career, educational and civic goals. If college
comes up, ask: have you though about where you might go to college? Ask if people have spoken with
them about what they need to do to get to college. If so, what have you been told about what you need
to do to make sure you get in?)






Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011

B. Youth Post Semi-Structured Interview Questions

See guidelines above in pre-service questions. The post interview process takes about 5-7 minutes per
youth participant.

1) Can you please say your name and what your grade will be in the fall?

2) Can you describe for me one (or more) of the service projects you were involved with this summer?
What was your role and responsibilities? Why get involved with ______? (Probe for understanding of
context… Why were these projects – eg. picking up trash at the beach—important?)

3) Do you feel you made a difference this summer? In what ways?

4) Is there anything you felt you learned this summer by participating in this program?

5) Is there anything you learned that you think will help you with your school work this fall? OR Do you
think learning _______ will help you with your school work this fall?

6) Compared to past volunteer projects, was the SOS experience similar or different than volunteering
you’ve done in the past? How so?

7) In your own words, can do you define what it means to be a “leader”?

8) Did you have any opportunities to be a leader this summer? (If yes, ask to describe leadership
experience.)

9) Would you like to participate in SOS next summer?

10) Do you think you will continue volunteering in the community during the school year? (Why or why
not? If yes, ask to describe the projects they plan to participate in.)

11) Would you recommend joining SOS to a friend?

12) So what would say if you were going to recruit somebody?

13) Where do you see yourself in 10 years? (Probe for career, educational and civic goals. What kind of
jobs are you interested in? Are you planning on going to college? Do you think that you will still be active
in the community as an adult?)

14) Has anybody in the program talked to about the steps you need to take to get to college? If so, what
did they say?

15) Anything you recommend we change for next year?



Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011

C. Youth Post ONLY Semi-Structured Interview Questions

To be used if it is not possible to interview youth prior to or at the beginning of the service experience.

1) Can you please say your name and what your grade will be in the fall?

2) Why did you join SOS this summer?

3) Whose idea was it to sign up? (Or, who signed you up?)

4) If you weren’t here in this SOS program, what would you be doing with your time this summer?

5) Can you describe for me one (or more) of the service projects you were involved with this summer?
What was your role and responsibilities? Why get involved with ______? (Probe for understanding of
context… Why were these projects – eg. picking up trash at the beach—important?)

6) Do you feel you made a difference this summer? In what ways?

7) Is there anything you felt you learned this summer by participating in this program?

8) Is there anything you learned that you think will help you with your school work this fall? OR Do you
think learning _______ will help you with your school work this fall?

9) Have you participated in Summer of Service before, or is this your first time?

10) Have you done any other volunteering in the past? Ask for detail explaining the project(s) and the
youths’ role in volunteering. (If yes, do you think what you did made a difference? In what way? What
did you learn from that project(s)?)

11) Was the SOS experience similar or different than volunteering you’ve done in the past? How so?

12) In your own words, can do you define what it means to be a “leader”?

13) Did you have any opportunities to be a leader this summer? (If yes, ask to describe leadership
experience.)

14) Would you like to participate in SOS next summer?

15) Do you think you will continue volunteering in the community during the school year? (Why or why
not? If yes, ask to describe the projects they plan to participate in.)

16) Would you recommend joining SOS to a friend?

17) So what would say if you were going to recruit somebody?

Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011

18) Where do you see yourself in 10 years? (Probe for career, educational and civic goals. What kind of
jobs are you interested in? Are you planning on going to college? Do you think that you will still be active
in the community as an adult?)
19) Has anybody in the program talked to about the steps you need to take to get to college? If so, what
did they say?

20) Anything you recommend we change for next year?







































Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011

D. Staff Pre-Service Semi-Structured Interview Questions


1) Please say your name and your position with the SOS program. Can you tell me about your
background – in terms of education and experience – that prepared you for this position?

1b) What is your role in the program? How do you accomplish these tasks? What is the staff-to-youth
ratio?

= = =
I’m going to ask you a series of questions about different programming components of SOS. If your
program included any of these components, then it would be helpful if you could give me at least one
concrete example.

2) To what extent did the adults in the program model any particular skills or behaviors for the youth?

3) Were there any opportunities to mentor the youth? Please describe.

4) Describe any teaching or training that helped to educate the youth regarding background knowledge
or context?

5) What about opportunities for the youth to enhance their team-building or collaboration skills?

6) Any specific aspects of the program geared towards leadership development / taking initiative?

7) Describe one or more examples this summer of the youth being academically engaged? (e.g. critical
thinking skills, literacy, content-area knowledge in social studies, science, or math, etc.) To what extent
did this academic engagement align with curricular standards?


















Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011


E. Staff Post-Service Semi-Structured Interview Questions

1) Please say your name and your position with the SOS program.


= = =
I’m going to ask you a series of questions about different programming components of SOS. If your
program included any of these components, then it would be helpful if you could give me at least one
concrete example.

2) To what extent did the adults in the program model any particular skills or behaviors for the youth?

3) Were there any opportunities to mentor the youth? Please describe.

4) Describe any teaching or training that helped to educate the youth regarding background knowledge
or context?

5) What about opportunities for the youth to enhance their team-building or collaboration skills?

6) Any specific aspects of the program geared towards leadership development / taking initiative?

7) Describe one or more examples this summer of the youth being academically engaged? (e.g. critical
thinking skills, literacy, content-area knowledge in social studies, science, or math, etc.) To what extent
did this academic engagement align with curricular standards?

8) How was Summer of Service similar or different from teaching in the classroom/other youth
programs? Is there anything unique or special about doing this program in the summer?

9) Did you learn anything new about teaching and/or service-learning that you will incorporate into
future programming or classroom-based curriculum?

10) Any changes that you would recommend for next year?













Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011


F. Staff Post ONLY Semi-Structured Interview Questions

To be used if it is not possible to interview staff prior to or at the beginning of the service experience.


1) Please say your name and your position with the SOS program. Can you tell me about your
background – in terms of education and experience – that prepared you for this position?


= = =
I’m going to ask you a series of questions about different programming components of SOS. If your
program included any of these components, then it would be helpful if you could give me at least one
concrete example.

2) To what extent did the adults in the program model any particular skills or behaviors for the youth?

3) Were there any opportunities to mentor (provide social-emotional support on a one-on-one level) the
youth?

4) Describe any teaching or training that helped to educate the youth regarding background knowledge
or context?

5) What about opportunities for the youth to enhance their team-building or collaboration skills?

6) Any specific aspects of the program geared towards leadership development / taking initiative?

7) Describe one or more examples this summer of the youth being academically engaged? (e.g. critical
thinking skills, literacy, content-area knowledge in social studies, science, or math, etc.) To what extent
did this academic engagement align with curricular standards? What about opportunities to share
information on college access and college preparation?

8) How was Summer of Service similar or different from teaching in the classroom/other youth
programs? Is there anything unique or special about doing this program in the summer? Anything
unique or special about working with middle schoolers?

9) Did you learn anything new about teaching and/or service-learning that you will incorporate into
future programming or classroom-based curriculum?

10) Any changes that you would recommend for next year?

Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011

Acknowledgements
This toolkit is the product of a team effort spearheaded by Susan Stroud, Executive Director, and Jean
Manney, Director of Strategic Development, at Innovations in Civic Participation. We are also very
thankful for the editing and technical support provided by Reuven Dashevsky, Program and
Development Associate at ICP, and Marlin Payne, ICP’s 2010 Summer of Service Fellow. Funding
for development of the toolkit was provided by Lumina Foundation for Education. We are also grateful
for the input and feedback provided by the National Summer of Service Evaluation Workgroup members
Nicole Tysvaer, SOS research fellow with Innovations in Civic Participation and doctoral student at the
University of Michigan School of Education; Sue Root, National Youth Leadership Council; Lisa Bardwell,
Earth Force; Jon Schmidt, Service Learning Initiative, Chicago Public Schools; Ernest Morrell, UCLA IDEA
program ; Adraine McKell, Volunteer Manatee, ManaTEENs program ; Sadie Miller, Brandeis University;
Jos Truitt, 2009 Summer of Service Fellow. Finally, it is important to note that the Toolkit draws heavily
from and builds on earlier evaluation toolkits developed by staff at the Center for Youth and
Communities, most notably, the “Making Knowledge Productive” toolkit developed by Thomas Pineros
Shields, Lawrence Neil Bailis, and Alan Melchior for the Massachusetts Department of Education. That
earlier effort provided a terrific starting point for the current document.

About the Author

Alan Melchior is the Deputy Director and a Senior Fellow at the Center for Youth and Communities. He
brings more than twenty years of experience in managing a wide variety of wide variety of policy,
evaluation, and technical assistance and training initiatives in the fields of youth, education, and
workforce development. Much of his work has focused on the evaluation of service-learning and
community service-related initiatives. In the past decade, he has led a number of major service-related
evaluation projects. These include the national evaluations of the Serve-America and Learn and Serve
America, School and Community-Based Programs for the Corporation for National and Community
Service (CNCS); studies of the institutionalization of service-learning for the CNCS and the W.K. Kellogg
Foundation, and evaluations of several national and regional service-learning programs. Other projects
have included evaluations of college access programs, school-to-career initiatives, and comprehensive
community partnerships, as well as studies of the uses of technology in youth and community-based
programs. Overall, Mr. Melchior brings a strong commitment to the use of research and evaluation as a
means of strengthening programs and services for young people.

About the Center for Youth and Communities
Since its inception in 1983, the Center for Youth and Communities (CYC) has established a national
reputation as one of the nation's leading research, professional development and policy organizations in
youth and community development. CYC is part of the Heller School for Social Policy and Management
at Brandeis University.
CYC’s ultimate goal is to "make knowledge productive" by connecting the knowledge gained from
scholarly research and practical experience in ways that help both policy makers and practitioners. This
blend of theory and practice provides CYC with a unique perspective and capacity. CYC works with a
wide variety of organizations, including federal, state and local governments, foundations, and nonprofit
organizations. In each case, the Center views practitioners and policy makers as partners in a practical
knowledge development effort in which both the community and the academy bring critical strengths,
and in which practical solutions to real-world issues are developed through a collaborative, mutually
Summer of Service Evaluation Toolkit



Innovations in Civic Participation © 2011

respectful approach. Center staff have been conducting nationally recognized research on service-
learning and other youth programs for over two decades.

Center for Youth and Communities
Heller School for Social Policy and Management
Brandeis University
415 South Street – MS035
Waltham, MA 02454
Phone: 781-736-3770
Fax: 781-736-3773
cyc@brandeis.edu
http://heller.brandeis.edu/cyc/


Special Acknowledgment: 2011 Revisions to Surveys

Through a grant from the Learn and Serve America program of the Corporation for National and
Community Service, ICP had the opportunity to pilot the participant surveys with Summer of Service
programs in Summer 2010. Based on feedback and the experience of implementing the impact
evaluation across seven sites and approximately 600 students, improvements were made to the
participant surveys in early 2011. ICP’s Summer of Service Research Fellow, Nicole Tysvaer, conducted
the impact evaluation for the Learn and Serve America SOS programs, identified improvements and
revised the surveys.

Nicole Tysvaer is a doctoral student at the University of Michigan School of Education. Tysvaer has more
than 15 years of experience working with youth service programs as a policy and program associate at
the Corporation for National Service, evaluation coordinator at JustServe AmeriCorps in Seattle, and as
the founding director of the Real Media Leadership Literacy Training program at Western High School in
Detroit. Her dissertation, Summer of Service and the Cultivation of Social Responsibility: What can we
accomplish in short, intensive service-learning programs?, will be published in 2012.


This toolkit was published with generous support
from Lumina Foundation for Education. Lumina
Foundation strives to help people achieve their
potential by expanding access and success in
education beyond high school.
© 2010 Innovations in Civic Participation
Summer of Service
Innovations in Civic Participation
1776 Massachusetts Avenue, NW
Suite 201
Washington, DC 20036
p: 202.775.0290 f: 202.355.9317
e: info@icicp.org
www.icicp.org

Master your semester with Scribd & The New York Times

Special offer for students: Only $4.99/month.

Master your semester with Scribd & The New York Times

Cancel anytime.