You are on page 1of 35

Evaluation

Design for
YMCA YFS
Assessing the Outcomes of
After-School Programs

Obaid Khan, Sarute Vithoontien


Table of Contents

Executive Summary ................................................................................................................................. 3


Introduction ............................................................................................................................................ 5
About YMCA-YFS After-School Program .................................................................................................. 5
Literature Review .................................................................................................................................... 7

Logic Model ............................................................................................................................................. 9

Proposed Evaluation Design Summary ................................................................................................... 11

Evaluation Question .............................................................................................................................. 12

Design Matrix ....................................................................................................................................... 12

Data Collection ...................................................................................................................................... 14

Data Analysis ........................................................................................................................................ 19

Proposed Presentation and Utilization Plan ........................................................................................... 24

Conclusion ............................................................................................................................................. 25

Table of Figures
Figure 1. Logic Model ............................................................................................................................ 11

Figure 2. Design Matrix.......................................................................................................................... 13

Figure 3. Possible Test Administration Dates .......................................................................................... 17

Figure 4. Demonstrative Food/Quantity/Calorie List .............................................................................. 17

Figure 5. BMI Calculation ...................................................................................................................... 18

Figure 6. Codebook ................................................................................................................................ 20

Table of Appendices
Appendix 1: Sample YMCA-YFS academic pretest/posttest ..................................................................... 26

1
Appendix 2: Exit Survey-CAQT Chi-Square data: .................................................................................... 30

Appendix 3: Nutrition Questionnaire Pre/Post Program ......................................................................... 33

Appendix 4: Fitness Questionnaire Pre/Post Program............................................................................. 33

2
Executive Summary

Purpose:

The following report has been prepared for the YMCA Youth and Family Services (YMCA-
YFS) After-School Programs by the George Washington University Masters of International
Development Studies candidates Sarute Vithoontien and Obaid Khan. This report offers staff
members a plan to systematically evaluate students’ academic, health (nutrition and fitness), and
social interaction outcomes as a result of after-school program participation. The evaluation
design also provides YMCA YFS with alternative strategies in dealing with potential obstacles in
the evaluation process, as well as sample tests, surveys, codebooks, and data entry guides which
may provide guidance for synthesizing data. The report concludes by proposing strategies by
which program results may be presented to relevant stakeholders.

Evaluation Design:

The evaluators created the evaluation design by:


● Consulting with YMCA-YFS staff members, Kirsten Anderson (Clinical Director) and
Ashley Greenwood (Volunteer Director) about the program’s overarching goals, funding,
activities, and data collection methods
● Researching the YMCA program and impact evaluations of after-school programs as a
whole
● Designing and providing program staff with a statement of work (SOW) which included
methodologies and deadlines
● Creating a Logic Model outlining and identifying relationships between key elements
within the YMCA YFS After-School Programs (inputs, activities, outputs, outcomes, and
impact)
● Developing supplementary evaluation questions in order to test the program theory
presented in the logic model
● Developing a design matrix which identifies the relevant indicators and methodologies
required to answer evaluation questions

Evaluation Tools:

The evaluators propose several tools which may help program staff capture changes in outcomes
across all relevant dimensions (academic, diet, fitness, and social interaction)

1. Baseline and Endpoint Academic Test measuring arithmetic and vocabulary skills
2. Baseline and Endpoint Eating Habits Survey
3. Baseline and Endpoint Body Mass Index measurements
4. Baseline and Endpoint Fitness Questionnaire
5. Baseline and Endpoint Fitness test (Timed 20 meter dash)

3
Proposed Presentation and Utilization Plan:

● Annual Reports to the Current Funders


● Proposals to Potential Donors for Additional Funding
● Program Planning and Implementation
● YMCA-YFS Website
● YMCA-YFS Brochures

4
I. Introduction
The following report provides an evaluation plan for the academic program coordinated by
YMCA Youth and Family Services (YMCA-YFS). The plan includes tools and methodologies
for YFS to use in the evaluation of its after-school program. The report provides concrete
recommendations that may improve future evaluation design, data collection, data management
and analysis to inform the implementation efforts by the staff.

The report begins with an overview of YFS and its programs, including the afterschool program
recommended for evaluation. It then summarizes a systematic review of literature to document
relevant research conducted to examine programs similar to YFS and the evaluations conducted
to capture their impact. The report then describes the evaluation design process, which includes
creating a logic model, determining evaluation questions, constructing a design matrix, and
selecting appropriate data collection methods. Furthermore, the report outlines the proposed data
collection and analysis methods. The report concludes by providing recommendations for
utilizing and sharing new data, discusses potential limitations and strategies for mitigating them,
and outlines a sample work plan.

A team of evaluators from the Masters of International Development Studies Program at the
George Washington University, Sarute Vithoontien and Obaid Khan, worked closely with YFS
to develop this evaluation plan. They obtained information from the Clinical Director, Ms.
Kirsten Andersen, about YSF goals and its after-school program components. The evaluation
team also consulted with the Volunteer Director, Ashley Greenwood, for ongoing program
details, existing data sets and data collection methods. At GWU, the evaluators worked under the
supervision of Program Evaluation Professor Dr. Donna Infeld.

This report is intended for internal use by YFS, and it may be reviewed by instructors and other
students in the evaluators’ graduate program. It will not be made available to the public without
consent of both the evaluation team and YFS.

II. About YMCA-YFS After-School Program

The YMCA After-School program[i] was created to enhance safety, health, social skills, and
academic performance. The after-school program serves students in kindergarten to middle
school (ages 3-13). Students who partake in program activities have an opportunity to discover
their interests and develop their talents.

The YMCA program goes beyond supplementing the academic skills taught in school. Program
staff help students build critical thinking skills, assist students with their homework, and promote
a healthy lifestyle through physical activities. Kids leave the program with enhanced social skills
and a renewed sense of confidence.

5
Currently, after-school programs operate[ii] in 11 branches within the Washington D.C. (3),
Maryland (4), and Northern Virginia Area (4). Each program provides services for varying age
groups (3-12, 4-12, 5-12). Some programs operate on-site at school campuses while others
operate off-site but provide transportation.

The evaluation plan in this report will focus on the Silver Springs, Maryland branch. The YMCA
Silver Spring branch operates in three locations: Carrol Avenue Quebec Terrace Community
Center (CAQT), Nob Hill Apartments, and North West Park (NWP) Apartments. In total, Silver
Spring YMCA employs 1 full-time staff at CAQT, 3 part-time staff at Nob Hill, and 2 part-time
staff at NWP. As of 2016, the total funding received was approximately $290,000, of which
$150,000 was allocated towards CAQT, while Nob Hill and NWP each received $70,000.

After-school programs at Silver Spring operate from September to May. The majority of children
that are served at this branch are in elementary school (grades 1-5). In total, the branch serves 70
children (each year the cohort varies as some students return while others do not) most of whom
are immigrants from Latin America or Africa. Registration is open to all children living within
the Silver Springs area.

The Silver Springs YMCA After-School Programs focus on enhancing academic performance
and physical health through academic counseling, the STEM (Science, Technology, Engineering,
and Mathematics) curriculum, and the HEPA (National Healthy Eating and Physical Activity
Standards) standards.

STEM (Context)[iii]: YMCA recognized that there will be a sizeable demand for careers related
to science, technology, engineering, and mathematics going forward. According to the U.S.
Department of Commerce, STEM related jobs will grow 1.5 times faster and pay 26% more
relative to jobs in other fields. By 2018, there will be over 1 million STEM jobs that will be left
vacant. Less than 15% of scientists and engineers are from a minority group (Hispanic, African
American, Native American) 1. Historically, most children within minority groups have not
pursued STEM careers because schools were insufficiently equipped (e.g. lack of funding,
qualified teaching staff) to provide STEM courses, and because of the lack of minority role
models in STEM fields. As such, the afterschool curriculum stimulates interest among children
in science and technology subjects.

STEM Activities: The STEM curriculum brings professionals in science and technology to
speak to children in afterschool programs. Topics and activities include designing lever catapults
and parachutes (Students are asked to build a prototype. Once catapults/parachutes are tested,
students are then encouraged to reflect on their initial design and make improvements. The

1
YMCA. "The Y: Science, Technology, Engineering and Math (STEM)." The Y, 2016.

6
exercise concludes with each student critiquing designs made by classmates. Another activity is
chromatography where students learn about water solubility by testing how different ink and
dyes react in water. Another example is related to the fundamentals of (aircraft) flight. Students
learn the basic concepts of “lift”, “weight”, “thrust”, and “drag” through paper airplane design.
The YMCA After-School Program hopes that through STEM, children will strengthen their
problem solving and critical thinking skills, as well as being able to see STEM as a career
opportunity.

HEPA (Context): In a fight against childhood obesity, the “Healthy Out-of-School Time
Coalition (HOST) conducted a survey (2011) and collected data from 700 out-of-school
programs (e.g. before and after-school programs, overnight camps, and day camps) across the
United States. The survey allowed the coalition to identify best practices in nutrition and
physical activities, as a result, the National Healthy Eating and Physical Activity Standards was
created (HEPA) for students participating in out-of-school program in grades K-12.

HEPA Standards[iv]:

YMCA abides by the following HEPA Standards to advocate healthy lifestyles:

1) Replacing cookies, candy, cake, and chips with fresh, frozen or canned fruits and vegetables
2) Serving water during snack times instead of soda, juices, and punch
3) Allocating a minimum of 30 minutes during out-of-school program times for physical
activity
4) Physical activities should include a combination of bone and muscle strengthening exercises
and cardio-respiratory activities (intensity of activity dependent on student’s age)

[i] http://www.ymcadc.org/page.cfm?p=82
[ii] http://www.ymcadc.org/page.cfm?p=82
[iii] http://www.ymcanyc.org/association/pages/stem-science.-technology.-engineering.-mathematics
[iv] http://www.ymca.net/news-releases/20110809-afterschool-standards.html

III. Literature Review

The literature below documents the findings from a series of after-school program evaluation
studies that capture the impact of programs on student performance and well-being.

Current State of After-School Programs


The After-school Alliance and the 21st Century Community Learning Center Programs (CCLC)
have made tangible efforts in improving after-school programs. The Alliance is the only national
organization dedicated to advocating for more investments towards affordable and quality after-

7
school programs. The CCLC programs are rolled out by the U.S. Department of Education to
serve underprivileged children living in high-poverty areas and attending underfunded schools
which fail to meet state academic standards (in core subjects such as reading and arithmetic).
Approximately 190,000 students are currently enrolled in an after-school program in Virginia
while over 19,000 students participate in the federally-funded CCLC.

The focus of after-school programs encompasses both academic and social/emotional dimensions
of a student’s life. The University of Memphis conducted an after-school program evaluation of
Virginia’s 21st CCLC with a focus on programs geared towards improving student behavior 2.
Three out of the four selected programs reported success in their objective of improving
students’ motivation to learn, their ability to get along with others as well as school attendance
and classroom participation.

Improving engagement conduct in school


In 2010, the Professors of Psychology at University of Chicago conducted a meta-analysis of 68
after-school studies in which they found that students in high-quality afterschool programs
attended school more often and showed improvements in their behavior compared to students not
enrolled in programs3. Another study by researchers at the University of California, Irvine, the
University of Wisconsin Madison and Policy Studies Associates, Inc. in 2007 with 35 quality
afterschool programs found that students who regularly participated in programs self-reported
improvements in their work habits, levels of persistence, and saw reductions in reports of
misconduct, such as skipping school4.

Increasing academic achievement


An evaluation of 21st CCLC programs, by the U.S Department of Education, in 2015 showed
that about one in three students made gains in their math and English grades5. Another study
conducted by Professor Vandell (UC Irvine) helped the Expanded Learning and Afterschool
Project in evaluation of outcomes associated with participation in afterschool programs in 2013.
The same study found that students regularly participating during the elementary school years
narrowed the math achievement gap at grade five between students from high-income and low-
income families6.

Immersing students in STEM

2
King, Margie. "CCLC Evaluation Surveys and Reports." VDOE :: Virginia Department of Education
Home, Center for Research in Education Policy-University of Memphis, 2015.
3
Durlak, Joseph A., and Roger P. Weissberg. "Afterschool Programs That Follow Evidence-Based
Practices to Promote Social and Emotional Development Are Effective." A Compendium on Expanded
Learning, 2010.
4
Kim Pierce & Deborah Lowe Vandell, Anamarie. "Participation in Out-of-School Settings and Student
Academic and Behavioral Outcomes." Expanding Learning, Oct. 2007
5
Afterschool Alliance. "Afterschool Fostering Student Success in Indiana." Afterschool Alliance, 2016.
6
Vandell, Deborah L. “Afterschool Programs: Longitudinal Findings”. UC Irvine School of Education.
Charles Stewart Mott Foundation, Oct. 2013.

8
STEM Programming is becoming widespread in schools and afterschool programs. In Virginia,
66 percent of parents report that their child has STEM learning opportunities in their afterschool
program and 65 percent of parents agree that afterschool programs can help children gain STEM-
related interests and skills 7.

Promoting health and wellness


Afterschool programs are also incorporating healthy activities and meals in their framework.
Afterschool Alliance household survey found that 75 percent of parents in Virginia reported that
their child’s afterschool program serves healthy snacks and/or meals and 87 percent said that it
offers opportunities for physical activity8.

Supporting working families


Afterschool programs are also shown to positively affect the lives of the parents. Researchers
note that parents missing work due to lack of afterschool care cost businesses up to $300 billion
per year in decreased worker productivity. In Virginia, 77 percent of parents surveyed agree that
afterschool programs help working parents keep their jobs 9.

The literature shows the relevant afterschool outcomes being evaluated. This informs how we
operationalize the outcomes for the YMCA-YFS context. The evaluation design is based on the
findings above to expand on the intended outcomes proposed by the program staff.

IV. Logic Model

Logic Model: A logic model is a visual depiction of relationships between resources, activities,
outputs, and outcomes of any program. Conceptualizing a logic model allows stakeholders to
collectively identify elements within the program that may be amended or changed in order to
improve program effectiveness. Logic models can be used in the planning and/or implementation
phase of the program and focus on evaluation.

The evaluators collaborated with YMCA-YFS staff to create a logical model (Figure 1.) for the
YMCA After-School Program:

● The YMCA After-School program inputs include the program staff (1 full-time, 6 part-
time) and a fund in the amount of $290,000 per year (CAQT: $150,000, Nob Hill:
$70,000, NWP: $70,000)

7
Afterschool Alliance. "Evaluations Backgrounder: A Summary of Formal Evaluations of Afterschool
programs." Afterschool Alliance, 2015,
8
Afterschool Alliance. "The 2014 America After 3PM: Afterschool Programs in Demand." Afterschool
Alliance, 2014
9
Brandeis University. "After-School Worries" Tough on Parents, Bad for Business." Catalyst |
Catalyst.org, 2006

9
● Program activities encompass vocabulary exercises, multiplication table exercises,
STEM-related activities, and the adherence to HEPA standards
● Program outputs include improved vocabulary, stronger quantitative and critical-thinking
skills, healthy nutrition, and enhanced physical fitness
● Program outcomes focus on a heightened sense of self-confidence, greater discipline, an
interest in STEM careers, more time spent engaging in physical activities
● All of the aforementioned elements are intended to bring about the following long-term
impacts: improved grades at school, increase likelihood of pursuit of STEM related
careers, and a healthy lifestyle

The logic model also includes contextual factors which are not under the program’s control and
may impact program effectiveness. Examples include school and community environment,
family income, race, household composition (family size, number of parents, number of
dependents), and student attributes.

The evaluators designed the logic model within the confines of YMCA-YFS After-School
program’s goals; as a result, “activities” revolve around HEPA standards and the STEM
curriculum. However, this logic model can be modified for YMCA’s other youth development
and social services programs as well.

The evaluators recommend that the YMCA-YFS After-School program reviews and modifies
this logic model on an annual basis in response to potential changes in funding and/or program
goals.

10
Figure 1.

IV. Proposed Evaluation Design Summary:


Because YMCA-YFS After-School programs are centered on improving youth academics
(STEM), health (HEPA), and social skills, we propose pretest/posttest designs in order to capture
changes in all three dimensions. Academic performance for participants should be measured at
baseline (youth’s initiation into the program) and at endpoint (youth’s graduation from the
program). Proxies for youth academic performance could be school grades from the most recent
semester. If school grades are not available, scores from a YMCA-YFS designed and
administered test (refer to Appendix 1) may be recorded at baseline and endpoint.

Youth health encompasses both nutrition and physical fitness. In regards to nutrition YMCA-
YFS staff may record youth Body Mass Index (BMI) at baseline and endpoint. We acknowledge
that BMI calculations only provide a broad overview of a child’s nutritional habits (calculations
do not take into account weight from muscle mass and bone density), therefore we also propose a
questionnaire asking students about their eating habits. Questionnaires should be disseminated
and responses recorded at the time a child enters the program and again when the child
graduates. The same procedure should be applied when measuring changes in youth’s physical
fitness (pretest/posttest questionnaires asking about their exercise habits to be applied at baseline
and endpoint). We have also taken into account the possibility that students may misreport their

11
fitness performance and recommend a pretest/posttest fitness challenge as an alternative strategy
(refer to Figure 2 for additional information).

Lastly, changes in social interaction may be measured via staff observation sheets. Topics
covered include youth peer interaction, ability to collaborate with youths especially those from
other backgrounds, ability to accomplish goals, and global and cultural awareness. A
questionnaire asking participant to self-report on the same topics as those in the staff observation
sheets, and a satisfaction/exit survey filled out by student participants is recommended. We
acknowledge it may not be possible to ascertain the level of interaction, collaboration, cultural
awareness...etc. clearly at baseline, therefore we recommend that observation sheets be filled out
over the first week of the program. Baseline questionnaires by youths should also be completed
at the end of the first week. Finally, staff observation sheets and youth questionnaires (self-
reports) should be completed again at the program’s endpoint in tandem with the satisfaction/exit
survey.

The following sections will present the program’s key and secondary evaluation questions and a
design matrix.

V. Evaluation Question
The evaluation plan is designed to help YMCA-YFS After-School Programs address the core
question: Is there an improvement in academic performance, diet and nutrition, and an
increase in positive social interactions with peers and staff members among students who
participate in the after-school program on a consistent basis?

In order to answer the key question, several subsidiary questions must be answered:
1) What are the levels of academic performance among students pre and post exposure to
the YMCA-YFS After-School Programs?
2) Are there any dietary changes pre and post exposure to the YMCA-YFS After-School
Programs?
3) Are there any changes to levels of physical activity pre and post exposure to the YMCA-
YFS After-School Programs?
4) How comfortable do students feel interacting with program staff and peers at the
beginning versus post program?

VI. Design Matrix


A design matrix is a framework which allows evaluators to organize relevant evaluation
questions and match them with the means to answer those questions, observable indicators, and
limitations of chosen strategies, respectively. The matrix is a useful tool to aid systematically
making decisions about appropriate data collection and analysis methods for each research
question. The design matrix below is tailored specifically for the YMCA-YFS After-School
programs with a focus on measuring changes in academic performance, nutrition, physical

12
fitness, and social interaction. Column 1 lists the evaluation questions identified in the above-
section, column 2 lists verifiable/observable indicators that program staff may refer to when
implementing a specific data collection strategy, and column 3 presents potential data collection
methods/sources necessary to answer their respective evaluation questions. Column 4
acknowledges the potential limitations to each indicator/methodology and provides mitigation
strategies.

Figure 2. Design Matrix


Questions Measure/Indicator Methodologies/Data Possible Limitations
Sources and Recommended
Mitigation Strategies
What are the levels of -Students’ school grades -School Report Card -Schools/parents may not
academic performance prior to entry into the -YFS pre-test scores permit the release of academic
among students prior to after-school program records
(baseline grades) -Mitigation Strategy: YFS pre-
their entry into the
-YMCA-YFS designed test scores may be used as the
YMCA-YFS After-School pre-test (testing only source for baseline grades
Programs? vocabulary and
quantitative skills;baseline
scores)

What are the levels of -Students’ school grades -School Report Card -Schools/parents may not
academic performance after graduating from the -YFS graduation-test permit the release of academic
among students after after-school program scores records
(note: the endpoint grades -Mitigation Strategy: YFS
graduating from the
do not necessarily have to graduation-test scores may be
YMCA-YFS After-School be at graduation, it may be used as the only source for
Programs? arbitrarily chosen but endpoint grades
must be at the same time
for all students to avoid
time-related contextual
factors interfering with the
results)

Are there any dietary -Student-self report of -Questionnaire asking -Students may not accurately
(nutritional) changes pre eating habits students what they eat on recall and/or misreport their
and post exposure to the a daily basis eating habits
-BMI calculation may not take
YMCA-YFS After-School -Student body mass index
-BMI is calculated by into account muscle mass and
Programs? (BMI) pre and post
dividing an individual’s bone density (e.g. a child may
exposure to the after-
weight in kilograms (kg) be considered overweight even
school program
by the square of their though the excess weight is
height. healthy muscle mass)
-Mitigation Strategy: Use both
methodologies in tandem

Are there any changes to -Fitness pre and post -Questionnaire asking -Students may not accurately
levels of physical activity program students how often they recall and/or misreport their
pre and post exposure to exercise or engage in physical activity. Students may
sports/recreational also overestimate their fitness

13
the YMCA-YFS After- activities. Questionnaire level
School Programs? may also ask students to -Mitigation Strategy: pre/post
self-report their fitness program fitness tests may be
levels implemented (e.g. Track and
Field event format: Timed 20
meter dash, Timed push-up
contest)

How comfortable do The level of student -Staff observation sheets -Defining what constitutes an
students feel interacting interaction with program on youth peer and staff “interaction” may be difficult.
with peers and program staff and peers within the interaction, ability to
first week of the program
staff? collaborate with youths
and at program -Mitigation Strategy:
graduation. from other backgrounds,
Staff should have a clear and
youth’s ability to
uniform definition for
accomplish goals, and
“interaction” (e.g. “a verbal
youth global and cultural
exchange including responses
awareness
to questions and inquiries,
-Participant questionnaire
excluding “yes/no” answers
asking youth to self report
and greetings”)
on the same topics in staff
observation sheets
-Participant
(We acknowledge that this
exit/satisfaction survey
mitigation strategy is already
asking for feedback about
being employed by program
staff and activities
staff)

VII. Data Collection

The instruments and techniques for data collection are crucial in evaluating a set of outcomes.
Data collection methods must be feasible and appropriate to answer the defined evaluation
questions. The following section addresses YMCA-YFS’s current data collection instruments,
their benefits, areas of improvement and our proposed data collection strategies going forward.

YMCA’s Current Data Collection Instruments:


1) Staff observation sheets on youth peer interaction, ability to collaborate with other youths
who are different than they are, youth’s ability to accomplish goals, and youth global and
cultural awareness
2) Participant questionnaire asking youth to self report on the same topics in staff
observation sheets
3) Participant exit/satisfaction survey asking for feedback about staff and activities

14
(Note: because data from staff observation sheets and participant questionnaires were not
available to us, this section will only address benefits derived from the implementation of
exit/satisfaction surveys).

Exit/Satisfaction surveys cover the following topics:


1) “I learn new things”
2) “I feel safe at the activities”
3) “I like the staff here”
4) “Staff tell me when I do a good job”
5) “Staff listen to what I have to say”
6) “Staff treat all kids nicely”

Response options are as follows: Yes, Kind Of, Not Really, and No

Benefits of Current Data Collection Instruments:


The exit/satisfaction survey allows program staff to capture the frequency of each type of
response and allows them to draw associations between responses through a chi-square test.
More specifically, it enables program staff to adapt their methods of interaction with kids in
order to improve learning among the various age groups within the program.

Although this report is an evaluation design, we offer a demonstration of this data collection and
analysis strategy based on the data given to us from program staff. We found that among
younger kids (K-2nd grade CAQT 2016), there was a relatively strong and statistically
significant association (Chi-Square Value = 16) between learning new things and staff treating
the kids nicely. The association was weaker between staff telling the kids when they do a good
job and the kids learning new things; Chi-Square value = 9.8, with the value not statistically
significant. Among older kids (3-8 grades CAQT 2015) we found that there was a relatively
strong and statistically significant association (Chi-Square Value = 55.039) between kids
learning new things and staff asking kids to plan and lead activities. The association was weaker
between kids learning new things and activities being fun; Chi-Square Value = 34.18 and
statistically significant. Refer to Appendix 2 for detailed results.

Proposed Data Collection Strategies:

Sample Size Recommendations: In order to ensure validity of findings, sample size for each
group (e.g. K-2nd grade, 3rd grade-8th grade) must be at least 3010. We acknowledge that given
the funding and staff availability constraints, the program cannot simply take on more kids for
the sole purpose of achieving validity in its evaluation findings. We therefore recommend that
the proposed data collection strategies be applied in all 3 program locations (CAQT, NWP, and
10
Hogg, Robert V, and Elliot A. Tanis. Probability and Statistical Inference. Macmillan, 1977.

15
Nob Hill), and if possible, to all of YMCA-YFS After-School program branches in the
Washington D.C., Maryland, and Northern Virginia area. This will increase the likelihood that
sample size requirements are met.

Collecting Academic Performance Data: We propose two strategies which the YMCA After-
School Program could use in order to collect academic performance data.

Strategy 1: The first is asking students who join the program to bring with them their most
recent school report card. The grades shown in the report card will serve as the student’s baseline
academic performance. The student may present another school report card at the time he/she
graduates from the after-school program (endpoint). The difference between baseline and
endpoint grades may represent the program’s effect on the student’s academic performance.

Potential Obstacles: We acknowledge that there are several complications with only using
school report cards to capture changes in academic performance. The first is derived from the
possibility that the time at which the school disseminates report cards to the students may not be
aligned with each student’s program entry/exit schedule. This is not a problem for baseline report
cards because students can bring their most recent copy, however it is a problem for endpoint
report cards. The second obstacle is the possibility that parents may not be comfortable with
releasing their child’s report card information to the program.

Strategy 2: Taking these obstacles into consideration, we propose a second strategy to obtain
academic performance data. By using an in-house academic performance test (10 quantitative
questions and 10 vocabulary questions) administered by the YMCA YFS After-School program
(Refer to Appendix 1), the scheduling and confidentiality issues are mitigated. The After-School
Program could administer a baseline test (pretest) at the program’s commencement and again at
the its conclusion. This strategy can also accommodate students who join the program in the
middle of its progression. In this case, the test may be administered when the child enters the
program and again when he/she leaves. According to program staff, the YMCA YFS After-
School program starts in September and continues until May. Below, Figure 3 illustrates the
months in which pre/post tests could be administered (Note: this schedule is also recommended
for all pre/post tests proposed in the evaluation design). Furthermore, if program staff so
chooses, midpoint performance tests may also be administered. In this case, December and
January would be the most ideal for midpoint data collection.

16
Figure 3. Possible Test Administration Dates

Month: September October November December January March April May

Pre-test/Post- Academic Mid-Program Mid- Academic


tests: Performance Academic Program Performance
Pre-Test Performance Academic Post-test
(Baseline) test Performance (Endpoint)
(Midline) test
(Midline)

Ideal Strategy: Ideally, we recommend using strategies 1 and 2 in tandem (program staff may
observe treatment effects in both school and after-school program contexts); however, should the
aforementioned obstacles arise, strategy 2 is the preferred choice.

Collecting Nutrition/Diet Data:

Strategy 1: In order to capture students’ changes in nutrition and diet pre and post exposure to
the YMCA YFS After-School Program, we propose administering a questionnaire/survey asking
students about their eating habits in the last 24 hours (Refer to Appendix 3). The questionnaire
has six questions asking students what meals and drinks they have for breakfast, lunch, and
dinner. The questionnaire concludes by asking students whether and what they consume in-
between meals (snack time). These will be open ended questions allowing students to identify
exactly what they consume on a daily basis. The questionnaire should be administered at
program commencement and conclusion (or when a student decides to opt out of the program).
To provide the program staff with the option of recording the changes in total calories consumed
by each student on a daily basis at baseline and endpoint, a list of common foods and their
calorie count (Figure 4) is provided below for demonstrative purposes.

Figure 4. Demonstrative Food/Quantity/Calorie List11


Food Description Quantity Calories

APPLE JUICE CANNED 1 cup 115

BAGELS, PLAIN 1 bagel 200

BANANAS 1 banana 105

CHEESEBURGER, REGULAR 1 sandwich 300

CHOCOLATE MILK, LOWFAT 1% 1 cup 160

CORN FLAKES, KELLOGG'S 1 oz 110

EGGS, COOKED, FRIED 1 egg 90

11
http://www.invive.com/calorie.html

17
ICE CREAM, VANILLA, SOFT SERVE 1 cup 375

YOGURT, W/ LOFAT MILK, PLAIN 8 oz 145

Cola Regular 12 Fl oz 160

Potential Obstacles: Students may not recall and/or misreport their eating habits.

Strategy 2:
The second strategy program staff may use to measure changes in nutrition and diet is by using a
Body Mass Index calculation (BMI)12. A body mass index is the ratio between an individual’s
height and weight. The BMI ratio may be calculated by dividing the individual’s weight (kg) by
the square of their height (meters). A calculation is shown below in figure 5:

Figure 5. BMI Calculation

For the demonstrative purposes: Person A’s BMI will be based on his weight of 50 kilograms
and a height of 1.6 meters:

BMI = weight (kg)/height^2


BMI = 50/1.6^2
BMI = 50/2.56
BMI = 19.53

(Note13: BMI < 18.5 = underweight, 18.5<BMI<24.9 = normal weight, 24.9<BMI<29.9 =


overweight, BMI > 30 = obese)

In this example, a child with a BMI ratio of 19.53 indicates that his/her weight is within a normal
weight range.

Potential Obstacle: Although BMI calculations allow program staff to quickly estimate a child’s
nutrition and diet, it does not take into account weight derived from bone density and muscle
mass (e.g. a student athlete who is perfectly healthy and has relatively more muscle mass than his
peers may register as “overweight” using a BMI calculation).

Ideal Strategy: We acknowledge that both strategies have benefits and drawbacks. To maximize
accuracy in data collection, we recommend using strategies 1 and 2 in tandem.

Collecting Student Fitness Data:

12 http://extoxnet.orst.edu/faqs/dietcancer/web2/twohowto.html
13 https://www.nhlbi.nih.gov/health/educational/lose_wt/BMI/bmi-m.htm

18
Strategy 1: Program staff may collect data on students’ level of fitness pre and post program by
administering a questionnaires asking each student questions regarding their daily physical
activities (Refer to Appendix 4 for the questionnaire). The questionnaire should be administered
at program commencement and again at the program’s conclusion.

Potential Obstacles: Students may not fully recall and/or misreport their level of interest and
engagement in physical activities.

Strategy 2: Program staff could organize an on-site cardiovascular fitness test in the form of a
timed 20 meter dash. Staff may record each student’s times at program commencement and
again at the program’s conclusion. Differences in times will indicate whether the student has
become more/less fit. In order to make the event more fun, program staff may incorporate other
fitness tests as well e.g. push-up contest (staff would record the number of push-ups executed
within a given time period e.g. 1 minute)

Collecting Social Interaction Data:

Strategies: We acknowledge that the program staff already employs three data collection
methodologies to capture students’ social interaction data. (1) Program staff utilizes observation
sheets to record youth peer and staff interaction, youth’s ability to collaborate with other youths,
who are of different backgrounds, youth’s ability to accomplish goals, and finally, their global
and cultural awareness. (2) Participant questionnaires asking youths to self-report the same
information as staff observation sheets and (3) an exit/satisfaction surveys asking for feedback
about staff and activities. In this regard, we recognize that program staff has a robust strategy to
capture youth social interaction. Program staff is currently utilizing all three methodologies in
tandem.

Potential Obstacle: It is difficult to define what an “interaction” entails.

Note: Staff should have a clear and uniform definition for interaction. A potential definition
could be “a verbal exchange including responses to questions and inquiries, excluding “yes/no”
answers and greetings”. Furthermore, we acknowledge that it is not possible to ascertain the level
of interaction, collaboration, and cultural awareness immediately at baseline, therefore, we
recommend that baseline observation sheets on relevant interactions be filled out over the first
week of the program. Baseline (self-report) questionnaires for youths should also be completed
by the end of the first week. Staff observations sheets and participant questionnaires should be
completed again at the program’s conclusion in tandem with the exit/satisfaction survey to
capture changes in social interaction.

VIII. Data Analysis:

19
YMCA-YFS’s Current Data Analysis Strategy
The current strategy is focused on using descriptive statistics, like percentages, to analyze data
on exit surveys. The documents shared with evaluators show the following analytical strategies:
1) Academic performance analysis of students in 1st to 5th grade and in 6th to 8th grade
separately for Math, Science, Social Studies and Reading on a quarterly basis.
2) Exit Surveys with analysis for students from kindergarten to 8th grade.

Benefits of Current Data Analysis Strategy


The existing strategy of the exit surveys to analyze students’ feedback about the staff and
activities provides a broad overview of the performance and perceptions of the after-school
program students. The staff can extract findings like the overall satisfaction of students with
respect to the staff performance and activities offered at the program. With the quarterly
academic performance analysis, the staff can get an overview of the performance trend for the
cohort.

Proposed Data Analysis Strategies:

Recommendations for Organizing Data for Analysis

Strategy: We propose utilizing a codebook to systematically conceptualize questions, variable


names, type of variables and response options. The codebook is developed in tandem with the
data collection instruments to summarize how the data can be collected, entered and analyzed.
For an in-depth analysis of an integrated dataset, the first step is to complete the codebook as a
guide to the overall data and how it can be analyzed. We have designed the codebook (attached
with the report) to demonstrate how it can be utilized for subsequent data entry and analysis. A
snapshot of codebook is provided below for illustration (Figure 6):

Figure 6: Codebook
Question Question Data Type Options Appearanc Label Required
No. e Name

9 What is the Numeric 1=1 Single vocscr_t1 Yes


vocabulary 2=2 Select
score at 3=3
baseline? 4=4
5=5
6=6
7=7
8=8
9=9
10=10

20
12 Are the Numeric 1=Yes Single fun_activiti Yes
activities 2=No Select es
fun? 3=Kind of
4=Not
Really

In the Figure above, the first two columns represent question number and evaluation questions
respectively. The third column represents the ‘Data Type’ for the type of variable that the
responses to question aim to capture. For example, the answer to Question 9 would be
somewhere between 1 and 10 which is a numeric variable. The fourth column represents the
response options for close-ended questions. For example, if a student is asked ‘Do you like the
staff?’, the options column would have Yes, No, Kind of and Not Really under the options
column. The fifth column called the ‘Appearance’ represents the way data is entered in the
spreadsheet. For example, open-ended questions like vocabulary score are entered in the text
format and for close-ended questions, the response would be entered as single select if only one
option is supposed to be selected out of the given options. The sixth column is called the ‘Label
Name’ which describes the variable name corresponding to the evaluation question as it is
entered in the spreadsheet. In the figure above, the variable names for baseline, midline and
endpoint vocabulary scores are vocscr_t1, vocscr_t2 and vocscr_t3 respectively. The seventh
column is called ‘Required’ represents the data that is required to be recorded and entered.

Recommendations for Integrating Data Sets

Strategy: We propose utilizing a Master Spreadsheet (attached with the report) to integrate data
on grades, participation rates, exit surveys, physical fitness and nutrition/diet that is
disaggregated by unique student IDs. This allows for comprehensive analysis of inter-related
outcomes, such as the academic performance, physical fitness and nutrition outcomes. Measures
of association between categorical variables (e.g. Chi-square test) and continuous variables (e.g.
correlation) can be used to ascertain the program effectiveness across the academic and non-
academic dimensions.

Capturing Program Effectiveness

Strategy: As proposed in the evaluation design, the pretest-post-test method can capture the
impact of after-school program on the intended outcomes. To this end, we recommend using the
baseline, midpoint(optional) and endpoint scores to measure average changes in test scores,
vocabulary scores, math scores, nutrition (calories, BMI) and physical fitness (exercise). An
example of measuring the average changes in vocabulary scores due to the program (derived
from codebook) is as follows:

21
n=number of participants, vocscr_t1= baseline vocabulary score, vocscr_t2= midline vocabulary
score, vocscr_t1=endpoint vocabulary score

Impact of the program on vocabulary scores (baseline to midpoint)=(vocscr_t2/n)-(vocscr_t1/n)


Impact of the program on vocabulary scores (midpoint to endpoint)=(vocscr_t3/n)-(vocscr_t2/n)
Impact of the program on vocabulary scores (baseline to endpoint)=(vocscr_t3/n)-(vocscr_t1/n)

Overall Recommendations for Data Collection and Analysis

Areas for Improvement in Current Data Collection Instruments:

Baseline and Endpoint Data


Current data collection methods only allow evaluators to assess associations between specific
outcomes, as mentioned in the section above. However, the data collection instruments cannot be
used to capture the impact of the afterschool program as there is no baseline and endpoint
instrument to measure the changes in the intended outcomes.

To truly capture the impact of the program along academic and non-academic dimensions, the
data has to be recorded for the outcomes of interest before and after program participation. For
example, the exit surveys should be coupled with the entry surveys to record the changes in the
perception of participants about the activities and staff. Midpoint data can also be recorded to
add more observations in determining the trend.

Self-Reported Data
Existing data collection instruments do not have the mechanisms to mitigate problems with self-
reported data. For example, if the students misreport their perception of the physical activities,
then there should be a way to mitigate the problem through another instrument. One way might
be to introduce a fitness test to corroborate the self-reported data on physical fitness. To
demonstrate the impact of the program, the intended outcomes must be measured at least before
and after the program, and the self-reported data should be supplemented with a measure to
strengthen the internal validity of the findings.

Sample Size for Surveys


Small sample size for the exit surveys is another weakness in the data collection methods. At
most, the sample size for an exit survey was 26 only out of 70 students in the cohort. To increase
the strength of an analysis, a larger sample size is needed. One target is the sample size of 30
which happens to be the bare minimum to assume normal distribution and conduct subsequent
statistical analysis.

Areas of Improvement in the Data Analysis Strategy

22
Inferential Statistics
The current analysis is centered on descriptive statistics that capture the respective proportions of
students with different grades and feedback. For example, the pie chart in exit survey represents
the percentage of students who said ‘yes’, ‘kind of’ ‘no’, ‘not really’ to the question about staff,
such as ‘Staff cares about me’. Analytic depth can be improved by inferential statistics that
measure the association between staff caring about a student and the student’s academic
performance and/or participation rates.

Sub-Group Analysis
The datasets are different in the sense that not every dataset is disaggregated by unique student
ID. Data from exit surveys, participation rates, grades and other data sources cannot be merged
for a comprehensive analysis for the after-school program’s effectiveness. Furthermore, the
fragmented data set impedes sub-group analysis across academic and non-academic dimensions.
For example, does the association between academic performance and staff feedback vary with
the gender and/or the grade of the student? This type of sub-group analysis can help capture the
impact of the program on different categories of students and inform a variety of activities
targeted at different groups of students to achieve the intended outcomes.

The following section provides overall recommendations that apply to both the data collection
and analysis.

Consistent Data Entry


Data sets for exit surveys do not have a codebook to ensure a uniform data entry method across
sites. A code book is a complete list of all data, showing the name of each variable, the values
the variable takes, and a complete description of how that variable is operationalized. The
absence of a codebook can lead to problems on non-uniform data entry across sites and
personnel, and can pose problems during data cleaning and analysis. Hence, there is a need for a
guiding tool and a frame of reference, like the codebook, to facilitate consistency in data entry
for effective analysis.

Fragmented Datasets
Integrated data sets are important for comprehensive evaluation of the program. Current
instruments record data in different excel spreadsheets and formats (Microsoft Word and
Microsoft Excel). Fragmented data sources make analysis difficult and wastes the evaluability
potential of the recorded data. For example, the disaggregated attendance rates are recorded in
spreadsheets with every month’s attendance rate corresponding to an individual student
identification code. However, the raw data for exit surveys in the spreadsheets do not correspond
to the individual student IDs. Hence, it is not possible to merge the datasets for analysis and
evaluation.

23
IX. Proposed Presentation and Utilization Plan
The tables provided in the report and the master spreadsheet attached can be utilized to
synthesize data and generate output to demonstrate the impact associated with the program to
funders, potential funders and other stakeholders. The codebook and data entry guide can be used
as reference tools to inform ongoing evaluation of the program. The pretest-post-test design with
baseline, midpoint and endpoint data can be used to present overall changes in outcomes that can
be associated with program’s effectiveness. To make implementing the evaluation plan as simple
as possible, the proposed data collection, entry and analytical tools complement the existing
practices and output data shared with the evaluators. The following section addresses potential
threats to validity in the proposed pretest post-test design that might confound the causal link
between the program and outcomes.

Threats to Validity

 Student History: The program accepts students from various ethnic, socio-economic
and family backgrounds. For example, the number of parents in the family could
affect the amount of time allocated towards child-rearing/supervision which, in turn,
could be associated with academic performance and the likelihood of delinquency.
 Maturation: The duration of each program cycle is one year which could provide
sufficient time for students to mature. Changes in measured outcomes could be
attributed to student maturity rather than program effectiveness.
 Measurement Effect: When students know that their performance is being gauged,
their behavior might change. For example, students could answer questions in a way
that they perceive to be pleasing to the staff.
 Regression to the Mean: The students who participate in the afterschool programs
are more likely to underperform academically relative to their peers. Therefore, the
observed changes in academic outcomes for these students may be larger than their
peers who are initially better equipped for academic challenges.

The evaluators propose that YMCA-YFS utilize the proposed data collection and analysis plans
in the following ways:
● Annual Reports to the Current Funders. Annual reports should include data on output
(such as the average change in math scores, vocabulary scores, grades, BMI, caloric
consumption) to demonstrate how the program has utilized funds to achieve the intended
outcomes.

24
● Proposals to Potential Donors for Additional Funding. The proposed evaluation output
and highlights from the results of annual report can be appended to the proposals to
demonstrate program effectiveness. YMCA-YFS can present last year’s successes as
evidence for program expansion and additional funds required to do so.
● Program Planning and Implementation. Data obtained from defined outputs and
feedback on staff and activities can help inform the strengths and areas of improvement
in program implementation. Evidence-based program planning and implementation can
help foster a culture of organizational learning.
● YMCA-YFS Website (http://yfs.ymcadc.org/). A results section can be added to the news
section on the website to highlight program successes and key findings from the
evaluation.
● YMCA-YFS Brochure. The brochure can be released on annual basis to highlight the
program successes, plans for program expansion and contact details for donations.

X. Conclusion:

The evaluation plan proposed in this report is designed to help YMCA-YFS systemize the
process of measuring the program’s intended outcomes of its academic, health (nutrition and
fitness) and social interaction components. We have provided the organization with concrete
recommendations to enhance data collection, management and analysis tools coupled with
communication methods to report the evaluation findings effectively for current and potential
funders, and general public. We have also shared guiding tools, such as the integrated
spreadsheet and codebook, to build on the organization’s analytic depth in measuring the overall
program effectiveness. We have proposed multiple recommendations and mitigation strategies
for data collection for the organization to pursue the most feasible and cost-effective techniques.
We recommend the organization to seek guidance on any aspect of the plan that needs further
clarification.

25
Appendix 1: Sample YMCA-YFS academic pretest/posttest

YMCA-YFS Academic Pre-Test

Quantitative Section14:

1) A teacher bought candy for each of her students. She has 40 students in her class; if
she gave 15 pieces of candy away, how many does she have left?
a) 28
b) 25
c) 17
d) 35

2) Sam was counting by 5’s. He stopped at 90. What number comes next?
a) 95
b) 85
c) 100
d) 86

3) Mr. Johnson’s 3rd graders were learning about number sequence patterns, They
counted 64, 68, 72, 76, 80. What number comes next and what is the difference between
each number?
a) 5, 90
b) 3, 83
c) 4, 84
d) 6, 86

4) Mrs. Smith bought 386 pencils. She lost 29 of them. How many does she have left?
a) 370
b) 347
c) 368
d) 357

14
Soft Schools. "3rd Grade Word Problems Quiz." Free Math Worksheets, Free Phonics Worksheets,
Math Games and Online Activities and Quizzes, 2016.

26
5) Samantha is ordering dinner in a restaurant. She can have steak or a hamburger. On
the side, she may choose, french fries, mashed potatoes, salad, or soup. How many different
combinations can Samantha order for dinner?
a) 4
b) 6
c) 8
d) 10

6) James is going to buy a new car. He may choose from a sedan, limousine, or a van. The
available colors are blue, black, green, white, and red. How many combinations does James
have to choose from?
a) 20
b) 17
c) 15
d) 24

7) A hotel is 3 stories tall. There are 30 offices on the first floor, 6 more offices on the
second floor than on the first floor, and 4 more offices on the third floor than on the second
floor. How many offices are there?
a) 106
b) 115
c) 103
d) 150

8) Cindy has 3 stacks of cement blocks. The first stack is 15 blocks tall. The second stack is
4 blocks taller than the first stack. The third stack is 2 blocks shorter than the second stack.
How many blocks did Cindy use in total?
a) 51
b) 40
c) 33
d) 60

9) Calvin had 252 cupcakes. His sister ate half of them. How many cupcakes does Calvin
have left?
a) 114
b) 126
c) 130
d) 174

10) A store sells boxes of baseball cards. Each box has 5 packets. Each packet has 5 cards.
How many cards are there in 3 boxes?

27
a) 35
b) 65
c) 25
d) 75

Vocabulary Section15:

1) In which book would you expect to find a world map?


a) Dictionary
b) Cook Book
c) Phone Book
d) Atlas

2) Thank you for making me_____ of the problem


a) Nice
b) Aware
c) Gaze
d) Listen

3) What is a hen?
a) Reptile
b) Fish
c) Female Bird
d) Lion

4) Can you tell me your _____?


a) Picture
b) Concerns
c) Pet
d) Music

5) I want to ____ the colors together.


a) Burn
b) Extract
c) Blend
d) Erase

6) What is the synonym for “call”?


a) Summon

15
Soft Schools. "3rd Grade Vocabulary." Free Math Worksheets, Free Phonics Worksheets, Math Games
and Online Activities and Quizzes, 2016

28
b) Dismiss
c) Release
d) Draw

7) What is the synonym for “strong”?


a) Wobbly
b) Durable
c) Rickety
d) Insecure

8) What is the antonym for “depart”


a) Leave
b) Exit
c) Arrive
d) Recess

9) What is the antonym for “happy”


a) Depressed
b) Ecstatic
c) Joyful
d) Blissful

10) What do you call a person who writes news articles professionally?
a) Engineer
b) Preacher
c) Journalist
d) Singer

Answer Key:
Quantitative Section
1) B
2) A
3) C
4) D
5) C
6) C
7) A
8) A
9) B
10) D

29
Vocabulary Section
1) D
2) B
3) C
4) B
5) C
6) A
7) B
8) C
9) A
10) C

Appendix 2 Exit Survey-CAQT Chi-Square data:

Chi-Square Data Exit Survey-CAQT K thru 2nd Grade (FY 2016)

Q3a - Staff treats all kids nicely * Q1B - I learn new things Crosstabulation

Q1B - I learn new things Total

Kind of Yes

Q3A - Staff treats all kids 2 0 0 2


nicely
Kind of 0 1 1 2

Yes 1 0 10 11

Total 3 1 11 15

Chi-Square Tests

Value df Asymp. Sig. (2-


sided)

Pearson Chi-Square 16.003 4 .003

Likelihood Ratio 12.422 4 .014

N of Valid Cases 15

a. 8 cells (88.9%) have expected count less than 5. The minimum

30
expected count is .13.

Q2B - Staff tell me when I do a good job* Q1B - I learn new things Crosstabulation

Q1B - I learn new things Total

Kind of Yes

Q2B - Staff tell me when I 2 0 0 2


do a good job
Kind of 0 0 2 2

Not Really 0 0 1 1

Yes 1 1 8 10

Total 3 1 11 15

Chi-Square Tests

Value df Asymp. Sig. (2-


sided)

Pearson Chi-Square 9.818 6 .133

Likelihood Ratio 9.115 6 .167

N of Valid Cases 15

a. 11 cells (91.7%) have expected count less than 5. The minimum


expected count is .07.

Chi Square Data Exit Surveys-CAQT 3rd thru 8th Grade (FY 2015)

Q2J - Staff asks me to plan, choose, or lead activities * Q1C - I learn new things Crosstabulation

Q1C - I learn new things Total

Kind of Not Really Yes

31
Q2J - Staff asks me to 1 0 0 0 1
plan, choose or lead
activities Kind of 0 0 1 0 1

Not Really 0 1 0 1 2

Yes 0 2 0 20 22

Total 1 3 1 21 26

Chi-Square Tests

Value df Asymp. Sig. (2-


sided)

Pearson Chi-Square 55.039 9 .000

Likelihood Ratio 18.783 9 .027

N of Valid Cases 26

a. 15 cells (93.8%) have expected count less than 5. The minimum


expected count is .04.

Q1B - The activities are fun * Q1C - I learn new things Crosstabulation

Q1C - I learn new things Total

Kind of Not Really Yes

Q1B - The activities are 1 0 0 0 1


fun
Kind of 0 0 1 2 3

Yes 0 3 0 19 22

Total 1 3 1 21 26

Chi-Square Tests

32
Value df Asymp. Sig. (2-
sided)

Pearson Chi-Square 34.179 6 .000

Likelihood Ratio 13.615 6 .034

N of Valid Cases 26

a. 11 cells (91.7%) have expected count less than 5. The minimum


expected count is .04.

Appendix 3 Nutrition Questionnaire Pre/Post Program 16

1) What have you eaten for breakfast in the last 24 hours Please specify the quantity e.g. 1
bowl/1 plate
2) What did you drink for breakfast in the last 24 hours? Please specify the number of
cups/bottles
3) What have you eaten for lunch in the last 24 hours? Please specify the quantity e.g. 1
bowl/1 plate
4) What did you drink for lunch in the last 24 hours? Please specify the number of
cups/bottles
5) What have you eaten for dinner in the last 24 hours? Please specify the quantity e.g. 1
bowl/1plate
6) What did you drink for dinner in the last 24 hours? Please specify the number of
cups/bottles
7) Do you usually have a snack in between meals? If yes, please specify the snack and the
quantity (e.g. 1 protein bar, 1 bag of chips, a cup of yogurt)

Appendix 4 Fitness Questionnaire Pre/Post Program17

1) Does your school have time allocated for recess on a daily basis? If yes, how long per
day?
2) What activities do you do during recess?
3) Do you participate in a school sports team or a local sports team outside of school during
your free time? If yes, how much time do you spend practicing per week? If the answer to
the first question is “no” please do not answer the second part of this question

16
Arthur Family Health. "Nutrition Quiz . Arthur." PBS KIDS, 2016
17
Arthur Family Health. "Fitness Quiz . Arthur." PBS KIDS, 2016

33
4) If your answer to question 3 is “no” what activities do you engage in during your free
time?
5) Do you participate in a physical education class at school on a daily basis? If yes, how
much time is allocated for P.E. class?
6) Do you enjoy your P.E. class?

34

You might also like