1

Departmental Evaluation In Higher Education
Jennifer S. Riggleman
Research Paper submitted for
CI 676: Program Evaluation
at Marshall University
in partial fulfillment of the
requirements for the degree of
Doctor of Education
in
Educational Leadership
Ron Childress, Ed.D., Professor
Graduate School of Education and Professional Development
South Charleston, West Virginia 2012
Keywords: Higher education, assessment
Copyright 2013 by Jennifer Riggleman














2

Abstract

The purpose of the paper is to provide insight into departmental program evaluation in
higher education. The paper investigates Sport Science Departmental evaluation, and programs
using accreditation as evaluation standards. During the investigation into program evaluation,
the research showed that there is no standard template for how to perform these evaluations and
that there is flexibility within the evaluation process. Program evaluation is used to make
curricular, pedagogical, facility, and faculty adjustment based on the results of the evaluation.
3

Table of Contents
Table of Contents
Departmental Evaluation In Higher Education ............................................................................... 1
Abstract ........................................................................................................................................... 2
Table of Contents ............................................................................................................................ 3
Introduction and Theoretical Framework ....................................................................................... 4
Statement of the Problem ................................................................................................................ 6
Purpose of the Study ....................................................................................................................... 8
Questions ........................................................................................................................................ 8
Delimitations ................................................................................................................................... 9
Significance of Study ...................................................................................................................... 9
Methods ........................................................................................................................................ 10
Literature Review ......................................................................................................................... 10
Streamlining the Process on a Global Scale ............................................................................. 10
Program Evaluation in West Virginia ....................................................................................... 11
Sport Science Programs ............................................................................................................ 12
Accreditation ............................................................................................................................. 15
Program Review ....................................................................................................................... 16
Conclusion .................................................................................................................................... 17
Discussion and Implications ......................................................................................................... 18
Implications for Further Research .............................................................................................. 188
References ..................................................................................................................................... 19



4

Introduction and Theoretical Framework

“The purpose of assessment is to produce feedback to the department, school/college, or
administrative unit on the performance of its curriculum, learning process, and/or services,
thereby allowing each unit to improve its programs. It is not an evaluation of individual students
or of individual faculty or staff (Outcomes Assessment, 2011).” This type of feedback is the
reason individual departments in higher education need to conduct annual program evaluations.
This method of assessment provides the department with an overall perspective as to how their
program is performing. Evaluation of the department tells them whether or not their students are
showing increased knowledge of the curricular content and whether or not the departmental
goals are being met by quantifying measureable learning objectives. Departmental evaluation
provides data for reflection to determine what is working and what is not. Each department can
then make the appropriate changes to address identified gaps. Evaluation is essential to
determine departmental effectiveness.
The Sport Science Department at Davis and Elkins College (D&E) has changed and
evolved over the years. Currently there are three majors within this department: Physical
Education, Exercise Science, and Sport Management. The 83 students are relatively evenly
distributed between the three majors. The ultimate departmental goal is to have all 83 majors
complete their coursework, graduate, and find jobs within their fields. Another desired outcome
is to mold and shape their character and develop responsibility, professionalism, and maturity
during the four years at D&E.
“The quality of teaching and learning can be seen from the perspective of
goals, the process deployed to achieve these, and how far they are achieved (Mohr, 1995).” That
reason for quality teaching encapsulates the basis for departmental program evaluation in higher
5

education. There are multiple reasons to assess the quality of teaching and learning, and
program evaluation is a critical way to do that assessment and then compare the results to
program outcomes.
Evaluation has multiple purposes (Bloom, 2013). Program evaluation results can provide
a basis for educated decisions about entering a particular school. Bloom suggests that programs
do not necessarily get a “grade” but the results of their program evaluation are used to gauge
student achievement and for program improvement. Bloom also believes public knowledge of
departmental evaluation outcomes can positively affect programs, curriculum, and staffing, and
has the potential to hold those programs accountable when potential students use past program
performance to make decisions about entry into the program.

6

Statement of the Problem

“Let us define a problem relative to a given program as some predicted condition that
will be unsatisfactory without the intervention of the program and satisfactory, or at least more
acceptable, given the program’s intervention (Mohr, 1995).” When you look at program
evaluation in terms of the above definition, what you are ultimately striving to determine, is the
impact of the program on its participants. The process begins to determine who is involved at
one end of the spectrum, and at what level do those participants begin. The process also
determines how you quantify where students finish, how they got there, and what the desired
outcomes are so departments are identifying whether or not their programs really do make a
difference as intended. Are students changed by their involvement in the programs? Do
departments see changes or met expectations in their students that cannot be explained except
through the completion of requirements in a given department? Those are the relevant questions
in program evaluation.
The primary departmental problem in higher education is: are students learning quality
content that can be taken into the workplace? How does a department work with individual
students to make sure that they are advised well, take all required content, and apply this content
appropriately so they will be ready to function effectively in the workplace.
The Sport Science Department at D&E has a 3-fold purpose:
“(1) To provide leadership and facilities for a regular program of instruction and
participation in physical activity that will improve the understanding and skill level of all
students in a variety of physical activities suitable for both present needs and for lifetime
participation; assist individuals in gaining better understanding of the general principles and
7

concepts involved in the fundamentals of effective movement patterns; encourage individuals to
improve and maintain physical fitness,
(2) To prepare students for teaching careers in physical education, or for careers in such
related areas as coaching, youth work, various health fields, sport and athletic programs, and
management of sport and athletic related businesses, and:
(3) To provide working knowledge of the physiology of exercise and exercise testing and
prescription so that graduates in Exercise Science may pursue employment in public and
privately owned wellness and fitness centers, corporate fitness facilities, and clinical therapeutic
programs, and be well prepared to continue their formal education at the graduate level (The
College Catalog, 2012).” Departmental purposes labeled above as (2) and (3) deal specifically
with the majors. The first purpose addresses the general education goal that the department
meets. Every student, regardless of major, is required to complete 2 credit hours of physical
education.
The program evaluation needs to address general education and the content for majors. A
goal of the department is to properly instruct and encourage the general education component by
teaching physical activity and health components, and then students will understand the value of
maintaining lifelong fitness levels. When the department uses current and applicable curriculum,
then students will be prepared for employment in health occupations, as teachers, coaches, or in
sport and athletic business management positions. If current technologies and information are
presented to Exercise Science students, then they will be able to seek graduate education or
employment in fitness or clinical tracts of the discipline.

8

Purpose of the Study

The primary purpose of this study was to compare evaluation criteria used in the Sport
Science Department at Davis and Elkins College with criteria from other higher education
evaluation models at assessing teaching, learning, growth, and achievement of objectives as
identified by the institution. The focus will be on evaluations focused on institutional objectives.
The study investigated learning achieved by the student, application of knowledge by the
student, content delivery by professors, and analysis of departmental goals. Quality program
evaluation is being scrutinized at the institutional level as well as for the purposes of
accreditation; therefore careful analysis of how this is executed becomes extremely important.
This study evaluated the overall evaluation of exercise science departments including curriculum
and pedagogy.


Questions
Two primary questions will be addressed in this study.
1. How is departmental program evaluation conducted in institutions of higher education?
2. How do results of program evaluation affect continuous improvement among
departments?




9

Delimitations
The study was limited to institutions of higher learning. With a focus on the differences
between higher education institutions.

Significance of Study

This research will add to existing knowledge of program evaluation and could potentially
influence current methods used to evaluate specific programs and certainly the Sport Science
Department at Davis and Elkins College. This study will be of value to professional peers who
are all required to do program evaluation for accreditation purposes. The potential findings of
the study may impact curricular decisions within the department, specific topics covered within
individual courses, student advisement, and policy development to streamline educational
pathways of students. Elder, Pujol, and Barnes (2003) found in their study of 235 institutions
with undergraduate Exercise Science programs, that the programs used various categories to
develop for program evaluation. These common standards were developed from professional
organizations, and the programs evaluated their curricula accordingly. Eleder, Pujol, and Barnes
found that most programs did not focus enough attention on health promotion, and many did not
require field experience. (Elder, Pujol, & Barnes, 2003) .


10

Methods
This study used sources from journals and databases as well as college websites to
compile data related to how program evaluation is conducted at different institutions.
Information was also gathered from the institutional websites as to what changes the program
evaluation prompted in specific departments.

Literature Review
This review of the literature examined departmental program evaluation at various
institutions of higher education. The review examined how selected West Virginia colleges, how
accreditation plays a role in evaluation, and what evaluation looks like at selected institutions
across the country. The results of program evaluation were examined to determine how the
findings are used.
Streamlining the Process on a Global Scale
Effective program evaluation looks at a specific program and bases evaluation on some
criterion, so conclusions can be formed, and standards, or levels of requirements and conditions
that need to be met, can be identified for accreditation purposes (Hamalainen, 2003). Program
evaluation is conducted to assess the quality of teaching and learning within an institution.
Hamalainen has even suggested that there is a need for a global “Common Core” of evaluation
criteria and standards for institutions of higher education. This “Common Core” could
streamline the accreditation process from one country to another. Leathwood and Phillips (2000)
agree with Hamalainen and state that program evaluation needs to be more standardized due to
accreditation, global competition and accountability of programs (Leathwood, 200).
11

Program Evaluation in West Virginia
Fairmont State College, located in Fairmont WV, has posted findings of the program
evaluation for its BSN nursing program (Program Evaluation, 2007). Evaluation of their
students’ success includes performance evaluations on standardized national exams, oral
presentations, student evaluations from external sources, participation, and scoring a “C” or
better in all coursework.. Goals and objectives were established based on national standards.
Curricular changes were noted based on student input from previous years. The program was
compared with similar programs within the state. Performance of graduates was assessed based
on a survey sent out to employers of their graduates 6 months after employment. The
departmental results were positive and showed success in the outcomes measured. There were
few discussions of changes and future development or use of assessment measures.
Blue Ridge Community and Technical College (BRCTC) established an assessment
committee to develop methods for assessing student learning and effectiveness of college
services for students. In responding to the Higher Learning Commission, BRCTC has spent a
great deal of time looking at individual programs and assessing outcomes for general education
and degree courses (Chapter 2 - Progress since the last visitation, 2007). Schools that function
under the West Virginia Higher Education Policy Commission (WVHEPC), such as Glenville
State College, must have every program evaluated by the WVHEPC (Glenville State College
Policies: Academic Policy 26) . These evaluations are based on viability, adequacy, necessity,
and consistency with the institutional mission. This review is completed by each degree-granting
program as a self-study conducted every 5 years. The self-study is then presented to the Program
Advisory Board for further review and recommendations..
The Exercise Science degree program at Marshall University has undergone changes in
the recent past (Annual Assessment Report , 2009) . It is now a stand-alone degree with a new
12

mission statement. These changes were enacted in response to guidelines established by the
Commission on the Accreditation of Allied Health Education Programs and the Committee on
Accreditation for the Exercise Sciences. The program lists eight program objectives and five
student learning outcomes. Each of these objectives and outcomes is assessed based on student
performance on multiple assignments in specified classes as well as clinical assessments.
Internal and external evaluation reports are prepared on faculty and the program. Surveys sent to
employers including one year following employment of graduates. Marshall’s report provided
significant structure to their evaluation strategies, but did not report findings or changes made as
a result of the findings.
Sport Science Programs
The University of Minnesota at Duluth states that their program evaluation comes from
the fact that their students pass National Strength and Conditioning Association, American
College of Sports Medicine certification exams, or that the students pass tests such as the MCAT
(Medical College Admissions Test) or GRE (Graduate Records Exam). This allows students to
successfully compete for entry into graduate and professional programs nationwide.
(Department of Health, Physical Education, and Exercise Science, 2013) Thus the program is
being evaluated based solely on the success or failure of their graduates in their endeavors
following completion of coursework. The content has to have been covered in order for them to
succeed on these national exams, and the curriculum can be changed or modified based on
discrepancies identified in the exam content by comparing to their curricular content.

The South Dakota State University (SDSU) Exercise Science Department aims to prepare
students in the cognitive, affective, and psychomotor domains of learning so that they will be
13

able to assist others in championing healthy, active lifestyles (South Dakota State Exercise
Science Program, 2012) . The program is accredited through the Commission on Accreditation
of Allied Health Education Programs, and is the only program in the state to have such
accreditation. Each student is given the opportunity to sit for the American College of Sports
Medicine (ACSM) exam, Health and Fitness Instructor during the senior year. SDSU has a pass
rate of 95% which is considerably above the national average of 58%. Five program goals and
four program outcomes are listed in the assessment. SDSU’s assessment report document is a
great resource on the program itself, but did not describe how program evaluation or assessment
is conducted to ascertain how the program goals and outcomes are met.
Central Connecticut State University’s Physical Education and Human Performance
departmental evaluation states five principles that guide the program and eight learning outcomes
(Department of physical education and human performace annual report, 2011-2012). These
principles and outcomes include the program providing coursework and experiences to create
competent physical education teachers. CCSU states there are significant curricular changes
being made in order to accommodate the changes in the state of Connecticut requirements due to
recent NCATE changes. They are adding sections of courses to adjust to the increasing number
of students in the Exercise Science major. CCSU’s evaluation addresses faculty
accomplishments for the year, and addresses where the department stands on goals established
the previous and current year.
Western Kentucky University does academic program review every 6 years for every
institutional department (Academic Program Review, 2013). The WKU evaluation design aims
to give each department an opportunity to analyze enrollment trends, student success, and
resourcing. The goal of WKU’s evaluation is to achieve continuous improvement to add to the
14

institutional effectiveness. The Pacific University of Oregon evaluates their educational
department by evaluating their program, their faculty, and their student candidates. Their review
states this is conducted in multiple ways, then data is sent to the Assessment Coordinator at the
school and NCATE coordinator also. Programs are evaluated by candidates, mentors, principles
or employers, and alumni. Current responses show 95% satisfaction in the Educational program.
(Part III: Evidence for Meeting Each Standard, 2007)
Coastal Carolina University’s Exercise and Sport program lists seven student learning
outcomes they desire students to have mastered upon completion of the program (Exercise and
Sport Science Major, 2012) . They also have a mission statement to equip students with
strategies and skills needed to work in the workplace. These strategies and skills include
assessment of physical activity and exercise programs, promote healthy lifestyles, and enhance
quality of life.
Montclair University does not have individual programs do evaluation but the institution
evaluates its entire curriculum instead. The University has multiple accrediting bodies for
individual programs and the institution is accredited by the Commission on Higher Education of
the Middle States Association of Colleges and Schools. Their Exercise Science program is
accredited by Commission on Accreditation of Allied Health Education Programs and National
Council for the Accreditation for Teacher Education (NCATE) for Physical Education. Program
review standards function in regard to maintaining specific accreditation (Annual Institutional
Program Report, 2012).
Lehman University lists four primary goals for their Exercise Science students. Specific
learning objectives quantifying the goals are listed for each one. The goals are specific but there
is no information in the report stating how these evaluated the goals (Exercise Science, 2010).
15

Chatham University’s Exercise Science program lists seven areas of knowledge that their
students should accomplish by graduation. Achievement of each of these objectives is further
defined by the required classes in the curriculum. For example, Exercise Science 101 meets
Writing, Oral Communication, and Critical Thinking outcomes.
Winston-Salem State University defines six learning outcomes for successful
completion of their exercise science major (Exercise Science, 2013). These objectives include
gaining, demonstrating, and applying knowledge gained about different categories of the
discipline. Iowa State University’s College of Human Sciences identifies areas such as
biomechanics, physiology, fitness, psychology, medical exercise, and health as objective
categories to accomplish for majors. Field experience is also required and assessed following the
completion of 320 hours. . Nathan Weiss Graduate College prepares students for employment
in professional areas such as wellness centers, hospitals, professional athletics, sport
management, and school districts, plus others (Program Description, 2013). Each student must
meet the objectives designed by the department as well as maintain a 3.0 GPA.
Accreditation

Florida Gulf Coast University’s Exercise Science, Athletic Training, and Physical
Therapy Programs are all accredited programs so program evaluation is assessed through the
accreditation process (Department of Physical Therapy and Human Performance, 2012) .
Exercise Science is recognized by the National Strength and Conditioning Association. Athletic
training is accredited by the Commission on Accreditation of Athletic Training Education, and
the doctoral Physical Therapy Program is accredited through the Commission on Accreditation
in Physical Therapy Education.
16

Boise State University (BSU) is accredited through the Northwest Commission on
Colleges and Universities (Accreditation, 2013) . Several of their individual programs are
accredited by discipline specific organizations so program review is conducted according to
those criteria. The Exercise Science major is accredited by the Commission on Accreditation of
Allied Health Education Programs and BSU’s Athletic Training major is accredited by the
Commission on Accreditation of Athletic Training Education.

Program Review
At the University of New Hampshire, program review is conducted on individual
departments once every 10 years at the undergraduate level. Self-evaluations are given during
the interim, and external reviews are performed periodically (Program Review, 2011).
The University of Puget Sound adopted a program evaluation procedure in which each
department is evaluated every five years (Department and Program Curriculum Review, 2006) .
The evaluation includes assessment of curricula, pedagogy, ongoing assessment, quality, and
integrity. Each department can design their own evaluation, but must address 11 review
questions designed by the University. These questions address the learning outcomes that the
University has identified as evaluation criteria.
Program review requirements for the University of West Florida follow a prescribed
University template. The template includes eight core requirements including such things as
strengths and weaknesses, changes since the previous review, and recommendations or actions as
a result of the review. The assessment committee evaluates the review and evaluates it according
to a matrix assessing (1) Mission Fit, (2) Educational Quality, (3) Assessment Planning, (4)
Operational Quality, (5) Strategic Planning, (6) Faculty Quality, (7) Cost Recovery, (8)
17

Enrollment History, and (9) Market Projections. The department is given a score based on the
results of the review by the committee (Program Review Final Report Template, 2011).
SUNY Fredonia’s program evaluation happens once every 5 years. Each department is
required to complete a self-study and an external evaluation (Guide for Periodic Evaluation of
Academic Program, 2011). The self-study must include departmental assessment of curriculum,
students, faculty, resources, and recommendations.

Conclusion
Program evaluation for specific departments in higher education takes many forms.
Some institutions evaluate regularly and annually, others every 5-10 years. Evaluation addresses
criteria such as learning outcomes, objectives, faculty, course pedagogy, and goals and
recommendations. Program evaluation does not follow a specific pattern or universal template.
Most institutions give broad and general topics that must be addressed, but the format differs
from one college to another.
Departmental program evaluation is often defined by accreditation standards. Each
professional organization that accredits programs has a unique set of standards that dictate the
evaluation process. When a department does not meet the expectations of the accrediting body,
then they will not receive accreditation. Those requirements become the program evaluation for
those departments seeking accreditation.
The most significant finding is that departmental program evaluation criteria vary from
institution to institution. There are common standards, but no specific template for Exercise
Science or to any other discipline was found. Regardless of the evaluation plan that was
developed, programs reported using the results of their evaluation to enact changes in their
18

program. They used the findings to make curriculum changes, to address pedagogical changes,
to assess facility and resource needs, and to properly equip faculty. Many programs also assess
faculty contributions throughout the academic year.
Departments are free to design their own assessment as long as accreditation is not
involved in the process. The presence of accreditation standards results in a more prescribed
process. The results of departmental program evaluation are beneficial to prompt changes that
make the programs more efficient, effective, and accountable.

Discussion and Implications

There are varied types of departmental evaluation. Utilization of evaluations by
individual department leaves freedom within programs that are not seeking accreditation to
develop their own evaluative standards. These still must adhere to university policies and work
within discipline specific criteria to establish accountability for themselves. Successful program
evaluation also provides departments with goals and areas toward which they can work. The
criteria differ from institution to institution; however, the emphasis is on the results and the
implications of those criteria. Therefore, in order for departments to be accountable to the results
of program evaluation, the evaluation has to be properly developed first.

Implications for Further Research

Based on the findings of this research, it would be beneficial to identify any universal
standards for evaluating programs. Having such standards would allow students looking into
19

programs to know that they were comparing apples to apples. They would also be able to
evaluate overall scores at different schools. The same criteria that apply to one institution would
apply to all institutions. Institutions are evaluating programs, but most are not publishing the
results. Institutions of comparable size could be compared in terms of their program evaluation.

















References

Department and Program Curriculum Review. (2006). Retrieved March 28, 2013, from
University of Puget Sound: http://www.pugetsound.edu/academics/faculty--staff-
resources/curriculum-guidelines--forms/curriculum-review/
20

Chapter 2 - Progress since the last visitation. (2007). Retrieved March 29, 2013, from Blue
Ridge Community and Technical College: http://www.blueridgectc.edu/wordpress/wp-
content/uploads/2012/08/4-CHAPTER2.pdf
Part III: Evidence for Meeting Each Standard. (2007). Retrieved March 28, 2013, from Pacific
University of Oregon: http://www.pacificu.edu/coe/ncate/standard2element2.cfm
Program Review. (2007). Retrieved March 30, 2013, from Fairmont State:
http://www.fairmontstate.edu/files/bog_fsu/programreviews/2007_bs_nursing.pdf
Annual Assessment Report . (2009). Retrieved March 30, 2013, from Marshall
University/Assessment:
http://www.marshall.edu/assessment/AssessmentReports2009/BS_ES(SOK)_AR_09.pdf
Exercise Science. (2010, February). Retrieved March 27, 2013, from Lehman College:
http://www.lehman.edu/research/assessment/documents/Exercise_GO_000.pdf
Guide for Periodic Evaluation of Academic Program. (2011). Retrieved April 1, 2013, from
SUNY Fredonia:
http://www.fredonia.edu/search/results/?cx=001498624467528633936%3Agt_dejxqt40&
cof=FORID%3A11&q=program+review&sa=Search
Outcomes Assessment. (2011). Retrieved from University of Wisconsin-Madison:
http://www.provost.wisc.edu/assessment/manual/manual1.html
Program Review. (2011). Retrieved March 28, 2013, from University of New Hampshire:
http://unh.edu/institutional-research/sites/unh.edu.institutional-
research/files/schedule%20new%20format%203-13.pdf
Program Review Final Report Template. (2011). Retrieved March 30, 2013, from University of
West Florida: http://uwf.edu/academic/programs/review.cfm
Department of physical education and human performace annual report. (2011-2012). Retrieved
March 30, 2013, from Central Connecticut State University:
http://www.ccsu.edu/uploaded/departments/AdministrativeDepartments/Institutional_Res
earch_and_Assessment/Annual_Reports/2011-2012/Shared/PEdHP2012.pdf
\Annual Institutional Program Report. (2012). Retrieved March 28, 2013, from
http://www.montclair.edu/oit/institutionalresearch/AssessmentDocs/Excellence/MSU%2
0Institutional%20Profile%202012.pdf
21

Department of Physical Therapy and Human Performance. (2012). Retrieved March 29, 2013,
from Florida Gulf Coast University: http://www.fgcu.edu/CHPSW/PT/index.html
Exercise and Sport Science Major. (2012). Retrieved March 30, 2013, from Coastal Carolina
Universisty: http://www.coastal.edu/hkss/EXSS_major.pdf
Major Program In Exercise Science. (2012). Retrieved April 1, 2013, from Winston-Salem State
University: http://www.wssu.edu/sehp/departments-and-
programs/hpss/documents/exercise-science-catalog.pdf
South Dakota State Exercise Science Program. (2012). Retrieved March 29, 2013, from South
Dakota State Undergraduate Programs: http://www.sdstate.edu/hns/undergrad-
program/upload/Exercise-Science-Manual.pdf
The College Catalog. (2012). Davis and Elkins College: The College Catalog 2102-2013. Elkins,
WV, USA.
Academic Program Review. (2013, March 7). Retrieved March 29, 2013, from Western
Kentucky University: http://www.wku.edu/academicaffairs/ee/program_review.php
Accreditation. (2013). Retrieved March 30, 2013, from Boise State University:
http://registrar.boisestate.edu/catalogs/online/accreditation.shtml
Department of Health, Physical Education, and Exercise Science. (2013, January 18). Retrieved
March 30, 2013, from University of Minnesota at Duluth:
http://www.d.umn.edu/hper/majors/exercise_science/index.html
Exercise Science. (2013). Retrieved April 1, 2013, from Iowa State University:
http://www.kin.hs.iastate.edu/programs/kinesiology-health/exercise-science/
Exercise Science Learning Outcomes. (2013). Retrieved March 30, 2013, from Chatham
University:
http://www.chatham.edu/academics/programs/undergraduate/exercise/outcomes.cfm
Program Description. (2013). Retrieved April 1, 2013, from Nathan Weiss Graduate College:
http://grad.kean.edu/masters-programs/exercise-science
Bloom, M. (2013, January 22). Five Things to Learn From Ohio’s New Teacher Preparation
Program Evaluations. Retrieved from State Impact: Eye on Education:
http://stateimpact.npr.org/ohio/2013/01/22/five-things-to-learn-from-ohios-new-teacher-
preparation-program-evaluations/
22

Elder, C., Pujol, T., & Barnes, J. (2003). An analysis of undergraduate exercise science
programs: exercise science curriculum survey. Journal of Strength and Conditioning,
17(3); 536-540.
Glenville State College Policies: Academic Policy 26. (n.d.). Retrieved March 29, 2013, from
Glenville State College: http://www.glenville.edu/docs/BOG_policy_026.pdf
Hamalainen, K. (2003). Common Standards for Programme Evaluation and Accreditation?
European Journal of Educaiton, , 291-300; 38:3.
Leathwood, C. &. (200). Developing curriculum evaluation research in higher education:
process, politics, and practicalitites. Higher Education, 313-300; 40.
Mohr, L. (1995). Impact Analysis for Program Evaluation. Thousand Oaks, CA: SAGE
Publishing.