Professional Documents
Culture Documents
MEDT 8480
Overall Comments:
(For the purposes of the below summaries, relevance levels of Hardly and Fairly are deemed as not relevant. Relevance
levels of Clearly and Highly are deemed as relevant.)
Utility - Of the eight Utility standards, U1, U2, U3, U4, and U5 were relevant and most these were partially met. The
evaluator background description supports the credibility of each of the evaluators. However, there is some confusion
as to who exactly is a stakeholder in this evaluation plan as well as confusion as to who the client is. Including a logic
model should address some of these questions and will also provide a greater understanding as to the activities planned
to support this evaluation effort. Determining specifically who will be using the evaluation will make it possible to
ensure that the purpose of the evaluation correlates to stakeholder needs and specifies the value placed on the
program by each stakeholder group.
Feasibility - Of the four Feasibility standards, F1, F2 and F4 were relevant and were partially or fully met. Some
adjustments are needed. F1 suggests the inclusion of additional information on how data will be recorded, analyzed,
and included in evaluation document for audience review. F2 focuses on effective program operations, however the lack
of a logic model creates gaps in audience knowledge of the GaVS program. Program operation information is necessary
for the audience to determine practicality of evaluation procedures.
Propriety - Of the seven Propriety standards, P1-P5 were relevant and were either partially or fully met. Some
adjustments are needed. P1 focuses on being responsive to stakeholders and we feel additional stakeholders should be
added to the one listed in the evaluation. Standard P4 could be more appropriately covered by checking the method
mentioned for score comparison - comparing actual scores to overall scores discussed in an interview may not be
sufficiently comparable. We share a suggestion to clarify the term impact in regards to your final conclusions to more
specifically detail to the audience what evaluators are analyzing (what will be considered impactful or not?).
Accuracy - Of the eight Accuracy standards, four were not met, but were also low in relevancy. Of the four that were
relevant, three were partially met and one not met at all. It reasonable to mark standards A7 and A8 as not being met
considering this metaevaluation is being conducted before the actual evaluation takes place; however, additional work
is clearly needed in standards A3 and A5. A3 does not clearly outline who has requested the evaluation and who
ultimately will be using the findings for the benefit of the program. We suggest that the evaluators establish that the
data be organized and presented for a student audience in such a way that it could be used as a recruitment tool to
bring additional students to enroll in GA Virtual School. Furthermore, there is additional work that needs to be done in
the Data and Information section of the evaluation plan to improve adherence to A5. For example, the section on data
and information states how information will be collected, but no appendices of what questions will be used and how
they will be formatted. We suggest that not only should that be added to the data and information section, but also
mention how data will be reported to stakeholders through tables in the appendix or graphs within the evaluation
report.
Evaluation Accountability - Of the three Evaluation Accountability standards, all three being ranked highly relevant to
the evaluation, all were either partially met or fully met. Few additions needed. E2 and E3 were met due to the current
metaevaluation process. The E1 standard of documenting procedures, data and outcome is vague as there are no survey
questions provided and it is unclear how the data and outcomes will be documented, compared/analyzed, and
distributed. Also, more information on who will be using the information is needed in the purpose section.
Overall, the evaluation plan shows merit for determining how the Georgia Virtual School program compares to a
traditional classroom setting as well as providing insight into the course offerings of Georgia Virtual School and the
reasons why students may choose to take courses online. However, several details have been left out, including the
program logic model and survey instruments, which would be useful in critiquing the effectiveness of the evaluation
plan. The main criticism of the plan is a lack of organizational structure as to who is asking for the evaluation and how
this information will be used in the future. In addition, the analysis of data collected could benefit from further statistical
analysis beyond calculating mean and numerical comparison. Further refinement of sampling techniques could be
required depending upon the target audience of the evaluation report. The basic plan for a quality evaluation is in place,
but many more details need to be included and/or expanded on.
Additional Feedback:
The Logic Model is neither included, nor described in the Description section. The inclusion of this model could help the
audience to better understand the overall evaluation of the GaVS program. The overall formatting of the paper is neat
and clear; however, we suggest referring to the APA handbook for formatting requirements such as cover page, page
numbers, etc. This will add a professional look to your hard work. Also, we suggest double-checking some of the
grammar and sentence structure, specifically in the Evaluation Purpose section, in order to create a smoother reading
experience for the audience.
STANDARDS STANDARD STATEMENTS RELEVANCE TO QUALITATIVE FEEDBACK/COMMENTS
THE EVALUATION
(CHECK ONE)
H F C H
A A L I
R I E G
D R A H
L L R L
Y Y L Y
Y
UTILITY
U1 Evaluator Evaluations should be conducted by Fully Met
Credibility qualified people who establish and
maintain credibility in the evaluation Your description of each evaluator in the Background section of the
context. X paper thoroughly examines the credibility of each evaluator by listing
prior education, prior educational work experience, and current
employment in the educational field.
E2 Internal Meta Evaluators should use these and other Fully Met
Evaluation applicable standards to examine the
accountability of the evaluation The metaevaluation is being performed by four credible persons who
design, procedures employed, X will review the evaluation according to each standard and the
information collected, and outcomes. evaluators will then employ suggestions to better the quality of the
overall evaluation.
E3 External Program evaluation sponsors, clients, Fully Met
Metaevaluatio evaluators, and other stakeholders
n should encourage the conduct of The metaevaluation is being performed by four credible persons who
external metaevaluations using these X will review the evaluation according to each standard.
and other applicable standards.