You are on page 1of 8

Qualitative Metaevaluation Form Using The Program Evaluation Standards, 3rd Edition

MEDT 8480

Evaluator: Group 10: Emma Hoerger, Doug


Curtright, Jacqueline Harris
Metaevaluators: JB Campbell, Dane Forlenza, Tammy 4/9/17
Revelle, Megan Standley

Overall Comments:
(For the purposes of the below summaries, relevance levels of Hardly and Fairly are deemed as not relevant. Relevance
levels of Clearly and Highly are deemed as relevant.)

Utility - Of the eight Utility standards, U1, U2, U3, U4, and U5 were relevant and most these were partially met. The
evaluator background description supports the credibility of each of the evaluators. However, there is some confusion
as to who exactly is a stakeholder in this evaluation plan as well as confusion as to who the client is. Including a logic
model should address some of these questions and will also provide a greater understanding as to the activities planned
to support this evaluation effort. Determining specifically who will be using the evaluation will make it possible to
ensure that the purpose of the evaluation correlates to stakeholder needs and specifies the value placed on the
program by each stakeholder group.

Feasibility - Of the four Feasibility standards, F1, F2 and F4 were relevant and were partially or fully met. Some
adjustments are needed. F1 suggests the inclusion of additional information on how data will be recorded, analyzed,
and included in evaluation document for audience review. F2 focuses on effective program operations, however the lack
of a logic model creates gaps in audience knowledge of the GaVS program. Program operation information is necessary
for the audience to determine practicality of evaluation procedures.

Propriety - Of the seven Propriety standards, P1-P5 were relevant and were either partially or fully met. Some
adjustments are needed. P1 focuses on being responsive to stakeholders and we feel additional stakeholders should be
added to the one listed in the evaluation. Standard P4 could be more appropriately covered by checking the method
mentioned for score comparison - comparing actual scores to overall scores discussed in an interview may not be
sufficiently comparable. We share a suggestion to clarify the term impact in regards to your final conclusions to more
specifically detail to the audience what evaluators are analyzing (what will be considered impactful or not?).

Accuracy - Of the eight Accuracy standards, four were not met, but were also low in relevancy. Of the four that were
relevant, three were partially met and one not met at all. It reasonable to mark standards A7 and A8 as not being met
considering this metaevaluation is being conducted before the actual evaluation takes place; however, additional work
is clearly needed in standards A3 and A5. A3 does not clearly outline who has requested the evaluation and who
ultimately will be using the findings for the benefit of the program. We suggest that the evaluators establish that the
data be organized and presented for a student audience in such a way that it could be used as a recruitment tool to
bring additional students to enroll in GA Virtual School. Furthermore, there is additional work that needs to be done in
the Data and Information section of the evaluation plan to improve adherence to A5. For example, the section on data
and information states how information will be collected, but no appendices of what questions will be used and how
they will be formatted. We suggest that not only should that be added to the data and information section, but also
mention how data will be reported to stakeholders through tables in the appendix or graphs within the evaluation
report.
Evaluation Accountability - Of the three Evaluation Accountability standards, all three being ranked highly relevant to
the evaluation, all were either partially met or fully met. Few additions needed. E2 and E3 were met due to the current
metaevaluation process. The E1 standard of documenting procedures, data and outcome is vague as there are no survey
questions provided and it is unclear how the data and outcomes will be documented, compared/analyzed, and
distributed. Also, more information on who will be using the information is needed in the purpose section.

Overall, the evaluation plan shows merit for determining how the Georgia Virtual School program compares to a
traditional classroom setting as well as providing insight into the course offerings of Georgia Virtual School and the
reasons why students may choose to take courses online. However, several details have been left out, including the
program logic model and survey instruments, which would be useful in critiquing the effectiveness of the evaluation
plan. The main criticism of the plan is a lack of organizational structure as to who is asking for the evaluation and how
this information will be used in the future. In addition, the analysis of data collected could benefit from further statistical
analysis beyond calculating mean and numerical comparison. Further refinement of sampling techniques could be
required depending upon the target audience of the evaluation report. The basic plan for a quality evaluation is in place,
but many more details need to be included and/or expanded on.

Additional Feedback:

The Logic Model is neither included, nor described in the Description section. The inclusion of this model could help the
audience to better understand the overall evaluation of the GaVS program. The overall formatting of the paper is neat
and clear; however, we suggest referring to the APA handbook for formatting requirements such as cover page, page
numbers, etc. This will add a professional look to your hard work. Also, we suggest double-checking some of the
grammar and sentence structure, specifically in the Evaluation Purpose section, in order to create a smoother reading
experience for the audience.
STANDARDS STANDARD STATEMENTS RELEVANCE TO QUALITATIVE FEEDBACK/COMMENTS
THE EVALUATION
(CHECK ONE)
H F C H
A A L I
R I E G
D R A H
L L R L
Y Y L Y
Y
UTILITY
U1 Evaluator Evaluations should be conducted by Fully Met
Credibility qualified people who establish and
maintain credibility in the evaluation Your description of each evaluator in the Background section of the
context. X paper thoroughly examines the credibility of each evaluator by listing
prior education, prior educational work experience, and current
employment in the educational field.

U2 Attention to Evaluations should devote attention to Partially Met


Stakeholders the full range of individuals and groups
invested in the program and affected In the Background section of the plan, the only stakeholders mentioned
by its evaluation. are the students participating in GaVS. We feel there are others who
X should be included in the full range of individuals...invested...and
affected such as GaVS administrators, teachers, etc.

U3 Negotiated Evaluation purposes should be Partially Met


Purposes identified and continually negotiated
based on the needs of stakeholders. The evaluators explain clearly that the evaluations purpose is to
X uncover both the strengths and weaknesses of the program and to
ultimately determine the effectiveness of the programs original goals.
This ultimately is a solid summative evaluation question, but the
evaluators never specify what the original program goals of Georgia
Virtual School were in the first place or what the stakeholder needs are.
Perhaps in order to fully meet this standard the evaluators should
include the original mission or vision statements of GA Virtual Schools.
There is no original program goal listed therefore it is not clear if the
goal is being met.

If the evaluation plan was to determine to continue using GAVS for a


specific school or system, then this should be stated.

U4 Explicit Values Evaluations should clarify and specify Partially Met


the individual and cultural values
underpinning purposes, processes, and X In the evaluation purpose section, it is stated that the evaluation will
judgments. provide stakeholders with key information about the effectiveness of
GAVS. Without seeing the survey that students will complete, it is
difficult to determine what value the stakeholders place on this
program. The same holds true for administrators who use GAVS. It will
be important to consider the value placed on the program by both
groups of stakeholders in order to determine if the program truly meets
the needs of the stakeholders.

U5 Relevant Evaluation information should serve Partially Met


Information the identified and emergent needs of
stakeholders. X The evaluation questions are clearly stated in the purpose of the
evaluation section. The information gained from these questions will be
useful in evaluating the program, but is this the information students
want to know about GAVS? This type of data seems best suited to serve
administrators who may be wavering on using the program in their
schools. What information will students want to know about GAVS?

U6 Meaningful Evaluations should construct activities, Not Met


Processes and descriptions, and judgments in ways X
Products that encourage participants to The only activity mentioned in the plan so far are found in the
rediscover, reinterpret, or revise their methodology section and reference the surveys completed by both the
understandings and behaviors. administrator and students of GAVS. These surveys may not have any
bearing on the operations of the program, and most likely will not
change participant behaviors in any way. One possible solution would
be to have students determine how GAVS has impacted their education
to date and what the consequences would be if GAVS was not an
option.
U7 Timely and Evaluations should attend to the X Partially Met
Appropriate continuing information needs of their
Communicating multiple audiences. This evaluation plan, as presented in the program description, does not
and Reporting seem time sensitive and has limited need of communication between
evaluators and student stakeholders. However, if additional
stakeholders were brought into the evaluation plan, then this standard
would be of more importance. The school system officials would need
timely updates about the progress of the evaluation in order to
determine when they could make a decision.

U8 Concern for Evaluations should promote X Not Met


Consequences responsible and adaptive use while
and Influence guarding against unintended negative The greatest concern or unintended consequence that could arise from
consequences and misuse. this evaluation is a negative view of GAVS course offerings by a school
or system, which could then lead to a local ban on these courses. This
would only be a local decision and would not impact GAVS as an entity,
but could have a negative impact on graduation rates. To guard against
this, attention should be given to the academic type of student that is
enrolling in GAVS and the reasons behind their decision.

STANDARDS STANDARD STATEMENTS RELEVANCE TO QUALITATIVE FEEDBACK/COMMENTS


THE EVALUATION
(CHECK ONE)
H F C H
A A L I
R I E G
D R A H
L L R L
Y Y L Y
Y
FEASIBILITY
F1 Project Evaluations should use effective Partially Met
Management project management strategies.
X The Methodology section clearly states effective and efficient project
management strategies such as obtaining data from the GaDOE which
will be timely, using an online survey tool which will be timely and
accurate, and using a random sampling technique which will be effective
and appropriate for data comparison. Little information is provided on
how surveys will be distributed and data recorded and analyzed.

F2 Practical Evaluation procedures should be X Partially Met


Procedures practical and responsive to the way
the program operates. Reasons as to why the program is put into place can be found in the
background description however there is no clear understanding as to
how the actual program operates. A description of such would be useful
for the audience to know if the evaluation procedures are practical.

F3 Contextual Evaluations should recognize, monitor, Not Met


Viability and balance the cultural and political
interests and needs of individuals and X This is unclear because there is no mention of the cultural or political
groups. interest. It is made clear that student participation in the survey is
totally voluntary and will not penalized for non-participation.

F4 Resource Use Evaluations should use resources Met


effectively and efficiently. X
Resources mentioned in Methodology are used effectively and
efficiently; however, you might consider surveying Georgia traditional
middle schools in addition to Georgia high schools as both grade levels
are part of the program.
STANDARDS STANDARD STATEMENTS RELEVANCE TO QUALITATIVE FEEDBACK/COMMENTS
THE EVALUATION
(CHECK ONE)
H F C H
A A L I
R I E G
D R A H
L L R L
Y Y L Y
Y
PROPRIETY
P1 Responsive and Evaluations should be responsive to Partially Met
Inclusive stakeholders and their communities.
Orientation The evaluators explain in their methodology section how data will be
obtained from a GA Virtual School administrator regarding EOCT pass
rates, from 20 random high school administrators regarding course
X availability, and from a random sample of 100 GaVS students for the
third evaluation question. This demonstrates a vast array of
stakeholders, but the evaluators list All Georgia Virtual School
students, as the key stakeholders under the evaluation question
section earlier in the evaluation plan. The evaluators may want to
include that key stakeholders will also include the GA Virtual School
Administrators as well as the parents of students who attend GA Virtual
School. The evaluator states that GaVS includes students grades 6 -12
however the evaluation questions and data only cover students grade 9-
12. It might be suggested that students 6-8 are included in the surveys,
pass/fail data, and course availability.

P2 Formal Evaluation agreements should be Fully Met


Agreements negotiated to make obligations explicit
and take into account the needs, The evaluators have clearly outlined in Methodology how they will be
expectations, and cultural contexts of X using open record information for data needed for their three
clients and other stakeholders. evaluation questions, and how they will go about obtaining consent for
information that isnt open record. For the first evaluation question
concerning EOCT pass rates, the evaluators recognized that state pass
rate information is available through the Department of Educations
website. Administrators through GA Virtual School are required to
adhere to quality guidelines per regulatory agencies such as SERB and
iNACOL, so the evaluators have done due diligence in ensuring that the
test scores obtained directly through GA Virtual School are reported
with integrity and in full compliance. When obtaining survey results
from a random sampling of GA Virtual School students, the evaluators
have ensured that an informed consent is obtained, from the
parents/guardians prior to any surveys and questionnaires. The
evaluators have also made sure to let stakeholders know that
participation in surveys or lack thereof will not, negatively affect their
students academic performance at Georgia Virtual School.

P3 Human Rights Evaluations should be designed and Fully Met


and Respect conducted to protect human and legal
rights and maintain the dignity of The evaluators have outlined under the standards/benchmarks section
participants and other stakeholders. of their evaluation plan the program standards per the Southern
X Regional Educational Board that each of their evaluation questions
addressed in an effort to show how the questions seek to verify
standards of the program and not target or prosecute a certain
individual or group of stakeholders. The evaluation questions put into
place are meant solely to help assess, curriculum standards for
program performance, student performance, and faculty professional
development.

P4 Clarity and Evaluations should be understandable Partially Met


Fairness and fair in addressing stakeholder
needs and purposes. X The evaluators mention in their methodology that the first evaluation
question will compare the EOCT pass scores from students around the
state to the students in GA Virtual School. They mention that the basis
of this comparison will be state pass rates for certain EOCT tests to a,
few open format questions pertinent to the issue of EOCT pass scores
for GAVS students. This isnt a fair comparison as the evaluators are
attempting to the compare quantitative data (state EOCT scores) to the
answers on a questionnaire from a GA Virtual School administrator.
Perhaps instead of answers to a questionnaire a fairer comparison
would just be two sets of quantitative data through the mean passing
percentage of the state, and the percentage of GA Virtual School
students who passed.
P5 Transparency Evaluations should provide complete Partially Met
and Disclosure descriptions of findings, limitations,
and conclusions to all stakeholders, In the sampling section of their evaluation plan, the evaluators have
unless doing so would violate legal and X been transparent in stating that all sampling of high schools and
proprietary obligations. students will be random and that each student and school has an equal
chance of being selected to participate in the evaluation. The evaluators
technically cannot give a final evaluation report yet; however, they may
want to be more transparent than just saying, answers to the above
questions will provide quantitative and qualitative data to determine
the impact that GVS is having on the students. Impact while general
enough to protect the evaluators doesnt provide enough transparency
to stakeholders on exactly what the term impact means. Perhaps re-
phrase to include that impact means an education equivalent to one
that would be received in a traditional school.

P6 Conflicts of Evaluations should openly and Fully Met


Interest honestly identify and address real or
perceived conflicts of interest that may Looking at the background information of the evaluators there are no
compromise the evaluation. X apparent conflicts and none of the evaluators are teachers at GA Virtual
School, so from the evaluation side there doesnt appear to be a conflict
of interest as all of the evaluators are accredited teachers working in a
public school system in Georgia. There may be a conflict of interest with
the fact that GA Virtual School administrators are responsible for
supplying the data necessary to answer the first evaluation question;
however, they are bound to the quality guidelines per regulatory
agencies.

P7 Fiscal Evaluations should account for all Not Met


Responsibility expended resources and comply with
sound fiscal procedures and processes. X There is no mention of being fiscally responsible for the evaluation, but
thats because there are no costs with the evaluation effort. Data is
either collected through open records or obtained through free services
and applications such as Survey Monkey.

STANDARDS STANDARD STATEMENTS RELEVANCE TO QUALITATIVE FEEDBACK/COMMENTS


THE EVALUATION
(CHECK ONE)
H F C H
A A L I
R I E G
D R A H
L L R L
Y Y L Y
Y
ACCURACY
A1 Justified Evaluation conclusions and decisions Not Met
Conclusions should be explicitly justified in the
and Decisions cultures and contexts where they have Conclusions cannot be examined as the evaluation has not yet been
consequences. X completed. Evaluation decisions are not justified according to culture as
it is not needed.

A2 Valid Evaluation information should serve Partially Met


Information the intended purposes and support
valid interpretations. X In the Analysis section, the data that is being collected should serve the
intended purpose to determine whether GaVS is effective through the
comparison of students pass rate on EOCTs, the course offerings, and
reasons for attendance. The only issue is the evaluation is only
focussing on a portion of the program (high school), but middle schools
are taken into consideration.

A3 Reliable Evaluation procedures should yield Not Met


Information sufficiently dependable and consistent
information for the intended uses. X It is difficult to determine if there is dependable and consistent
information for intended uses since it is unclear as to who will be using
the findings from this evaluation. How are the findings intended to be
used? It is unclear as to who has commissioned this evaluation - GaVS
will not be the ones using the report ultimately and are not the ones
who have asked for it, although a separate report detailing the benefits
of GaVS intended for a student audience could be useful.
A4 Explicit Evaluations should document Partially Met
Program and programs and their contexts with
Context appropriate detail and scope for the In the Background section, the GaVS program purpose is outlined along
Descriptions evaluation purposes. X with a brief description. Without a Logic Model to reference, the details
of the program are unknown.

A5 Information Evaluations should employ systematic Not Met


Management information collection, review,
verification, and storage methods. X The Data and Instrumentation section clearly state how information will
be collected, but there are no appendices showing what questions will
be used and what format (Likert scale, open-ended, multiple-choice?).
The review of data in the Analysis section tells what is being compared,
but not how they are being compared (what range/standard deviation
of scores will be considered comparable between GaVS students & state
students?). No storage methods mentioned - perhaps discuss how data
will be reported (as a table in the appendix? as a graph within the
evaluation report?).

A6 Sound Designs Evaluations should employ technically X Partially Met


and Analyses adequate designs and analyses that
are appropriate for the evaluation The design and analysis of the evaluation questions found in the
purposes. methodology section seem appropriate for the intended information.
However, if the evaluation plan was to determine whether to continue
using GAVS for a specific school or system, then sampling techniques
should be modified to stratified sampling instead of random. The
collected data would be most beneficial if it pertained to schools
systems of comparable size and resources. In addition, what standard is
going to be used with the Likert-scale data to determine if schedule
flexibility is a factor in GAVS usage?

A7 Explicit Evaluation reasoning leading from Not Met


Evaluation information and analyses to findings,
Reasoning interpretations, conclusions and X Since the actual evaluation has yet to be completed and we are simply
judgments should be clearly and reviewing the evaluation plan, the findings, interpretations, conclusions
completely documented. and judgements cannot be documented at this time.

A8 Communicatio Evaluation communications should Not Met


n and have adequate scope and guard
Reporting against misconceptions, biases, X No mention of evaluation communication or reporting. We suggest a
distortions, and errors. mention of who will receive final reports and if they will be
differentiated for different audiences. Since it will be written by credible
evaluators, bias should not be present and errors should be low.

STANDARDS STANDARD STATEMENTS RELEVANCE TO QUALITATIVE FEEDBACK/COMMENTS


THE EVALUATION
(CHECK ONE)
H F C H
A A L I
R I E G
D R A H
L L R L
Y Y L Y
Y
EVALUATION ACCOUNTABILITY
E1 Evaluation Evaluations should fully document Partially met
Documentation their negotiated purposes and
implemented designs, procedures, The evaluation plan details the purpose of the evaluation in the
data, and outcomes. X Background section (but not who will be using the information), the
questions that are being used, procedures to collect the data as well as
types of data, and the anticipated methods of data analysis. As for the
results of your questions, is there a standard of success required or is
this just a comparison between GAVS and traditional schools?

E2 Internal Meta Evaluators should use these and other Fully Met
Evaluation applicable standards to examine the
accountability of the evaluation The metaevaluation is being performed by four credible persons who
design, procedures employed, X will review the evaluation according to each standard and the
information collected, and outcomes. evaluators will then employ suggestions to better the quality of the
overall evaluation.
E3 External Program evaluation sponsors, clients, Fully Met
Metaevaluatio evaluators, and other stakeholders
n should encourage the conduct of The metaevaluation is being performed by four credible persons who
external metaevaluations using these X will review the evaluation according to each standard.
and other applicable standards.

You might also like