Professional Documents
Culture Documents
Monev in Health Program
Monev in Health Program
g of Health
Evaluatio
n Services
DR. RASHA SALAMA
PHD PUBLIC HEALTH
FACULTY OF
MEDICINE
SUEZ CANAL UNIVERSITY-EGYPT
Presentation Outline
Monitoring and
Evaluation of
health services
Evaluation
Definition
Monitoring
and
Concept
Definition
Types and concept
Design
Evaluatio
s Process: n
Method FIVE phases challenge Monitori
of s ng
s
Evaluation versus
evaluatio
n
Monitoring and Evaluation (M&E)
Monitoring Evaluation
A planned, systematic A process that assesses
process of observation an achievement against
that closely follows preset
a course of activities, criteria.
and compares what is
Has a variety of purposes,
happening with what is
and follow distinct
expected to happen
methodologies
(process,
outcome,
performance,
etc).
Evaluation Monitoring
• A systematic process to
determine the extent to people’s lives.on
of information
which programme
service
been or needs and achieved
are being results a for comparison
analyse
have the reasons for with
any nd
during
discrepancy.
relevance, efficiency implementation
• effectiveness.
Attempts
and It
to measure service’s too
whether
measures and o what extent late.
the why progress fell short
are improvingt the quality
programme’s inputs and
of of
services
• The periodic collection and review
implementation, coverage and • Identifies shortcomings before it is
use
• Provides elements of analysis as to
implementation plans.
expectations
• Open to modifying original plans
Comparison between Monitoring and
Evaluation
Evaluation
Evaluation can focus on:
• Projects
normally consist of a set of activities • Conditions
undertaken
achieve to objectives within a given
specific are
particular
budget characteristi
and time period. cs or states
of being of
persons or
things (e.g.
• Programs disease,
are organized sets of projects or nutritional
services
concerned with a particular sector or
regio
geographic
n
• Services
e.g. Hea th services, whereas programmes are usually imited in
time or area.
are based on a permanent structure, and, have the goal of becoming, national
l l
in coverage,
re organizational operations of a continuous and supporting nature
(e.g. personnel
management
• Processes
operations).
information systems,
status, literacy, income
level).
Projects
Processes
Conditions
Services
Programs
Evaluation may focus on different aspects of a service
or program:
Processes
• Inputs Inputs
are resources provided for an activity, and include cash,
supplies, personnel, equipment and training.
• Processes Impacts
transform inputs into
outputs. • Outputs
Outputs
are the specific products or services, that an activity is expected
to deliver as a result of receiving the inputs.
• A service is effective if Efficiency
it “works”, i.e. it delivers outputs in accordance with its
objectives. • A service is efficient or cost-effective if
effectiveness is achieved at the lowest practical Effectiveness
cost. • Outcomes
refer to peoples’ responses to a programme and how they are
doing things differently as a result of it. They are short-term Outcomes
effects related to objectives.
So what do you think?
Prospective Evaluation,
determines what ought to
happen (and why)
Retrospective
Evaluation, determines what actually
happened (and why)
Evaluation Matrix
The broadest and most common classification of evaluation
identifies two kinds of evaluation:
• Formative evaluation.
Evaluation of components and activities of a program other than
their outcomes. (Structure and Process Evaluation)
• Summative evaluation.
Evaluation of the degree to which a program has achieved its
desired outcomes, and the degree to which any other outcomes
(positive or negative) have resulted from the program.
Evaluation Matrix
Components of
Comprehensive Evaluation
Evaluation Designs
B: Selecting
A: Planning the Appropriate
Evaluation Evaluation
Methods
E: Implementing
C: Collecting and D: Evaluation
Analysing Reporting recommendations
Information Findings
Phase A: Planning the Evaluation •*Provide background
information on the history
• Determine the purpose of the and current status of the
evaluation. • Decide on type of evaluation. programme being
evaluated including:
• Decide on who conducts evaluation
(evaluation team) • How it works: its
• Review existing information in objectives, strategies
programme documents including and management
monitoring information. process) •Policy
• List the relevant information sources environment •Economic
and financial feasibility
▫ Quantitative:trates, proportions,
percentage, common
▫ Qualitative: “yes”denomin
or “no”
(e.g.,
ator popula
ion)
• Reliability: can be collected consistently by different
• data
Validity: measure what we mean to measure
collectors
Which Indicators?
Qualitative tools:
evaluation (more than one method can be
There are five frequently used data collection processes in qualitative
used):
are
1. Unobtrusive seeing, involving an observer who is not seen by those
observed;
who
an activity but is seen by the activity’s
participants
2. Participant observation, involving an observer who does not take
/he poses questions to the respondent, usualy on a one-on-one
.
bass
part in 3. Interviewing, involving a more active role for the evaluator
l
identify patterns within the
material
because she 4. Group-based data collection processes such as
i
focus groups; and
5. Content analysis, which involves reviewing documents and
transcripts to
Quantitative tools:
Surveys/questionnaires; •
Registries
• Activity logs;
• Administrative
records; • Patient/client
charts;
• Registration
forms; • Case
studies;
• Attendance sheets.
Pretesting or piloting......
Other monitoring
and evaluation
methods:
Biophysical Most
significant change
method
measurements Cost-
benefit analysis
Sketch mapping
GIS mapping
Transects
Seasonal calendars
Impactflow diagram ( Problem and objectives tree
cause-effect diagram)
Systems (inputs-
Institutional
linkage outputs) diagram
diagram (Venn/Chapati
Monitoringand evaluation
diagram)
Wheel (spider web)
Spider Web Method:
This method is a visual index developed to
identify the kind of indicators/criteria that can
be used to monitor change over the program
period. This would present a ‘before’ and ‘after’
program/project situation. It is commonly used
in participatory evaluation.
Phase D: Reporting Findings