You are on page 1of 76

SUMMARY

PLANNING, MONITORING
AND EVALUATION

Vesper .H. Chisumpa


vchisumpa@yahoo.com

December,2010
WHAT IS PLANNING?
Planning
 Everyday activity (old as Examples
Human kind)  Daily schedules
 Process of setting goals,  Work plans
developing strategies,  Action/Activity plans
outlining tasks, and  M and E Plans
schedules to accomplish
 Strategic Plans
goals
 Future-oriented activity  Development Plans (e.g.

 Changes according to National, Provincial,


District)
situation
 Time-reference period
Planning: Questions
 Whatwill be done?  WHAT are the
 Why do we want to do it? Objectives
 WHY are the Expected

 How is it to be done? Outputs


 HOW are the Activities
 When will it be done?
 WHEN is the Time

 Who
Reference Period
will do it?
 WHO are the
Persons/Org.
 Where will it be done? Responsible
 WHERE is the Location
Link between Planning and
Monitoring and Evaluation

 Planning is the base/foundation on which


monitoring and evaluation of programmes is
conducted
 To monitor and evaluate we need clearly
defined: goals, objectives, activities, time-
frames, etc.
What is MONITORING AND
EVALUATION?
What is Monitoring?
Monitoring: definition

Monitoring is a continuous process that aims to


provide the main stakeholders of a project,
programme or policy with early indications of the
quality, quantity and timeliness of progress towards
delivering intended results.
What is being monitored?
Measures of achievement aligned to objectives in
the plan or strategy of the initiative (whether
project, programme, policy or other).

 INPUTS (finance, resources, in-kind),


 ACTIVITIES (planned, completed),
 OUTPUTS (directly, or by proxy through
indicators),
 OUTCOMES (through indicators)
Why monitor?
Monitoring is important for several reasons, but
primarily:
 What gets monitored is more likely to get done.
 If you don’t monitor performance, you can’t tell
success from failure.
 If you can’t see success, you can’t reward it.
 If you can’t recognise failure, you can’t correct
it.
 If you can’t demonstrate results, you can’t
sustain support for your actions.
What is Evaluation?
Evaluation: definition

Definition:
 An evaluation is an assessment, as systematic and
impartial as possible, of a project, programme or policy.
 It focuses on expected and achieved accomplishments.
 It aims at determining the relevance, impact,
effectiveness, efficiency and sustainability of the
interventions and contributions of the organization.
Why evaluate?

 Understand why and the extent to which


intended and unintended results are achieved,
and their impact on stakeholders
 Important source of evidence on the achievement
of results and institutional performance, thus is
one basis for corporate accountability
 Important contributor to building knowledge and
organizational learning.
Types of Evaluation
Evaluation
Front-end Evaluation conducted prior to the implementation of the
analysis(Baseline programme
evaluation)

Evaluability assessment A preliminary evaluation conducted prior to a more


formal evaluation
Formative evaluation A developmental, process-oriented evaluation which
provides guidance to a programme’s activities
Mid-term evaluation Assessment of the effectiveness and efficiency of the
programme when it is halfway through the planned period
Impact, or outcome, Evaluation focuses on the programme’s effectiveness,
evaluations assessing the results of the programme

Summative evaluation Final assessment done at the end of a project

Evaluation of evaluation This process includes external reviews of evaluations, re-


analysis of evaluation data and meta-analysis of several
evaluations.
How do monitoring and
evaluation differ?
They are different, but interrelated functions, as
they both contribute knowledge as a basis for
accountability and enhanced performance.
 Monitoring is an internal, repetitive, operations
and management function. Evaluation is often
external, periodic/ snapshot, in greater depth and
asking different questions.
 Monitoring: “Are we doing things right”?
while Evaluation, in addition, “Are we doing
the right things?” and “Are there better ways
of achieving the results?”
Summary comparison of M and E

Monitoring Evaluation
Monitoring is the routine process Evaluation is the use of social
of data collection and research methods to systematically
measurement of progress toward investigate a program’s
program objectives. effectiveness.

 Monitoring involves
 Evaluation requires study
design.
counting what we are  Evaluation sometimes
doing. requires a control or
 Monitoring involves comparison group.
routinely looking at the  Evaluation involves
quality of our services measurements over time.
 Monitoring is a continuous  Evaluation involves special
process. studies.
 Evaluation is periodic
Graphic Illustration of M and E

Monitoring Evaluation
Intervention
17
Outcome
What is the Purpose of M and E ?

1. To collect information to make informed, evidence-based


decisions.
2. To make mid-course adjustments and refine project activities.
3. To demonstrate progress and explain unique challenges to
stakeholders, funders, & partners.
4. To expand the knowledge base (basic research)
5. To create records of past and present performance for future
assessment and institutional memory.
Leadership in M and E

“ A leader gives people a reason to persist and a direction


toward which to persist”
 Leaders cope with change (soft skills)
 Set directions, mobilize people, innovate, inspire
 Managers cope with complexity (hard skills)
 Plan and budget, organize, implement, M&E
 Need a combination for both

19
Leading the M and E Environment

Have accurate and complete information


Be open to alternative perspectives,
Reflect critically on presuppositions and their
consequences
Provide equal opportunity to participate, and
 Accept an informed, objective, and rational
consensus
GOALS
What is a goal?
 Goal: a broad statement of a GOALS:
desired, long-term outcome of  Present the desired outcomes,
the program. accomplishments, result or
 Example: Goal of Training
purpose sought (not the
 To improve the knowledge process).
and skills in monitoring  Capture broad changes in
and evaluation of conditions, answering the “so
Programme Staff what” question.
 Often goals reflect behavior,
attitude, or economic change
and show how our activities
contribute toward a larger
development impact.
 Usually goals reflect a result
achieved in an intermediate or
long time period.
What are objectives?
Objectives: statements of desired, specific, realistic, and
measurable program results. They should be SMART!!
 Specific: identifies concrete events or actions that will take
place.
 Measurable: quantifies the amount of resources, activity, or
change to be expended and achieved.
 Appropriate: logically relates to the overall problem
statement and desired effects of the program.
 Realistic: Provides a realistic dimension that can be achieved
with the available resources and plans for implementation.
 Time-based: specifies a time within which the objective
will be achieved.
Example:
To provide two workshops on planning, monitoring and evaluation to
50 programme staff as measured by workshop reports by Dec 31,
2010.
M & E Frameworks
Type of Framework Program Basis for Monitoring
and Brief Description Management and Evaluation

Conceptual – Interaction Determines which No. Can help to explain


of various factors factors the program will results
influence

Results – Logically Shows the causal Yes. At the objective


linked program relationship between level
objectives program objectives

Logic model – logically Shows the causal Yes. At all stages of the
linked inputs, processes, relationship between program from inputs to
outputs, and outcomes inputs and the objectives process to outputs to
outcomes/objectives
Conceptual Frameworks

Definition: Diagram that identifies and illustrates the


relationships between all relevant systemic,
organizational, individual, or other salient factors that
may influence program/project operation and the
successful achievement of program or project goals.
Purpose for M&E:
 To show where the program fits into wider context
 To clarify assumptions about causal relationships
 To suggest causal pathways
Causes of Malnutrition in Society

Child malnutrition,
Outcomes
death and disability

Immediate
Inadequate Disease causes –
dietary intake individual
level

Inadequate Poor water/sanitation Underlying


Insufficient
maternal & child & inadequate health causes at
access to food
care practices services household/
family level

Quantity & quality of actual


resources – human, economic and
Macro-level
organizational – and the way they causes at
are controlled societal
level
Potential resources: environment,
technology, people

Source: UNICEF, State of the World’s Children, 1998.


Results Framework
Diagrams that identify steps or levels of results and
illustrate the causal relationships linking all levels of a
program’s objectives in an organizational chart style.

Why Use a Results Framework?


 Provides a clarified focus on the causal relationships that
connect incremental achievement of results to the
comprehensive program impact.

 Clarifiesproject/program mechanics and factors’


relationships that suggest ways and means of objectively
measuring the achievement of desired ends
Example of Results
Framework

Goal: Improved Health Status


and/or Decreased Fertility

Objective: Improved Use of Health/FP Services


and/or Appropriate Practices

IR1: Access/Availability IR2: Quality IR3: Sustainability IR4: Demand

IR1.1:Commodities/Facilities IR2.2: Provider Performance IR3.1: Policy IR4.1: Attitude

IR1.2: Equity IR2.3: Training/Supervision IR3.2: Health Care Finance IR4.2: Knowledge

IR2.4: Information System IR3.3: Private Sector IR4.3: Community Support

=
Logical Frameworks
(LogFrame Matrix)
• These are 4X4 matrices commonly used as a tool for
planning, monitoring and evaluation of projects. They
are applied widely to show the logical flow of linkages
between the project’s means and ends; its inputs,
outputs, outcomes and impacts. .

• Summarizes what the project intends to do and how


• Summarizes key assumptions
• Summarizes outputs and outcomes that will be monitored
and evaluated

=
LogFrame Matrix
• In strict terms, it appears as a four-by-four matrix
• It is read from the bottom – up
• We have control at the input, activity and output level

Inputs

Narrative OVIs MOVs Assumptions


A B C D
What does the How can we tell if we Where can we get What else must
project want to have achieved it? information that will happen if it is to
achieve? tell us this? = succeed?
31
32
Logic Models
Definition:
 Diagrams that identify and illustrate the linear
relationships flowing from program inputs, processes,
outputs, and outcomes.
 Inputs or resources affect processes or activities which
produce immediate results or outputs, ultimately leading to
longer term or broader results, or outcomes.

Purpose:
 Provides a streamlined interpretation of planned use of
resources and desired ends
 Clarifies project/program assumptions about linear
relationships between key factors relevant to desired ends
 Other terms used: M&E Frameworks, indicator matrixes
Logic Model: Family Planning Activity

INPUT PROCESS OUTPUT OUTCOME IMPACT


• Human and • Educate men • Sessions held 1. Increased access • Increased
financial and women in community to contraceptive contraceptive
resources about the about family methods prevalence
• Demand for advantages of planning
2. Increased access
modern methods
FP in the to FP counseling
community method use • Increased
3. Increased
• Distribute FP interest in FP
number of new
methods in the • FP methods users of modern
community distributed in methods
communities
• Train program 4. Increased male
staff in • Clinic staff participation in
providing FP trained in FP FP decisions
information method
counseling
and methods
INDICATORS
What is an Indicator?
 A variable that measures one aspect of a program/project or
outcome.
 A statistical value providing an indication of the condition or
direction over time of performance of a defined process or
achievement of a defined outcome.
 An indicator provides evidence that a certain condition exists or
certain results have or have not been achieved. Indicators enable
decision-makers to assess progress towards the achievement of
intended outputs, outcomes, goals, and objectives
 An indicator is a variable that measures one aspect of a program
or project that is directly related to the program’s objectives
An appropriate set of indicators includes at least one indicator
for each significant aspect of the program or project.
What Indicators are:
 Clues, Signs and Markers
 Can be quantitative or qualitative
Quantitative indicators are numeric and are
presented as numbers or percentages.

Qualitativeindicators are descriptive observations


and can be used to supplement the numbers and
percentages provided by quantitative indicators.

Need to distinguish between core and non-core


indicators
Characteristics of Good
Indicators
 VALID: accurate measure of a behavior, practice, or task
 RELIABLE: consistently measurable in the same way by
different observers
 PRECISE: operationally defined in clear terms
 MEASURABLE: quantifiable using available tools
and methods
 TIMELY: provides a measurement at time intervals
relevant and appropriate in terms of program goals and
activities
 PROGRAMMATICALLY IMPORTANT: linked to an
impact or to achieving the objectives that are needed
for impact
Factors to Consider When
Selecting Indicators
 Logical – are they linked to framework?
 Programmatic needs – do they get you the information
you need for decision making?
 Resources – can you afford to collect it?
 External requirements – do you need it for
government, donor, headquarters?
 Data availability – can you get the data you need for
both numerator and denominator?
 Standardized indicators – can you compare across
programs/countries? Is there a “gold standard” for this
indicator?
Each Indicator Includes:
 Description/definition
 Timing of indicator – measurement
• Activities (processes/inputs) or Results (outcomes/impact)
 Calculation of the measure
• Counts (number of teachers trained and posted; number of
trees planted)
• Percentages, rates, ratios (contraceptive prevalence rate; %
of planted trees surviving)
 Purpose of the indicator
 Data source and disaggregation
 Frequency of data collection
 Strengths and weaknesses
Indicator Reference Sheet

Definition: detailed documentation for each indicator


 Basic information
 Description
 Plans for data collection
 Plans for data analysis, reporting, and review
 Data quality issues
 Performance data table (baseline and targets)
Population-Based Indicator Reference Sheet

Example: Indicator Reference Sheet


Core Indicator Relating to Key Result (Objective): Required by Projects Conducting Surveys
Indicator Description
Indicator: Contraceptive Prevalence Rate (Met Need)
Definition: Percentage of women of reproductive age (WRA) 15-49 who are married or in union using (or whose partner is
using) a modern method of family planning

Unit of Measure: Percentage

How to Calculate: NUMERATOR: No. of women age 15–49 years who are married or in union who are not pregnant (or unsure) AND
who report using (or whose partner is using) a modern method of family planning.
DENOMINATOR: Total number of women age 15–49 years who are married or in union included in the survey
Indicator = (Numerator/Denominator) * 100.

Important Background At minimum, you are encouraged to examine this indicator by age group.
Characteristics to Modern methods: The following methods are usually counted as modern methods in the indicator: female
Consider When sterilization (tubal ligation), male sterilization (vasectomy), pills, IUD, injections, implants (NORPLANT), condoms,
Assessing This Indicator: female condom, diaphragm, and foam/jelly and LAM. The Standard Days Method (SDM) is also in the process of
being designated as a modern method of FP.

What It Measures: Population coverage of current family planning use

Important Notes: The conventional indicator is limited to women in union/married.

Data Collection
Data Source: Population-based household survey (Flexible Fund Family Planning Survey)
Data Collection Method: To obtain the CU/CPR, please refer to the Flexible Fund Family Planning Survey for model questions required to
construct the indicator. The most recent version is found on the website: www.childsurvival.com

Frequency/Timing of Baseline, midterm (if applicable), and final evaluation


Data Collection:

Proposed Data Use: The overall purpose of this indicator is to assess whether all your project activities are contributing to the ultimate
objective of increased contraceptive use. If the CU or CPR is lower than expected, consider all the possible factors
that contribute to end use (e.g., knowledge/interest, quality of care, access).
Common pitfalls in indicator
selection
• Indicators not linked to program activities
• Using outputs as outcomes
• Poorly defined indicators
• Data needed for indicator is unavailable
• Indicator does not accurately represent intended
outcome
• Too many indicators
Indicator Pyramid

Decreases
Global
Compare countries
Overview world-wide situation
Number of
Indicators
National/Sub-national
Assess effectiveness of response
Reflect goals/objectives of national/ Increases
sub-national response

District, Facility, Community


Identify progress, problems, and challenges
Indicator Pyramid

Impact

Less Indicators
Outcomes
Less Control

Outputs

Inputs/Activities

45
Key Questions to Ask When
Selecting Indicators
 Do they meet programmatic needs?
 Will they give you useful information for decision
making?
 Are they feasible considering time, money, and
staffing?
 Do they match external requirements?
 Are the data available? How accurate are the data?
 Are they standard indicators used across projects,
programs, countries?
 Are you collecting the information appropriately?
Link between Logical Frame and
M and E
Logical frame Type of M&E Activity Indicators
Hierarchy
Goal Ex-post Evaluation Impact

Purpose Program Review Outcome

Objectives Periodic and final Outcome


evaluation
Outputs Monitoring/periodic Output
evaluation
Activities/Inputs monitoring Output

47
M and E Framework and
Illustrative Data Types

Assessment Input Activities Output Outcomes Impact


& Planning (Resources) ( Interventions, (Immediate (Intermediate (Long-term
Services) Effects) Effects) Effects)

Situation Analysis Staff Trainings # Staff Trained Provider Behavior HIV Incid/Prev
Response Analysis Funds Services # Condoms Distributed Risk Behavior Social Norms
Stakeholder Needs Materials Education # Test Kits Distributed Service Use STI Incid/Prev
Resource Analysis Facilities Treatments # Clients Served Behavior AIDS Morb/Mort
Collaboration plans Supplies Interventions # Tests Conducted Clinical Outcomes Economic Impact
Quality of Life

Program Population-based Biological,


Program-based Data
Development Data Behavioral & Social Data

In addition to monitoring these data types, selected programs


conduct enhanced process and outcome evaluations.

48
M and E Questions and
Are we Approaches
doing them
on a large OUTCOME & IMPACT Are collective efforts being implemented on a
enough MONITORING large scale to impact the situation (coverage,
scale? impact? Surveys and surveillance

OUTCOMES Are interventions working/making a difference?


Outcome Evaluation Studies
Are we
doing them OUTPUTS Are we implementing the program as planned?
right •Outputs monitoring

ACTIVITIES What are we doing?


•Process M&E, Quality Assessment

INPUTS What interventions and resources are needed?


Are we
Needs, Resource, Response Analysis & input monitoring
doing the
right things What interventions can work (efficiency and effectiveness?
•Special studies, operational res, formative res

What arethe contributing factors?


•Determinants Research

Problem What is the problem?


Identification •Situation analysis
49
Adapted from CDC
M and E Plan Outline
1. Introduction
2. Context
3. Overview of the program goals and objectives
4. Main Content
a. Monitoring and Evaluation matrix
b. Results Framework or Logframe
c. Performance table (Baseline, targets, results
d. Description of data sources
e. Data flow
f. Data quality
g. Data Analysis Plan
h. Dissemination and use of results Timeline
5. Staffing and Technical Assistance needs
6. Budget
7. Annual Activity Plan
8. Appendices (sample tools and questionnaire)
50
Monitoring and Evaluation
Matrix
Strategic Objective: (eg Education, Health, Agriculture)

Intermediate Result (Strategies of Intervention):


Activities/in Indicators Sources of Frequency of Responsible
puts data & data Data Person(s)
Collection Collection Team
Process

Outputs

Outcomes Definition of
outcome methods
indicators

Objective

51
Data Sources
Data Sources: Concepts and
Definitions
 Method refers to the scientific design or approach to a
monitoring, evaluation, or research activity, a data
collection
 Tool refers to the instrument used to record the
information that will be gathered through a particular
method.
 Data Triangulation refers to the simultaneous use of
multiple evaluation methods and information sources
to study the same topic

53
Quantitative Methods of
Data Collection

Quantitative methods are those that generally


rely on structured or standardized approaches
to collect and analyze numerical data.
 Almost any evaluation or research question can
be investigated using quantitative methods,
because most phenomena can be measured
numerically.

54
Qualitative Methods of Data
Collection

Qualitative methods are those that generally rely on a


variety of semi-structured or open-ended methods to
produce in-depth, descriptive information.

55
Qualitative vs Quantitative Methods
Qualitative methods Quantitative methods
Use observations and words as raw data Provide data as numbers
Use open-ended questions Use close-ended questions
Ask “how?” and “why?” Ask “how many?”
Collect data using interviews, observation Collect data using surveys
and written documents.
Are case-oriented Are population-oriented
Do not attempt to generalize results Attempt to generalize results
Use purposeful sampling Use probability sampling
Use small sample size Use large sample size
Validity revolves around Validity dependents on instrument
interviewer’s/observer’s training and development
competence
Types of Data

1. Numeric
2. Non-Numeric (Qualitative)

When data is processed, it provides useful


information for analysis and dissemination

57
Data Collection Tools

Some common monitoring and evaluation tools


include:
 Semi Structured and structured questionnaires
 In-depth interviews
 Focus Group Discussions
 sign-in (registration) logs;
 registration (enrollment, intake) forms;
 checklists;
 program activity forms;
 logs and tally sheets; and
 patient charts

58
Types of Data Sources
 Population Census
 Population Based Surveys
 Participatory Observations/Assessments
 Field Visits
 Cohort Studies
 Program Data/Reports
 MIS
 Special Studies

59
Types of Qualitative Data

Types of Data

Text Narrative Visual

Documents - Video,
Scripts,
reports, Diagram,
field notes,
meeting chart
memos,
minutes, e- photographs,
speeches
mails, images
diaries
60
Qualitative Data Collection
Methods
1. Interview
- Interview Guide – Focus Group Discussion or Key
personnel

2. Observation
-Participant – spend time with subjects
-Non-participant: observe the routines and interactions
among the subjects

61
Types of Qualitative Data
Analysis Methods
1. Thematic Analysis – emerging themes
2. Content Analysis – answering research questions
3. Discourse Analysis – patterns, structures and
language used in speech and the written word
(speech and talk)

Source of information
o transcript from interview/FGD,
o a series of written answers on an open-ended
questionnaire
o field notes
o memos written by the researcher

62
Data Quality
 Data quality is important because the quality of the
data determines the usefulness of the results.
 The data you collect are meaningful only if they are
of the highest possible quality.
 Data quality must be monitored at every single step
of the process and such monitoring should not depend
on only one person to ensure data quality.

63
Data Quality Issues
Here are some data quality issues to consider:
 Coverage: Will the data cover all of the elements
of interest?
 Completeness: Is there a complete set of data for
each element of interest?
 Accuracy: Have the instruments been tested to
ensure validity and reliability of the data?
 Frequency: Are the data collected as frequently as
needed?
 Reporting Schedule: Do the available data reflect
the time periods of interest?
 Accessibility: Are the data needed
collectable/retrievable?
 Power: Is the sample size big enough to provide a
stable estimate or detect change?
64
Data Quality Criteria
1. Validity – definitional issues, Proxy measures, Inclusions / Exclusions, data
sources

2. Reliability – consistency in collection methodologies, collection instruments,


sampling frameworks, Personnel issues, analysis and manipulation
methodologies

3. Timeliness - The relationship between the time of collection, collation and


reporting to the relevance of the data for decision making processes.

4. Precision - Accuracy (measure of bias); Precision (measure of error); Is the


margin of error in the data less than the expected change the project was
designed to effect? Transcription or manipulation?

5. Integrity - Free from personal manipulations, technological failures, lack of


audit verification and validation; Is the data free from ‘untruth’ introduced by
either human or technical means, willfully or unconsciously?
65
How do you ensure data quality?
 Developing clear goals, objectives, indicators, and research questions
 Planning for data collection and analysis
 Pre-testing methods/tools
 Training staff in monitoring and evaluation, data collection
 Creating ownership and belief in data collection among responsible
staff
 Incorporating data quality checks at all stages
 Are forms complete?
 Are answers clearly written?
 Are answers consistent?
 Are figures tallied correctly?
 Checking data quality regularly
 Taking steps to address identified errors
 Documenting any changes and improving the data collection system
as necessary

66
Data Quality Audits

 Verification
 Validation
 Self-assessment
 Internal audit
 External audit

67
Sampling
Probability Non-Probability
1. Simple Random  Availability Sampling
Sampling  Quota Sampling
2. Stratified Random  Purposive Sampling
Sampling  Snowball Sampling
3. Systematic Sampling
4. Cluster Sampling

Population <-> Sample


-statistical inference
-representativeness
-Sampling frame
68
What is data analysis?

 Process of bringing order, structure and meaning to the


mass of collected data

 Analysis does not mean using a complicated computer


analysis package.

 It means taking the data that you collect and looking at it


in comparison to the questions that you need to answer

69
What is the purpose of
data analysis?

 To check whether we are achieving program


objectives

 To summarize data and generate new objectives

 To answer a research problem or question

70
Types of Data Analysis

1. Univariate Analysis– One variable

2. Bivariate Analysis – two variables

3. Multivariate Analysis– more than two variables

Measurement Scales

Nominal, Ordinal, Interval/Ratio

71
Measures of Location

(mean, median, mode)

 Mean - the average weight is 3.4 kg, or that

 Median - 50 % of the babies weigh less than 3.3 kg, or

that

 Mode - 95 % of the babies have a weight between 2.8

and 4.3 kg

72
Measures of Variation
 It is important to determine not only the location of
the mean, but also to look at the variation within the
data.
 There are several ways to specify the variation in the
data but the common are:
 Variance
 Standard Deviation = 0.5kg

73
Moments of a Distribution

74
Statistical Power

The statistical power of analysis is determined by the


following:

(a) The level of significance to be used


(b) The variability of the data (as measured, for
example, by its standard deviation)
(c) The size of the difference in the population it is
required to detect.
(d) The size of the samples
75
M&E BUS

THANK
YOU!!

You might also like