You are on page 1of 64

MONITORING AND EVALUATION:

METHODS, PROCESS

MIGOM
PARNAVA
KARTIKEY

MODERATOR: Dr. M. Upadhyay

1
Presentation Outline
What is Monitoring & Evaluation

Why do we need Monitoring & Evaluation

Guiding principles

Who to involve

Key issues and identifying information we need

How to collect the information

Analysing and using the information

Communicating the data


• MONITORING
• The continuous, ongoing collection and review of information on
programme implementation, coverage and use for comparison with
implementation plans

• EVALUATION
• A systematic process to determine the extent to which service needs and
results have been or are being achieved and analyse the reasons for any
discrepancy

3
Attribute Monitoring Evaluation
Main focus Collecting data on progress Assessing data at critical stages of
the process
Sense of completion Sense of progress Sense of achievement
Time focus Present Past-Future
Main question What is happening now to Have we achieved our goal
reach our goal
Attention level Details Big picture
Inspires Motivation Creativity
Periodicity Continuous throughout the Intermittent; at the beginning or end
whole process of significant milestones
Supports Implementation of a plan Designing the next planning cycle

Skills required Management Leadership


Output processing Progress indicators needs to Evaluation results need to be
be closely monitored by few discussed, processed and interpreted
people by all stakeholders 4
MONITORING
Continuous follow up of activities to
ensure that they are proceeding
according to plan

Monitor Monitor Monitor


inputs process outputs

5
EVALUATION

OUTCOME
• What was • How was
planned? • What was it
achieved? achieved?

STRATEGY FUNCTIONING

6
Effectiveness Efficiency

Impact

Monitoring
+
Evaluation 7
MONITORING AND EVALUATION (M&E)
• Key functions:

• To improve the performance of those responsible for implementing


health services.

• Whether a service/program is accomplishing its goals.

• Identifies program weaknesses and strengths, areas of the program


that need revision, and areas of the program that meet or exceed
expectations.

8
NEED OR PURPOSE • Whether heading in right
direction
Problem • Identify problems in planning
and implementation
identification
and cause
analysis

Inform Monitoring Goal and


stakeholders and objective
evaluation setting

Developing
implementation
plan
9
GUIDING PRINCIPLES
• Focused and feasible

• Timely

• Useable

• Credible, valid and reliable

• Sensitive

• Ethical
10
WHO NEEDS AND USES M&E INFORMATION?

To improve
programme • Managers
implementation

• Donors
To inform and improve
• Governments
future programmes
• Technocrats

• Donors
• Governments
Inform stakeholders
• Communities
• Beneficiaries
11
WHO CONDUCTS M & E?
• Program implementer

• Stakeholders

• Beneficiary

12
WHAT TO MONITOR?

• Assessment of the process of program


1 delivery

• For understanding how a program


2 works & how it produces results

• How participants recruited &


3 maintained

PROCESS • How resources acquired & used


4

Number of ASHAs • How barriers & problems


selected by due 5 encountered
process (NRHM)
13
WHAT TO MONITOR?
• Is program proceeding as per
Plan the plan?

• How well programe


Compliance implementation complies with
program plan?

• To keep track of ongoing


activities, supplies &
PROGRESS Budget equipment and money spent in
relation to budget allocation
Percentage of anganwadi
workers imparted 5 days
training program at district
level (NHM) Delivery • Assessment of program delivery
14
HOW TO CARRY OUT M & E
Key Features:
1. Program Framework: Analyze and systematically lay out program
elements

2. Identify key elements to monitor and evaluate.

3. Determine and describe the measures to be used for monitoring and


evaluation

4. Develop M & E Framework and action plans, including data collection


and analysis, reporting and dissemination of findings.

15
M & E QUESTIONS
• What is being done?

• By whom?

• Target population?

• When?

• How much?

• How often?

• Additional outputs?

• Resources used?
16
MONITORING AT DIFFERENT LEVELS

1. Top level: ensuring achievement of impact and provision of inputs


mainly. Major concern to devise strategy and allocate resources

2. Middle level: concerned with getting the desired output with the
inputs utilized. Need to exercise supervision, provide support and
take timely corrective action

3. Lower level/operational level: supervise actual operations and


to ensure that planned activities are being carried out as per
schedule
17
FOCUS OF EVALUATION
• Projects

• Programs

• Services

• Processes

• Conditions

18
ASPECTS OF PROGRAMME

• Inputs

• Processes

• Outputs

• Outcomes

• Impacts

19
Input Outputs Outcomes
Components Activities Target Short-term Long-term
(What was (What was groups (Learning and (Ultimate
invested) done) (Who was action) impact)
reached)
Staff Workshops Participants Awareness Improved
Volunteers Meetings Patient Knowledge health outcome
Time Counselling Clients Attitude Social
Money Facilitation Citizens Skills Economic
Materials Assessments Motivations Environmental
Equipment Training Action Disease
Recruitment Behaviour prevalence
Decisions
C A T SO LO

20
REVISED NATIONAL TUBERCULOSIS CONTROL PROGRAM

Availability of trained
faculty

Availability of funds
in IEC/training heads
INPUT
Availability of printed
INDICATORS material for
handouts, venue

21
REVISED NATIONAL TUBERCULOSIS CONTROL PROGRAM

Number of Sensitization meetings

Trainings conducted

Number of sputum samples sent to


laboratory from lower centres
PROCESS
Number of sputum samples examined
INDICATORS under microscopy

22
REVISED NATIONAL TUBERCULOSIS CONTROL PROGRAM

Number of smear
positive PTB
diagnosed

Number of
OUTPUT registered TB
INDICATORS patients with
non-HIV status

23
REVISED NATIONAL TUBERCULOSIS CONTROL PROGRAM

Increase in case
notification
rates

OUTCOME Improvement in
INDICATORS treatment
success rates

24
REVISED NATIONAL TUBERCULOSIS CONTROL PROGRAM

Reduction in TB
prevalence

Reduction in TB
incidence rates

IMPACT
Reduced number of
INDICATORS deaths

25
TYPES OF EVALUATION
• Retrospective Evaluation:
• when programs have been functioning for some time.

• Prospective Evaluations:
• when a new program within a service is being introduced

26
Retrospective Evaluation, determines what actually
happened (and why)

Prospective Evaluation, determines what ought to happen


(and why)
27
Contd.
• Formative evaluation
Evaluation of components and activities of a program other than their
outcomes. (Structure and Process Evaluation)

INPUT & PROCESSES

• Summative evaluation
Evaluation of the degree to which a program has achieved its desired
outcomes, and the degree to which any other outcomes (positive or
negative) have resulted from the program.

OUTPUT, OUTCOMES & IMPACTS

28
WHO CONDUCTS EVALUATION?

• Internal evaluation (self evaluation), in which people within a


program sponsor, conduct and control the evaluation
E.g. DTO, MO-TC, CMU in NIPCCD

• External evaluation, in which someone from beyond the


program acts as the evaluator and controls the evaluation
E.g. WHO, IIPS

29
IE Methodology
Selection of Districts

Upto 30 million- 2 districts per 30 million- 100 million- 3 >100 million- 3-4 districts per
quarter districts per quarter quarter

Selection of TB Units/ DMCs

2 DMC that are examining higher 2 are selected randomly from


DMC at DTC
number of presumptive TB case remaining DMCs

Selection of patients

• 2 DMCs with low case load: 4 NSP • 2 pediatric


• In remaining, 4 NSP + 1 each of Relapse, • Total 36-
patients + 1 previously treated case patients
Treatment after default and Failure + 1 TB/HIV 30 39
undergoing
• (5*2=10) patient + 1 DR-TB patient (27) patients
treatment
INTERNAL EVALUATION
Advantages Disadvantages
• Knows the implementing organisation, its • May lack objectivity and thus reduce
programme and operations credibility of findings
• Understands and can interpret behaviour • Tends to accept the position of the
and attitudes of members of the organisation
organisation • Usually too busy to participate fully
• May possess important informal • Part of the authority structure and may
information be constrained by organizational role
• Known to staff, less threat of anxiety or conflict
disruption • May not be sufficiently knowledgeable or
• More easily accept and promote use of experienced to design and implement an
evaluation results evaluation
• Less costly • May not have special subject matter
• Doesn’t require time-consuming expertise
recruitment negotiations
• Contributes to strengthening national
evaluation capability 31
EXTERNAL EVALUATION
Advantages Disadvantages
• May be more objective and find it easier • May tend to produce overly theoretical
to formulate recommendations evaluation results
• May be free from organizational bias • May be perceived as an adversary
• May offer new perspective and additional arousing unnecessary anxiety
insights • May be costly
• May have greater evaluation skills and • Requires more time for contract,
expertise in conducting an evaluation negotiations, orientation and monitoring
• May provide greater technical expertise
• Able to dedicate him/herself full time to
the evaluation Can serve as an arbitrator
or facilitator between parties
• Can bring the organization into contact
with additional technical resources.

32
GUIDELINES FOR EVALUATION (FIVE PHASES)

A: Planning the Evaluation

B: Selecting Appropriate Evaluation Methods

C: Collecting and Analysing Information

D: Reporting Findings

E: Implementing Evaluation recommendations

33
PHASE A: PLANNING THE EVALUATION
• Determine the purpose of the evaluation.

• Decide on type of evaluation.

• Decide on who conducts evaluation (evaluation team)

• Review existing information in programme documents including


monitoring information.

• List the relevant information sources

• Describe the programme.


34
• E.g. RNTCP
• Purpose: to find out the inadequacies in prog implementation

• Type: Internal evaluation

• Who conducts: DTO/MO-IC

• Review of previous visit reports

• Relevant information sources: referral slips, laboratory request form,


tuberculosis treatment card, TB identity card, referral form for treatment
of DR-TB
Registers: tuberculosis laboratory register, culture and DST
laboratory register, Tuberculosis notification register, second line TB
treatment register 35
Phase B: Selecting Appropriate Evaluation Methods
• Identify evaluation goals and objectives.

• Formulate evaluation questions and sub-questions

• Decide on the appropriate evaluation design

• Identify measurement standards

• Identify measurement indicators

• Develop an evaluation schedule

• Develop a budget for the evaluation. 36


Sample evaluation questions
Program clients: Program managers:
• Does this program provide us • Does this program provide our clients
with high quality service? with high quality service?
• Are some clients provided with • Are there ways managers can
better services than other clients? improve or change their activities, to
If so, why? improve program processes and
outcomes?

Program Staff:
• Does this program provide our Funding bodies:
clients with high quality service? • Does this program provide its clients
• Should staff make any changes in with high quality service?
how they perform their work, as • Is the program cost-effective?
individuals and as a team, to • Should they make changes in how
improve program processes and they fund this program or in the level
outcomes? of funding to the program?
37
Evaluation Area Evaluation Question Examples of Specific
(Formative Measurable Indicators
assessment )
Staff Supply Is staff supply sufficient? Staff-to-client ratios

Service Utilization What are the program’s usage Percentage of utilization


levels?
Accessibility of How do members of the target • Percentage of target
Services population perceive service population who are aware of
availability? the program in their area

• Percentage of the “aware”


target population who know
how to access the service

Client Satisfaction How satisfied are clients? Percentage of clients who


report being satisfied with the
service received
38
Evaluation Area Evaluation question Examples of specific
(Summative Assessment) measurable indicators

Changes in Behaviour Have risk factors for cardiac Compare proportion of


disease have changed? respondents who reported
increased physical activity

Morbidity/Mortality • Has lung cancer mortality • Age-standardized lung


decreased by 10%? cancer mortality rates for
• Has there been a reduction in males and females
the rate of low birth weight •Compare annual rates of low-
babies? birth weight babies over five
years period

39
PHASE C: COLLECTING AND ANALYSING
INFORMATION
• Develop data collection instruments

• Pre-test data collection instruments

• Undertake data collection activities

• Analyse data

• Interpret the data

40
Pretesting or piloting

Pilot test should involve

Testing the way the Testing the way Testing supportive


Testing the
instrument is the responses documents/
instrument
administered are recorded procedures

41
GATHERING OF QUALITATIVE AND QUANTITATIVE
INFORMATION: INSTRUMENTS

42
Quantitative tools:
• Surveys/questionnaires

• Registries

• Activity logs

• Administrative records

• Patient/client charts

• Registration forms

• Case studies

• Attendance sheets
43
Other monitoring and evaluation methods:

• Impact flow diagram


• Biophysical measurements (cause-effect diagram)
• Cost-benefit analysis • Institutional linkage
• Sketch mapping diagram (Venn/Chapati
• GIS mapping diagram)
• Transects • Problem and objectives tree
• Seasonal calendars • Systems (inputs-outputs)
• Most significant change diagram
method • Monitoring and evaluation
Wheel (spider web)

44
Impact flow diagram (cause-effect diagram)

45
PHASE D: REPORTING FINDINGS

• Write the evaluation report.

• Decide on the method of sharing the evaluation results and


on communication strategies.

• Share the draft report with stakeholders and revise as


needed to be followed by follow up.

• Disseminate evaluation report.

46
• Checklist: e.g.
• Resources:
• Is at least one trained MO officer available in the health facility?
• Is a full time trained Lab Technician (LT) available for sputum
microscopy?
• Have provisions been made for sputum collection when LT is absent?

• Review of forms, registers:


• Are the lab forms for sputum exams filled correctly?
• Is the lab register filled correctly, completely?
• Are results up to date?

• Exit interviews of at least 2 patients undergoing sputum microscopy


• Do the patients know how to cough out good quality sputum
properly?
47
48
EVALUATION REPORT: SUGGESTED OUTLINE
• Title page
• Table of contents
• Acknowledgement
• List of acronyms
• Executive summary
• Introduction
• Finding and conclusion
• Lessons learned
• Recommendations
• Annexures

49
RNTCP supervisory register
Recommendation
Name of the Health facility visited
Name and designation of supervisor filling this form
Date and time
Observations on actions taken based on previous visit
Key observation
Politico administrative commitment and resource management
Diagnosis
Drugs and laboratory consumable
DOT and Follow up
TB-HIV activities
Records and reports
ACSM activities
DOTS Plus
Finding of home visit
50
PHASE E: IMPLEMENTING EVALUATION
RECOMMENDATIONS
• Develop a new/revised implementation plan in partnership with
stakeholders.

• Monitor the implementation of evaluation recommendations and report


regularly on the implementation progress.

• Plan the next evaluation

51
National Health Mission

• The National Health Mission (NHM) encompasses its two Sub-


Missions, the National Rural Health Mission (NRHM) and the
newly launched National Urban Health Mission (NUHM)
• The main programmatic components include Health System
Strengthening in rural and urban areas- Reproductive-
Maternal- Neonatal-Child and Adolescent Health (RMNCH+A),
and Communicable and Non-Communicable Diseases
• The NHM envisages achievement of universal access to
equitable, affordable & quality health care services that are
accountable and responsive to people's needs
52
•Mission steering group

•Empowered programme committee

•Common Review Mission

•Joint Review Mission

•International advisory panel

53
MISSION STEERING GROUP
• Highest policy making and steering institution under NHM.
• Provides broad policy direction to the mission.
• Advises the Empowered programme committee of the mission in
policies and operation
• Exercises the main programme and governance for Health
sector
• Chairperson- Union minister of Health & Family welfare.
• Fully empowered to approve financial norms in respect of all
schemes and components which are part of NHM.

54
COMMON REVIEW MISSION
• It was setup as a part of the Mission steering group’s mandate
of review and concurrent evaluation.
• Appraisal was conducted in November 2007.
• The task of the NRHM CRM was to assess the progress of the
NRHM on 24 parameters.
• 52 members
• Ninth CRM was held from 30th October – 6th Novemebr 2015

55
Suggested outline of community based monitoring
activity
• The Monitoring committee at each respective level reviews and
collates the records coming from all the committees dealing
with units immediately below it

• Also appoints a small sub-team drawn from its NGO and PRI
representatives who visit a small sample of units under their
purview and review the conditions there

• This enables the committee to not just rely on reports but to


have a first-hand assessment of conditions in their area

56
• The Monitoring committee sends a periodic report (Quarterly for Village,
PHC, Block and District levels; Six monthly for State level) to the next
higher level committee

Tools for monitoring:


• Format for Village Health register, Village Health Calendar
• Guideline for information to be collected in Village group discussion
• Schedule of ASHA Interview
• Interview format for MO PHC / CHC
• Format for Exit interview (PHC / CHC)
• Documentation of testimony of denial of health care

57
Level Agency Activity (quaterly in all except state which is 6monthly)
Village Village health and •Reviews village health register, village health calender
sanitation committee •Reviews performance of ANM,MPW,ASHA
•Sends brief 3 monthly report to PHC committee
PHC PHC monitoring and •Reviews and collates reports with all VHSC
Planning committee •NGO/PRI subteam conduct FGD in 3 sample village under PHC
•Visit PHC review records discuss with RKS members
•Sends brief 3monthly report to block committee
Block(CH Block monitoring and •Reviews and collates reports with all PHC
C) planning committee •NGO/PRI subteam conduct FGD in one PHC ,interview MO
•Visit CHC review records discuss with RKS members
•Sends brief 3 monthly report to District committee

District District monitoring •Reviews and collates reports with all PHC
and planning •NGO/PRI subteam conduct FGD in one PHC ,interview MO
committee •Visit CHC review records discuss with RKS members
•Sends brief 3 monthly report to District committee

State State monitoring and •Reviews and collates reports


planning committee •NGO/PRI
•Sends 6 monthly reports to NHM/Union health ministry
58
Suggested framework to organise information
Levels Main issues for Reference Who When Tools
monitoring documents
Village ANM/MPW •Village health VHSC(incl. ASHA Quarterl •Standard agenda
services,ASHA plan ANM) y items
activities •Charter of •VHR
citizen •VHC
•NHM scheme •ANM/MPW records
•Village FGDs
•Interview of
beneficiaries

PHC •Overview of village •PHC health •PHC monitoring Quarterl •Standard agenda
level monitoring plan and planning y items
•Staffing, supplies and •Charter of committee, PRI •Report from VHSC
services availability at citizen’s members etc. •Record of select
PHC health rights •PHC RKS village FGD
•Quality of care at PHC at PHC members •Interview of MO PHC
•Exit interview of PHC
patients

59
Levels Main issues for monitoring Reference Who When Tools
documents
Block •Overview of health service in •CHC health •CHC Quaterly •Standard agenda
block plan monitoring items for CHC
•Staffing, supplies and •Charter of and planning committee meeting
service availability at CHC citizen’s committee •Reports from PHC
•Quality of care at CHC from health rights incl PRI committee
people’s perspective at CHC members etc. •Records of visits to
•CHC RKS select PHCs
members •Interview of MO in
•Facilitation charge of CHC
by nodal •Report of the
NGO/CBO district health
mission
District •Overview of all public health •District •District 6 monthly •Standard agenda
services in the district(except health plan health items for District
services provided by •Charter of monitoring committee meeting
municipal bodies) state citizen’s and planning •Reports from Block
specific health schemes health rights committee health committee
•Quality of care at district at District •Public •Records of visits to
hospital and sub-divisional hearing select sub-divisional
hospital facilitator hospitals/CHCs
team 60
Levels Main issues Reference documents Who When Tools
for monitoring
State All issues of •State health plan, state •State health •Six monthly •Report from
rural public PIP monitoring and committee district health
health •NHRC planning meetings committee
services/NHM recommendations and committee •Annual •Record of visit
in the satte state govt component of •State peoples independent to select
including state NHRC national action rural health reports,public districts
specific health plan watch meetings •Report of state
schemes •All NHM schemes report/citizens health mission
ASHA,JSY, United report by civil •Report of
funds expenditure society groups district public
•IPHS and functioning •Public meeting hearing
of various level facilities of state mission •Independent
•National health with civil society reports
programmes and family representative
planning insurance
scheme
•PPP and related
regulations
•State health budget
and expenditure
61
CONCLUSION:
WHY MONITORING AND EVALUATION

M&E Ensures Communi Measures Provides Ensures Promotes


should be systemati cates efficiency informatio effective continuou
part of c results and n for allocation s learning
the design reporting and effectiven improved of and
of a accounta ess decision resources improvem
program bility making ent

62
63
Thank you
64

You might also like