You are on page 1of 65

Monitoring and Evaluation Training Guidelines

For

Health Insurance System

April 2022
Session overview and timeline

Time Duration Topic(s) Training Methods

8:00-8:30 30 minutes Welcome and Introduction Introductory Facilitator presentation

Facilitator presentation; Group


8:30-10:00 1 ½ hours Why M&E? discussion

10:00-12:00 2 hours M&E for Health Insurance system Facilitator presentations; Group
discussions

12:00-1:00 1 hour Lunch break

Indicators; Project Level M&E: M&E


1:00-3:00 2 hours targets and matrix for current strategic Facilitator presentations; Group
year discussions; Group activity

2
Methods Remarks

PowerPoint presentations In progress

Exercises and questions to be prepared


Brainstorming, Interactive discussion on different
topics, and group exercises
Datasets on claims, membership and contribution required

Daily recap and discussion

3
Training Methodology

•The sessions of this five-day training workshop are designed based on the principles of adult learning
and participatory approaches. Methodologies include lectures accompanied by PowerPoint slides, group
discussions, group work, activities and presentations, brainstorming and mapping exercises and games
for session linkages.

•Module 1 Introduction to Monitoring and Evaluation (M&E In General
• At the end of this training participants will demonstrate capability in:

¾ The objectives of monitoring and evaluation for Health Insurance system


¾ Defining monitoring, evaluation, efficiency, effectiveness and the monitoring cycle
¾ The questions that a good monitoring system should answer The National M&E guidelines

4
Session one

• Session One Introductory Activity and Training Objectives

• 8:30-9:00 30 minutes Introductory Activity and Training Objectives Group Activity,


• Facilitator Presentation

•Materials
• Slips of paper, 1 for each participant, each with a different M&E term written on it
• Copy of training workshop schedule
• LCD Projector
• PowerPoint presentation

5
• Facilitator Note: This session is the “icebreaker.” During this time participants and
facilitator(s) will introduce themselves and their relative experience, current position
and organization.

• In addition to “getting to know you,” the main purpose of this introductory activity is
for participants to become familiar with monitoring and evaluation & complaints &
Grievance handling related terms, as well as for the facilitator(s) to get a brief
assessment of participants’ understanding of these terms.

6
IntroductoryActivity

•Introductory Activity
•Welcome participants and invite them to take a seat any where they’d like; as you are welcoming
them, provide each participant with a slip of paper with an M&E related term on it.

•Once all participants have entered and taken their seat say, “Welcome to this monitoring and
evaluation training workshop. As you can see, I have given each of you a piece a paper with a word
written on it; I also have one myself. We are going to go around the room and I would like each
person to introduce his/herself, his/her job title and the organization which you are
representing. After you have finished, I would like you to define the word on your piece of paper, to
the best of your ability, and describe how this word relates to monitoring and evaluation. I’ll begin.
My name is (name). I am the (job title) for (organization). My term is (read term allowed). This word
means (define) and relates to M&E because (explain).”

7
Introduction

• The Ethiopian Health Insurance Service (EHIS), the former Health Insurance Agency,
has been implementing a community-based health insurance program and currently
embarking on rolling out social health insurance program. As of 2020/21, the CBHI
program was able to cover more than 800 woredas and provide financial protection for
healthcare to more than 40 million beneficiaries.

• As part of its Ten-Year Strategic Plan, the EHIS is embarking to achieve universal
health insurance coverage in 2030. Strengthening the monitoring and evaluation
system is one of the core initiatives of the EHIS for the successful implementation
of planned activities in the ten-years period.

8
Continue….

It is important to emphasize to participants that this is not a test; they are


•Facilitator Note:
not expected to have “right” definitions nor are they expected to be able to “correctly”
link their specific term to monitoring and evaluation. Remind participants that this is
just a fun
•exercise to allow them to become comfortable with monitoring and evaluation
terminology while introducing each other to the rest of the group.

• Once this activity is complete, ask participants to try and remember their word, how
they have defined it and linked it to monitoring and evaluation; request that they take
note of how their ideas about this word and its relation to monitoring and evaluation
may or may not change throughout the course of this session

9
Methodology

• This training is aiming to capacitate staff from in the health insurance system on
M&E and complaints management system. Therefore, the training will be delivered
in different phases.
• In this training will be participants from RHBs, zone, werda, contracted health
facility and EHIS branches,. Then the other phases will cascade comprise of
training participants from woredas, to keble

• These cascaded trainings will be managed by those who took TOT and will be
mentored by staff from EHIS head quarter.

10
Training overview

•Training Overview
•Briefly orient participants on the sessions and schedule of this training workshop. If
you have made copies of the schedule, distribute them to participants. In position of
individual copies, paste a copy of the training schedule in a visible part of the
training room.

•Training Goals and Objectives
•Introduce the goal and training objectives of this workshop.
• Facilitator Note: Goals and objectives should be exhibited in a PowerPoint presentation or a large flip
chart

11
Continue….

• The goal of this training workshop is to orient and sensitize EHIS Project partners
and implementing stakeholders on the fundamental aspects, systems and processes
of monitoring and evaluation for Monitoring and Evaluation
•The objectives of this monitoring and evaluation training workshop are as follows:
(1) To highlight the importance and use of M&E for development
(2) To introduce the national M&E plan,
(3) Provide conceptual clarity for monitoring and evaluation
(4) To introduce the framework for monitoring and evaluation of EHIS interventions
(5) To introduce indicators and share available strategic information for EHIS
(6) To sensitize on data quality and strengthen project level M&E functions
(7) To generate commitments and an action plan for M&E

12
Introduction ……

• As part of strengthening the M&E system of health insurance


programs, the EHIS has developed a Monitoring and Evaluation Manual
which will guide the overall implementation of monitoring, evaluation,
learning, research, and data management activities by stakeholders in the
health insurance system.

• The successful implementation of the new M&E manual among the


respective stakeholders is contingent upon the capacity of the staff
working on M&E, data management, and quality assurance

13
Continue…..

• Facilitator Note: Ask participants to keep the objectives in mind throughout the sessions. Do they feel the
objectives are being met?

•Session Conclusion
•Wrap up this session by saying, “So we are all here to receive training on monitoring
and evaluation, but why do we need monitoring and evaluation? What purpose does it
serve?”

14
Introduction…

• . Consequently, it has been found essential to provide training of trainers


(TOT) to the respective staff so that they will be equipped with the
knowledge and skills required to effectively implement the M&E
manual, improve data quality, and proper utilization of information for
decision making. These staff who will receive the TOT will then cascade
it to others under their authority.

15
Module one session two

Why M&E?

8:30-10:00 1 ½ hours Why M&E?


Group discussion;
Facilitator presentation

•Materials
• LCD Projector
• PowerPoint presentation
•Group Discussion
• Begin this session by asking participants, “What did you do when you got up this morning

16
Continue…..

• Facilitators Note: The purpose of this exercise is to relate monitoring and evaluation to a
mirror; monitoring and evaluating is the “mirror” of our training .
• If no one mentions looking in a mirror, ask leading questions, such as, “What did you after
you got dressed?” or “Did you shave/put on make-up this morning?
• If yes, did you do so while looking in a mirror?” Elaborate on this point by asking, “How
would it be to shave/put on make-up without looking in the mirror?

• What is the difference between shaving/putting on make-up with and without looking in the
mirror?” Relate this to monitoring and evaluation; with M&E we can reflect on our project
and its process, just as we use a mirror to see our own reflection when we shave/put on make-
up – it allows us to see what we’re doing and reduces the chance of error.

17
Continue …..

•Facilitator Presentation
•Exhibit PowerPoint presentation. The following topics should be included in this
presentation:
(1)M&E as a tool to address emerging challenges
(2)M&E as a National priority for Health Insurance System
(3)M&E as an element in the process of managing for impact
• This presentation is intended to illustrate how M&E contributes to EHIS programs on
both the Regional and national scale.
•Session Conclusion
• Conclude this session by discussing how M&E functions to mitigate impact.

18
Monitoring definition

•Monitoring Definition
• Monitoring is the routine process of data collection and measurement of progress toward program objectives.
• It is a continuous form of reflection involving examining what we are doing and routinely looking at the
quality of our services.
• Monitoring is “an internal process undertaken by management carried out to assess progress at regular
intervals throughout the life of a project or program.”

• Monitoring cycle
•Monitoring is a four-step cyclical process where each step works to inform the other.
• Firstly, a monitoring plan must be prepared; this means creating a system for monitoring the progress of your
project.
• The next step is to collect and analyze data.

•For example, if your organization was running an health service utilization, a way to monitor its success
would be to collect and analyze data regarding how many beneficiary are utilize health service in health
facility ?

19
Mentoring cycle continue….

•If your monitoring plan has been successful, you should be able to answer the following
questions:
(1)Are we on the way to our planned objective(s)?
(2)To what extent are planned activities being implemented?
(3)Are project activities being carried out correctly, on time and within budget?
(4)How well are services being provided?
(5)What services are we providing to whom, when, how often, for how long and in what
context?
(6)Are the objectives and targets reasonable?
• Do we need to take corrective action?
• Facilitator Note: It is recommended that you display a few Examples at this juncture to
reiterate certain points about monitoring

20
Evaluation Definition

•Evaluation Definition
• Evaluation is concerned with the relevance, effectiveness, efficiency, impact and sustainability of what has been done.
• It is the process of identifying and reflecting on the effects of a process, thus far, and judging the value of it.
• It utilizes social research methods involving measurements over time, study design and special studies, to systematically
investigate a program’s effectiveness.
• Sometimes evaluation requires a control or comparison group

•Effectiveness and Efficiency Definition


•Differentiating between effectiveness and efficiency , for the purpose of M&E, is parallel to distinguishing between
monitoring and evaluation. Although both words are quite similar, they are the result of two different functions.
• Effectiveness is a function of good design.
• Efficiency is a function of good management.

•In this context, effectiveness is synonymous with evaluation in that they are both concerned with impact and
effects.

•Efficiency, on the other hand, is the equivalent of monitoring, as they involve outputs, activities and inputs.

21
Deafferenting b/n Monitoring and Evaluation
• Concerned with efficiency of resource deployment • Concerned with effectiveness

• Measures project inputs, of strategy


activities & outputs • Measures project effects and impact
• Structured top-down methods (reporting) • Participatory bottom-up methods (inquiry)
•Regularly scheduled reporting processes • Periodically scheduled inquiring processes
• Emphasis on decision making • Emphasis on organizational
• Asks “are we doing the learning

• thing right” • Asks “are we doing the right thing”

• Absolutist view (“are we doing what we said we would?”) • Philosophical view (“is what we’re doing actually worth doing?”)

• Project team accountable to project management • Project management

• Provides starting point for evaluation • accountable to stakeholders – especially beneficiaries

22
Session Conclusion

•Session Conclusion
•Conclude this session by informing participants that monitoring and evaluation systems must be in place in
order for organizations to demonstrate accountability for the past, enhance management decision-
making in the present and to promote organizational learning for the future.

•Without effective monitoring and evaluation systems in place there would be limited learning from both
successes and failures resulting in lost opportunities to know how to handle a mistake the next time it occurs or
benefit from effective approaches or successes.

•The recent history of much development and aid work is that there has been limited institutional learning
despite huge efforts and expenditures. There are too many examples of old mistakes being repeated despite
serious attempts at quality planning!!

•By monitoring and evaluating our programs we reduce the risk of program duplication and error, and
increase accountability.

23
Monitoring and Evaluation function

• Monitoring and Evaluation Functions


•M&E functions to achieve five main things - FALAM
•F – Follow-up
•A – Accountability
•L – Learning
•A – Action
•M – Management

•Let’s discuss some of these in detail.


•Accountability is generally recognized as being responsible to the higher levels, such as donors
and funding agencies. While M&E does accomplish this, it also allows for downward accountably;
enabling transparency of accounts and plans to beneficiaries and stakeholders.

• M&E functions as a learning tool in order to improve strategies. It allows recognition of both positive
and negative outcomes, facilitating the ability to recognize areas of fault or lacking, as well as aspects
of success

24
M&E function continue…..

•In this sense M&E facilitates action. Learning and feedbacks generated by critical reflection are applied to programs
and projects to improve actions.

•This action, in turn, influences day to day management and decision making. The conclusions drawn by monitoring and
evaluation procedures determine processes of leading, planning and organizing staff as well as controlling activities,
people and other resources in order to achieve particular objectives.

•Benefits and Expectations of Monitoring and Evaluation


•There are many expected benefits of M&E which improve program design and implementation as well as organizational
cohesiveness. Implementing an M&E program in your organization will result in:
1) Evidence-based programming
2) Management efficiency
3) Documentation and determination of what works 4) Early warning in programs for corrective action
5) Information to manager for day to day decisions
6) Participation, understanding and consensus
7) Accountability
8) Advocacy and transparency
9) Demonstration of success and formulation of policies
10) Wise allocation of funds
25
M&E function continue…..

•Monitoring and Evaluation Expectations


•There are several attributes which are expected from an M&E plan; these include:
• Innovation
• Reach or coverage
• Cost efficiency
• Replicability
• Public/stakeholder opinion
• Quality
• Sustainability
• Impact

26
EHIS Indicators and selection criteria
10:00-11:00 1 hour Key EHIS Indicators and definition Facilitator presentation

27
National core indicator

•National Core Indicators


•Use the e-copy of the National Indicator Definitions to review the 27 National Core
Indicators; there are:
• Input indicators () – to measures elements such as people, money, equipment, policies, etc.
• Outcome indicators () – to measure trainings, logistics, management, etc.
• Output indictors () – to measure services, service usage, knowledge, etc.
• Process indicators () – to measure behavior change, etc.
• Impact (National Commitment and Action) indicators ()

28
EHIS Indicator….

•Materials
•LCD Projector
•PowerPoint Presentation
• Facilitator Note: It is important to keep the atmosphere exciting and keep participant’s
energy high. The next two sessions will have your participants listening to you lecture
and present for 2 hours! Remember to talk to them, not at them. Engage them;
encourage questions and participation
•Exhibit the PowerPoint presentation on the Fundamentals of M&E; it should cover the following points:
¾ Defining indicator selection criteria
¾ Overview of the 27 Core National Indicators

¾ Defining evaluation EHIS Indicator


¾ Explain indicator linkage with monitoring and evaluation
¾ Evaluate EHIS indicator based on effectiveness
¾ Focus of EHIS indicator

29
EHIS Indicator….Continue

• Facilitator Note: This presentation should include photographs and illustrations to


draw analogies from. It is recommended that the first slide of your PP
presentation be a picture

30
EHIS Indicator….Continue

• Indicators and indicator selection processes

• Indicator Definition

• EHIS log frame


• EHIS Indicator linkage with M&E
• Group exercise

31
EHIS Logical frame work

• Module 1 – Session Five Indicators, Level M&E and M&E Matrix


•Materials
•LCD Projector
•PowerPoint presentations
•National Indicator Checklist
•Pens/pencils
•Exhibit the PowerPoint presentation on Indicators. It should cover the following main points:
¾ Defining indicators and the different types of indicators
¾ How and why indicators are established
•Indicator Definition
• An indicator is a measurable statement of program objectives and activities.
• It is a variable, measure or criterion that measures one aspect of a program/project.
• Simply stated, an indicator verifies whether an intended change actually occurred.

32
CONTINUE….

•Indicators are developed for two reasons:


1) To measure attainment of inputs, processes, outputs, effects and impacts related to the project design
hierarchy.
2) To evaluate key questions in the evaluation of projects and programs.

•Types of Indicators
• Numeric indicators are expressed as counts, percentages, ratios, proportions, rates or averages.
•i.e., enrollment rates, mortality rates, case finding rate, etc.

• Non-numeric indicators are expressed in words; sometimes they are also referred to as qualitative or
categorical indicators.
•i.e., selection and training of volunteers, committee formations, etc.
• Input indicators measure the inputs which go into the design and implementation of a program
• i.e., number of training materials, staff members and infrastructure

33
Continue…

• Process indicators measure the activities which are being conducted to achieve the outputs and
outcomes
•i.e., number of training workshops conducted or number of site visits.

• Output indicators are used to measure the product of the immediate inputs of project activities; they
describe the goods and services produced by program activities.
•i.e., number of staff members trained or the number of IEC materials distributed
• Outcome indicators are used to measure the effect of an intervention on systems or behaviors
resulting from the achievement of an immediate goal after a specified interval of time; commonly
for short-term assessment, such as one or two years after a program or project has been
implemented.
•i.e., the number of clinics which meet new quality standards

• Impact indicators measure the actual change in conditions of key problems or unmet needs which
are linked to the ultimate program goal; the impacts which program/project activities have had on a
population over a longer period of time.

34
Measuring indicator

•Measuring Indicators
•Explain how to create a numeric indicator, i.e., a percentage. Explain that you must first have a numerator and a
denominator.

•For example
•Facilitator Note: Demonstrate this on a blackboard/flipchart. Create one or two examples to ensure participants
understand the process of generating a numeric indicator.

•Remind participants that when measuring indicators, it is important to consider where and how the information
will be obtained; these two factors can greatly influence data.
•Developing Indicators
•Indicators and the information utilized to realize them must be attached to each level of the program planning
hierarchy; impact, goal, objectives, intermediate results, activities (inputs, outputs, processes) and outcomes.

• When developing indicators, the five core evaluation questions must be answered: 1) Relevance 2)
Efficiency
3) Effectiveness
4) Impact
5) Sustainability
35
Measuring indicator continue……

•The information system should be varied in order to capture cross-cutting issues and to incorporate views of
stakeholders and beneficiaries; this contributes to participation and transparency.

• Facilitator Note: Emphasize the importance of taking note of both unintentional positive and negative
outcomes and impacts of the program to participants
•Group Exercise
• This exercise is intended for groups of two. Once everyone has paired off, indicator checklists should be
distributed
•Request that each pair work together to identify the appropriate hierarchical level of the listed indicators. Twenty
minutes will be provided for participants to complete this activity.

•Facilitator Note: This activity has been designed to assess participants’ knowledge regarding indicators

•In some circumstances indicators may be applicable to more than one level; inform participants of this so they are
aware that they can check more than one level.

•When the allocated time has expired, review the indicators and their levels with participants. Explain why each
indicator is located in its particular position.

•Allow time for questions. 36
DATA Quality Improvement

• – Session Two Data Quality


8:30-10:00 1 ½ hours Data Quality and Feedback Systems Facilitator presentation, Group Activity

37
Data Quality

•Materials
•LCD Projector
•PowerPoint presentation
•White Board
•Markers
• Facilitator Presentation
•Exhibit PowerPoint presentation on data quality. The presentation should cover the following main
points:
¾ Define data and data quality
¾ Criterion for measuring data quality
¾ Criterion for establishing good data
• Steps for verifying data
• Emphasize that data quality is a continuous commitment! In order to maintain data quality there
needs to be training and re-training on reporting systems and indicator definitions for new staff
members. The data needs to be used to improve actions and narrative documentation must be kept
to support quantitative accomplishments
38
Data definition

•Data Definition
• Factual information used as a basis of reasoning, discussion or calculation
• Must be recorded; if it is not written down, it is not data

•Data Quality Definition
• How well the information collected represents the program activities
• Refers to the worth/accuracy of the information collected
• Data that reflects true performance
• Focuses on ensuring that data management process is of a high standard

•Measuring Quality Data
•The data is only as good as its quality. The quality of data is measured against six criteria:
1) Validity – data which clearly, directly and adequately represents the result that was intended to be
measured

39
Data quality continue…..

1) Reliability – if the process were repeated over and over again it would yield the same result
2) Integrity – data have been protected from deliberate bias or manipulation for political/personal reasons
3) Precision – data are accurate with sufficient detail
4) Timeliness – data are current and information is available on time
5) Confidentiality – clients are assured that their data will be maintained according to national and/or international standards for data

•Good quality data can be recognized by checking the FACTS
•F – Formatted Properly
•A – Accurate
•C – Complete
•T – Timely
•S – Segmented properly

•Always do a FACTS check on your data to ensure it maintains quality!



•Why?

40
Continue…

•Facilitator Presentation
•How is it that we can verify data? With RATS R – Raw data (i.e., forms)
• A – Aggregated data (i.e., tables)
• T – Trends (i.e., consistent changes over time)
• S – Segmented data (i.e., split by sex, age, etc.)

•Following the conclusion of this module, participants should have a working understanding of:
9 Quality data
9 The components of good data
9 The actions required for quality data improvement
9 Verifying data
Facilitator Note: After conducting a brief Q&A with participants regarding the day’s sessions, select one group to
prepare a brief presentation on the sessions covered during ppt presentation to be presented on the morning.
Inform the group who will be undertaking this task that the presentation should be a maximum of 10 minutes.

41
Material- Facilitator

•Materials – Facilitator
• LCD Projector
• Previously prepared PowerPoint presentations

• Markers
• Pens/pencils
• Handout: Slips of paper, one for every participant, each with one monitoring and
evaluation related word written on it
• Handout:
• Handout: National Indicator Checklist, EHIS Reportioning and Recording tools

42
Martial- participants

• Handout: National Indicator Checklist, EHIS Reportioning and Recording tools

• Pens/pencils
• Hand out
• M&E, Compliant and Gravedance handling manual
• Laptop

43
• Data Ana1/211111½ Day allowed
• lysis

•Data Analysis practical


exercise
44
Reporting flow/Reporting in the HIS

•8:30-10:00


Facilitator Presentation (Reporting flow presentation
1 ½ hours

• ½ hr
•Discussion on reporting flow
•Materials – Facilitator
• LCD Projector
• Previously prepared PowerPoint presentations

• Markers
• Pens/pencils
• Handout: Slips of paper, one for every participant, each with one monitoring and evaluation related
word written on it
• Handout:
• Handout: , EHIS Reportioning flow hand out
•Data Quality and Feedback Systems
•Facilitator presentation, Group Activity
•8:30-10:00
•1 ½ hours 8:30-10:00
•1 ½ hours
•Data Quality and Feedback Systems
•Facilitator presentation, Group Activity

•Data Quality and Feedback Systems


•Facilitator presentation, Group Activity 45
REPORTING AND RECORDING TOOLS


Facilitator Presentation 1 day
hours

•Discussion on reporting flow


•Materials – Facilitator
• LCD Projector
• Previously prepared PowerPoint presentations

• Markers
• Pens/pencils
• Handout: Slips of paper, one for every participant, each with one monitoring and
evaluation related word written on it
• Handout:
• Handout: , EHIS Recording and Reportioning tools out

46
Reportting and Recording tools

• Detail practical exercise on members and provider raw


data ½ day

47
Monthly Kebele CBHI Section Activity Summary Report
In the reporting period
CBHI Membership
Male Female Total
Paying
Number of CBHI members newly enrolled Non - paying
Total
Paying
Family members newly registered (including HH
Non - paying
heads)
Total
Paying
Number of CBHI members renew their
Non - paying
membership
Total
Paying
Total family members with membership renewal
Non - paying
(including HH heads)
Total
Paying
ID provided out of total registered members Non - paying
Total
CBHI Contribution Collection In the reporting period
Expected money to be collected from registered members
Contribution from paying member (actually collected)
Amount deposited to Bank
# of receipts unreturned from previous years
Receipt voucher-RV # of receipts received this year
# of receipts returned in this month
Quarterly Woreda Scheme Health Insurance Activity Summary Report
Region:_____________________________________
Reporting
Name of Zone/Sub-City:__________________________________ Period:___________________________
Name of Woreda/Scheme:_________________________ Submission date:___________________

CBHI Membership Male Female Total Remark


Paying
Number of CBHI members newly enrolled Non - paying
Total
Paying
Family members newly registered (including HH
Non - paying
heads)
Total
Paying
Number of CBHI members renew their
Non - paying
membership
Total
Paying
Total family members with membership renewal
Non - paying
(including HH heads)
Total
Paying
ID provided out of total registered members Non - paying
Total
CBHI Revenue Mobilization In the reporting period Remark

Total contribution collected from Expected money to be collected from


paying HHs registered members
Amount deposited to Bank

# of receipts unreturned from


previous years
Receipts Voucher-RV
# of receipts distributed
# of receipts returned
Expected from Region
Deposited by region
Targeted Subsidy
Expected from woreda
Deposited by Woreda
Other sources
Health Service Utilization Male Female Total Remark
Total number of visits by Health Insurance beneficiaries
Non -
Paying Total
Number of claim submitted for: paying
OPD visits
IPD visits
Total
Amount of claim submitted for: Ethiopian Birr Remark
Card
Lab
Drugs
Imaging
surgery/procedure
bed including food
other services
Total Amount Claimed
Total length of stay in days (for IPD)
Claim Management (request) HC Hospital Total Remark
Number
Total claims
Amount in Birr
Number
Amount of money deducted due to Claim
Claims rejected Audit
Amount of money added due to Claim
Audit
Number
OOP reimbursement
Amount in Birr
Number
Claims from 3rd party
Amount in Birr
CBHI Expense In the reporting period Remark
Total amount paid for Health Centers
Total amount paid for Hospital(s)
Total OOP reimbursement
Total amount paid for 3rd party
CBHI Expense by type of visit
Amount paid for OPD Visit
Amount paid for IPD Visit
Amount paid to Higher Level Pool
Regional pool
Amount deposited to Higher level Pooling Zonal pool
Total
Any other expenses

CBHI Fund Current Balance at Scheme Level (Bank statement)


Amount Payable

Medical Audit Status Private


HC Hospital Clinic Pharmacy Total
Total no. of HFs clinically audited
Total # of claims audited
Private
Health care providers contracted HC Hospital Clinic Pharmacy Total
Total # of HFs contracted
Number of accredited HFs out of contracted
Referral Number Remark
For further treatment and investigation
Due to stock out Services
Patients outcome Number
Discharged/Cured
Death
Referred
Others
General Assembly and Board meetings; and related Number Remark
# of General Assembly meeting conducted in the reporting period
# of CBHI Board meeting conducted in the reporting period
# of community-facility-provider forums conducted in the reporting period
By EHIA HQ/ By Health By
# of times the CBHI Scheme provided with supportive supervision/SS Branch System Partners Total

Complaints Management HC Hospital Total Remark


Number of Health Facilities implementing complaint handling mechanisms
# of Complaints
Main reasons for complaints Registered # of complaints Addressed Remark
Receptions related
Shortage of drugs and supplies
Shortage of Lab and imaging services
Others
Total
Number of individuals trained on health insurance Male Female Total
Political Leaders
Community Leaders
Woreda CBHI Coordinators/staffs
Health Extension Workers/HEWs
Others
Total

Human Resource of the Woreda CBHI Scheme Health


Professional IT Accountant Others

Total # of staff at the beginning of the fiscal year (report only at the beginning of the year)
Number of new staff recruited
Total # of staff resigned
Total # of staff at the end of the fiscal year (report only at the end of the year)
Quarterly EHIA Branch Health Insurance Activity Summary Report
Reporting
Region:_____________________________________ Period:_______________________________

Name of Branch:__________________________________ Submission date:___________________

Newly
Existing/previously included in
Total
reported the reporting
Reporting EHIA Branch Profile period Remark

Number of functional CBHI Woredas under the branch

Number of Woredas initiating CBHI under the branch

Functional Woredas Scheme population

No.of eligible households in functional CBHI Woredas


CBHI Membership (only for functional woredas) Male Female Total Remark
Paying
Number of CBHI members newly enrolled Non - paying
Total
Paying
Family members newly registered (including HH heads) Non - paying
Total
Paying
Number of CBHI members renew their membership Non - paying
Total
Paying
Total family members with membership renewal (including HH heads) Non - paying
Total
Paying
ID provided out of total registered members Non - paying
Total
CBHI Revenue Mobilization In the reporting period Remark

Total contribution collected from paying HHs


Expected money to be collected from
registered members (actually collected)

Amount deposited to Bank


# of receipts unreturned from previous
years
Receipts Voucher-RV
# of receipts distributed
# of receipts returned
Expected from Region
Targeted Subsidy Deposited by region
Expected from woreda
Deposited by Woreda
Other Sources
Female Total
Health Service Utilization Male Remark

Total number of visits by Health Insurance beneficiaries


Number of claim submitted for: Paying Non - paying Total
OPD visits
IPD visits
Total
Amount of claim submitted for: Ethiopian Birr Remark
card
lab
drugs
imaging
surgery/procedure
bed including food
other services
Total Amount Claimed
Total length of stay in days (for IPD)
Claim Management (request) HC Hospital Total Remark
Number
Total claims
Amount in Birr
Number
Amount of money deducted due to
Claims rejected Claim Audit
Amount of money added due to Claim
Audit
Number
OOP reimbursement
Amount in Birr
Number
Claims from 3rd party
Amount in Birr
CBHI Expense In the reporting period Remark
Total amount paid for Health Centers
Total amount paid for Hospital(s)
Total OOP reimbursement
Total amount paid for 3rd party
CBHI Expense by type of visit
Amount paid for OPD Visit
Amount paid for IPD Visit
Expense due to Higher Level Pool
Regional pool
Amount deposited to Higher level Pooling Zonal pool
Total
Any other expenses
CBHI Fund Current Balance at Scheme Level (Bank statement)
Amount Payable
Medical Audit Status HC Hospital Private Clinic Pharmacy Total
Total no. of HFs clinically audited
Total # of claims audited
Health care providers contracted HC Hospital Private Clinic Pharmacy Total
Total # of HFs contracted
Number of accredited HFs out of contracted
Referral Number Remark
For further treatment and investigation
Due to stock out Services
Patients outcome Number
Discharged/Cured
Death
Referred
Others
General Assembly and Board meetings; and related Number Remark
Number of General Assembly meetings conducted in the reporting period
Number of CBHI Board meetings conducted in the reporting period
Number of community-facility-provider platforms conducted in the reporting period
Number of Zonal Review meetings conducted
By EHIA HQ/ By Health
# of times the CBHI Scheme provided with supportive supervision/SS Branch System By PartnersTotal
Complaints Management HC Hospital Total Remark
Number of Health Facilities implementing complaint handling mechanisms
# of
Complaints # of complaints
Main reasons for complaints Registered Addressed Remark
Receptions related
Shortage of drugs and supplies
Shortage of Lab and imaging services
Others
Total
On the Not
CBHI Financial Management and Audit system Done process started Remark
# of Woredas with the last year budget closing accomplished
# of Woredas Audited the last year budget
Amount of money in
Audit findings (Deficit amount and refunded) # of woredas ETB Remark
Deficit/discrepancy Reported
Deficit/discrepancy Refunded
Operational Budget Utilization/Branch In the reporting period Remark
Amount of money allocated
Amount of money utilized
Capacity Building (EHIA Branch Office only) Male Female Total Remark
Number of staffs received capacity building training
Number of staff who has got further educational opportunities
Number of individuals trained on health insurance Male Female Total Remark
Political Leaders
Community Leaders
Woreda CBHI Coordinators/staffs
Health Extension Workers/HEWs
Others
Total

Human Resource Woreda Scheme


level Region/Zone EHIA Branch Total
Total # of staff at the beginning of the fiscal year (report only at the beginning of
the year)
Number of new staff recruited
Total # of staff resigned
Total # of staff at the end of the fiscal year (report only at the end of the year)
Data Quality In the Quarter Remark
# of CBHI Woreda Schemes submitted the expected report
# of CBHI Woreda Schemes submitted the expected report timely

# of CBHI Woreda Schemes assessed for data quality improvement by EHIA Branch
Agenda

Training Implementation Plan


Training Implementation Plan

Participants

64
TH A N K YO U !

You might also like