You are on page 1of 48

Programme evaluation is essentially an information-

gathering and interpreting activity that attempts to


answer a set of questions about a programme's
performance and effectiveness.
When it comes to designing an evaluation the evaluator
has to consider the specific questions being addressed by
the evaluation and the audience for the answers.
 This will influence the selection of evaluation design,
data sources and methods of data collection.
 To conduct evaluation research, we must be able to
operationalise, observe and recognise the presence or
absence of what is under study (Babbie, 1998:336).
Collecting Data
 Evidence helps provide answers to your research questions and
hypotheses.
 To get these answers, you engage in the step of collecting or gathering
data.
 Collecting data means identifying and selecting individuals for a
study, obtaining their permission to study them, and gathering
information by asking people questions or observing their behaviors.
 Of paramount concern in this process is the need to obtain accurate
data from individuals and places. This step will produce a collection
of numbers (test scores, frequency of behaviors) or words (responses,
opinions, quotes).
Analyzing and Interpreting the Data

During or immediately after data collection, you need to make


sense of the information
supplied by individuals in the study. Analysis consists of “taking
the data apart” to determine individual responses and then
“putting it together” to summarize it.
Analyzing and interpreting the data involves drawing
conclusions about it; representing it in tables, figures, and pictures
to summarize it; and explaining the conclusions in words to
provide answers to your research questions. You report analysis
and interpretation in sections of a report usually titled Results,
Findings, or Discussions.
DATA

 Data refers to the presentation of information in a


formalised manner suitable for communication,
interpretation and processing.
 The term raw data refers to unprocessed information.
 Information refers to knowledge that is communicated
 The continuous and systematic collection and
analysis of information (data) in relation to a
program or project that is able to provide
management and key stakeholders with an
indication as to the extent of progress
against stated goals and objectives.
(Markiewicz,2014).

LSU
• Monitoring refers to the continuous assessment of project
implementation and first impact through the process of
data collection and analysis, reporting and use of
information. (Gosparini, 2003)

LSU
 The DAC defines monitoring as “a management
function which uses methodological collection of data
to determine whether the material and financial
resources are sufficient, whether the people in charge
have the necessary technical and personal
qualification, whether activities conform to work-
plans, and whether the work-plan has been achieved
and has produced the original objectives”.

LSU
Evaluation.
 The rigorous, scientifically-based collection and
analysis of information about
program/intervention activities, characteristics,
and outcomes that determine the merit or worth of
the program/ intervention.

LSU
 Monitoring and evaluation are two different
exercises, each having its own focus, instruments
and methodologies. They have in common that
both exercises collect data on the performance of
the project.

 Both are essentially data collecting exercises.

LSU
Approaches to data collection
 There are two main approaches to the data collection process:
qualitative and quantitative.

 There are two main types of information produced by the data


collection process: qualitative and quantitative. The most obvious
difference between the two is that quantitative data is numerical data
(e.g., amounts, proportions) and qualitative data is information which
can best be described in words or diagrams and pictures (e.g.,
descriptions of events, observed behaviours, direct quotations, maps)
• Many fields of the social sciences have been engaged for years -
- if not decades in what has come to be known as the
‘quantitative-qualitative’ debate.
• The debate concerns itself with the question, which of the
approaches is better suited to record social phenomena, and to
what degree the two should -- and can -- be integrated. In
recent times, the debate has shifted considerably towards a
broad mainstream calling for sensible integration of
quantitative and qualitative approaches very much along the
lines of Mechanic (1989, p.154) who maintains “the strong view
that research questions should dictate methodology” and he
particularly endorses “combining the advantages of a survey
(its scope and its sampling opportunities) with the smaller
qualitative
• study.”
Most information systems within projects require the collection of both quantitative
and qualitative data. Projects need qualitative data about the nature of results (e.g.,
beneficial or harmful effects, intended or unintended impacts). Projects also need
quantitative data (e.g., about the distribution or intensity of the results) to ensure
the accuracy and representativeness of the analysis.

Qualitative data may be collected through open-ended questions in self-


administered questionnaires, individual interviews, focus group discussions, or
through observations during fieldwork. Data requested in open-ended questions
include respondents’ opinions on a certain issue, reasons for a certain behaviour
and description of certain procedures, practices or beliefs/knowledge with which
the evaluator is not familiar.
Qualitative and Quantitative data

 Quantitative data are obviously needed when a number, rate,


or proportion related to the target population must be
estimated or a variable such as crop production must be
measured.
 Qualitative data are needed when the attitudes, beliefs, and
perceptions of the target population must be known in order
to understand its reactions and responses to project services.
 Both quantitative and qualitative data collection may employ
similar approaches, such as interviews or observations.
 However, quantitative approaches use more closed-ended
approaches in which the researcher identifies set response
categories (e.g., strongly agree, strongly disagree, and so forth),
whereas qualitative approaches use more open-ended
approaches in which the inquirer asks general questions of
participants, and the participants shape the response
possibilities (e.g., in an interview with a teacher, a qualitative
researcher might ask: What does professional development
mean to you?).
 A quantitative researcher typically has taken some courses or
training in measurement, statistics, and quantitative data
collection, such as experiments, correlational designs, or survey
techniques.
 Qualitative researchers need experience in field studies in
which they practice gathering information in a setting and
learning the skills of observing or interviewing individuals.
 Coursework or experience in analyzing text data is helpful, as
well as in research designs such as grounded theory,
ethnography, or narrative research.
 Some individuals have experience and training in approaches
to research that combine both quantitative and qualitative
methods, such as mixed methods research or action research.
If you: Then use this approach
want to do statistical analysis Quantitative
want to be precise
know exactly what you want to
measure
want to cover a large group

want anecdotes or in-depth Qualitative


information
are not sure what you want to
measure
do not need to quantify
 The problem, evaluation design, evaluation theory/tradition,
the evaluation questions, and the literature reviews help to
steer the researcher toward either the quantitative or
qualitative track.

 These, in turn, inform the specific research design to be used


and the procedures involved in them, such as sampling, data
collection instruments or protocols, the procedures, the data
analysis, and the final interpretation of results.
Research/Evaluation designs are the specific procedures
involved in the research process: data collection, data analysis,
and report writing.
 Survey Designs
 In another form of quantitative research/evaluation, you may
not want to test an activity or materials or may not be interested
in the association among variables.
 Instead, you seek to describe trends in a large population of
individuals. In this case, a survey is a good procedure to use.
 Survey designs are procedures in quantitative research in
which you administer a survey or questionnaire to a small group
of people (called the sample) to identify trends in attitudes,
opinions, behaviors, or characteristics of a large group of
people (called the population).
Correlational Designs
 Correlational designs are procedures in quantitative research in which investigators
measure the degree of association (or relation) between two or more variables using
the statistical procedure of correlational analysis.
 This degree of association, expressed as a number, indicates whether the two
variables are related or whether one can predict another.
 To accomplish this, you study a single group of individuals rather than two or more
groups as in an experiment.
 Experimental Designs
 Some quantitative researchers/evaluators seek to test whether
an intervention makes a difference for individuals.
 Experimental research procedures are ideally suited for this.
 Experimental designs (also called intervention studies or
group comparison studies) are procedures in quantitative
research in which the investigator determines whether an
activity or materials make a difference in results for
participants.
 You assess this impact by giving one group one set of
activities (called an intervention) and withholding the set
from another group.
Mixed Methods Designs
 You decide to collect both quantitative data (i.e., quantifiable
data) and qualitative data (i.e., text or images).
 The core argument for a mixed methods design is that the
combination of both forms of data provides a better
understanding of a research problem than either quantitative or
qualitative data by itself. Mixed methods designs are
procedures for collecting, analyzing, and mixing both
quantitative and qualitative data in a single study or in a
multiphase series of studies.
 In this process, you need to decide on the emphasis you will
give to each form of data (priority), which form of data you will
collect first (concurrent or sequential), how you will “mix” the
data (integrating or connecting), and whether you will use
theory to guide the study (e.g., advocacy or social science
theory).
Action Research Designs
 Like mixed methods research, action research designs often
utilize both quantitative and qualitative data, but they focus
more on procedures useful in addressing practical problems in
projects, interventions, schools, hospitals, etc.
 Action research designs are systematic procedures used to
gather quantitative and qualitative data to address
improvements in their program like the educational setting, their
teaching, and the learning of their students.
 In some action research designs, you seek to address and
solve local, practical problem.
 In other studies, your objective might be to empower, transform,
and emancipate individuals in the settings.
Grounded Theory Designs
• Instead of studying a single group, you might examine a
number of individuals who have all experienced an action,
interaction, or process.
• Grounded theory designs are systematic, qualitative
procedures that researchers/evaluators use to generate a
general explanation (grounded in the views of participants,
called a grounded theory) that explains a process, action, or
interaction among people.
• The procedures for developing this theory include primarily
collecting interview data, developing and relating categories (or
themes) of information, and composing a figure or visual model
that portrays the general explanation.
• In this way, the explanation is “grounded” in the data from
participants. From this explanation, you construct predictive
statements about the experiences of individuals
 Narrative Research Designs
 You may not be interested in describing and interpreting group
behavior or ideas, or in developing an explanation grounded in
the experiences of many individuals.
 Instead, you wish to tell the stories of one or two
individuals. Narrative research designs are qualitative
procedures in which researchers describe the lives of
individuals, collect and tell stories about these
individuals’ lives, and write narratives about their
experiences.
 In education, these stories often relate to school
classroom experiences or activities in schools.
 Ethnographic Designs
 You may be interested in studying one group of individuals, in
examining them in the setting where they live and work, and in
developing a portrait of how they interact.
 An ethnographic study is well suited for this purpose.
 Ethnographic designs are qualitative procedures for describing,
analyzing, and interpreting a cultural group’s shared patterns of
behavior, beliefs, and language that develop over time. In
ethnography, the researcher provides a detailed picture of the
culture-sharing group, drawing on various sources of
information.
 The ethnographer also describes the group within its setting,
explores themes or issues that develop over time as the group
interacts, and details a portrait of the group.
Types of data

 PRIMARY
 SECONDARY
(MAKE A DISTINCTION BETWEEN THE TWO)

LSU
What data is/are required depends on a number of
variables
 The needs of the client;
 The timing of the monitoring/evaluation in the
project cycle;
 The nature of the project;
 The purpose of monitoring or evaluation

LSU
For data collection, the general rules is
 Use available data if you can. (Its faster, less
expensive, and easier than generating new data.)

However, find out how earlier evaluators:


− collected the data
− defined the variables
− ensured accuracy of the data.

LSU
If you must collect original data THEN:
− establish procedures and follow them
(protocol)
− maintain accurate records of definitions and
coding
− pre-test, pre-test, pre-test
− verify accuracy of coding, data input.

LSU
DESIGN M&E PLAN

Utilise logic model/theory of change/key evaluation questions

Identify appropriate sources of evidence


and data collection methods


Collect data on implementation/outcomes

Analyse data and produce evaluation report


SOURCE: Goremucheche R. (2016)
Data collection and data
management for evaluation .
CREST Powerpoint presentation
Main considerations in data collection

 The evaluator has to consider the specific


questions being addressed by the evaluation and
the audience for the answers.
 This will influence the selection of evaluation
design, data sources and methods of data
collection
 There are two main approaches to collecting data;
quantitative and qualitative

LSU
Category Examples
Observation Systematic observation under controlled experimental/
laboratory conditions

Participant observation in natural field settings

Self-reporting Personal and group face-to-face interviewing

Telephone interviewing

Mail and electronic surveys

Archival/ documentary Historical documents/ diaries/ letters/ speeches/ literary


sources texts/ narratives/ official memoranda/ business plans/ annual
reports/ medical records/ etc.
Physical sources Blood samples/ cell tissue/ chemical compounds/ materials
etc.
LSU
EVALUATION THEORIES

 Evaluation theories describe and prescribe what evaluators do


or should do when conducting evaluations.

 They specify such things as evaluation purposes, users, and


uses, who participates in the evaluation process and to what
extent, general activities or strategies, method choices, and
roles and responsibilities of the evaluator, among others
(Fournier, 1995; Smith, 1993). Largely, such theories are
normative in origin and have been derived from practice rather
than theories that are put into practice (Chelimsky, 1998).
Data collection methods and evaluation traditions

 Additionally, specifying program theory recently has been put


forth as an essential competency for program evaluators
(Stevahn, King, Ghere, & Minnema, 2005).
 Stufflebeam and Shinkfield (2007) note, ‘‘ . . . if evaluators do
not apply evaluation theory . . . then it is important to ask why
they do not.
 Perhaps the approaches are not sufficiently articulated for
general use, or the practitioners are not competent to carry
them out, or the approaches lack convincing evidence that
their use produces the needed evaluation results’’ (p. 62).
Ethics in data collection

What is ethical behavior?


 “…a set of moral principles, rules, or standards
governing a person or a profession” (London School
of Hygiene and Tropical Medicine).
 “Not knowing what constitutes best practice is
incompetence.
 Knowing what best practice is, but not knowing how
to achieve it, may be inexperience. Knowingly not
following best practices, when one knows how to
achieve it, is unethical” (Smith 2002: 23)
ETHICS RELATED TO DATA COLLECTION

7 Ethical Principles
1. Ensure Confidentiality and or anonymity
2. Get Informed Consent
3. Do No Harm
4. Build Rapport Not Friendship
5. Minimize Intrusiveness
6. Avoid Inappropriate behaviour
7. Setting High Expectations
Anonymity: an evaluation condition in which the researcher do
not request any identifying materials that could link the persons
from whom they collect data with the collected data

Confidentiality: an evaluation condition in which the researchers


collect data in such a way that only they are able to identify the
persons responding to interviews, surveys, observations or
document reviews
Strategies to Ensure Confidentiality

No names in field and personal notes


Protect raw and processed data e.g. passwords
Caution when sharing data (team and those who understand
ethics)
Training of fieldworkers/researchers in ethics
Ethical Principle 2: Get Informed Consent

“Informed consent is not a consent form or a legal document; it is


a communication and decision process” (Sieber, 1994, p. 5)

1. Explicit act i.e. written or verbal agreement


2. Based on full understanding of the evaluation study (purpose,
procedures, potential risks and benefits)
3. Should be voluntary and not coerced
4. Renegotiable – participant fully aware of right to discontinue
participation
Ethical Principle 3: Do No Harm (1)

“…avoid inflicting physical or psychological injury on others, and


wherever possible, to protect them from exposure to the risk of
harm.” (Morris 2008:5)

Where harm is unavoidable:


Minimize harm
Ensure that direct or indirect benefits compensate for the harm
Sensitivity when interviewing participants: e.g. survivors of
traumatic events such as domestic violence.
Ethical Principle 3: Do No Harm (2)

Is the perceived benefit of getting the data worth any risk of


the safety of the data collector and the participant?
Post conflict situations
Current conflict situations and emergency situations
Areas where crime is rife
Areas where violence is rife
Cultures that have certain beliefs about accessing information
from females in the community
Ethical Principle 5: Minimise Intrusiveness

Because of the iterative nature of qualitative data collection, you


may need to get more data from participant to verify or get clarity
and you may become intrusive.

Personal Time e.g. when interviews go over scheduled time.

Personal Space e.g. where interviews are conducted in the home.


Ethical Principle 6: Avoid Inappropriate behaviour

If you feel you are getting too close to your participants – you
probably are…
Ethical Principle 7: Avoid Setting High Expectations

Some evaluators may stir up high expectations among


participants during data collection:
Incentives for participation
Inappropriate phrasing of purpose of evaluation
Discussion of benefits of participating in study
E.g. Evaluations in camps for displaced individuals
E.g. Evaluations in programmes at improving livelihoods or
poverty alleviation
 1. Relevance
 2. Coherence
 3. Effectiveness
 4. Efficiency
 5. Impact
 6. Sustainability

LSU

You might also like