Professional Documents
Culture Documents
DIPLOMA IN PUBLIC
MANAGEMENT
Monitoring and
Evaluation
Contact details:
Regenesys School of Public Management
Tel: +27 (11) 669-5000
Fax: +27 (11) 669-5001
E-mail: info@regenesys.co.za
www.regenesys.co.za
This study guide highlights key focus areas for you as a student. Because the field of study in question is so
vast, it is critical that you consult additional literature.
All rights reserved. No part of this publication may be reproduced, stored in or introduced into a retrieval
system, or transmitted, in any form, or by any means (electronic, mechanical, photocopying, recording or
otherwise) without written permission of the publisher. Any person who does any unauthorised act in relation
to this publication may be liable for criminal prosecution and civil claims for damages.
CONTENTS
1. WELCOME TO REGENESYS .............................................................................................................. 1
2. TEACHING AND LEARNING METHODOLOGY .................................................................................. 2
2.1 PRINCIPLES FOR RESPONSIBLE MANAGEMENT EDUCATION ............................................. 2
2.2 REGENESYS’ INTEGRATED LEADERSHIP AND MANAGEMENT MODEL .............................. 3
2.3 DEVELOPING REGENESYS GRADUATE ATTRIBUTES ........................................................... 5
3. KEY TO ICONS..................................................................................................................................... 7
4. STUDY MATERIAL ............................................................................................................................... 8
5. PRESCRIBED AND RECOMMENDED RESOURCES ........................................................................ 8
5.1 ARTICLES..................................................................................................................................... 8
5.2 LEGISLATION............................................................................................................................... 9
5.3 ACCESSING JOURNAL ARTICLE AND OTHER ONLINE LINKS ............................................. 11
5.4 ADDITIONAL SOURCES TO CONSULT.................................................................................... 12
6. GROUND RULES AND EXPECTATIONS .......................................................................................... 13
6.1 EXPECTATIONS ........................................................................................................................ 13
6.2 GROUND RULES ....................................................................................................................... 14
7. INTRODUCTION................................................................................................................................. 15
7.1 LEARNING OUTCOMES ............................................................................................................ 15
7.2 AN INTRODUCTION TO MONITORING AND EVALUATION IN THE PUBLIC SECTOR ......... 16
7.2.1 DEFINING MONITORING AND EVALUATION ................................................................ 16
7.2.2 CONCEPTS AND TERMINOLOGY .................................................................................. 17
7.2.3 THE PURPOSE OF MONITORING AND EVALUATION ................................................. 21
7.2.4 MONITORING AND EVALUATION MODELS AND TECHNIQUES ................................. 22
7.2.5 M&E STAKEHOLDERS IN THE GOVERNMENT SPHERE............................................. 25
7.2.6 CONCLUSION .................................................................................................................. 33
7.2.7 KEY POINTS .................................................................................................................... 34
7.3 THE LEGAL CONTEXT FOR MONITORING AND EVALUATION ............................................. 36
7.3.1 INTRODUCTION TO LEGISLATION ................................................................................ 37
7.3.2 CONSTITUTION OF THE REPUBLIC OF SOUTH AFRICA, 1996 .................................. 37
7.3.3 THE PUBLIC SERVICE ACT, 1994 AS AMENDED ......................................................... 39
7.3.4 THE PUBLIC FINANCE MANAGEMENT ACT (PFMA), AS AMENDED .......................... 39
7.3.5 THE MUNICIPAL FINANCE MANAGEMENT ACT .......................................................... 41
7.3.6 TREASURY REGULATIONS............................................................................................ 43
7.3.7 GOVERNMENT’S REVENUE AND EXPENDITURE STRATEGY ................................... 46
7.3.8 DIVISION OF REVENUE ACT (DORA) ............................................................................ 48
7.3.9 MONITORING AND EVALUATION AND POLICY MANAGEMENT................................. 50
7.3.10 CONCLUSION .................................................................................................................. 51
7.3.11 KEY POINTS .................................................................................................................... 52
7.4 THE MONITORING AND EVALUATION PROCESS.................................................................. 53
7.4.1 PLACEMENT IN THE GOVERNMENT’S PLANNING CYCLE ......................................... 53
7.4.2 INTERGOVERNMENTAL RELATIONS AND THE LOCAL GOVERNMENT FISCAL
FRAMEWORK .................................................................................................................. 55
7.4.3 MONITORING AND EVALUATION SYSTEM .................................................................. 57
7.4.4 EVALUATION PERSPECTIVES....................................................................................... 60
7.4.5 NATIONAL EVALUATION PLAN 2013/2014 .................................................................... 66
7.4.6 M&E AS PART OF OTHER MANAGEMENT FUNCTIONS ............................................. 75
7.4.7 CONCLUSION .................................................................................................................. 76
7.4.8 KEY POINTS .................................................................................................................... 77
7.5 IMPLEMENT A MONITORING AND EVALUATION SYSTEM ................................................... 78
7.5.1 INTRODUCTION .............................................................................................................. 79
7.5.2 STEP 1: EXAMINE THE CONTEXT AND CURRENT STATE ......................................... 81
7.5.3 STEP 2: ADMINISTRATIVE INFORMATION SYSTEMS AND DATA SETS ................... 83
7.5.4 STEP 3: LIST INDICATORS, TARGETS AND BASELINES ............................................ 86
7.5.5 STEP 4: GROUP INDICATORS BY POLICY OBJECTIVE .............................................. 91
7.5.6 STEP 5: IF POLICY OBJECTIVES HAVE NO INDICATORS .......................................... 91
7.5.7 STEP 6: REVIEW LINK BETWEEN INPUTS, OUTPUTS, OUTCOMES AND IMPACTS,
AND IDENTIFY CAUSAL RELATIONSHIPS AND LINKS ................................................ 91
7.5.8 STEP 7: REPORTING APPROACH ................................................................................. 96
7.5.9 STEP 8: EVALUATE APPROACH.................................................................................... 97
7.5.10 STEP 9: CAPACITY BUILDING PLAN ............................................................................. 99
7.5.11 STEP 10: COMMUNICATION PLAN ................................................................................ 99
7.5.12 MONITORING AND EVALUATION SUCCESS FACTORS............................................ 101
7.5.13 CONCLUSION ................................................................................................................ 102
7.5.14 KEY POINTS .................................................................................................................. 102
8. REFERENCES.................................................................................................................................. 104
GLOSSARY OF KEY TERMS AND ABBREVIATIONS .................................................................... 109
9. 109
9.1 KEY TERMINOLOGY ............................................................................................................... 109
9.2 ABBREVIATIONS ..................................................................................................................... 111
10. VERSION CONTROL ....................................................................................................................... 112
List of Tables
TABLE 1: MONITORING AND EVALUATION TERMINOLOGY................................................................ 17
TABLE 2: DIFFERENCES BETWEEN MONITORING AND EVALUATION .............................................. 19
TABLE 3: DIFFERENT TYPES OF RESULTS-BASED MANAGEMENT .................................................. 20
TABLE 4: PURPOSE OF MONITORING AND EVALUATION................................................................... 21
TABLE 5: COMPONENTS OF THE LOGIC MODEL ................................................................................. 23
TABLE 6: BALANCED SCORECARD PERSPECTIVES ........................................................................... 23
TABLE 7: DIVISION OF M&E RESPONSIBILITIES BETWEEN THE PSC AND THE DPME ................... 28
TABLE 8: EXAMPLE OF CONSTITUTIONAL VALUES AND MONITORING AND EVALUATION
INDICATORS ......................................................................................................................... 38
TABLE 9: SAMPLE TREASURY REGULATIONS ..................................................................................... 44
TABLE 10: LEGISLATION THAT ORGANISES INTERGOVERNMENTAL RELATIONS ......................... 55
TABLE 11: STEPS IN THE MONITORING AND EVALUATION PROCESS ............................................. 58
TABLE 12: SITUATION ANALYSIS ........................................................................................................... 82
TABLE 13: PRIMARY DATA COLLECTION METHODS ........................................................................... 84
TABLE 14: PERFORMANCE INDICATORS .............................................................................................. 86
TABLE 15: DIFFERENT REPORTS .......................................................................................................... 96
TABLE 16: COMPONENTS OF AN EVALUATION REPORT ................................................................. 100
List of Figures
FIGURE 1: LOGIC MODEL ........................................................................................................................ 22
FIGURE 2: GOVERNMENT'S OUTCOMES-BASED APPROACH............................................................ 24
FIGURE 3: KEY STAKEHOLDERS IN TERMS OF DPSA MONITORING AND EVALUATION................ 26
FIGURE 4: POLICY LIFE CYCLE .............................................................................................................. 50
FIGURE 5: MONITORING AND EVALUATION IN THE PLANNING CYCLE ............................................ 54
FIGURE 6: THE MONITORING AND EVALUATION PROCESS .............................................................. 57
FIGURE 7: ACHIEVING OUTCOMES USING THE GOVERNMENT-WIDE MONITORING AND
EVALUATION SYSTEM ........................................................................................................ 80
FIGURE 8: THE PLANNING CYCLE ......................................................................................................... 81
FIGURE 9: EXAMPLE OF A THEORY OF CHANGE MODEL FOR THE CITY OF JOHANNESBURG ... 93
FIGURE 10: EXAMPLE OF A PROBLEM TREE ANALYSIS..................................................................... 94
FIGURE 11: EXAMPLE OF APPLICATION OF THE OUTCOMES APPROACH TO MONITORING AND
EVALUATION ........................................................................................................................ 95
1. WELCOME TO REGENESYS
“Have a vision. Think big. Dream, persevere, and your vision will become a reality.
Awaken your potential, knowing that everything you need is within you.”
Dr. Marko Saravanja
At Regenesys we help individuals and organisations achieve their personal and organisational goals
by enhancing their management and leadership potential. Our learning programmes are designed
to transform and inspire your mind, heart and soul, helping you to develop the knowledge, skills,
positive values, attitudes and behaviours required for success.
Having educated more than 100 000 students based in highly reputable local and international
corporations across more than 160 countries since the inception of Regenesys in 1998, we are now
one of the fastest-growing institutions of management and leadership development in the world. Our
ISO 9001:2008 accreditation bears testimony to our quality management systems meeting
international standards. We are also accredited with the Council on Higher Education.
At Regenesys you will be taught by business experts, entrepreneurs and academics who are inspired
by their passion for human development. You will be at a place where business and government
leaders meet, network, share their experience and develop business relationships.
We will help you awaken your potential and to realise that everything you need to succeed is within
you. And we will be with you every step of the way.
Areas of Expertise
Regenesys uses an interactive teaching and learning methodology that encourages self-reflection
and promotes independent and critical thinking. Key to our approach is an understanding of adult
learning principles, which recognise the maturity and experience of participants, and the way that
adult students need to learn.
At the core of this is the integration of new knowledge and skills into existing knowledge structures,
as well as the importance of seeing the relevance of all learning via immediate application in the
workplace. Practical exercises are used to create a simulated management experience to ensure
that the conceptual knowledge and practical skills acquired can be directly applied within the work
environment of the participants. The activities may include scenarios, case studies, self-reflection,
problem solving and planning tasks.
Our courses are developed to cover all essential aspects of the training comprehensively in a user-
friendly and interactive format. Our subject matter experts have extensive experience in
management education, training and development.
Regenesys upholds the UN Global Compact’s Principles for Responsible Management Education:
(PRME, 2014:1)
This course will draw on a model developed by Regenesys Management, demonstrating how the
external environment, the levels of an organisation, the team and the components of an individual
are interrelated in a dynamic and systemic way. The success of an individual depends on his or her
self-awareness, knowledge, and ability to manage these interdependent forces, stakeholders and
processes.
The degree of synergy and alignment between the goals and objectives of the organisation, the team
and the individual determines the success or failure of an organisation. It is, therefore, imperative
that each organisation ensures that team and individual goals and objectives are aligned with the
organisation’s strategies (vision, mission, goals and objectives, etc); structure (organogram,
decision-making structure, etc); systems (HR, finance, communication, administration, information,
etc); culture (values, level of openness, democracy, caring, etc). An effective work environment
should be characterised by the alignment of organisational systems, strategies, structures and
culture, and by people who operate synergistically.
PEOPLE
(QUALITY OF LIFE)
Bearable Equitable
Sustainable
PLANET PROFIT/PROSPERITY
(STEWARDSHIP) Viable (VALUE CREATION)
PURPOSE
As a public manager you have the capacity to bring about real change. As much as organisations
are shaped by their environment, their actions influence the environment. You can contribute to
sustainable change by managing responsibly.
Getting a qualification is not enough, on its own, to prepare you to traverse the rapidly changing
world of work, where industry 4.0 and 5.0 are rendering many professions obsolete. We will work
with you throughout your studies to help you develop these critical attributes to navigate the new
world order, along with the skills and knowledge you need to excel in any environment.
Think differently
To think differently, you must be intellectually curious, analytical, open-minded though constructively
critical, with the mental agility to think across disciplines, contexts and domains to solve complex
problems and find innovative ways to do things. Be imaginative but rational. We will systematically
help you cultivate higher-order thinking – the kind of thinking that recognises and makes sense of
patterns others miss, and that facilitates unique linkages and solutions.
Both well-informed and knowledgeable, you must be committed to sound research, taking a
multidisciplinary and metacognitive approach to problem-solving, and able to recognise and put
aside personal bias, basing decisions on evidence. This will prepare you to take calculated risks.
This ties back to the overarching P in the quadruple bottom line: purpose. Purpose-driven, you put
sustainability at the heart of your organisation. Emotionally and spiritually intelligent, you should be
self-aware, understand the interconnectedness of all things, and act ethically and with integrity. As
an ideal graduate, you will be a service-oriented agent of change.
Harness diversity
You will appreciate the value of individual differences. Socially intelligent, collaborative and a skilled
communicator, you should be able to facilitate connections to build, empower and manage high-
functioning teams with diverse skills and personalities, and support them in assuming
responsibilities.
Professional comportment
With a confident and inspiring aura, you are utterly professional, yet accessible. Deliberate,
determined, disciplined, and focused. You will model your values, and hold yourself accountable.
You will have the resilience and grit to keep going in the face of adversity.
Your glocal outlook underpins your ability to operate and compete ethically and profitably as a
responsible global citizen in a borderless world. Your multicultural awareness and wide-ranging
interest in current affairs enables you to recognise and respond to local cultures and needs without
losing sight of the global picture.
The next few sections contain practical information that will help you do just that.
Example Calculations
Audio Presentation
Choice Appendix
These resources provide a starting point for your studies. You are
expected to make good use of your textbooks, the additional
resources provided via online links, and wider reading that you, as a
higher education student, will source yourself.
5.1 ARTICLES
• Adato, M. (2011). Combining quantitative and qualitative methods for program monitoring and evaluation: why
are mixed-method designs best?
http://documents.worldbank.org/curated/en/633721468349812987/pdf/643860BRI0Mixe00Box0361535B0PUBL
IC0.pdf (accessed 18 June 2020).
• Goldman, I., Engela, R., Akhalwaya, I., Gasa, N., Leon, B., Mohamed, H. and Phillips, S. (2012). Establishing a
national M&E system in South Africa. The World Bank Special Series on The Nuts & Bolts of Monitoring and
Evaluation Systems, 21, 1-11,
http://documents.worldbank.org/curated/en/556311468101955480/pdf/760630BRI0Nuts00Box374357B00PUBL
IC0.pdf (accessed 18 June 2020).
• Kaplan, R.S and Norton, D.P. (1992). The Balanced Scorecard – Measures that Drive Performance. Harvard
Business Review. https://hbr.org/1992/01/the-balanced-scorecard-measures-that-drive-performance-2
(accessed 18 June 2020).
• Lahey, R. (2010). The Canadian monitoring and evaluation (M&E) system: lessons learned from 30 years of
development. ECD Working Paper Series.
http://documents.worldbank.org/curated/en/865531468226748462/pdf/654070NWP0230c0C0disclosed011040
110.pdf (accessed 18 June 2020).
• Nkwinti, G. (nd). National Development Plan and the New Growth Path: Transforming the Economy.
http://kzntopbusiness.co.za/site/search/downloadencode/nLaqaaKelpO8mnjc (accessed 18 June 2020).
5.2 LEGISLATION
• Department of Public Service and Administration. (2012). Public Service Act, 1994.
http://www.dpsa.gov.za/dpsa2g/documents/acts®ulations/psact1994/PublicServiceAct.pdf (accessed 18
June 2020).
• National Treasury. (2003). Local Government: Municipal Finance Management Act No. 1 of 1999,
http://mfma.treasury.gov.za/MFMA/Legislation/Local%20Government%20-
%20Municipal%20Finance%20Management%20Act/Municipal%20Finance%20Management%20Act%20(No.%
2056%20of%202003).pdf (accessed 18 June 2020).
• National Treasury. (2005). Treasury regulations for departments, trading entities, constitutional institutions and
public entities.
http://www.treasury.gov.za/legislation/pfma/regulations/gazette_27388%20showing%20amendments.pdf
(accessed 18 June 2020).
• Public Service Commission. (2012). Evolution of monitoring and evaluation in the South African public service.
http://www.psc.gov.za/newsletters/docs/2012/K-9555%20PSC_6th%20edition%20magazine_DevV11.pdf
(accessed 18 June 2020).
Most study guide and virtual course links should open directly when you click on them, provided your
browser is open and connected to the net. However, to access articles and e-books on EbscoHost
or Emerald, you must be logged in to the student portal, and have these databases open.
If this does not work (it can depend on what browser you are using), cut and paste the URL (the
www address) into your browser and click to access the link. Use Chrome, Firefox or Safari as your
browser. Do not use Internet Explorer, as it is no longer supported by all applications. Check that
you have copied the whole URL, and have not left out part after a hyphen. There should not be any
spaces in the URL – the whole thing should be on one line.
Please report any broken links – or any other problems encountered on your educational journey
that we can solve – to mdt@regenesys.co.za so we can fix them for you.
Links to additional media that may prompt discussion and help you complete this course will be
saved in Around the Net, a couple of clicks down from the EbscoHost database links in the portal
library. Visit the site regularly to see what’s new.
As a higher education student, you are responsible for sourcing additional information that will assist
you in completing this course successfully. Here are sources you can consult to obtain additional
information on the topics to be discussed in this course. You will find more on the portal.
EbscoHost and These online databases contain journal articles, e-books and multimedia relevant to your
Emerald studies. Registered Regenesys students in good standing can access them through the
student portal.
NetMBA MBA constructs and discussion. http://www.netmba.com/
MindTools Ideas, constructs, management models and commentary. http://www.mindtools.com/
ProvenModels Provides management models – generalisations of business situations that, when applied in
context, can be powerful tools for solving business problems. http://www.provenmodels.com/
12manage.com More models, principles and global commentary. http://www.12manage.com/
The Free Comprehensive overviews of strategic planning.
Management Library http://managementhelp.org/strategicplanning/index.htm
TED TED (Technology, Entertainment and Design) is a nonprofit organisation that devotes itself
to spreading new, transformative ideas in science, business and global issues, among other
topics. TED’s website will take you to each of the groundbreakingTED Talks, and also to
TEDx, a programme that helps communities, organisations and individuals to create local
TED-like experiences. https://www.ted.com/about/our-organization
A word of caution – not all information available on the internet is necessarily of a high academic
standard. Always compare the information you find with that in reputable sources, such as articles
published in accredited journals.
6.1 EXPECTATIONS
It is crucial in any learning process that the expectations and needs of the learners are identified.
The identification of the learners’ expectations and needs enables the facilitator to create a relevant
and learner-focused learning process.
Expectations
Time: 10 minutes
In most group situations it is important to collectively develop ground rules or norms of behaviour in
order to create an environment conducive to learning. Ground rules set the tone for future group
discussions and behaviour.
Ground Rules
1. Find a partner.
2. List and discuss two issues that you feel would create an environment conducive for learning.
3. Each pair will brainstorm their list.
This Monitoring and Evaluation course focuses on the aspects that make up the government's
monitoring and evaluation system.
From a general introduction to the subject, we move on to explain the monitoring and evaluation
system in the context of government and in the legal context. We examine matters such as the
outcomes-based approach and the Government-Wide Monitoring and Evaluation System (GWMES).
From these systems you can learn how to apply a monitoring and evaluation framework in your own
organisation and how to ensure that this framework is a success.
The timetable under each section heading provides guidance on how long to spend studying the
section. Follow the timetable to ensure that you spend a suitable length of time on each section,
cover the required sections relevant to each assignment, and have enough time to prepare for the
examination.
• Goldman, I., Engela, R., Akhalwaya, I., Gasa, N., Leon, B., Mohamed, H. and Phillips, S.
(2012). Establishing a national M&E system in South Africa. The World Bank Special
Series on The Nuts & Bolts of Monitoring and Evaluation Systems, 21, 1-11,
Recommended http://documents.worldbank.org/curated/en/556311468101955480/pdf/760630BRI0Nuts00
reading Box374357B00PUBLIC0.pdf (accessed 18 June 2020).
• Kaplan, R.S and Norton, D.P. (1992). The Balanced Scorecard – Measures that Drive
Performance. Harvard Business Review. https://hbr.org/1992/01/the-balanced-scorecard-
measures-that-drive-performance-2 (accessed 18 June 2020).
According to Görgens-Albino and Zall Kusek (2009:2), “monitoring and evaluation is a powerful
public management tool that can be used to improve the way governments and organisations
achieve results. Just as governments need financial, human resource, and accountability systems,
they also need good performance feedback systems.” Monitoring and evaluation is a crucial part of
this feedback system.
In 2008 the Public Service Commission (PSC), published a manual explaining the basic concepts of
monitoring and evaluation. This manual defines the concepts as follows:
Monitoring
“A continuing function that uses systematic collection of data on specified indicators to provide
management and the main stakeholders of an ongoing development intervention with indications
of the extent of progress and achievement of objectives and progress in the use of allocated
funds.”
“Evaluation also refers to the process of determining the worth or significance of an activity, policy
or programme. An assessment, as systematic and objective as possible, of a planned, ongoing,
or completed development intervention.
“Note: Evaluation in some instances involves the definition of appropriate standards, the
examination of performance against those standards, an assessment of actual and expected
results and the identification of relevant lessons.”
To get an overview of the PSC’s constitutional mandate and its vision and mission, visit
the Public Service Commission website:
It is crucial to consider the terminology described in Table 1 if we are to fully understand monitoring
and evaluation.
Data terrains The Government-Wide Monitoring and Evaluation System (GWMES) is composed of three
data terrains: programme performance information, evaluation, and census data or statistics.
Evidence-based The systematic application of the best available evidence to the evaluation of options and to
decision-making decision-making in management and policy settings. Evidence should be based on the three
data terrains of the GWMES system.
Government-Wide The GWMES is a management framework within public sector organisations that works with
Monitoring and other management systems to integrate monitoring and evaluation practices into all elements
Evaluation System of the organisation.
(GWMES)
Outcomes-based The outcomes-based approach clarifies what we expect to achieve, how we expect to achieve
approach it, and how we will know we are achieving it. It is composed of inputs, activities, outputs,
outcomes and impacts – terms we will explore in more detail later in this course.
Performance A predetermined signal that a certain point in a process has been reached or a result has
indicators been achieved.
While monitoring and evaluation overlap and exist in a mutually beneficial capacity, they are still two
distinct processes. The Department of Performance Monitoring and Evaluation (DPME) (2011:3)
provides the following distinction between monitoring and evaluation:
“Monitoring involves the continuous collecting, analysing and reporting of data in a way that
supports effective management. Monitoring aims to provide managers with regular (and real-time)
feedback on progress in implementation and results, and early indicators of problems that need
to be corrected. It usually reports on actual performance against what was planned or expected.
“In summary, monitoring asks whether the things we planned are being done right, while
evaluation is asking are we doing the right things, are we effective, efficient and providing value
for money, and how can we do it better? Evaluation has the element of judgment, and must be
(made) against objectives or criteria.”
“Monitoring involves collecting, analysing, and reporting data on inputs, activities, outputs,
outcomes and impacts as well as external factors in a way that supports effective management.
Monitoring aims to provide managers, decision makers and other stakeholders with regular
feedback on progress in implementation and results, and early indicators of problems that need
to be corrected.”
“Evaluation is a time-bound and periodic exercise that seeks to provide credible and useful
information to answer specific questions to guide decision making by staff, managers, and policy
makers. Evaluations may assess relevance, efficiency, effectiveness, impact and sustainability.
Impact evaluations examine whether underlying theories and assumptions were valid, what
worked, what did not and why. Evaluation can also be used to extract cross-cutting lessons from
operating unit experiences and (for) determining the need for modifications to strategic results
frameworks.”
Focus (what?) • Focuses on the outputs of projects, • Compares planned with actual outcome
programmes, partnerships and achievement
activities, and their contribution to • Focuses on how and why outputs and
outcomes strategies contributed to achievement of
• Checks progress against plans, and outcomes and impacts
areas for action and improvement • Addresses questions of relevance,
effectiveness, sustainability and change
Responsibility Internal management and programme or External evaluators and partners
(who?) project manager responsibility at all Internal evaluators:
levels: • Executive management team (EMT)
• City-wide • Mayoral committee
• Clusters • Council
• Entities or departments • Performance management, monitoring,
• Mayoral committee evaluation and reporting unit
• Performance management,
monitoring and evaluation
reporting unit
Timing (when) Continuous and systematic • Time-bound, periodic and in-depth
• Before (formative), during (aiding
improvements) or after a project or
programme (summative)
Outcomes-based Inputs, activities and outputs Impacts, outcomes, purpose and overall objectives
position Outputs vs. inputs (effectiveness and efficiency);
impact; results vs costs; relevance to priorities
Data sources • Progress reports Evaluation reports; monitoring data; primary and
• Management information systems secondary data sources – including case studies,
surveys and statistical data
• Performance management data
(City of Johannesburg, 2012:10)
Section 195(1)(c) of the constitution provides that: “Public administration must be development-
oriented”. State institutions should ensure that all programmes comply with this principle. The PSC’s
State of the Public Service Report (2007) described the context of the developmental state thus:
“South Africa’s efforts to promote growth and development are being pursued within the context
of building a developmental state…. Such a state seeks to ‘capably intervene and shepherd
societal resources to achieve national developmental objectives,’ rather than simply rely on the
forces of the market.
“What gives rise to and shapes the nature of a developmental state depends on the context and
history of a country…. Against this background, many have quite correctly cautioned against any
attempts to suggest that there is a prototype of a developmental state that can be constructed on
the basis of what worked in other countries.
“What then is the specific context within which to locate a South African developmental state?
The PSC believes that the Constitution provides the basis on which to understand
developmentalism in South Africa given how it captures the collective will and determination of
her people to create a better life for themselves.”
It is therefore essential that the monitoring and evaluation system of the state complies with this
principle.
According to Chabane (2013), monitoring and evaluation in the public service aim to address:
• Opposition to change;
• A focus on completing activities rather than assessing their results;
• Insufficient measurement, collection and analysis of data that informs improvements;
• Ensuring monitoring and reporting for compliance, not improvement;
• Weak programme planning, indicators and targets, logic models or theories of change;
• Weak design of data measurement and collection processes; and
• No value in evidence-based planning and decision making.
M&E are also used to guide management decision-making and organisational learning and
accountability, to solicit support for programmes and advocacy, and to promote transparency. See
Table 4.
Management decision- Monitoring and evaluation can augment and complement management, as they provide
making evidence for decision-making. This is possible if monitoring and evaluation information
is appropriate and feeds into existing managerial processes. Decisions about resource
allocation, strategy implementation, policy decision and programme design are easier
with accurate information.
Organisational M&E helps to create learning organisations. They are useful tools to establish which
learning programme design will be best to implement and which will bring the best return on
investment. Information gathered through monitoring and evaluation should be
communicated in action-orientated reports. It can therefore be deduced that monitoring
and evaluation produce new knowledge.
Accountability Public servants are accountable for how public money is spent, how objectives are
achieved, and for ensuring that this is done with integrity. Monitoring and evaluation
provide information in a structured and formalised manner.
Soliciting support for Support for a programme is validated by means of evaluated findings. Monitoring and
programmes evaluation provide evaluated findings.
Government’s monitoring and evaluation system was developed from the Fifteen-Year Review of
Government (2009), which stated that there had to be a radical shift in government policy in order to
improve performance to an acceptable level. A completely new approach to M&E was needed, and
so the outcomes-based approach was introduced. This is the basis for the government-wide
monitoring and evaluation system or GWMES (Clear, 2012:144). We will study this framework in
detail later in this course. To understand the implementation of the government-wide M&E system,
we must understand the logic model and the Kaplan and Norton balanced scorecard perspectives.
Logic model
The logic model explains the relationship between means (inputs, activities and outputs) and ends
(outcomes and impacts). It consists of a hierarchy of inputs, activities, outputs, outcomes and
impacts (see Figure 1).
Impacts
Manage
What do we aim to change?
towards
Outcomes achieving
What do we wish to achieve? results
Outputs
What do we produce or deliver?
Activities Plan,
What do we do? budget,
implement
Inputs
What do we use to do the work?
(PSC, 2008:42)
Inputs All the resources that contribute to production and delivery of outputs. Inputs are “what we use to
do the work”. They include finances, personnel, equipment and buildings
Activities The processes or actions that use a range of inputs to produce the desired outputs and ultimately
outcomes. In essence, activities describe “what we do”.
Outputs The final products, or goods and services produced for delivery. Outputs may be defined as “what
we produce or deliver”.
Outcomes The medium-term results for beneficiaries that are a logical consequence of achieving certain
outputs. Outcomes should relate clearly to an institution’s strategic goals and objectives, which
should be set out in its plans. Outcomes are “what we wish to achieve”.
Impacts The results of achieving specific outcomes, such as reducing poverty and creating jobs.
1. In groups, discuss a project or programme currently in the implementation phase of delivery. Assign the
components of the logic model to this project or programme.
2. Critically discuss how this model can be used as a monitoring and evaluation tool.
In an article published by the Harvard Business Review, Kaplan and Norton identified four
perspectives for evaluating the performance of an organisation, shown in Table 6.
Financial Is the organisation financially successful? Does the project or programme deliver value for
money?
Customer Is the public satisfied with service delivery?
Learning and growth Is the organisation achieving its vision and goals? Monitoring and evaluation is intended to
develop a learning organisation. If the organisation achieves its vision and goals, growth
will be inevitable.
Internal business This perspective assesses implementation procedures.
process
(PSC, 2008:19)
• Kaplan, R.S and Norton, D.P. 1992, The Balanced Scorecard – Measures that Drive
Performance, Harvard Business Review, https://hbr.org/1992/01/the-balanced-
scorecard-measures-that-drive-performance-2 (accessed 18 June 2020).
The monitoring and evaluation process involves inputs, activities, outputs, outcomes and impacts.
In this section, we consider government’s interpretation of the theories and models presented above.
The components of the logic model make up government’s outcomes-based approach to managing
performance. See Figure 2.
• Inputs: what is required to complete a task? For example, financial resources, human
resources, infrastructure;
• Activities: the functions, actions, and tasks that use the inputs and produce results. For
example, contract for services, answer queries, give advice;
• Outputs: the products or services made from activities. For example, “service providers
obtained” and “work initiated”;
• Outcomes: the end goal – what we wish to achieve. They are the product of effective outputs.
Outcomes are linked to organisational strategic plan; and
• Impacts: what results from achieving the outcomes? For example, faster production of official
documents at home affairs, reduction of poverty, etc.
(Republic of South Africa, 2010a:12)
This model is considered in more detail in the fourth section of this course, titled “Implement a
Monitoring and Evaluation System”.
We have discussed the developmental purpose of the government’s monitoring and evaluation
system. In this section we consider the involvement of national, provincial, line department and
constitutional institutions.
National level
The Presidency
The Presidency is responsible for formulating the medium-term strategic framework and the
government’s programme of action. The implementation of these plans is then monitored against
their priorities. The Presidency publishes bi-annual progress reports on the implementation of the
government’s programme of action. The Presidency relies on data provided by the monitoring and
evaluation systems (PSC, 2008:13) to compile these reports.
National Treasury
The minister of finance, supported by the National Treasury, determines fiscal policy. The Treasury
compiles the national budget and devises and implements financial management policy. Parliament
allocates money according to strategic objectives. Indicators and targets are set to measure the
attainment of objectives, and the National Treasury plays an important role in monitoring
performance against these objectives. The Treasury evaluates whether expenditure achieved value
for money. The results of the evaluations are published in quarterly reports in the Budget Review,
Provincial Budgets and Expenditure Review and the Local Government Budgets and Expenditure
Review (PSC, 2008:14).
The DPSA provides monitoring and evaluation information through the bodies shown in Figure 3.
Ministry of
Performance
Co-ordinating Monitoring &
Departments: NT, Evaluation: G&A
DPSA, OPSC; National cluster: NPC Minister, deputy
School of minister, Parliament,
Government, Co- Portfolio Committee
operative on Public Service and
Governance & Administration, and
Traditional Affairs, so on
Stats SA, etc
DPSA
monitoring
and
evaluation
Government
departments,
Directors-general,
premiers' offices,
exco/manco, DPSA
provincial and local
staff
offices
Other government
agencies, research
institutions, trade
unions and other
partners
The Public
(Molepo, 2011)
This department devises policy on the structure and functioning of provincial and local government,
and therefore evaluates the performance of local and provincial government. Local government is
essential for the delivery of basic services, and the department's role in monitoring and evaluating
service delivery is therefore directly linked to monitoring and evaluation (PSC, 2008:15).
This organisation collects, analyses and publishes information generated by the national statistics
system. The system also collects statistics on development indicators from the government’s
strategies. Without reliable statistics, planning and service, monitoring and evaluation would not be
possible (PSC, 2008:15).
The National School of Government replaces the Public Administration Leadership and Management
Academy (PMG, 2014). The purpose of the school is “to build an effective and professional public
service through the provision of relevant, mandatory training programmes” (PMG, 2014). The
National School of Government is therefore involved in training staff in M&E principles.
While the PSC had sole responsibility for monitoring and evaluation functions during the first two
decades of the democratic dispensation, in January 2010 a new department in the Presidency – the
Department of Performance Monitoring and Evaluation – was established to:
(DPME: 2012:2)
This means there are now two government bodies responsible for public service M&E. To prevent
duplication of responsibilities, the PSC and the DPME have specific monitoring and evaluation
functions, shown in Table 7.
Monitoring of heads of department performance Frontline service delivery monitoring and the presidential hot
management line
Evaluation of the state of the public service Government-wide monitoring and evaluation system and
capacity building
Development indicators
Constitutional institutions
The Public Service Commission (PSC) was established in accordance with sections 195 and 196 of
the constitution, which stipulate that it would be the body in charge of monitoring and evaluation of
public service performance. However, as you have seen, this has now changed, with responsibility
for performance monitoring and evaluation being split between the PSC and the Department of
Performance Monitoring and Evaluation (DPME).
1. Compare and contrast the monitoring and evaluation responsibilities of the PSC and the DPME.
2. Analyse whether these differences are actually adhered to in the public service. Using this analysis, recommend
ways that the PSC and DPME could use a more integrated approach to monitoring and evaluation.
The auditor-general is responsible for auditing the accounts and financial statements of national and
provincial departments (PSC, 2008:16). From a monitoring and evaluation perspective, the auditor-
general's most important role is performance auditing. This involves determining how well the
organisation or department being audited has spent money. In addition, part of the audit can be
determining how well the audited entity has determined its performance indicators (PSC, 2008:16).
As part of the Bill of Rights (enshrined in the constitution), the Human Rights Commission must
protect, promote and ensure respect of the rights of citizens (PSC, 2008:16). As a large portion of
these rights are socioeconomic, the Human Rights Commission has a role to play in ensuring that
government delivers essential services to its people (PSC, 2008:16). This involves effective M&E.
The key departments in monitoring and evaluation are the offices of the premiers and the provincial
treasuries. Key strategic objectives are set for each province in the Provincial Growth and
Development Strategy and the Provincial Government Programme of Action. Offices of the premier
monitor and evaluate the performance of provincial departments according to the direction set in the
growth and development strategy (PSC, 2008:15).
Line departments
Line departments implement government policy. They must monitor and evaluate the
implementation of policy, the impact of policy and the quality of service delivery (PSC, 2008:15).
Why is it necessary to have multiple entities to ensure effective monitoring and evaluation in the
public service? Discuss critically the implications of this for efficiency.
Read the case study below and answer the questions that follow.
Building a results-based monitoring and evaluation system for the Western Cape government of
South Africa
The Western Cape government has developed a series of provincial strategic objectives on which to base its
priorities, transversal planning processes and service delivery. The government has also worked on building a
results-based monitoring and evaluation system to capture information on the extent to which these objectives are
being achieved. This system enables integrated province-wide monitoring and evaluation to occur. It consists of
seven phases as described later in this paper. This seven-phase process provides a toolkit for any government
institution to set up its own RBM&E system. This is the main value of the system.
Context of results-based monitoring and evaluation for the Western Cape government
The primary aim of monitoring and evaluation in government is to provide information for decision-making. But data
collected for monitoring and evaluation is often wide-ranging and fragmented. To make sense of the complexity and
diversity of the data collected against indicators, and to turn data into useful strategic management information it is
necessary to integrate the data into a system.
The central feature of monitoring and evaluation for the Western Cape government is that it is used to improve
performance. The results-based monitoring and evaluation system is being used to measure the performance of the
desired outcomes in relation to the strategic objectives government aims to achieve. Other provinces may wish to use
a similar results-based monitoring and evaluation system to measure their performance.
Focus of the Western Cape government’s results-based monitoring and evaluation system
The main focus of the system is measuring outcomes against a set of indicators. Given that achieving outcomes
depends, in part, on factors beyond the direct control of government, outcomes and their measurement, in our
approach, are clearly distinguished from outputs and their measurement. Thus, outputs are about what the province
as a whole, and each department in it, actually delivers, while outcomes are about what they wish to achieve through
these outputs. Indicators measuring outputs are therefore clearly differentiated from indicators that measure
outcomes.
The Western Cape model aims to provide a platform for setting up results-based systems for public sector monitoring
and evaluation in the province. The system should ensure that it provides data and information that is necessary to
measure government’s achievements against a core set of indicators contained within it. Such information enables
evidence-based decision-making in line with the provincial government’s policies, strategies, programmes and
projects.
As a starting point for developing this toolkit, a strategic framework for province-wide monitoring and evaluation
system was developed. This framework examined how such a system could measure the results of the work done in
the province. Its emphasis was on measuring outcomes directly linked to specific provincial strategic objectives. It
also sketched how the processing of monitoring and evaluation data collection would be supported by an electronic
application.
To develop a results-based monitoring and evaluation system for the province, a specific seven-phase sequence was
formulated, taking into account the principles of Kusek and Rist (2004: 23): formulating outcomes and goals;
selecting outcome indicators for monitoring; gathering baseline information on the current condition; setting specific
targets to reach; setting timelines for reaching them; collecting data to assess whether the targets were being met.
The development of the system was also informed by other international practices, such as the Malaysian
Government Result-Based Budgeting System and the International IMA Model. The results-based monitoring and
evaluation system for the province was developed in-house with technical support conducting quality assurance in
the field of indicators and data governance.
Each of the seven phases constitutes a subsystem. These subsystems are interdependent and contained within the
overarching province-wide monitoring and evaluation system. They provide the necessary components of the system,
so that they can operate as a whole through effective indicator and data management. The components are then
aligned to the core processes and supporting processes of the province-wide system’s mandate. The subsystems are
reviewed annually to maintain an up-to-date and comprehensive monitoring and evaluation system that will function
effectively.
The readiness assessment involved conducting provincial audits with the monitoring and evaluation staff in the
Western Cape provincial government, ascertaining the capacity and readiness in each department to build a results-
based monitoring and evaluation system that could be aligned to this province-wide system, and the critical
challenges faced in each department in relation to building the results-based monitoring and evaluation systems.
Stakeholder engagement involved identifying relevant stakeholders at local, provincial, national and international
levels, and institutionalising stakeholder engagement through the establishment of a monitoring and evaluation forum
and an external reference group, which met on a regular basis. This phase was important for understanding the
stakeholder environment.
This phase focused on the development of the strategic monitoring and evaluation frameworks that would provide the
conceptual and strategic understanding of the province-wide system’s mandate, its results-based approach and its
relationship to the policy context of the provincial government. This phase set out the strategic approach on how to
implement results-based monitoring and evaluation to assess how well the provincial government was doing in
meeting its strategic objectives.
This phase was the starting point in translating the provincial strategic objectives into broad, outcomes-based themes,
and then subdividing these themes into aims or desired outcomes for the period 2010 to 2014. In this phase a
compendium of indicators was selected to measure each aim or desired outcome for the province’s strategic
objectives, taking into account the national statistical production areas and global imperatives.
In this phase, attributes for each core indicator were identified in order to build the monitoring system. These
attributes included information on appropriate data sources; the frequency of data collection; responsible data
producers and level of disaggregation to measure results based on the indicators. Baseline data was also collected,
and targets were set against which the outcome indicators could be measured. The indicators and their attributes
culminated in a monitoring and results framework for each strategic objective. This phase was the essence of the
results-based monitoring and evaluation system, and was interlinked with phases 3 and 5.
This phase related to the collection of data on the outcome indicators to observe the situation and the changes that
occurred as well as the analysis and reporting of results. It included the identification and location of the data
sources, and the assessment of the data quality by building quality standards into provincial administrative data
records. This phase was critical, as it related to broader data governance matters such as data profiling, data quality
standards and data architecture, and it was interlinked with phases 3 and 4.
The information architecture was designed to support the manual processes regarding collecting province-wide
information, and to manage data collected to measure the indicators.
The information architecture of the province-wide monitoring and evaluation system was included in a broader
computer-based relational database. This database contains data collected not only for the province-wide system,
but also for the Annual Performance Assessment System and the Executive Projects Dashboard. The province-wide
system, as an electronic system, draws its data by interfacing with other e-platform systems. The Annual
Performance Assessment System, as an electronic application, stores output indicator results, while the Executive
Projects Dashboard, as an electronic application, receives and captures information on departmental projects and
tracks progress and budget utilisation. This phase was the essence of automating the work done in phases 2, 3, 4
and 5.
Phase 7: Planning to implement and sustain the province-wide monitoring and evaluation system
In this phase the Western Cape government ensured that the system delivered an effective indicator and data
management system for collecting relevant data and information for strategic management purposes. The annual
review of the subsystems of the province-wide system takes place in this phase. This review is pivotal in maintaining
an up-to-date, comprehensive monitoring and evaluation system, and ensuring that the components in each phase
adhere to the necessary policy context, monitoring and evaluation elements and mechanisms for such a system. This
phase indicates when the results-based monitoring and evaluation system is ready to start, to be implemented and to
be sustained.
In conclusion, building the results-based system and its application set the direction for improving monitoring and
evaluation processes and methods within the provincial government, ultimately improving the measuring of results on
a continuous basis.
(Ishmail, 2012)
Questions
1. Critically reflect on the components of the logic model in reference to the case study.
2. Evaluate the balanced scorecard perspectives using the case study.
3. Explain the interrelationships between the components of the outcomes-based approach using the case study.
7.2.6 Conclusion
Monitoring and evaluation allows government to track and correct progress in public organisations.
It is therefore a vital tool in achieving improved government services, which continue to modify their
processes and strategies as they learn from M&E experiences.
• Goldman, I., Engela, R., Akhalwaya, I., Gasa, N., Leon, B., Mohamed, H. and Phillips, S.
(2012). Establishing a national M&E system in South Africa. The World Bank Special
Series on The Nuts & Bolts of Monitoring and Evaluation Systems, 21, 1-11,
http://documents.worldbank.org/curated/en/556311468101955480/pdf/760630BRI0Nuts
00Box374357B00PUBLIC0.pdf (accessed 18 June 2020).
Once you have read the article, complete the following tasks:
1. Critically evaluate the six types of evaluation. Do you think these are comprehensive enough to cover all areas
of evaluation in the public service? Why?
2. Given the overview of the emerging successes, challenges and sustainability issues for a monitoring and
evaluation system provided in the article, explain how obstacles to effective monitoring and evaluation can be
overcome.
3. Using Table 1 in the article, evaluate the ability of the management performance assessment tool to measure
performance of government departments and administrations.
4. Evaluate how Figure 1 in the article represents the relationship between the different stakeholders of monitoring
and evaluation in South Africa. Draw a diagram in which you represent the stakeholder relationships more
appropriately. Present your alternative diagram in a presentation to the rest of the class, explaining the
adaptations you made to it and why you felt these changes were necessary.
• Various definitions of monitoring and evaluation were offered. The essential point is that M&E
helps us to improve the quality of our work;
• We distinguished between monitoring and evaluation;
• Monitoring and evaluation is a tool for development;
• Among the purposes of M&E are:
o Improved management decision-making
o Organisational learning
o Enhanced accountability
o Soliciting support for programmes
o Supporting advocacy
o Promoting Transparency
• We identified M&E stakeholders:
o The Presidency, National Treasury, the Department of Public Service and
Administration, the Department of Co-operative Governance and Traditional Affairs
o Statistics South Africa, the National School of Government, the Department of
Performance Monitoring and Evaluation in the Presidency
o The Public Service Commission, the auditor-general, the Human Rights Commission
o Premier’s offices and line departments
• We distinguished between the M&E function of the Public Service Commission and the
Department of Performance Monitoring and Evaluation; and
• We identified various M&E models and techniques, namely the logic model, the balanced
scorecard approach, and government’s outcomes-based approach.
It will help you strengthen and embed your understanding of the course. You will not be able to
change your answers once you have submitted them, so make sure you have completed the
relevant section of coursework first. Where you see Select all that are relevant, be aware that
any number of the options presented could be correct. You will lose marks for incorrect
selections, so choose carefully. Your combined marks from these assessments count towards a
total of 20% of your course mark.
Learning outcome • Understand the relevant legislation informing the monitoring and evaluation process.
• Department of Public Service and Administration. (2012). Public Service Act, 1994.
http://www.dpsa.gov.za/dpsa2g/documents/acts®ulations/psact1994/PublicServiceAct.pd
f (accessed 18 June 2020).
• Nkwinti, G. (nd). National Development Plan and the New Growth Path: Transforming the
Economy. http://kzntopbusiness.co.za/site/search/downloadencode/nLaqaaKelpO8mnjc
(accessed 18 June 2020).
• National Treasury. (2005). Treasury regulations for departments, trading entities,
constitutional institutions and public entities.
http://www.treasury.gov.za/legislation/pfma/regulations/gazette_27388%20showing%20ame
ndments.pdf (accessed 18 June 2020).
• National Treasury. (2013). National Treasury Strategic Plan 2013/2017.
http://www.treasury.gov.za/publications/strategic%20plan/Strat%20Plan%202013-2017.pdf
Recommended (accessed 18 June 2020).
reading
• National Treasury. (2014). Division of Revenue Act, 2014.
http://www.treasury.gov.za/legislation/acts/2014/Division%20of%20Revenue%20Act,%2020
14%20(Act%20No.%2010%20of%202014).pdf (accessed 18 June 2020).
• National Treasury. (2014). Public Finance Management Act No. 1 of 1999.
http://www.treasury.gov.za/legislation/pfma/act.pdf (accessed 18 June 2020).
• National Treasury. (2003). Local Government: Municipal Finance Management Act No. 1 of
1999, http://mfma.treasury.gov.za/MFMA/Legislation/Local%20Government%20-
%20Municipal%20Finance%20Management%20Act/Municipal%20Finance%20Managemen
t%20Act%20(No.%2056%20of%202003).pdf (accessed 18 June 2020).
• Department of Economic Development. (2011). The New Growth Path: Framework.
http://www.economic.gov.za/communications/publications/new-growth-path-series/download
(accessed 18 June 2020).
Monitoring and evaluation in South Africa originates from legislation mandating the efficient and
effective monitoring and evaluation of policies, actions and use of resources in the public service.
Section overview In this section, we study the most important of these laws to reinforce the importance of M&E in
complying with legislation and to impart an understanding that monitoring and evaluation is
integral to achieving the goals of this legislation.
Various laws support the implementation of a monitoring and evaluation system: the overarching
framework provided by the constitution, the Public Finance Management Act, the Public Service Act,
and the Municipal Finance Management Act (DPME, 2011:2).
Be aware that these acts are frequently amended. It is your responsibility to ensure that you have
the latest version of each act, which you can find on the relevant department’s website.
In this section we will review this legislation in terms of the purposes it serves in relation to monitoring
and evaluation.
The constitution is the highest level of legislation in South Africa. It is from the constitution that the
public service was first established. The constitution contains values by which the public service
must strive to operate. It also contains mandates for the establishing of entities to manage monitoring
and evaluation in the public service.
The values enshrined in the constitution shape and define M&E practices in South Africa. In every
constitutional principle is the implicit implication that the public service must be monitored and
evaluated to ensure compliance with the values of the constitution. In other words, to achieve the
ideals of the constitution, monitoring and evaluation of all public entities is necessary.
This may be more clearly understood in Table 8, in which the PSC links indicators and standards to
two constitutional values as an example of how values can be made into measurable indicators.
1. Using the format of the table from the PSC above, choose two constitutional values of your own and draw a table
converting them into measurable indicators.
2. Present your table to the rest of the class, clearly explaining how you developed your indicators.
The constitution enumerates values in section 195 that the PSC uses to define good governance
(PSC, 2008:17). In addition, the constitution holds that the public service be (PSC, 2008:17):
Using the constitutional values listed above, explain the importance of monitoring and evaluation for the public service.
The Public Service Act, No. 103 of 1994, as amended by the Public Service Amendment Act, No.
30 of 2007, prescribes how national and provincial departments should function. It also regulates the
appointment and performance of government employees (Clear, 2012:150).
In order to understand the Public Service Act, you can read it here:
• Department of Public Service and Administration. (2012). Public Service Act, 1994.
http://www.dpsa.gov.za/dpsa2g/documents/acts®ulations/psact1994/PublicServiceA
ct.pdf (accessed 18 June 2020).
The Public Finance Management Act, No. 1 of 1999, as amended by the Public Finance Amendment
Act, No. 29 of 1999, is intended to ensure the fair and ethical use of finances in national and
provincial government. It is therefore important to have an understanding of the act, which is largely
concerned with the monitoring and evaluation of government finances.
The Public Finance Management Act emphasises the need to move away from an input approach
to an output approach, while focusing on the responsible managing of public funds. The following
sections of the act are relevant to monitoring and evaluation:
• Section 6.1 (f), which mandates the responsibilities of the National Treasury, states that the
Treasury must monitor the implementation of provincial budgets;
• Section 6.2 (c), which also deals with the responsibilities of the National Treasury, states that
the Treasury “must monitor and assess the implementation of (the act), including any
prescribed norms and standards, in provincial departments, in public entities and in
constitutional institutions”;
• Section 18.2 (c), under the functions and powers of a provincial treasury, says that the
provincial treasury must comply with the annual Division of Revenue Act, and monitor and
assess the implementation of that act in provincial public entities and in terms of section 18.2
(d), “must monitor and assess the implementation in provincial public entities of national and
provincial norms and standards”;
• Section 27 states that measurable objectives must be submitted for each programme (Clear,
2012:150);
• Section 38 (a) (iv), under the responsibilities of accounting officers, states that “a system for
properly evaluating all major capital projects prior to a final decision on the project” must be
maintained by the accounting officer in all departments, trading entities and constitutional
institutions;
• Section 45 states that department officials must assume responsibility for the “effective,
efficient, economical and transparent use of financial resources” (Clear, 2012:150);
• Section 51 (a) (iv), under general responsibilities of accounting officers, states that the
accounting authority must ensure “a system for properly evaluating all major capital projects
prior to final decisions on the project”.
We recommend that you familiarise yourself with the Public Finance Management Act:
The Municipal Finance Management Act, No. 56 of 2003, follows the requirements for adequate
reporting and responsibilities laid down by the Public Finance Management Act at the municipal
level. It also provides instructions for how performance management is monitored by the municipality
(Clear, 2012:150).
The following sections of the Municipal Finance Management Act are useful for monitoring and
evaluation:
• Section 5.2, which outlines the general functions of the National Treasury and provincial
treasuries, which states that the National Treasury may:
o 2 (a) “Monitor the budgets of municipalities to establish whether they: are consistent with
the national government’s fiscal and macroeconomic policy.
o (b) “Promote good budget and fiscal management by municipalities, and for this purpose
monitor the implementation of municipal budgets, including their expenditure, revenue
collection and borrowing”; and
o (c) “Monitor and assess compliance by municipalities and municipal entities with (i) this
act; and (ii) any applicable standards of generally recognised accounting practice and
uniform expenditure and revenue classification systems.”
• Section 5.4 requires a provincial treasury to monitor:
o (i) Compliance with this act by municipalities and municipal entities in the province;
o (ii) The preparation by municipalities in the province of their budgets;
o (iii) The monthly outcome of those budgets; and
o (iv) The submission of reports by municipalities in the province as required in terms of
this act.
• Section 34.3, under the heading “Capacity Building”, refers to the monitoring function in terms
of section 155(6) of the Constitution, and says a provincial government:
o (a) “Must share with a municipality the results of its monitoring to the extent that those
results may assist the municipality in improving its financial management;
o (b) “Must, upon detecting any emerging or impending financial problems in a
municipality, alert the municipality to those problems;” and
o (c) “May assist the municipality to avert or resolve financial problems.
• Section 41.1 under the Monitoring of Prices and Payments for Bulk Resources, “the National
Treasury must monitor:
o “The pricing structure of organs of state for the supply of electricity, water or any other
bulk resources that may be prescribed, to municipalities and municipal entities for the
provision of municipal services; and
o “Payments made by municipalities and municipal entities for such bulk resources.”
• National Treasury. (2003). Local Government: Municipal Finance Management Act No. 1
of 1999, http://mfma.treasury.gov.za/MFMA/Legislation/Local%20Government%20-
%20Municipal%20Finance%20Management%20Act/Municipal%20Finance%20Manage
ment%20Act%20(No.%2056%20of%202003).pdf (accessed 18 June 2020).
1. Explain the importance of the Municipal Finance Management Act in the context of monitoring.
2. How might the act be used to evaluate the use of municipal funds?
The regulations that took effect in 2000 and were amended in 2005 are a significant shift from the
previous approach (Treasury instructions) in that they allow for more flexibility and they place
responsibility for decisions in the hands of the accounting officer.
Table 9 outlines significant features of the regulations. However, students are reminded that they
should read the regulations in full to appreciate the extent of this document.
Part 1 General definitions, application, and date of commencement are stated here.
Part 2 Here “management” is defined, including:
• Corporate management (including chief financial officer);
• Internal control (ie audit committees that act consistently with the Institute of Internal Auditors);
• Risk management strategy (including a fraud prevention plan); and
• Financial misconduct (including investigation, criminal proceedings, and reporting).
Part 3 Planning (at various levels) and budgeting are prescribed here, including:
• Strategic planning (annual preparation, submission and content to facilitate departmental
votes); and
• Budgeting and related matters (ie format, virement, rollovers, transfer of functions, additional
funds and adjustment budgets).
Part 4 The two key responsibilities of revenue and expenditure are dealt with here (including unauthorised,
irregular, fruitless and wasteful expenditure).
Revenue management:
• Application (identification, collection, recording and safeguarding of all revenue for which the
institution is responsible);
• Responsibility for revenue management; and
• Services rendered by the state.
Expenditure management:
• Accounting officer’s responsibilities;
• Approval;
• Compensation of employees (personnel costs);
• Transfer payments and subsidies (excluding division of revenue grants and other allocations to
municipalities);
• Division of revenue grants and other allocations to municipalities;
• The charging of expenditure against a particular vote (or main division of a vote); and
• Recovery, disallowance, and adjustment of payments.
Unauthorised, irregular, fruitless and wasteful expenditure:
• Prevention and detection;
• Reporting;
• Disciplinary steps; and
• Recovery of losses (or damages).
From time to time regulations are repealed (as with all other legislation). The onus is on you to take
note of these and keep abreast of current legislation.
The strategic plan, the National Treasury says, should take into account all the relevant policies,
legislation and other mandates for which the department is responsible and should reflect the
strategic outcome-oriented goals and objectives that the department will strive to achieve over the
stated period.
Additionally, the minister of finance (as the political principal of the department) is guided by active
collaboration with Parliament (including the Standing Committee on Finance, the Select Committee
on Finance, and the Standing Committee on Public Accounts).
Read these articles to gain more insight into the government’s strategic frameworks:
• Nkwinti, G. (nd). National Development Plan and the New Growth Path: Transforming the
Economy. http://kzntopbusiness.co.za/site/search/downloadencode/nLaqaaKelpO8mnjc
(accessed 18 June 2020).
• Department of Economic Development. (2011). The New Growth Path: Framework.
http://www.economic.gov.za/communications/publications/new-growth-path-
series/download (accessed 18 June 2020).
Strategic Plan
1. Discuss what is meant by the following terms in the National Treasury Strategic Plan 2013/2017 and identify how
these concepts affect the monitoring and evaluation process, their impact on the provision of services and why it
is important to consider these concepts both locally and nationally in strategic planning:
2. Revenue and expenditure strategies are based on various factors, including broad strategic frameworks. Why is it
important to consider these factors and broad strategic frameworks when devising revenue and expenditure
strategies?
The constitution requires that every year a Division of Revenue Act (DoRA) determine the equitable
division of nationally raised revenue between national government, the nine provinces and
municipalities.
The act follows a highly consultative process. The following organisations and institutions are
consulted:
• The Financial and Fiscal Commission (as discussed in Section 1 of this course);
• The South African Local Government Association (Salga); and
• The national and provincial departments.
Schedule 1 shows the equitable division of revenue raised nationally among the three spheres of
government as follows (National Treasury, 2014).
Schedule 3 (too extensive to replicate here) determines each municipality’s equitable share of the
local government sphere’s share of revenue raised nationally.
Schedule 4 sets out the allocations to provinces to supplement the funding of programmes or
functions funded from provincial budgets (by vote), with Schedule 5A showing specific-purpose
allocations to provinces (by vote) and 5B to municipalities (by vote).
Schedule 6A shows the allocations-in-kind to provinces for designated special programmes and 6B
likewise to municipalities. Schedule 7A gives the unallocated provisions for provinces for disaster
response.
Accompanying memorandum
The Intergovernmental Fiscal Relations Act (1997) requires that the Division of Revenue Bill be
accompanied by a memorandum explaining:
• How the bill takes account of the respective sections of the constitution;
• The extent to which the Financial and Fiscal Commission’s recommendations have been
taken into account; and
• Any assumptions (or formulae) used to allocate the funds between the three spheres of
government.
Compare the Division of Revenue Act from last year to the current Division of Revenue Act or Bill, whichever is most
recent.
Policy mandates processes and procedures implemented in an organisation. Figure 4 shows how
monitoring and evaluation fit into the policy life cycle.
Problem
Policy
Review
objectives
Policy
Evaluation
options
Feasibility of
Monitoring
options
Policy
Implementation decisions
Planning
(PSC, 2008:9)
Since there are not many completely new problems that the state has never addressed before,
the cycle probably starts with the review of existing policy. The stages of problem identification,
determining policy objectives, examining policy options, and taking a policy decision are a
complex process filtered through many layers of stakeholders. These stakeholders include
political parties, civil society, legislative and executive arms of government, and government
departments. Policy is further argued and explained in various documents, like discussion and
policy documents.
The process is invariably not as sequential or rational as depicted. Identification of options and
rational evaluation of the feasibility, or the costs and benefits, of options, in any precise sense,
assume perfect knowledge of what will work, which is frequently not the case. Policy options
emerge through political debate, and the best policies through taking a considered decision and
making adjustments when the effect of a policy is seen in practice.
As soon as a policy decision has been taken, government departments initiate the processes of
designing a programme that can achieve the policy objectives, detailed planning of the
programme, and implementation. To ensure that implementation proceeds as planned and
that the envisaged objectives are achieved, the programme is monitored and evaluated.
Depending on the results achieved by the programme, the initial policy decision, or aspects of the
design, implementation and resource allocation to the programme may be reviewed.
The evaluation of the success of policy and the reasons for success or failure are critical parts of
the process. This evaluation is not necessarily a formal, technical evaluation but one that is
intricately part of administrative and political processes, where the judgements and power of key
decision-makers play the primary role. Monitoring and evaluation mediates this by producing valid
evidence for policy decisions, ensuring greater objectivity.
Since public policy is a set of statements that “determine what actions government will take, what
effects those actions will have on social conditions, and how those actions can be altered if they
produce undesirable outcomes”, policy evaluation is also an inherent part of monitoring and
evaluation.
7.3.10 Conclusion
The legislation studied above is intended to guide the implementation of monitoring and evaluation
in the public service. While the laws do contain helpful values and directives from which monitoring
and evaluation practitioners can draw, the PSC and DPME have been active in producing many
documents aiming to explain the impact of the legislation in practical terms. This leads to the
question: is the legal framework for monitoring and evaluation sufficient? You may continue to think
about this question as we explore the PSC and DPME frameworks and guidance notes in the section
that follows.
1. Critically evaluate the legislation studied above by considering whether it contains adequate guidelines for the
implementation of a monitoring and evaluation system for the public service. Substantiate your response.
2. In your groups, compile a list of different policies involved in the monitoring and evaluation process in your
department. Critically evaluate why each policy is crucial to the success of the process.
• The Constitution of the Republic of South Africa, 1996, enumerates various constitutional
values, some of which inform the practice of monitoring and evaluation. In other words, M&E
is not simply an administrative requirement, but is drawn from our fundamental national
values;
• Various acts of Parliament flesh out the constitutional requirements of monitoring and
evaluation, namely the:
o Public Service Act
o Public Finance Management Act
o Municipal Finance Management Act
• The Treasury Regulations, 2005, also add detail to M&E practices;
• Departments’ strategic plans and the government’s revenue and expenditure strategy also
inform M&E;
• The annual Division of Revenue Act allocates how nationally-raised revenue is shared
between the three spheres of government. Application of the act also relates to M&E; and
• Monitoring and evaluation also contributes to (or feeds into) the policy life cycle.
It will help you strengthen and embed your understanding of the course. You will not be able to
change your answers once you have submitted them, so make sure you have completed the
relevant section of coursework first. Where you see Select all that are relevant, be aware that
any number of the options presented could be correct. You will lose marks for incorrect
selections, so choose carefully. Your combined marks from these assessments count towards a
total of 20% of your course mark.
In the previous section we placed monitoring and evaluation within their legal requirements. In
this section we will examine the organisational values that a monitoring and evaluation system
Section overview
must enhance. We will evaluate the current values of the public service and ask whether the
theories used to develop a monitoring and evaluation system answers the country’s needs.
Previously, we located monitoring and evaluation in the policy cycle. We will now have a look at how
monitoring and evaluation fits into the planning and implementation processes of government
departments. This is illustrated in Figure 5 and discussed thereafter.
Medium-term
budget
4. Review
Annual review
Annual
performance plan
Specially Third-quarter
2. Monitoring
commissioned report
evaluations Performance
plans for units
3. Evaluation and individuals
Each department is responsible for devising a five-year strategic plan. The strategic plan must be
aligned with the governments’ strategic direction, which is published in the Medium-Term Strategic
Framework and the Government Programme of Action.
The process starts with a general election, when government produces new programmes. This
process is the same for on provincial level where provincial strategic plans must align with provincial
government programmes of action. At departmental level, plans must align with provincial growth
and development strategies and integrated development plans. Plans must also align with local
integrated development plans.
From the strategic plan, each department prepares a budget (estimates of expenditure/medium-
term expenditure framework) and submit this to the National Treasury. This is approved by
Parliament or the provincial legislature.
From the strategic plan and budget, departments must then prepare an annual performance plan.
The plans must contain objectives, outputs, indicators and targets. The annual performance plan is
then broken down into plans for each component of the organisation (for example: the human
resources plan, the risk management plan, the programme management plan, etc). These plans are
implemented and monitoring starts immediately. Monitoring measures are set against the
objectives, outputs, indicators and targets in the plan. The progress is reported in monthly and
quarterly reports.
Managers then conduct quarterly monitoring through evaluation and analyse the success and
failure of programmes. Action plans are developed for performance improvements. Quarterly
monitoring can be supplemented by commissioned evaluations by experts internally or externally.
These reports form part of the annual review of performance, which then feeds the new planning
cycle for the next financial year.
(Adapted from National Treasury, 2007:4)
In addition to the constitution, four acts govern (or organise) the system of intergovernmental
relations and the local government fiscal framework. The integration (coherency) of these are
summarised in Table 10.
Intergovernmental Passed to promote co-operation between the three spheres of government on fiscal,
Fiscal Relations Act budgetary and financial matters (it establishes the Budget Forum); and to prescribe a
(1997) process for the determination of an equitable sharing and allocation of revenue raised
nationally (requires that a Division of Revenue Bill is tabled annually).
Municipal Structures Act Provides for the establishment of different types of municipalities, including the division
(1998) (including of powers and functions between local and district municipalities; regulates the internal
amendments) systems, structures and office bearers of municipalities.
Municipal Systems Act Sets out detailed requirements for community participation, integrated development
(2000) and Municipal planning, performance management, administration, service provision, debt collection,
Systems Amendment and the establishment of municipal entities; regulates the publication of by-laws and
Act (2003) determines the role of national provincial government in setting standards and
monitoring local government.
Intergovernmental Established to provide a framework for the three spheres of government to promote and
Relations Framework facilitate intergovernmental relations; to provide for mechanisms and procedures to
Act (2005) facilitate the settlement of intergovernmental disputes.
(National Treasury, 2011 and related acts)
The constitution envisages the decentralisation of the administration of many functions – currently
the responsibility of national and provincial government – to municipalities. To enable this, the local
government fiscal framework must provide municipalities with access to revenue sources that are
commensurate with the powers and functions for which they are responsible. As stated by National
Treasury (2011:27), “It is important to keep in mind that the whole local government fiscal framework
is designed to fund local government, and not just the transfers from national government.”
“It is important to understand the relationship between the allocation of functions and the fiscal
framework, the fiscal effort the municipality makes to collect revenues, the appropriate allocation of
those revenues to services, the responsible management of service delivery processes and the
effective delivery of services.”
(National Treasury, 2011:27-28)
The framework and systems continue to evolve as better modes of co-operation and co-ordination
emerge and as functions are shifted between spheres. While changes take place, National Treasury
(2011:29-30) reinforces key elements and principles that underpin the intergovernmental system,
namely:
• Accountability (while each sphere has specific constitutionally defined powers and
responsibilities, intervention from, for example, provincial governments in local government,
occurs when relevant parties fail to carry out their constitutionally defined responsibilities);
• Transparency and good governance (transparent reporting arrangements within and
between spheres; political executives are responsible for policy and outcomes and
accounting officers are responsible for implementation and outputs);
• Mutual support (continually strengthening the capacity of municipalities);
• Redistribution (achieved through the division of revenue and the latest equitable share
formulae);
• Vertical division (driven by priorities, budget process, and trade-offs, where appropriate);
• Revenue-sharing (funded from its own revenues, equitable share allocations, and
conditional and unconditional grants);
• Broadened access to services (innovative but efficient modes of delivery, leveraging public
and private resources to fund infrastructure); and
• Responsibility over budgets (self-determination and responsibility to comply with these;
national government will not bail out provinces or municipalities that mismanage their funds,
nor will it provide guarantees for loans).
The Department of Performance Monitoring and Evaluation (2012:6) offers a model for implementing
a monitoring and evaluation framework. This model is shown in Figure 6.
(DPME, 2012)
Step 1: Situation analysis • List the policy objectives and main sources.
• List other joint implementation institutions and partners, the sphere of government,
and nature of co-operative leadership.
Step 2: Describe • These are data records, IT, financial, and other day-to- day systems that are
administrative sources of information.
information systems and • You need to list and describe them in terms of their purpose, their location, the
data sets frequency of report extraction, the users of the reports, the nature of the system
(manual or electronic), the nature of the interface, maintenance, etc.
• Indicate any planned systems.
• List and describe data sets in current use.
Step 3: List indicators, • Indicators must relate to policy outcomes; cross-cutting issues; targets prescribed;
targets, and baselines sector and premiers' office requirements; and other monitoring and evaluation-
related research and indexes, including international comparisons and
requirements.
• Each indicator must relate to the logic model.
Step 4: Group indicators Clarify which indicators give information on the attainment of each policy objective.
by policy objective
Step 5: (If policy Identify and design new indicators, baselines and targets, and repeat steps 3 and 4.
objectives have no
indicators)
Step 6: Review links • There should be causal relationships between different elements of the results
between inputs, outputs, chain: if the appropriate mix of inputs is combined, this will result in service
outcomes and impact, delivery outputs; if the appropriate service delivery outputs are achieved, this
and identify causal should contribute towards achieving policy outcomes/impacts. This is called the
relationships and links logic model.
(results chain) • Indicators must be measured against six criteria:
o Reliability
o Well-defined
o Verifiability
o Cost-effective
o Appropriateness
o Relevance.
As Table 11 confirms, there are many complex issues in the implementation of a monitoring and
evaluation process. We will look more closely in the final section of this course at each of the steps
in the monitoring and evaluation process to ensure complete understanding of the practices that
must be followed when setting up monitoring and evaluation in your organisation or department.
In order to develop and implement a monitoring and evaluation system, you have to understand
against which perspectives the system will be measured. In the previous section we explained the
planning process as well as the steps involved in implementing the monitoring and evaluation
system. In this section we will look at the broader perspectives of monitoring and evaluation.
A government programme is a set of activities that deliver the products of government (PSC,
2008:39). For example:
The Department of Basic Education is currently in the implementation phase of the National
School Nutrition Programme. The programme aims to:
From the example above, it is clear that the programme has complex outcomes and includes
governance, safety and security, social change and services. Evaluating this programme will require
examining whether the objectives of the programme have been achieved and whether they could
have been achieved in a different manner using different strategies and activities. Key factors
relevant to the delivery of the programme and how they relate to each other need to be analysed, as
does its impact. An impact evaluation:
“Impact evaluations can range from large-scale sample surveys in which project populations and
control groups are compared before and after, and possibly at several points during programme
intervention; to small-scale rapid assessment and participatory appraisals where estimates of
impact are obtained from combining group interviews, key informants, case studies and available
secondary data.”
(World Bank, 2004)
The impact of a programme like the National School Nutrition Programme could for example be
assessed in the learners’ performance at the end of a school semester. The Public Service
Commission (2008:40-41) lists the following as key elements of programme evaluation:
Programmes are complex and not all elements are pre-designed or implemented as planned. The
form that many of the elements take may emerge as the programme is implemented and
adjustments are made based on experience. Monitoring and evaluation provides the evidence for
decisions on what adjustments to make.
Critically evaluate the success of a programme currently in the implementation phase in your department using the
list provided above as guidance.
Financial perspective
Financial statements provide evaluators with the answers to these questions. Financial statements
are presented monthly and quarterly in the form of reports, which are then measured against budget.
These statements are prepared according to the prescriptions of the Public Finance Management
Act (discussed in section 2). The auditor-general audits the financial statements of the department
annually. So, as with other perspectives for monitoring and evaluation, the financial perspective
answers pre-set questions and then digs deeper as more and more questions are asked.
Governance perspective
Good governance in departments means compliance with the values listed in Section 195 of the
Constitution. Good governance is:
“… a system of values, policies and institutions by which a society manages its economic, political
and social affairs through interaction within and among the state, civil society and private sector”
(PSC, 2008:21).
“Monitoring and evaluation is responsible for establishing a high standard of service delivery,
monitoring and good governance in the public service” (PSC, 2013:3). Good governance is
mandated by the eight Batho Pele (“People First”) principles, which guide service delivery in
government organisations. As a performance initiative, the Batho Pele principles are intricately linked
to monitoring and evaluation.
Eight Batho Pele principles were developed to serve as acceptable policy and legal framework
regarding public service delivery. These principles are aligned with the Constitutional ideals of:
1. Consultation
There are many ways to consult users of services including conducting customer surveys,
interviews with individual users, consultation with groups, and holding meetings with
consumer representative bodies, NGOs and CBOs. Often, more than one method of
consultation will be necessary to ensure comprehensiveness and representativeness.
Consultation is a powerful tool that enriches and shapes government policies such as the
integrated development plans (IDPs) and their implementation in local government.
This principle reinforces the need for benchmarks to measure constantly the extent to which
citizens are satisfied with the service or products they receive from departments. It also
plays a critical role in the development of service delivery improvement plans to ensure a
better life for all South Africans. Citizens should be involved in the development of service
standards.
Required are standards that are precise and measurable so that users can judge for
themselves whether they are receiving what was promised. Some standards will cover
processes, such as the length of time taken to authorise a housing claim, to issue a passport
or identity document, or even to respond to letters. To achieve the goal of making South
Africa globally competitive, standards should be benchmarked (where applicable) against
those used internationally, taking into account South Africa's current level of development.
3. Increasing access
One of the prime aims of Batho Pele is to provide a framework for making decisions about
delivering public services to the many South Africans who do not have access to them.
Batho Pele also aims to rectify the inequalities in the distribution of existing services.
Examples of initiatives by government to improve access to services include such platforms
as the Gateway, Multipurpose Community Centres and Call Centres. Access to information
and services empowers citizens and creates value for money, and quality services. It
reduces unnecessary expenditure for citizens.
This goes beyond a polite smile, “please” and ”thank you”. It requires service providers to
empathise with citizens and treat them with as much consideration and respect as they
would like themselves.
The public service is committed to continuous, honest and transparent communication with
citizens. This involves communication of services, products, information and problems,
which may hamper or delay the efficient delivery of services to promised standards. If
applied properly, the principle will help (dispel) the negative perceptions that citizens in
general have about the attitude of the public servants.
5. Providing information
As a requirement, available information about services should be at the point of delivery, but
for users who are far from the point of delivery, other arrangements will be needed ...
managers and employees should regularly seek to make information about the organisation
and all other service delivery-related matters available to fellow staff members.
A key aspect of openness and transparency is that the public should know more about the
way national, provincial and local government institutions operate, how well they utilise the
resources they consume, and who is in charge. It is anticipated that the public will take
advantage of this principle and make suggestions for improvement of service delivery
mechanisms, and to even make government employees accountable and responsible by
raising queries with them.
7. Redress
This principle emphasises a need to identify quickly and accurately when services are falling
below the promised standard, and to have procedures in place to remedy the situation. This
should be done at the individual transactional level with the public, as well as at the
organisational level, in relation to the entire service delivery programme. Public servants are
encouraged to welcome complaints as an opportunity to improve service, and to deal with
complaints so that weaknesses can be remedied quickly for the good of the citizen.
Many improvements that the public would like to see often require no additional resources
and can sometimes even reduce costs. Failure to give a member of the public a simple,
satisfactory explanation to an enquiry may, for example, result in an incorrectly completed
application form, which will cost time to rectify.
Explain, using the Batho Pele principles, how monitoring and evaluation are important in achieving good governance
in South Africa's public service.
From a human resources perspective, monitoring and evaluation should examine whether human
resource management objectives have been achieved and whether good human resource
management practices are being applied in the public service. Human resource practices are
underlined by the following constitutional principles:
• Good human resource management and career development practices must be cultivated to
maximise human potential; and
• Employment and personnel management practices (must be) based on ability, objectivity,
fairness.
According to the Public Service Commission (2008:22) human resource management objectives
include:
• The recruitment of skilled staff who can meet service delivery requirements;
• The achievement of a status of being a good employer; and
• Creation of a public service that meets professional standards, is proud to serve the public,
is patriotic, selfless, non-racial and non-sexist.
Ethics perspective
The ethical perspective of evaluation will examine outcomes linked to change in conduct (for
example fewer incidents of corruption) and whether enough measures are in place to prevent
unwanted outcomes. These measures have been called an ethics infrastructure and include:
The National Treasury provides guidelines for departments concerning strategic goals. These
guidelines prescribe that departments set strategic goals that include service delivery management
and organisation areas, financial management, and training and learning.
These areas inform the annual reports submitted to the National Treasury. They should use the
following headings:
• General Information;
• Programme (or service delivery) Performance;
• Report of the Audit Committee;
• Annual Financial Statements; and
• Human Resource Management.
National Treasury further requires that the reports focus on performance and data gathered from
monitoring and evaluation systems and evaluations.
The National Evaluation Plan is a set of summarised evaluations approved by Cabinet as priorities.
It provides feedback on ongoing evaluations as well as the national evaluation system. The current
National Evaluation Policy Framework was approved on 23 November 2011. The purpose of the
plan is:
The Cabinet developed a set of 12 outcomes through consultation and discussion at many levels.
These outcomes reflect the development impacts government seeks to achieve. The outcomes have
been written in terms of measurable outputs and key activities in order to achieve the outputs.
An extract from the National Evaluation Plan (2012), overleaf, summarises the programmes
approved for evaluation for the fiscal year 2013/2014. Each programme is linked to the 12 outcomes
and the accountable departments established by Parliament.
Name of Department Name of Intervention Title of evaluation Key motivation for this evaluation including scale (eg budget, beneficiaries)
Department of Rural Revitalization of Cost benefit analysis of The ultimate objective of the revitalisation of irrigation schemes is directly linked to outcome 7,”
Development and irrigation schemes revitalization of existing Vibrant, equitable and sustainable rural communities and food security”. Over and above that the
Land Reform irrigation schemes irrigation schemes contribute to the achievement of other outcomes, namely outcome 4: decent
employment through economic growth. Irrigation is one of the main mechanisms for permitting high
productivity production and for providing significant numbers of smallholder farmers with a decent
living. Many of the irrigation schemes were established in the former homelands and are not
working effectively.
Department of Basic Funza-Lushaka Evaluation of Funza- The intervention is linked to outcome 1: improved quality of basic education and sub-output 1:
Education Bursary Scheme Lushaka Bursary improve teacher capacity and practices. The budget is R672 million for 2012/13 with 11 650
Scheme bursaries awarded for 2012/13. Given the shortage of teachers in key subjects such as maths,
physical science and accounting, as well as in the foundation phase, it is important to assess the
extent to which the Funza Lushaka bursary scheme addresses this problem.
Name of Department Name of Intervention Title of evaluation Key motivation for this evaluation including scale (eg budget, beneficiaries)
Department of Land Care Impact Evaluation on The Land Care Programme is about the sustainable use of land and is linked to outcomes 7 (rural
Agriculture, Forestry Land Care development) and 10 (environment). Land care projects are implemented mostly in communal
and Fisheries lands and the Programme employs community members to implement activities. The programme
benefited 15 867 beneficiaries in 2011/12 and it is envisaged to benefit 28 500 people in the
2012/13 financial year. The estimated budget for 2012/13 is R115 661 000 and R108 million for
2013/14. It is not a large programme but is innovative in seeking to achieve environmental,
production and economic objectives simultaneously.
Department of Rural National Rural Youth Diagnostic Evaluation of Half of all 18-to-24-year-olds are unemployed, accounting for about 30 per cent of total
Development and Service the National Rural Youth unemployment and National Treasury estimates that the average probability of an 18-to-24-year-
Land Reform Service Corps old of finding a job is just 25 per cent. Overall unemployment is worse in rural areas. The National
(NARYSEC) Rural Youth Service attempts to deal with issues of youth unemployment and rural development,
supporting rural youths who lack skills and enabling them to develop skills and take forward
productive activities. As such it is linked to outcomes 7 (rural development), 5 (skills) and 4
(employment). The programme targets unskilled and unemployed rural youths from ages of 18-35
who have a minimum of grade 10 certificate. (outcome 4 delivery agreement)
Department of Basic New School Evaluation of curriculum A key initiative of government has been in changing the school curriculum, affecting 12 million
Education Curriculum implementation learners. This is a key activity in outcome 1: Improved quality of basic education, sub-output 1:
improve teacher capacity and practices and sub-output 2: increase access to high quality learning
materials. An evaluation in 2013/14 is looking at the school certificate more generally. This
evaluation will look more particularly at the issue of the school curriculum.
Whatever the place of monitoring and evaluation within an organisation, in the public service the
following must be integrated with monitoring and evaluation:
• The strategic plan of every government organisation: monitoring and evaluation is a part
of the organisation's strategic plan as it specifically uses the organisation's strategic goals
and objectives to set performance indicators and thereby facilitate monitoring and
evaluating;
• Human resource planning: an important part of monitoring and evaluation is capacity (ie
the human resources/people of the organisation); human resources needs to include
monitoring and evaluation in its planning processes;
• Electronic systems: it is crucial that organisations keep electronic records of all monitoring
and evaluation processes, and that these processes are facilitated by the IT systems
implemented within the organisation;
• Planning and budgeting: monitoring and evaluation needs to be part of the organisation's
budget and other planning processes, and to be able to provide monitoring and evaluation
feedback on these systems;
• Framework for reward and recognition: individuals should be recognised and rewarded
appropriately for their successful role in monitoring and evaluation practices; and
• Training and development: monitoring and evaluation training should take place as a way
to create an understanding and a culture of monitoring and evaluation within the
organisation.
Creating this culture depends to a large degree on the approach managers choose to take.
7.4.7 Conclusion
In this section we discussed the broader aspects of monitoring and evaluation within the public
service. We established an understanding of the government-wide monitoring and evaluation system
and the values the system wishes to address. We provided the cabinet’s priority evaluations, which
should lead all implementation strategies and plans for monitoring and evaluation. In the next section
we will look at the steps involved in implementing a monitoring and evaluation system.
As a summary for this section and an introduction to the next, read this article:
• Public Service Commission. (2012). Evolution of monitoring and evaluation in the South
African public service. http://www.psc.gov.za/newsletters/docs/2012/K-
9555%20PSC_6th%20edition%20magazine_DevV11.pdf (accessed 18 June 2020).
1. Critically evaluate the summary of proposed evaluations for 2013/2014. Review programmes in your
department and link them to these evaluations. Discuss in your groups the progress of the implementation of
these programmes and explain how they link to the 12 outcomes set by the cabinet.
2. Critically review the summary of proposed evaluations for 2014-2015. Discuss which programmes your
department could implement. Explain how these programmes link to the 12 outcomes set by the cabinet.
3. Evaluate how well your organisation integrates monitoring and evaluation with other management functions.
• Monitoring and evaluation fits into the government’s overall planning cycle, which runs from
one general election to the next;
• M&E takes place in the context of three inter-related spheres of government (national,
provincial and municipal);
• Intergovernmental relations are organised in terms of four acts of Parliament, namely the:
o Intergovernmental Fiscal relations Act, 1997
o Municipal Structures Act, 1998
o Municipal Systems Act, 2000, and Municipal Systems Amendment Act 2003
o Intergovernmental relations Framework Act, 2005
• These frameworks and systems continue to evolve, and M&E plays a role in improving them;
• The DPME offers a model for implementing a monitoring and evaluation framework,
illustrated in Figure 6;
• Evaluation can be conducted in more than one way. There are, for example, the
o Programme performance perspective
o Financial perspective
o Governance perspective
o Human resource management perspective
o Ethics perspective
o Perspective of National treasury guidelines
• There is an overall National Evaluation Plan first introduced in 2013/14. The essential point
is that having such a plan (with a set of identified outcomes) is a way to improve the quality
of public administration; and
• Monitoring and evaluation is one of many management functions, and must take other
management functions into consideration, not so that we can tick boxes, but so that we can
improve the quality of all our work.
It will help you strengthen and embed your understanding of the course. You will not be able to
change your answers once you have submitted them, so make sure you have completed the
relevant section of coursework first. Where you see Select all that are relevant, be aware that
any number of the options presented could be correct. You will lose marks for incorrect
selections, so choose carefully. Your combined marks from these assessments count towards a
total of 20% of your course mark.
• Adato, M. (2011). Combining quantitative and qualitative methods for program monitoring
and evaluation: why are mixed-method designs best?
http://documents.worldbank.org/curated/en/633721468349812987/pdf/643860BRI0Mixe00
Box0361535B0PUBLIC0.pdf (accessed 18 June 2020).
• Goldman, I., Engela, R., Akhalwaya, I., Gasa, N., Leon, B., Mohamed, H. and Phillips, S.
(2012). Establishing a national M&E system in South Africa. The World Bank Special
Series on The Nuts & Bolts of Monitoring and Evaluation Systems, 21, 1-11,
http://documents.worldbank.org/curated/en/556311468101955480/pdf/760630BRI0Nuts00
Box374357B00PUBLIC0.pdf (accessed 18 June 2020).
Recommended
reading • Lahey, R. (2010). The Canadian monitoring and evaluation (M&E) system: lessons learned
from 30 years of development. ECD Working Paper Series.
http://documents.worldbank.org/curated/en/865531468226748462/pdf/654070NWP0230c0
C0disclosed011040110.pdf (accessed 18 June 2020).
• Molleman, E. and Timmerman, H. (2003). Performance management when innovation and
learning become critical performance indicators. Personnel Review, 32(1), 93-113.
https://www.researchgate.net/profile/Eric_Molleman/publication/235285519_Performance_
management_when_innovation_and_learning_become_critical_performance_indicators/lin
ks/5948df07458515db1fd8df78/Performance-management-when-innovation-and-learning-
become-critical-performance-indicators.pdf (accessed 18 June 2020).
Monitoring and evaluation is, as noted, the responsibility of the Department of Performance
Monitoring and Evaluation and the Public Service Commission. In this section, we examine the
M&E process set by the Department of Performance Monitoring and Evaluation. This will allow
Section overview you to create a performance-based framework for monitoring and evaluation in your own
context. You must keep in mind that all of this occurs within the government's framework for
M&E (the Government-wide Monitoring and Evaluation System) and that monitoring and
evaluation is integrated with many management functions.
We briefly discussed the steps involved in the monitoring and evaluation system in the previous
section of the course. In this section we will examine the implementation of the monitoring and
evaluation process, which is not necessarily linear. An evaluation may occur at any time in the life
cycle of a project or programme. Keep this in mind when compiling or adapting monitoring and
evaluation programmes.
Once evaluation is complete, you need to communicate the results to relevant stakeholders. This
involves following the Department of Performance Monitoring and Evaluation’s guidelines for
communicating evaluation results. These will also be explained in this section.
The monitoring and evaluation process must be followed within the framework of the government-
wide monitoring and evaluation system. See Figure 7.
The legislature provides funding and the public officials carry out the activities described in the
programme
Public scrutiny and robust systems The records are captured, verified
result in good management and analysed into reports
Census and
Performance Follow up
surveys, admin Evaluations
information actions
data sets, etc
Surely the first step is to identify the problem and its context? Please bring problem identification up
so that it runs ahead of situational analysis.
The first step of a monitoring and evaluation process is to outline the current state. A situational
analysis provides an overview of the current state of the department. Information about different
aspects of the organisation’s current state are gathered, analysed and presented. For the purpose
of a government programme, a situational analysis should give you an overview of the current status
of a department and how this department aligns with the needs of the public or the goals set out in
the strategic plan.
A situational analysis:
You have discussed the planning process in your Strategic Management course. For an overview of
the planning process study Figure 8.
Monitoring
Situation
and
analysis
evaluation
Priority
and
Task setting
objective
setting
Option
appraisal
McCoy and Bamford (1998:6-9) suggest the conducting a situation analysis involves the following
steps. See Table 12.
Step 1: Determine the The framework provides the scope of the analysis. It should be focused and should fit
framework the needs of the research.
Step 2: Identify what There is no need to reinvent the wheel. Do research within the department to see what
information is already information is already available. Make sure that it is accurate.
available
Step 3: Identify what Analyse the gaps between the information that already exists, and the information
information is still needed. This will give you an indication of what should still be gathered. Usually at this
required stage questionnaires or feedback forms will be developed to collect specific
information.
Step 4: Collect the Next you need to develop a research plan. Describe the information needed, the
required information process you will follow to gather it, how you will analyse it and in what format you will
present your findings.
Step 5: Compile and Write a report presenting the situation and its analysis. Usually the format of a research
write the report report could be used, but different departments prefer different reporting formats for
specific information.
Step 6: Distribute and Lastly the information collected should be published. The type of programme or project
disseminate the report will predict which channels to use for the publication of the report.
The department will have to invest time and resources in the collection of the information. Although
the situation analysis is a useful tool, it can be costly and often requires experts at some stage of the
process (Miller, nd).
Agriculture is the major economic activity in the area. The main produce is grapes and sun-dried fruit. Although the
water supply limits the development of agriculture, exploitation of overseas markets provides opportunity for some
economic growth. Apart from some food processing (wine and sun-dried fruit), there is no manufacturing or industrial
activity. There are no accurate unemployment figures for the Northern Cape; the October Household Survey of 1994
estimated that 32,5% of an estimated 278 743 economically active people were unemployed. Rates were higher for
coloureds (37, 9%) and blacks (39,4%), than for whites (7,2%).
Fifty-seven percent of unemployed people had been unemployed for more than a year at the time of the survey.
Almost 75% of unemployed people are not trained or skilled for specific work. Employment opportunities are limited,
with strong seasonal variation in the availability of work. Pensions and other grants form an important source of
income for many households. Although there are no accurate figures, there is no doubt that a sizable proportion of
the population lives in poverty. In comparison to other regions in the province, more people live in rural areas with
poorer access to basic services when compared to the provincial figures.
Outcome 4 of the national evaluation plan posits that government is committed to ensure decent employment through
inclusive economic growth. In the case study, one of the problems underpinning unemployment is listed as insufficient
skills.
Imagine that you are an employee of the Department of Higher Education for this region. You are tasked with
investigating the situation and presenting your findings to the minister of education for the Northern Cape. Design a
plan for the collection of the necessary information needed for the feedback report for the minister.
Step 2 of the process links well with step one – data collection and analysis. In step one of the
process the current situation has been analysed while in step 2 the information is taken a step further:
it is analysed to inform decision. Therefore – knowledge is created.
Primary data can be collected in different ways, depending on the project needs, skills levels, time
and budget. Data can also be collected from individuals or from a group of people.
Examples of primary data collection include: questionnaires, surveys, interviews, focus groups and
group interviews. These are explained in Table 13.
Which method of data collection would suit needs of the programme you discussed in the
previous activity? Why would this be the best method for collecting data?
There are two main types of information produced by the data collection process: qualitative and
quantitative. The most obvious difference between the two is that quantitative data are numerical
(for example amounts, proportions) and qualitative data gives information best described in words,
diagrams, or pictures.
Most monitoring and evaluation systems require the collection of both quantitative and qualitative
data. Interventions need qualitative data about the nature of results (for example beneficial or harmful
effects, intended or unintended impacts). Interventions also need quantitative data (for example
about the distribution or intensity of the results) to ensure the accuracy of the analysis.
Whether the data we collect is numerical or textual (descriptive) is determined by the type of
questions we ask in our tools. Detailed qualitative data can be obtained by asking open-ended
questions, whereas numerical data can be obtained by asking closed-ended questions.
Mixed research uses both quantitative and qualitative techniques in a single study. For example, a
study could use a qualitative method such as focus groups as well as a quantitative method such as
a questionnaire survey to collect data. Alternatively, a research instrument could use a mix of open-
ended (qualitative) and closed-ended (quantitative) questions to collect responses.
1. In your group, decide whether you will use qualitative or quantitative methods to fix the problem you identified in
the previous activity.
2. Explain how you would use this method to collect the data you require with specific reference to the processes
explained above.
“… a pre-determined signal that a specific point in a process has been reached or result
achieved. The nature of the signal will depend on what is being tracked and needs to be very
carefully chosen. In management terms, an indicator is a variable that is used to assess the
achievement of results in relation to the stated goals/objective.”
Developing indicators involves answering the question: How will I know or what will I see that will tell
me the specific result has been achieved? To answer this question, let us examine the guidelines
for developing performance indicators.
National Treasury (2007:7) requires that all performance indicators conform to the standards
reflected Table 14.
Reliable The indicator should be accurate enough for its intended use and respond to changes in the level
of performance.
Well-defined The indicator needs to have a clear, unambiguous definition so that data will be collected
consistently, and be easy to understand and use.
Verifiable It must be possible to validate the processes and systems that produce the indicator.
Cost-effective The usefulness of the indicator must justify the cost of collecting the data.
Appropriate The indicator must avoid unintended consequences and encourage service delivery
improvements, and not merely create incentives to meet targets.
Relevant The indicator must relate logically and directly to an aspect of the institution's mandate, and the
realisation of strategic goals and objectives.
Furthermore, performance indicators need to contribute to the four standards required of monitoring
and evaluation:
• Equity
• Effectiveness
• Efficiency
• Economy.
• Economy indicators are used to determine that the correct inputs are attained at the lowest
possible cost. They also determine whether the correlation of outputs with cost is correct.
Economy indicators are relative due to the fact that the correct use of funds will differ in
different cases. The best way to determine whether inputs are economical or not is therefore
to compare them with international best practice (are similar institutions achieving similar
results based on the same amount of resources?);
• Efficiency indicators measure how efficiently the inputs have been transformed into
outputs. The most efficient input indicators are those that can produce the maximum number
of outputs per indicator, or alternatively, the least amount of input to create a single output.
Efficiency is measured with an input-to-output ratio. Again, efficiency differs for various
institutions, and so again using international best practice to determine efficiency is advised;
• Effectiveness indicators measure how well the outputs achieve the outcomes. The
outcomes must relate to the strategic objectives of the specific organisation, and therefore
effectiveness can be measured based on whether or not these objectives (outcomes) are
being achieved. As an organisation's goals and objectives are likely to be the same over a
period of at least five years, effectiveness only needs to be measured once during this period;
and
• Equity indicators measure how well services are being provided without unfair bias or
discrimination. In other words, equity indicators are used to measure how well an
organisation has attained comparable outputs among different groups; for example, between
those living in rural and urban areas. The best way to measure equity is by conducting
benchmark tests.
The National Treasury (2007:10) offers the following steps for developing performance indicators:
During this stage, managers need to determine what needs to be done to reach the desired outcome
and impacts.
Although the organisation may have developed a range of extensive indicators, it is more effective
to select and work with a few of the most important indicators, rather than attempt to measure every
aspect of service delivery and outputs (National Treasury, 2007:11). The National Treasury offers
the following advice for selecting the best indicators:
• The indicators should clearly communicate the strategic goals and objectives of the
organisation;
• The data used to determine indicators should be easily available; and
• Choose indicators according to their manageability: will it be likely to control these indicators
and monitor them closely enough?
After selecting appropriate performance indicators, determine what level of performance will ensure
the achievement of the organisation's outcomes. Performance targets are goals the organisation
sets that determine a specific set level of performance that the organisation, programme, project or
individual aims to achieve within a set time frame (National Treasury, 2007:9).
Setting performance targets involves determining the baseline and performance standards (National
Treasury, 2007:9):
• The baseline refers to the current level of performance, which must be improved
• Performance standards are the "minimum acceptable level of performance" required from
the individual, organisation or project
Objective: “To expand access province-wide by the end of 2015 to an appropriate package of
treatment to all people in the province diagnosed with HIV or AIDS.”
Baseline: “Currently 45 000 people in the province have been diagnosed with HIV or AIDS.
Fewer than 4000 are receiving appropriate treatment.”
Performance standard: “To provide mobile HIV clinic services to all districts at least monthly.”
Performance targets are set at the beginning of a strategic planning period. According to the National
Treasury (2007:9), they should be determined using Smart criteria:
• Specific: the form and required level of performance must be clearly identified;
• Measurable: there must be a way to measure the required performance standard;
• Achievable: the performance requirements must be realistic given the context;
• Relevant: the performance requirements must not be linked to a specific goal; and
• Time-bound: there must be a limited period for the performance requirements to be met.
As the central aim of this scheme is to be able to monitor and evaluate performance information, it
is crucial that a process for reporting progress and results is integrated into the planning, budgeting
and implementation process.
The organisation needs to determine the best way of getting the information to the right people. This
will of course depend on the institution and its specific structure.
Monitoring and evaluation needs to be a cyclical process. Therefore, at every stage of the plan,
regular monitoring and evaluation has to be taking place in order to determine (National Treasury,
2007:9):
1. Using the criteria for performance indicators, and Molleman and Timmerman's (2003) argument, develop
indicators you could use to measure progress for the monitoring and evaluation project you have been working
on in previous activities.
Once a set of suitable indicators has been defined for an intervention, and a data collection method
chosen, stipulate the performance level that the institution and its employees will strive to achieve.
This includes stipulating performance targets that are relative to current baselines.
Baseline information offers a point of comparison. A baseline is the specific measurement of the
indicators within your monitoring system.
You start collecting baseline information during the first year of the intervention. After that, you can
use this information to measure the progress of the project or programme.
• Compare the indicators measuring the situation before the intervention started with the
situation at a specified period after it started;
• Compare changes in areas where the intervention has taken place with those in similar
locations, but where the intervention has not taken place; and
• Compare the difference between similar groups – one that has been exposed to the
intervention and a so-called control group that is not within intervention influence.
Performance targets express a specific level of performance that the intervention is aiming to achieve
within a given time period. The first step in setting performance targets is to identify the baseline,
which may be the performance level recorded in the year prior to the planning period.
Performance standards express the minimum acceptable level of performance, or the performance
level that is generally expected. These should be informed by legal requirements, departmental
policies and service-level agreements. They can also be benchmarked against performance levels
in other institutions, or according to accepted best practices.
Performance standards and performance targets must be specified before the beginning of a service
cycle, as this may be a strategic planning period or a financial year. This is so that the institution and
its managers know what they are responsible for, and can therefore be held accountable at the end
of the cycle. While standards are normally timeless, targets must be set in relation to a specific
period. The targets for outcomes will incline to span multi-year periods, while the targets for inputs,
activities and outputs should cover either quarterly or annual periods.
An organisation must use standards and targets during the course of the intervention, as part of its
internal management plans and individual performance management system.
When you develop indicators, there might be a temptation to set unrealistic performance targets.
Successful performance management requires realistic, achievable targets but ones that challenge
the institution and its staff. Targets must preferably be set with regards to previous and existing levels
of achievement (ie current baselines), and realistic forecasts of what is possible. Where targets are
set about service delivery standards it is important to recognise current service standards and what
is normally regarded as acceptable.
• Communicate what will be accomplished if the current policies and expenditure programmes
are continued;
• Allow performance to be compared at regular intervals – on a monthly, quarterly or annual
basis as required by departmental standards; and
• Make possible evaluations of the correctness of current policies and expenditure
programmes.
1. Using the project you have been working on in previous activities, develop baseline and performance targets for
your department or organisation.
2. Explain why you have chosen these specific baselines and targets for your project.
This step is fairly straightforward: you link your indicators to the relevant policy objective. This allows
a clear correlation between what you are trying to achieve and how you are going to achieve it. If
there are no indicators for your policy objectives, you proceed to Step 5.
Based on the guidelines highlighted in steps 3 and 4, you must design new indicators, targets and
baselines if these are not present for the specific policy that you are addressing.
To maintain the goals of government's outcomes-based approach, you need to ensure that each link
in the monitoring and evaluation chain is serving a purpose and can be related back to inputs,
outputs, outcomes and impacts.
In Figure 3, performance indicators are shown to be present at each point in the pyramid. This is
because performance indicators define inputs, activities, outputs, outcomes and impacts.
Measurable indicators therefore need to be defined for each element of the pyramid.
The best way to identify and evaluate the links between outcomes and your monitoring and
evaluation system is to use a theory of change or logic model.
According to the DPME (2011:20 in City of Johannesburg, 2012:18), a theory of change "describes
a process of planned change, from the assumptions that guide its design, the planned outputs and
outcomes to the long-term impacts it seeks to achieve." In other words, it allows you to identify the
casual links between the impact (long-term) and the outcomes (as measured by their outputs,
activities and inputs).
“A theory of change is therefore a reflection of the end goal or impact desired, and the outcomes,
outputs, activities and inputs viewed as necessary for this end goal to be achieved. A set of
assumptions underpins identification of each of the elements in the chain. Assumptions may arise
from experience, facts, insights, formal learning, research or other sources. Through ongoing
monitoring and evaluation activities, these assumptions may be surfaced, challenged and refined,
thereby allowing those using the monitoring and evaluation framework to apply insights from past
practice when identifying the most appropriate set of activities, outputs and outcomes through
which to drive the desired long-term goals.”
(City of Johannesburg, 2012:18)
The example below, drawn from the City of Johannesburg’s’ guidelines for the implementation of the
government-wide monitoring and evaluation system, illustrates this.
As the example shows, an impact is identified. This impact is then broken down using outcomes to
specify how it will be achieved. In turn, these outcomes are broken down into different activities and
inputs that will be used to ensure the outcomes are met. The time frame on the left-hand side of the
figure illustrates the period in which each activity and output will be achieved.
To get to the point where you can create a logic model, you need to follow the two steps discussed below:
A problem tree analysis is a problem-solving technique used to understand the roots of a problem. In
order to generate a plan for evaluating whether outputs and activities are contributing to outcomes, and
thereby ensuring the attainment of the correct inputs (the logic model), you need to understand which
activities and outputs are necessary.
This example shows a problem-tree analysis as applied for national budget execution. The primary and
secondary causes (roots of the problem) are shown in the context of their contribution to the problem (in
this case, weak national audit authority, the new emerging public service ethic, top-down management
systems and culture, and limited HR capacity). These are seen as creating the problems illustrated in the
"trunk" of the tree (limited budgeting and accounting capacity, for example). These ultimately result in the
negative effect highlighted in the top "branches" of the tree.
Once you have completed the problem-tree analysis, you can link the "roots" to inputs, activities and
outputs that can be used to solve these problems. Here is an example from the City of Johannesburg:
As the figure shows, the city has developed inputs, activities and outputs that will solve the district's biggest
problems. These are then linked to the outcomes and impacts that will be achieved.
1. Using the problem-tree analysis and your own department's problem/s, create a diagram representing the roots
of the problem.
2. Link these roots to the inputs and activities that could be used to solve them.
3. Present your analysis to the class.
4. Discuss and critique the presentations.
A successful and relevant monitoring system feeds the results of the monitoring to all stakeholders
who require the information.
There are several ways to share information. The most common way is through written reports. As
mentioned before – each step might require a different reporting format to communicate the relevant
information. Usually the purpose of the report dictates the format. A summary of different reports is
provided in Table 15.
Problem analysis or These reports examine and provide an analysis of a specific problem or need identified
need analysis report by an organisation
Project or programme The report will outline the plan of the project or programme
plans
Feasibility reports Feasibility reports are written after the conducting a feasibility study. A feasibility study
researches whether a specific project or programme would be successful or profitable.
This is done before the project or programme starts.
Proposals Proposals are reports containing documents, official statements or letters written to
convince a reader that a project or programme presented should be tendered to the
writer.
Progress reports Progress reports reflect on the progress of a specific project or programme
implementation
Evaluation reports These reports summarise the evaluation information gathered and presents results to a
reader concerning the success or failure of a programme or project.
Impact assessment These reports provide the impact results of a specific programme or project
reports
Annual reports Annual reports summarise the operational functions of a specific department. The results
are usually linked to the strategic goals of the department.
(BusinessDictionary, 2019)
The Framework for Managing Programme Performance Information (2007) was devised by National
Treasury and provides guidance to national, provincial and local government on managing
performance.
Performance information is useful only if it is consolidated and reported back into
planning, budgeting and implementation processes where it can be used for management decisions,
particularly for taking corrective action.
What this means is getting the right information in the right format to the right people at the right time.
Organisations must find out what information the various users of performance information require.
Likewise, they must develop formats and systems to ensure that their needs are met.
In section three we discussed the different perspectives of evaluation. We also discussed the
possible outcomes for evaluating a government programme. In this section we will have a look at
implementing evaluation techniques.
Framing the evaluation requires the subject of the evaluation to be clearly identified and evaluated.
Framing questions linked to the evaluation perspectives will ensure this. An example is provided to
illustrate the point. The example provides evaluation questions linked to the evaluation perspectives
identified in the previous chapter of this course.
• What were the objectives of the programme and how well were they achieved?
• Did a secondary housing market develop so that beneficiaries could realise the
economic value of their asset?
• Did the housing programme create sustainable human settlements? (Included in this
concept are informal settlement upgrade, promoting densification and integration,
enhancing spatial planning, enhancing the location of new housing projects,
supporting urban renewal and inner city regeneration, developing social and economic
infrastructure, designing housing projects so that they support informal economic
activity and enhancing the housing product.)
Programme design
• What are the design features of the programme with regard to:
o The service delivery model (will government build houses, finance housing or
subsidise housing?)
o The financial contribution of government to each household
o The access mechanism: will people apply for houses (demand driven strategy) or
will government undertake housing projects where the need has been identified
by officials (supply driven strategy)?
o The size and quantity of houses
o The location of housing projects.
o Town planning patterns
o The types of units that are provided (family units or rental housing)
• The configuration of institutions through which the programme will be delivered (In the
case of housing the programme is delivered through government departments on
national, provincial and local level plus financial institutions, housing institutions
(landlords), consultants, developers and building contractors)
• How did these design features contribute to the success of the programme?
• How flexible was the programme design so that creative solutions were possible on
the project level?
• How well was the programme implemented?
Responsiveness to needs
Values targeting
• Who were the targeted beneficiaries and how well were they reached?
• What are the eligibility criteria and are really poor people not excluded by the way the
programme is implemented?
• Was there any special dispensation for vulnerable groups like women, disabled people,
children and youth?
Scale of engagement
What is the scale of the programme and how does it compare to housing needs?
(PSC, 2008:51-53)
The evaluation questions determine the scope of the evaluation, the type of evaluation and the
methodologies used to evaluate. The evaluation questions will also prescribe the information sources
needed for the process.
Evaluation Questions
Use the programme or project you identified in the first activity of this section and design evaluation questions that will
cover the evaluation perspectives of monitoring and evaluation.
In order to implement the monitoring and evaluation system, staff need two essential elements:
• Line managers must have generic monitoring and evaluation skills as required by the
Framework for Managing Programme Performance Information; and
• Specialist monitoring and evaluation skills to ensure monitoring and evaluation strategy
implementation and to ensure quality.
(National Treasury, 2007:15)
• The users of the monitoring and evaluation data understand how to integrate information in
monitoring and evaluation functions and understand how to respond to the findings of the
initial situational analysis report; and
• Monitoring and evaluation managers will understand the monitoring and evaluation system,
be able to manage it, and to produce results related to indicators and needs of the community.
Monitoring and evaluation practitioners must be able to link various components of the monitoring
and evaluation system information provided to ensure projects and programmes succeed. The
approach adapted by practitioners should be evidence-based, and data gathering methodology must
be scientific.
A capacity building plan should be developed once the department’s monitoring and evaluation
strategy is reviewed, evaluating the skills needed to implement the strategy. If they are not available,
the options are:
• Training and development – using the National School of Government to train managers;
• Recruitment and selection – finding the right employee with the right skills to complete the
monitoring and evaluation team;
• Talent management – mentoring and coaching individuals identified by the talent
management plan to become expert monitoring and evaluation practitioners;
• On-the-job coaching – existing monitoring and evaluation experts in the department can
transfer skills to other employees; and
• Participation in knowledge transfer – experts and laypeople could take part in conferences
and workshops addressing the skills needed for monitoring and evaluation system
implementation.
(National Treasury, 2007:16)
The results of the entire monitoring and evaluation process must be communicated to all relevant
stakeholders (employees, relevant departments, the public, etc.)
The DPME offers guidelines for communicating evaluation results. These are as follows:
1. Determine what the monitoring and evaluation findings mean: what do they say about the
project/programme/department/organisation? Using the answer to this question, figure out
the best way to convey these findings to the stakeholders: written, verbal, electronic means
(website, social media, etc.)?
2. Produce three summaries:
• Summarise the key findings of the monitoring and evaluation process in plain language
in a one-page summary, containing the key messages you want to convey.
• Write a three-page executive summary of the findings
• Write a 25-page summary report
3. The DPME's communications plan template must be completed.
4. Explain how stakeholders can access and use information.
An evaluation report provides readers with relevant information pertaining to the evaluation and its
results. Generically, an evaluation report includes the components discussed in Table 16.
Executive summary A short summary of the process followed, the results of the evaluation, the objectives
achieved, lessons learnt, questions answered, and needs fulfilled is presented first.
Introduction The background of the evaluation, the purpose of the evaluation and the major activities of
the project are presented in the introduction to the report.
Evaluation methods A short explanation of the tools and methods used to gather information and conduct the
and tools evaluation is provided. Examples of the tools and plans could be presented as appendices
linked to this section of the report.
Summary of results Here, a data analysis is provided.
Interpretation of Results are interpreted and presented. The results must be linked to the outcomes of the
results system or the goals and objectives of the plan.
Conclusion Present how the project objectives were met and whether the purpose of the evaluation
was achieved.
Recommendations Key points must be summarised in this section and suggestions of improvements should
be made.
(Zarinpoush, 2006:51)
1. What method do you think would be best to communicate the results of your evaluation to the relevant
stakeholders?
2. Explain why this method would be the most suitable given the target audience.
3. Identify and explain the importance of communicating evaluation results.
According to MacKay (2007:22) the success of a monitoring and evaluation system relies on the
information the system provides. The information should be of such calibre that it can be used to:
• Support government policy making – this includes performance budgeting and national
planning;
• Support policy development and programme planning;
• Support programme and project management; and
• For accountability purposes.
(MacKay, 2007:23)
Information provided by the system should be of such high quality that it becomes institutionalised
and sustainable.
The success of the system relies on the critical success factors identified within each programme or
project. These factors will be linked to the strategy of the department and remain the focus of all
activities involved in the programme or project.
• Lahey, R. (2010). The Canadian monitoring and evaluation (M&E) system: lessons
learned from 30 years of development. ECD Working Paper Series.
http://documents.worldbank.org/curated/en/865531468226748462/pdf/654070NWP023
0c0C0disclosed011040110.pdf (accessed 18 June 2020).
Implementing a monitoring and evaluation system successfully ultimately will ensure that the 12
outcomes outlined by Cabinet are met. If these objectives are met, all South Africans will benefit
from the dedication and hard work of monitoring and evaluation practitioners.
1. Draw up a model that shows how you could apply the success factors of monitoring and evaluation to the
government-wide monitoring and evaluation system.
2. Within this model, show where your department or organisation would play a role in the success of monitoring
and evaluation.
3. Present your model to the class.
4. Give critical feedback on fellow student’s models.
5. After the presentations are complete, use the knowledge you have gained from this course to explain the
importance of monitoring and evaluation from a public-sector perspective.
• The implementation of monitoring and evaluation is not necessarily linear – evaluation can
take place at any stage in the life cycle of a project or programme;
• The results of evaluation must be communicated to stakeholders;
• The DPME offers a 10-step process for monitoring and evaluation as follows:
o Step 1: Examine the context and current state of the problem
o Step 2: Administrative information systems and data sets are used to collect data about
the problem and the programme intended to rectify it
o Step 3: List indicators, targets and baselines, so that you know what you are trying to
achieve, and whether you have achieved it
o Step 4: Group indicators by policy objective
o Step 5: If policy objectives have no indicators you must design them
o Step 6: Review link between inputs, outcomes and impacts, and identify causal
relationships: each link in the M&E chain must serve a purpose, and must relate to inputs,
outputs, outcomes and impacts of the policy or programme
o Step 7: Reporting approach – decide how best to report back to your various
stakeholders
o Step 8: Evaluate approach – here you evaluate the success of the programme or policy
o Step 9: Capacity building plan – make sure you have the required skills to conduct the
M&E process
o Step 10: Communication plan – communicate your results to all who need to know them
• We briefly consider the success factors of M&E: the key is usable information.
It will help you strengthen and embed your understanding of the course. You will not be able to
change your answers once you have submitted them, so make sure you have completed the
relevant section of coursework first. Where you see Select all that are relevant, be aware that
any number of the options presented could be correct. You will lose marks for incorrect
selections, so choose carefully. Your combined marks from these assessments count towards a
total of 20% of your course mark.
Adato, M. (2011). Combining quantitative and qualitative methods for program monitoring and
evaluation: why are mixed-method designs best?
http://documents.worldbank.org/curated/en/633721468349812987/pdf/643860BRI0Mixe00Box036
1535B0PUBLIC0.pdf (accessed 18 June 2020).
Centre for Learning on Evaluation and Results (Clear), 2012, African Monitoring and Evaluation
Systems: Exploratory Case Studies, Johannesburg: Graduate School of Public and Development
Management, University of the Witwatersrand.
City of Johannesburg, 2012, Annexure 3: The City of Johannesburg's Monitoring and evaluation
Framework,
https://www.joburg.org.za/documents_/Documents/Intergrated%20Development%20Plan/2013-
16%20IDP%2017may2013%20final.pdf (accessed 18 March 2019).
Department of Public Service and Administration, 2018, ‘The Batho Pele Vision',
http://www.dpsa.gov.za/documents/Abridged%20BP%20programme%20July2014.pdf (accessed
18 March 2019).
Department of Public Service and Administration. (2012). Public Service Act, 1994.
http://www.dpsa.gov.za/dpsa2g/documents/acts®ulations/psact1994/PublicServiceAct.pdf
(accessed 18 June 2020).
Goldman, I., Engela, R., Akhalwaya, I., Gasa, N., Leon, B., Mohamed, H. and Phillips, S. (2012).
Establishing a national M&E system in South Africa. The World Bank Special Series on The Nuts &
Bolts of Monitoring and Evaluation Systems, 21, 1-11,
http://documents.worldbank.org/curated/en/556311468101955480/pdf/760630BRI0Nuts00Box374
357B00PUBLIC0.pdf (accessed 18 June 2020).
Ishmail, Z. 2012, Building a results-based monitoring and evaluation system for the Western Cape
government of South Africa, In: PSC News, February/March 2012.
http://www.psc.gov.za/newsletters/docs/2012/K-
9555%20PSC_6th%20edition%20magazine_DevV11.pdf (accessed: 18 March 2019).
Kaplan, R.S. and Norton, D.P. 1996, The Balanced Scorecard, Harvard Business School Press,
Boston, Massachusetts.
Lahey, R. 2006, A Framework for Developing an Effective Monitoring and evaluation System in the
Public Sector – Key Considerations from International Experience, Solutions, Canada.
http://www.ecdg.net/wp-content/uploads/2011/12/Framework-for-developing-an-effective-ME-
system-in-the-public-sector-2009_Lahey_good.doc (accessed: 18 March 2019).
MacKay, K. 2007, ‘How to build M&E systems to support better government’, Washington: The
World Bank.
McCoy, D. and Bamford, D. 1998, How to Conduct a Rapid Situation Analysis: A Guide for Health
Districts in South Africa, Durban: Health Systems Trust, www.hst.org.za/uploads/files/rapid.pdf
(accessed 14 March 2014).
Molepo, A.N. 2011, Monitoring & Evaluation Framework for the Public Service, M&E Learning
Network, SA Reserve Bank Conference Centre, 15 February 2011.
Molleman, E. and Timmerman, H. (2003). Performance management when innovation and learning
become critical performance indicators. Personnel Review, 32(1), 93-113.
https://www.researchgate.net/profile/Eric_Molleman/publication/235285519_Performance_manage
ment_when_innovation_and_learning_become_critical_performance_indicators/links/5948df07458
515db1fd8df78/Performance-management-when-innovation-and-learning-become-critical-
performance-indicators.pdf (accessed 18 June 2020).
Nkwinti, G. (nd). National Development Plan and the New Growth Path: Transforming the
Economy. http://kzntopbusiness.co.za/site/search/downloadencode/nLaqaaKelpO8mnjc (accessed
18 June 2020).
National Treasury. (2005). Treasury regulations for departments, trading entities, constitutional
institutions and public entities.
http://www.treasury.gov.za/legislation/pfma/regulations/gazette_27388%20showing%20amendmen
ts.pdf (accessed 18 June 2020).
PMG, 2014, National School of Government & Public Service Commission mandate and
challenges, https://pmg.org.za/committee-meeting/17515/ (accessed 18 March 2019).
Presidency, 2007, Policy Framework for the Government-wide Monitoring and evaluation System.
https://www.dpme.gov.za/publications/Guides%20Manuals%20and%20Templates/Functions%20of
%20an%20M%20and%20E%20component%20in%20National%20Government%20Departments.p
df (accessed 18 March 2019).
Presidency, nd, Monitoring and evaluation: Capacity-building within the public sector,
http://www.thepresidency.gov.za/learning/curriculum.pdf (accessed 18 March 2019).
PSC, 2013, Annual Report to Citizens for the 2017/2018 Financial Year,
http://www.psc.gov.za/documents/2013/ARC%20English.pdf (accessed 18 March 2019).
PSC, 2018, Annual Report to Citizens for the 2012/2013 Financial Year,
http://www.psc.gov.za/documents/reports/2018/FINAL_PUBLIC_SERVICE_COMMISSION_Annual
_Report_2017_2018_%2021_SEPT_2018.pdf (accessed 18 March 2019).
Public Service Commission. (2012). Evolution of monitoring and evaluation in the South African
public service. http://www.psc.gov.za/newsletters/docs/2012/K-
9555%20PSC_6th%20edition%20magazine_DevV11.pdf (accessed 18 June 2020).
World Bank, 2004, Influential Evaluations: Evaluations that Improved Performance and Impacts of
Development Programs, Operations Evaluation Department Knowledge Programs and Evaluation
Capacity Development Group (OEDKE): Washington
Zall Kusek, J. and Görgens-Albino, 2009, Making Monitoring and evaluation Systems Work: A
Capacity Development Toolkit, Washington: The World Bank.
Zarinpoush, F. 2006, ‘Project evaluation guide for nonprofit organisations: fundamental methods
and steps for conducting project evaluation.’
http://sectorsource.ca/sites/default/files/resources/files/projectguide_final.pdf (accessed 18 March
2019).
The lists below will help you to understand important terminology used throughout this course.
Use them as a point of reference as you work through the material and feel free to add your own
terms and abbreviations to the list.
Term: Explanation:
Accountability The obligation of government to account for its activities, accept responsibility for them, and
to disclose the results in a transparent manner.
Activity Actions taken, or work performed through inputs such as funds, technical assistance, and
other types of resources are mobilized to produce specific outputs.
Batho Pele Meaning “people first” in English, Batho Pele consists of eight principles intended to
encourage and promote an efficient and effective public service.
Data Specific quantitative and qualitative information or facts that are collected and analysed.
Department of Established in 2010, the DPME is responsible for continuous improvement in service
Performance delivery through monitoring and evaluation.
Monitoring and
Evaluation (DPME)
Effectiveness The extent to which a programme/intervention has achieved its objectives.
Efficacy The extent to which an intervention produces the expected results under ideal conditions in a
controlled environment.
Efficiency A measure of how economically inputs are converted into results.
Feedback Process in which the effect or output of an activity or input is returned to change the next
action.
Goal A broad statement of a desired outcome for a programme.
Government-wide A system developed by the Presidency that describes monitoring and evaluation in
monitoring and government.
evaluation system
Impact The long-term effect of programmes or interventions.
Inputs The financial, human and material resources used in a programme or intervention.
Intervention A specific activity intended to bring about change in some aspects of
organisation/department.
Logical framework Management tool used to improve the design of interventions.
Monitoring and A multi-year implementation strategy for the collection, analysis and use of data for a specific
evaluation plan programme.
Abbreviation Explanation
DoRA Division of Revenue Act
DPME Department of Performance Monitoring and Evaluation
GWMES Government-wide monitoring and evaluation system
M&E Monitoring and Evaluation
PSC Public Service Commission
PSA Public Service Act
PFMA Public Finance Management Act
MFMA Municipal Finance Management Act