You are on page 1of 117

POSTGRADUATE

DIPLOMA IN PUBLIC
MANAGEMENT

Monitoring and
Evaluation

Contact details:
Regenesys School of Public Management
Tel: +27 (11) 669-5000
Fax: +27 (11) 669-5001
E-mail: info@regenesys.co.za
www.regenesys.co.za
This study guide highlights key focus areas for you as a student. Because the field of study in question is so
vast, it is critical that you consult additional literature.

Copyright © Regenesys, 2020

All rights reserved. No part of this publication may be reproduced, stored in or introduced into a retrieval
system, or transmitted, in any form, or by any means (electronic, mechanical, photocopying, recording or
otherwise) without written permission of the publisher. Any person who does any unauthorised act in relation
to this publication may be liable for criminal prosecution and civil claims for damages.
CONTENTS
1. WELCOME TO REGENESYS .............................................................................................................. 1
2. TEACHING AND LEARNING METHODOLOGY .................................................................................. 2
2.1 PRINCIPLES FOR RESPONSIBLE MANAGEMENT EDUCATION ............................................. 2
2.2 REGENESYS’ INTEGRATED LEADERSHIP AND MANAGEMENT MODEL .............................. 3
2.3 DEVELOPING REGENESYS GRADUATE ATTRIBUTES ........................................................... 5
3. KEY TO ICONS..................................................................................................................................... 7
4. STUDY MATERIAL ............................................................................................................................... 8
5. PRESCRIBED AND RECOMMENDED RESOURCES ........................................................................ 8
5.1 ARTICLES..................................................................................................................................... 8
5.2 LEGISLATION............................................................................................................................... 9
5.3 ACCESSING JOURNAL ARTICLE AND OTHER ONLINE LINKS ............................................. 11
5.4 ADDITIONAL SOURCES TO CONSULT.................................................................................... 12
6. GROUND RULES AND EXPECTATIONS .......................................................................................... 13
6.1 EXPECTATIONS ........................................................................................................................ 13
6.2 GROUND RULES ....................................................................................................................... 14
7. INTRODUCTION................................................................................................................................. 15
7.1 LEARNING OUTCOMES ............................................................................................................ 15
7.2 AN INTRODUCTION TO MONITORING AND EVALUATION IN THE PUBLIC SECTOR ......... 16
7.2.1 DEFINING MONITORING AND EVALUATION ................................................................ 16
7.2.2 CONCEPTS AND TERMINOLOGY .................................................................................. 17
7.2.3 THE PURPOSE OF MONITORING AND EVALUATION ................................................. 21
7.2.4 MONITORING AND EVALUATION MODELS AND TECHNIQUES ................................. 22
7.2.5 M&E STAKEHOLDERS IN THE GOVERNMENT SPHERE............................................. 25
7.2.6 CONCLUSION .................................................................................................................. 33
7.2.7 KEY POINTS .................................................................................................................... 34
7.3 THE LEGAL CONTEXT FOR MONITORING AND EVALUATION ............................................. 36
7.3.1 INTRODUCTION TO LEGISLATION ................................................................................ 37
7.3.2 CONSTITUTION OF THE REPUBLIC OF SOUTH AFRICA, 1996 .................................. 37
7.3.3 THE PUBLIC SERVICE ACT, 1994 AS AMENDED ......................................................... 39
7.3.4 THE PUBLIC FINANCE MANAGEMENT ACT (PFMA), AS AMENDED .......................... 39
7.3.5 THE MUNICIPAL FINANCE MANAGEMENT ACT .......................................................... 41
7.3.6 TREASURY REGULATIONS............................................................................................ 43
7.3.7 GOVERNMENT’S REVENUE AND EXPENDITURE STRATEGY ................................... 46
7.3.8 DIVISION OF REVENUE ACT (DORA) ............................................................................ 48
7.3.9 MONITORING AND EVALUATION AND POLICY MANAGEMENT................................. 50
7.3.10 CONCLUSION .................................................................................................................. 51
7.3.11 KEY POINTS .................................................................................................................... 52
7.4 THE MONITORING AND EVALUATION PROCESS.................................................................. 53
7.4.1 PLACEMENT IN THE GOVERNMENT’S PLANNING CYCLE ......................................... 53
7.4.2 INTERGOVERNMENTAL RELATIONS AND THE LOCAL GOVERNMENT FISCAL
FRAMEWORK .................................................................................................................. 55
7.4.3 MONITORING AND EVALUATION SYSTEM .................................................................. 57
7.4.4 EVALUATION PERSPECTIVES....................................................................................... 60
7.4.5 NATIONAL EVALUATION PLAN 2013/2014 .................................................................... 66
7.4.6 M&E AS PART OF OTHER MANAGEMENT FUNCTIONS ............................................. 75
7.4.7 CONCLUSION .................................................................................................................. 76
7.4.8 KEY POINTS .................................................................................................................... 77
7.5 IMPLEMENT A MONITORING AND EVALUATION SYSTEM ................................................... 78
7.5.1 INTRODUCTION .............................................................................................................. 79
7.5.2 STEP 1: EXAMINE THE CONTEXT AND CURRENT STATE ......................................... 81
7.5.3 STEP 2: ADMINISTRATIVE INFORMATION SYSTEMS AND DATA SETS ................... 83
7.5.4 STEP 3: LIST INDICATORS, TARGETS AND BASELINES ............................................ 86
7.5.5 STEP 4: GROUP INDICATORS BY POLICY OBJECTIVE .............................................. 91
7.5.6 STEP 5: IF POLICY OBJECTIVES HAVE NO INDICATORS .......................................... 91
7.5.7 STEP 6: REVIEW LINK BETWEEN INPUTS, OUTPUTS, OUTCOMES AND IMPACTS,
AND IDENTIFY CAUSAL RELATIONSHIPS AND LINKS ................................................ 91
7.5.8 STEP 7: REPORTING APPROACH ................................................................................. 96
7.5.9 STEP 8: EVALUATE APPROACH.................................................................................... 97
7.5.10 STEP 9: CAPACITY BUILDING PLAN ............................................................................. 99
7.5.11 STEP 10: COMMUNICATION PLAN ................................................................................ 99
7.5.12 MONITORING AND EVALUATION SUCCESS FACTORS............................................ 101
7.5.13 CONCLUSION ................................................................................................................ 102
7.5.14 KEY POINTS .................................................................................................................. 102
8. REFERENCES.................................................................................................................................. 104
GLOSSARY OF KEY TERMS AND ABBREVIATIONS .................................................................... 109
9. 109
9.1 KEY TERMINOLOGY ............................................................................................................... 109
9.2 ABBREVIATIONS ..................................................................................................................... 111
10. VERSION CONTROL ....................................................................................................................... 112
List of Tables
TABLE 1: MONITORING AND EVALUATION TERMINOLOGY................................................................ 17
TABLE 2: DIFFERENCES BETWEEN MONITORING AND EVALUATION .............................................. 19
TABLE 3: DIFFERENT TYPES OF RESULTS-BASED MANAGEMENT .................................................. 20
TABLE 4: PURPOSE OF MONITORING AND EVALUATION................................................................... 21
TABLE 5: COMPONENTS OF THE LOGIC MODEL ................................................................................. 23
TABLE 6: BALANCED SCORECARD PERSPECTIVES ........................................................................... 23
TABLE 7: DIVISION OF M&E RESPONSIBILITIES BETWEEN THE PSC AND THE DPME ................... 28
TABLE 8: EXAMPLE OF CONSTITUTIONAL VALUES AND MONITORING AND EVALUATION
INDICATORS ......................................................................................................................... 38
TABLE 9: SAMPLE TREASURY REGULATIONS ..................................................................................... 44
TABLE 10: LEGISLATION THAT ORGANISES INTERGOVERNMENTAL RELATIONS ......................... 55
TABLE 11: STEPS IN THE MONITORING AND EVALUATION PROCESS ............................................. 58
TABLE 12: SITUATION ANALYSIS ........................................................................................................... 82
TABLE 13: PRIMARY DATA COLLECTION METHODS ........................................................................... 84
TABLE 14: PERFORMANCE INDICATORS .............................................................................................. 86
TABLE 15: DIFFERENT REPORTS .......................................................................................................... 96
TABLE 16: COMPONENTS OF AN EVALUATION REPORT ................................................................. 100

List of Figures
FIGURE 1: LOGIC MODEL ........................................................................................................................ 22
FIGURE 2: GOVERNMENT'S OUTCOMES-BASED APPROACH............................................................ 24
FIGURE 3: KEY STAKEHOLDERS IN TERMS OF DPSA MONITORING AND EVALUATION................ 26
FIGURE 4: POLICY LIFE CYCLE .............................................................................................................. 50
FIGURE 5: MONITORING AND EVALUATION IN THE PLANNING CYCLE ............................................ 54
FIGURE 6: THE MONITORING AND EVALUATION PROCESS .............................................................. 57
FIGURE 7: ACHIEVING OUTCOMES USING THE GOVERNMENT-WIDE MONITORING AND
EVALUATION SYSTEM ........................................................................................................ 80
FIGURE 8: THE PLANNING CYCLE ......................................................................................................... 81
FIGURE 9: EXAMPLE OF A THEORY OF CHANGE MODEL FOR THE CITY OF JOHANNESBURG ... 93
FIGURE 10: EXAMPLE OF A PROBLEM TREE ANALYSIS..................................................................... 94
FIGURE 11: EXAMPLE OF APPLICATION OF THE OUTCOMES APPROACH TO MONITORING AND
EVALUATION ........................................................................................................................ 95
1. WELCOME TO REGENESYS

“Have a vision. Think big. Dream, persevere, and your vision will become a reality.
Awaken your potential, knowing that everything you need is within you.”
Dr. Marko Saravanja

At Regenesys we help individuals and organisations achieve their personal and organisational goals
by enhancing their management and leadership potential. Our learning programmes are designed
to transform and inspire your mind, heart and soul, helping you to develop the knowledge, skills,
positive values, attitudes and behaviours required for success.

Having educated more than 100 000 students based in highly reputable local and international
corporations across more than 160 countries since the inception of Regenesys in 1998, we are now
one of the fastest-growing institutions of management and leadership development in the world. Our
ISO 9001:2008 accreditation bears testimony to our quality management systems meeting
international standards. We are also accredited with the Council on Higher Education.

At Regenesys you will be taught by business experts, entrepreneurs and academics who are inspired
by their passion for human development. You will be at a place where business and government
leaders meet, network, share their experience and develop business relationships.

We encourage holistic leadership development by fostering multiple intelligences at an intellectual,


physical, emotional, and spiritual level. We promote the development of rational intelligence (IQ) by
honing your critical and analytical abilities so that you become a better problem-solver and innovative
thinker. We will foster your spiritual intelligence (SQ) as a purpose- and value-driven individual who
can rise above adversity, take difficult decisions and make a difference. We will help you develop
your emotional intelligence (EQ) so that you can significantly improve your relationships and resolve
conflict effectively. You will have the opportunity to hone your financial acumen through the personal
finance education available on campus. And, because studying often adds pressure to an already
stressful life, we will also help you develop physical intelligence (PQ) by learning how to manage
stress and lead a healthier lifestyle.

We will help you awaken your potential and to realise that everything you need to succeed is within
you. And we will be with you every step of the way.

Areas of Expertise

© Regenesys School of Public Management 1


2. TEACHING AND LEARNING METHODOLOGY

Regenesys uses an interactive teaching and learning methodology that encourages self-reflection
and promotes independent and critical thinking. Key to our approach is an understanding of adult
learning principles, which recognise the maturity and experience of participants, and the way that
adult students need to learn.

At the core of this is the integration of new knowledge and skills into existing knowledge structures,
as well as the importance of seeing the relevance of all learning via immediate application in the
workplace. Practical exercises are used to create a simulated management experience to ensure
that the conceptual knowledge and practical skills acquired can be directly applied within the work
environment of the participants. The activities may include scenarios, case studies, self-reflection,
problem solving and planning tasks.

Our courses are developed to cover all essential aspects of the training comprehensively in a user-
friendly and interactive format. Our subject matter experts have extensive experience in
management education, training and development.

2.1 PRINCIPLES FOR RESPONSIBLE MANAGEMENT EDUCATION

Regenesys upholds the UN Global Compact’s Principles for Responsible Management Education:

• Purpose: We will develop the capabilities of students to be future generators of sustainable


value for business and society at large and to work for an inclusive and sustainable global
economy.
• Values: We will incorporate into our academic activities and curricula the values of global
social responsibility as portrayed in international initiatives such as the United Nations Global
Compact.
• Method: We will create educational frameworks, materials, processes and environments that
enable effective learning experiences for responsible leadership.
• Research: We will engage in conceptual and empirical research that advances our
understanding about the role, dynamics, and impact of corporations in the creation of
sustainable social, environmental and economic value.
• Partnership: We will interact with managers of business corporations to extend our
knowledge of their challenges in meeting social and environmental responsibilities and to
explore jointly effective approaches to meeting these challenges.
• Dialogue: We will facilitate and support dialogue and debate among educators, students,
business, government, consumers, media, civil society organisations and other interested
groups and stakeholders on critical issues related to global social responsibility and
sustainability.

(PRME, 2014:1)

© Regenesys School of Public Management 2


2.2 REGENESYS’ INTEGRATED LEADERSHIP AND MANAGEMENT
MODEL

This course will draw on a model developed by Regenesys Management, demonstrating how the
external environment, the levels of an organisation, the team and the components of an individual
are interrelated in a dynamic and systemic way. The success of an individual depends on his or her
self-awareness, knowledge, and ability to manage these interdependent forces, stakeholders and
processes.

The degree of synergy and alignment between the goals and objectives of the organisation, the team
and the individual determines the success or failure of an organisation. It is, therefore, imperative
that each organisation ensures that team and individual goals and objectives are aligned with the
organisation’s strategies (vision, mission, goals and objectives, etc); structure (organogram,
decision-making structure, etc); systems (HR, finance, communication, administration, information,
etc); culture (values, level of openness, democracy, caring, etc). An effective work environment
should be characterised by the alignment of organisational systems, strategies, structures and
culture, and by people who operate synergistically.

Regenesys’ Integrated Leadership and Management Model

© Regenesys School of Public Management 3


As you start to apply the concepts you learn from this course in your work, think about the effect of
your action on all stakeholders – the people your organisation serves, your colleagues and the public
at large. What are the ethical implications of what you intend to do? How will you ensure that your
strategies and activities are sustainable in the triple-bottom-line sense, considering people, planet
and profit requirements?

ETHICAL CONSIDERATIONS … HOW WILL YOUR ACTIONS AFFECT:

PEOPLE
(QUALITY OF LIFE)

Bearable Equitable

Sustainable

PLANET PROFIT/PROSPERITY
(STEWARDSHIP) Viable (VALUE CREATION)

PURPOSE

The Ten Principles

The UN Global Compact, in its 10 principles, asks organisations to:

• Support and respect the protection of internationally proclaimed human rights;


• Ensure they are not complicit in human rights abuses;
• Uphold workers’ freedom of association and the effective recognition of the right to collective
bargaining;
• Eliminate all forms of forced and compulsory labour;
• Abolish child labour;
• Eliminate discrimination in respect of employment and occupation;
• Support a precautionary approach to environmental challenges;
• Undertake initiatives to promote greater environmental responsibility;
• Encourage the development and diffusion of environmentally friendly technology; and
• Work against corruption in all its forms, including extortion and bribery.

(United Nations, nd)

As a public manager you have the capacity to bring about real change. As much as organisations
are shaped by their environment, their actions influence the environment. You can contribute to
sustainable change by managing responsibly.

© Regenesys School of Public Management 4


2.3 DEVELOPING REGENESYS GRADUATE ATTRIBUTES

Getting a qualification is not enough, on its own, to prepare you to traverse the rapidly changing
world of work, where industry 4.0 and 5.0 are rendering many professions obsolete. We will work
with you throughout your studies to help you develop these critical attributes to navigate the new
world order, along with the skills and knowledge you need to excel in any environment.

REGENESYS GRADUATE ATTRIBUTES

Think differently

To think differently, you must be intellectually curious, analytical, open-minded though constructively
critical, with the mental agility to think across disciplines, contexts and domains to solve complex
problems and find innovative ways to do things. Be imaginative but rational. We will systematically
help you cultivate higher-order thinking – the kind of thinking that recognises and makes sense of
patterns others miss, and that facilitates unique linkages and solutions.

Ground decisions in evidence

Both well-informed and knowledgeable, you must be committed to sound research, taking a
multidisciplinary and metacognitive approach to problem-solving, and able to recognise and put
aside personal bias, basing decisions on evidence. This will prepare you to take calculated risks.

© Regenesys School of Public Management 5


Lead consciously

This ties back to the overarching P in the quadruple bottom line: purpose. Purpose-driven, you put
sustainability at the heart of your organisation. Emotionally and spiritually intelligent, you should be
self-aware, understand the interconnectedness of all things, and act ethically and with integrity. As
an ideal graduate, you will be a service-oriented agent of change.

Harness diversity

You will appreciate the value of individual differences. Socially intelligent, collaborative and a skilled
communicator, you should be able to facilitate connections to build, empower and manage high-
functioning teams with diverse skills and personalities, and support them in assuming
responsibilities.

Professional comportment

With a confident and inspiring aura, you are utterly professional, yet accessible. Deliberate,
determined, disciplined, and focused. You will model your values, and hold yourself accountable.
You will have the resilience and grit to keep going in the face of adversity.

Have a glocal outlook

Your glocal outlook underpins your ability to operate and compete ethically and profitably as a
responsible global citizen in a borderless world. Your multicultural awareness and wide-ranging
interest in current affairs enables you to recognise and respond to local cultures and needs without
losing sight of the global picture.

Are you ready to start work on what it takes


to be a Regenesys graduate?

The next few sections contain practical information that will help you do just that.

© Regenesys School of Public Management 6


3. KEY TO ICONS

Workbook Rate your skills

Prescribed resources References

Interesting reading Articles

Case study Discussion forum

Example Calculations

Quote Definitions and glossary

“ Note important information In a nutshell – important summary

Self-reflection activity to complete Task to complete/formative assessment

Course evaluation Digital assessment

Assignment Group assignment

Workplace assessment Exam

Web link Video clip

Audio Presentation

Livestream/webinar Folder/portfolio of evidence

Choice Appendix

Downloadable item Upload here

© Regenesys School of Public Management 7


4. STUDY MATERIAL

Your material includes:

• This study guide


• Prescribed reading and multimedia
• Digital assessments at the end of each section of your course
• Individual assignment

These resources provide a starting point for your studies. You are
expected to make good use of your textbooks, the additional
resources provided via online links, and wider reading that you, as a
higher education student, will source yourself.

5. PRESCRIBED AND RECOMMENDED RESOURCES

Various resources are prescribed to help you complete this course.

5.1 ARTICLES

• Adato, M. (2011). Combining quantitative and qualitative methods for program monitoring and evaluation: why
are mixed-method designs best?
http://documents.worldbank.org/curated/en/633721468349812987/pdf/643860BRI0Mixe00Box0361535B0PUBL
IC0.pdf (accessed 18 June 2020).

• Goldman, I., Engela, R., Akhalwaya, I., Gasa, N., Leon, B., Mohamed, H. and Phillips, S. (2012). Establishing a
national M&E system in South Africa. The World Bank Special Series on The Nuts & Bolts of Monitoring and
Evaluation Systems, 21, 1-11,
http://documents.worldbank.org/curated/en/556311468101955480/pdf/760630BRI0Nuts00Box374357B00PUBL
IC0.pdf (accessed 18 June 2020).

• Kaplan, R.S and Norton, D.P. (1992). The Balanced Scorecard – Measures that Drive Performance. Harvard
Business Review. https://hbr.org/1992/01/the-balanced-scorecard-measures-that-drive-performance-2
(accessed 18 June 2020).

• Lahey, R. (2010). The Canadian monitoring and evaluation (M&E) system: lessons learned from 30 years of
development. ECD Working Paper Series.
http://documents.worldbank.org/curated/en/865531468226748462/pdf/654070NWP0230c0C0disclosed011040
110.pdf (accessed 18 June 2020).

© Regenesys School of Public Management 8


• Molleman, E. and Timmerman, H. (2003). Performance management when innovation and learning become
critical performance indicators. Personnel Review, 32(1), 93-113.
https://www.researchgate.net/profile/Eric_Molleman/publication/235285519_Performance_management_when_
innovation_and_learning_become_critical_performance_indicators/links/5948df07458515db1fd8df78/Performan
ce-management-when-innovation-and-learning-become-critical-performance-indicators.pdf (accessed 18 June
2020).

• Nkwinti, G. (nd). National Development Plan and the New Growth Path: Transforming the Economy.
http://kzntopbusiness.co.za/site/search/downloadencode/nLaqaaKelpO8mnjc (accessed 18 June 2020).

5.2 LEGISLATION

• Department of Economic Development. (2011). The New Growth Path: Framework.


http://www.economic.gov.za/communications/publications/new-growth-path-series/download (accessed 18
June 2020).

• Department of Public Service and Administration. (2012). Public Service Act, 1994.
http://www.dpsa.gov.za/dpsa2g/documents/acts&regulations/psact1994/PublicServiceAct.pdf (accessed 18
June 2020).

• National Treasury. (2014). Public Finance Management Act No. 1 of 1999.


http://www.treasury.gov.za/legislation/pfma/act.pdf (accessed 18 June 2020).

• National Treasury. (2003). Local Government: Municipal Finance Management Act No. 1 of 1999,
http://mfma.treasury.gov.za/MFMA/Legislation/Local%20Government%20-
%20Municipal%20Finance%20Management%20Act/Municipal%20Finance%20Management%20Act%20(No.%
2056%20of%202003).pdf (accessed 18 June 2020).

• National Treasury. (2005). Treasury regulations for departments, trading entities, constitutional institutions and
public entities.
http://www.treasury.gov.za/legislation/pfma/regulations/gazette_27388%20showing%20amendments.pdf
(accessed 18 June 2020).

• National Treasury. (2013). National Treasury Strategic Plan 2013/2017.


http://www.treasury.gov.za/publications/strategic%20plan/Strat%20Plan%202013-2017.pdf (accessed 18 June
2020).

• National Treasury. (2014). Division of Revenue Act, 2014.


http://www.treasury.gov.za/legislation/acts/2014/Division%20of%20Revenue%20Act,%202014%20(Act%20No.
%2010%20of%202014).pdf (accessed 18 June 2020).

• National Treasury. (2019). Division of Revenue Bill, 2019.


https://www.gov.za/sites/default/files/gcis_document/201902/b5-2019division-revenue-bill_0.pdf (accessed 18
June 2020).

© Regenesys School of Public Management 9


• National Treasury. (2019). Economic transformation, inclusive growth, and competitiveness: Towards an
Economic Strategy for South Africa.
http://www.treasury.gov.za/comm_media/press/2019/Towards%20an%20Economic%20Strategy%20for%20SA.
pdf (accessed 18 June 2020).

• Presidency. (2010). Guide to the Outcomes Approach.


http://www.dpme.gov.za/publications/Guides%20Manuals%20and%20Templates/Guideline%20to%20outcome
%20approach.pdf (accessed 18 June 2020).

• Presidency. (2012). National Evaluation Plan 2013-14 to 2015-16.


https://www.dpme.gov.za/publications/Policy%20Framework/National%20Evaluation%20Plan%202013%20-
%2014.pdf (accessed 18 June 2020).

• Public Service Commission. (2012). Evolution of monitoring and evaluation in the South African public service.
http://www.psc.gov.za/newsletters/docs/2012/K-9555%20PSC_6th%20edition%20magazine_DevV11.pdf
(accessed 18 June 2020).

© Regenesys School of Public Management 10


5.3 ACCESSING JOURNAL ARTICLE AND OTHER ONLINE LINKS

Most study guide and virtual course links should open directly when you click on them, provided your
browser is open and connected to the net. However, to access articles and e-books on EbscoHost
or Emerald, you must be logged in to the student portal, and have these databases open.

• Click on Tools, Resources, Library, and then on Ebscohost or Emerald:

… and then click on the article link in the study guide.

If this does not work (it can depend on what browser you are using), cut and paste the URL (the
www address) into your browser and click to access the link. Use Chrome, Firefox or Safari as your
browser. Do not use Internet Explorer, as it is no longer supported by all applications. Check that
you have copied the whole URL, and have not left out part after a hyphen. There should not be any
spaces in the URL – the whole thing should be on one line.

Please report any broken links – or any other problems encountered on your educational journey
that we can solve – to mdt@regenesys.co.za so we can fix them for you.

Links to additional media that may prompt discussion and help you complete this course will be
saved in Around the Net, a couple of clicks down from the EbscoHost database links in the portal
library. Visit the site regularly to see what’s new.

© Regenesys School of Public Management 11


5.4 ADDITIONAL SOURCES TO CONSULT

As a higher education student, you are responsible for sourcing additional information that will assist
you in completing this course successfully. Here are sources you can consult to obtain additional
information on the topics to be discussed in this course. You will find more on the portal.

EbscoHost and These online databases contain journal articles, e-books and multimedia relevant to your
Emerald studies. Registered Regenesys students in good standing can access them through the
student portal.
NetMBA MBA constructs and discussion. http://www.netmba.com/
MindTools Ideas, constructs, management models and commentary. http://www.mindtools.com/
ProvenModels Provides management models – generalisations of business situations that, when applied in
context, can be powerful tools for solving business problems. http://www.provenmodels.com/
12manage.com More models, principles and global commentary. http://www.12manage.com/
The Free Comprehensive overviews of strategic planning.
Management Library http://managementhelp.org/strategicplanning/index.htm
TED TED (Technology, Entertainment and Design) is a nonprofit organisation that devotes itself
to spreading new, transformative ideas in science, business and global issues, among other
topics. TED’s website will take you to each of the groundbreakingTED Talks, and also to
TEDx, a programme that helps communities, organisations and individuals to create local
TED-like experiences. https://www.ted.com/about/our-organization

A word of caution – not all information available on the internet is necessarily of a high academic
standard. Always compare the information you find with that in reputable sources, such as articles
published in accredited journals.

© Regenesys School of Public Management 12


6. GROUND RULES AND EXPECTATIONS

6.1 EXPECTATIONS

It is crucial in any learning process that the expectations and needs of the learners are identified.
The identification of the learners’ expectations and needs enables the facilitator to create a relevant
and learner-focused learning process.

Expectations

Purpose: To identify your expectations and needs around this course

Time: 10 minutes

1. What are your expectations for this course?


2. What can you contribute to make this course successful?
3. What are possible obstacles that could prevent you from achieving your expectations?

© Regenesys School of Public Management 13


6.2 GROUND RULES

In most group situations it is important to collectively develop ground rules or norms of behaviour in
order to create an environment conducive to learning. Ground rules set the tone for future group
discussions and behaviour.

Some examples of ground rules:

• One person speaks at a time.


• People who want to speak should put up their hands.
• Participants should ensure that they arrive at sessions on time, etc.

Ground Rules

Purpose: To develop ground rules for the group

3-minute discussion in pairs


Time:
10-minute brainstorm in plenary

1. Find a partner.
2. List and discuss two issues that you feel would create an environment conducive for learning.
3. Each pair will brainstorm their list.

© Regenesys School of Public Management 14


7. INTRODUCTION

This Monitoring and Evaluation course focuses on the aspects that make up the government's
monitoring and evaluation system.

From a general introduction to the subject, we move on to explain the monitoring and evaluation
system in the context of government and in the legal context. We examine matters such as the
outcomes-based approach and the Government-Wide Monitoring and Evaluation System (GWMES).

From these systems you can learn how to apply a monitoring and evaluation framework in your own
organisation and how to ensure that this framework is a success.

7.1 LEARNING OUTCOMES

On completing this course, you should be able to:

• Explain monitoring and evaluation and related terminologies;


• Understand the different monitoring and evaluation models and techniques;
• Explain the monitoring and evaluation process;
• Communicate monitoring and evaluation results;
• Assess the importance of a monitoring and evaluation system;
• Understand the relevant legislation informing the monitoring and evaluation process;
• Compile and implement a performance-based framework for monitoring and evaluation;
• Assess the success factors of a monitoring and evaluation system; and
• Explain the link between monitoring and evaluation and other business functions.

The timetable under each section heading provides guidance on how long to spend studying the
section. Follow the timetable to ensure that you spend a suitable length of time on each section,
cover the required sections relevant to each assignment, and have enough time to prepare for the
examination.

© Regenesys School of Public Management 15


7.2 AN INTRODUCTION TO MONITORING AND EVALUATION IN THE
PUBLIC SECTOR

Timeframe Minimum of 20 hours

• Explain monitoring and evaluation and related terminologies;


Learning outcomes • Understand the different monitoring and evaluation models and techniques; and
• Assess the importance of a monitoring and evaluation system.

• Goldman, I., Engela, R., Akhalwaya, I., Gasa, N., Leon, B., Mohamed, H. and Phillips, S.
(2012). Establishing a national M&E system in South Africa. The World Bank Special
Series on The Nuts & Bolts of Monitoring and Evaluation Systems, 21, 1-11,
Recommended http://documents.worldbank.org/curated/en/556311468101955480/pdf/760630BRI0Nuts00
reading Box374357B00PUBLIC0.pdf (accessed 18 June 2020).
• Kaplan, R.S and Norton, D.P. (1992). The Balanced Scorecard – Measures that Drive
Performance. Harvard Business Review. https://hbr.org/1992/01/the-balanced-scorecard-
measures-that-drive-performance-2 (accessed 18 June 2020).

Monitoring and evaluation (M&E) is government’s system of tracking and reporting on


performance of public sector organisations. In this section we present a definition of monitoring
Section overview
and evaluation relevant to this course. We review the main stakeholders in the M&E process,
and the models and techniques that inform the government monitoring and evaluation system.

7.2.1 Defining Monitoring and Evaluation

According to Görgens-Albino and Zall Kusek (2009:2), “monitoring and evaluation is a powerful
public management tool that can be used to improve the way governments and organisations
achieve results. Just as governments need financial, human resource, and accountability systems,
they also need good performance feedback systems.” Monitoring and evaluation is a crucial part of
this feedback system.

In 2008 the Public Service Commission (PSC), published a manual explaining the basic concepts of
monitoring and evaluation. This manual defines the concepts as follows:

Monitoring

“A continuing function that uses systematic collection of data on specified indicators to provide
management and the main stakeholders of an ongoing development intervention with indications
of the extent of progress and achievement of objectives and progress in the use of allocated
funds.”

© Regenesys School of Public Management 16


Evaluation

“The systematic and objective assessment of an ongoing or completed project, programme or


policy, its design, implementation and results. The aim is to determine the relevance and
fulfilment of objectives, development efficiency, effectiveness, impact and sustainability. An
evaluation should provide information that is credible and useful, enabling the incorporation of
lessons learned into the decision-making process of both recipients and donors.

“Evaluation also refers to the process of determining the worth or significance of an activity, policy
or programme. An assessment, as systematic and objective as possible, of a planned, ongoing,
or completed development intervention.

“Note: Evaluation in some instances involves the definition of appropriate standards, the
examination of performance against those standards, an assessment of actual and expected
results and the identification of relevant lessons.”

To get an overview of the PSC’s constitutional mandate and its vision and mission, visit
the Public Service Commission website:

7.2.2 Concepts and Terminology

It is crucial to consider the terminology described in Table 1 if we are to fully understand monitoring
and evaluation.

TABLE 1: MONITORING AND EVALUATION TERMINOLOGY

Data terrains The Government-Wide Monitoring and Evaluation System (GWMES) is composed of three
data terrains: programme performance information, evaluation, and census data or statistics.
Evidence-based The systematic application of the best available evidence to the evaluation of options and to
decision-making decision-making in management and policy settings. Evidence should be based on the three
data terrains of the GWMES system.
Government-Wide The GWMES is a management framework within public sector organisations that works with
Monitoring and other management systems to integrate monitoring and evaluation practices into all elements
Evaluation System of the organisation.
(GWMES)
Outcomes-based The outcomes-based approach clarifies what we expect to achieve, how we expect to achieve
approach it, and how we will know we are achieving it. It is composed of inputs, activities, outputs,
outcomes and impacts – terms we will explore in more detail later in this course.
Performance A predetermined signal that a certain point in a process has been reached or a result has
indicators been achieved.

© Regenesys School of Public Management 17


This approach to management is based on four pillars:

• Definition of strategic goals, which provide a focus for action;


• Specification of expected results, which contribute to the achievement of these
Results-based goals; and the alignment of programmes, processes and resources in support of
management these expected results;
• Ongoing monitoring and assessment of performance, integrating lessons learnt into
future planning; and
• Improved accountability for results – did programmes make a difference in the lives
of ordinary South Africans?
(Adapted from DPME, 2012:22 and Presidency, 2010:9)

Distinction between monitoring and evaluation

While monitoring and evaluation overlap and exist in a mutually beneficial capacity, they are still two
distinct processes. The Department of Performance Monitoring and Evaluation (DPME) (2011:3)
provides the following distinction between monitoring and evaluation:

“Monitoring involves the continuous collecting, analysing and reporting of data in a way that
supports effective management. Monitoring aims to provide managers with regular (and real-time)
feedback on progress in implementation and results, and early indicators of problems that need
to be corrected. It usually reports on actual performance against what was planned or expected.

“In summary, monitoring asks whether the things we planned are being done right, while
evaluation is asking are we doing the right things, are we effective, efficient and providing value
for money, and how can we do it better? Evaluation has the element of judgment, and must be
(made) against objectives or criteria.”

The Presidency (2007:1-2) offers the following distinction:

“Monitoring involves collecting, analysing, and reporting data on inputs, activities, outputs,
outcomes and impacts as well as external factors in a way that supports effective management.
Monitoring aims to provide managers, decision makers and other stakeholders with regular
feedback on progress in implementation and results, and early indicators of problems that need
to be corrected.”

“Evaluation is a time-bound and periodic exercise that seeks to provide credible and useful
information to answer specific questions to guide decision making by staff, managers, and policy
makers. Evaluations may assess relevance, efficiency, effectiveness, impact and sustainability.
Impact evaluations examine whether underlying theories and assumptions were valid, what
worked, what did not and why. Evaluation can also be used to extract cross-cutting lessons from
operating unit experiences and (for) determining the need for modifications to strategic results
frameworks.”

© Regenesys School of Public Management 18


In other words, “while monitoring tracks what you have planned to do, evaluation asks deeper
questions of effectiveness, efficiency, causality, relevance and sustainability. Government is taking
forward evaluation to improve government’s performance and development impact, accountability,
decision-making and to widen the knowledge base around government’s work” (DPME, 2013). Table
2 also differentiates between monitoring and evaluation.

TABLE 2: DIFFERENCES BETWEEN MONITORING AND EVALUATION

Area Monitoring Evaluation


Objective (why?) • To establish baseline information • To validate what results were achieved, and
• To track changes from baseline how and why they were or were not achieved
conditions to desired outcomes • To refine the theory of change: revisit original
• To identify areas requiring corrective assumptions and objectives – thereby
action improving learning and future approaches

Focus (what?) • Focuses on the outputs of projects, • Compares planned with actual outcome
programmes, partnerships and achievement
activities, and their contribution to • Focuses on how and why outputs and
outcomes strategies contributed to achievement of
• Checks progress against plans, and outcomes and impacts
areas for action and improvement • Addresses questions of relevance,
effectiveness, sustainability and change
Responsibility Internal management and programme or External evaluators and partners
(who?) project manager responsibility at all Internal evaluators:
levels: • Executive management team (EMT)
• City-wide • Mayoral committee
• Clusters • Council
• Entities or departments • Performance management, monitoring,
• Mayoral committee evaluation and reporting unit
• Performance management,
monitoring and evaluation
reporting unit
Timing (when) Continuous and systematic • Time-bound, periodic and in-depth
• Before (formative), during (aiding
improvements) or after a project or
programme (summative)
Outcomes-based Inputs, activities and outputs Impacts, outcomes, purpose and overall objectives
position Outputs vs. inputs (effectiveness and efficiency);
impact; results vs costs; relevance to priorities
Data sources • Progress reports Evaluation reports; monitoring data; primary and
• Management information systems secondary data sources – including case studies,
surveys and statistical data
• Performance management data
(City of Johannesburg, 2012:10)

© Regenesys School of Public Management 19


Monitoring and evaluation should also be seen as distinct from other results-based management
activities. These differences are explained in Table 3.

TABLE 3: DIFFERENT TYPES OF RESULTS-BASED MANAGEMENT

Results-based management Activity Objective


Inspection or investigation Detects strange or illegal behaviour Control and compliance
Performance audit Ensures the accuracy of departmental Accountability, control and compliance
performance information
Monitoring Continuous tracking of progress against Management, accountability, corrective
plans action
Evaluation Systematic collection and objective Learning, accountability, improved
analysis of evidence performance
Research Testing theories through observation Learning or knowledge, creativity,
informing policy
(DPME, 2012:4)

Monitoring and evaluation as a tool for growth

Section 195(1)(c) of the constitution provides that: “Public administration must be development-
oriented”. State institutions should ensure that all programmes comply with this principle. The PSC’s
State of the Public Service Report (2007) described the context of the developmental state thus:

“South Africa’s efforts to promote growth and development are being pursued within the context
of building a developmental state…. Such a state seeks to ‘capably intervene and shepherd
societal resources to achieve national developmental objectives,’ rather than simply rely on the
forces of the market.

“What gives rise to and shapes the nature of a developmental state depends on the context and
history of a country…. Against this background, many have quite correctly cautioned against any
attempts to suggest that there is a prototype of a developmental state that can be constructed on
the basis of what worked in other countries.

“What then is the specific context within which to locate a South African developmental state?
The PSC believes that the Constitution provides the basis on which to understand
developmentalism in South Africa given how it captures the collective will and determination of
her people to create a better life for themselves.”

It is therefore essential that the monitoring and evaluation system of the state complies with this
principle.

© Regenesys School of Public Management 20


The Difference Between Monitoring and Evaluation

1. In groups, compile a working definition of monitoring and evaluation.


2. Using Table 1, identify which activities in your organisation constitute monitoring and which constitute evaluation.
3. Once your table is complete, assess the importance of both monitoring and evaluation.
4. Discuss the implications of monitoring and evaluation in the context of a developmental state, and the
consequences of not implementing monitoring and evaluation.

7.2.3 The Purpose of Monitoring and Evaluation

According to Chabane (2013), monitoring and evaluation in the public service aim to address:

• Opposition to change;
• A focus on completing activities rather than assessing their results;
• Insufficient measurement, collection and analysis of data that informs improvements;
• Ensuring monitoring and reporting for compliance, not improvement;
• Weak programme planning, indicators and targets, logic models or theories of change;
• Weak design of data measurement and collection processes; and
• No value in evidence-based planning and decision making.

M&E are also used to guide management decision-making and organisational learning and
accountability, to solicit support for programmes and advocacy, and to promote transparency. See
Table 4.

TABLE 4: PURPOSE OF MONITORING AND EVALUATION

Management decision- Monitoring and evaluation can augment and complement management, as they provide
making evidence for decision-making. This is possible if monitoring and evaluation information
is appropriate and feeds into existing managerial processes. Decisions about resource
allocation, strategy implementation, policy decision and programme design are easier
with accurate information.
Organisational M&E helps to create learning organisations. They are useful tools to establish which
learning programme design will be best to implement and which will bring the best return on
investment. Information gathered through monitoring and evaluation should be
communicated in action-orientated reports. It can therefore be deduced that monitoring
and evaluation produce new knowledge.
Accountability Public servants are accountable for how public money is spent, how objectives are
achieved, and for ensuring that this is done with integrity. Monitoring and evaluation
provide information in a structured and formalised manner.
Soliciting support for Support for a programme is validated by means of evaluated findings. Monitoring and
programmes evaluation provide evaluated findings.

© Regenesys School of Public Management 21


Supporting advocacy Results presented by monitoring and evaluation systems help arguments for the
continuation adjustment or termination of programmes. Monitoring and evaluation
systems help clarify issues, promote understanding of goals, and document programme
implementation, creating institutional knowledge.
Promoting Findings through the monitoring and evaluation system are available to a wider audience,
transparency promoting transparency.
(PSC, 2008:4-6)

7.2.4 Monitoring and Evaluation Models and Techniques

Government’s monitoring and evaluation system was developed from the Fifteen-Year Review of
Government (2009), which stated that there had to be a radical shift in government policy in order to
improve performance to an acceptable level. A completely new approach to M&E was needed, and
so the outcomes-based approach was introduced. This is the basis for the government-wide
monitoring and evaluation system or GWMES (Clear, 2012:144). We will study this framework in
detail later in this course. To understand the implementation of the government-wide M&E system,
we must understand the logic model and the Kaplan and Norton balanced scorecard perspectives.

Logic model

The logic model explains the relationship between means (inputs, activities and outputs) and ends
(outcomes and impacts). It consists of a hierarchy of inputs, activities, outputs, outcomes and
impacts (see Figure 1).

FIGURE 1: LOGIC MODEL

Impacts
Manage
What do we aim to change?
towards
Outcomes achieving
What do we wish to achieve? results

Outputs
What do we produce or deliver?

Activities Plan,
What do we do? budget,
implement
Inputs
What do we use to do the work?

(PSC, 2008:42)

© Regenesys School of Public Management 22


Let us have a closer look at the components of the logic model. According to the National Treasury
(2007:6) the components of the logic model are as shown in Table 5.

TABLE 5: COMPONENTS OF THE LOGIC MODEL

Inputs All the resources that contribute to production and delivery of outputs. Inputs are “what we use to
do the work”. They include finances, personnel, equipment and buildings
Activities The processes or actions that use a range of inputs to produce the desired outputs and ultimately
outcomes. In essence, activities describe “what we do”.
Outputs The final products, or goods and services produced for delivery. Outputs may be defined as “what
we produce or deliver”.
Outcomes The medium-term results for beneficiaries that are a logical consequence of achieving certain
outputs. Outcomes should relate clearly to an institution’s strategic goals and objectives, which
should be set out in its plans. Outcomes are “what we wish to achieve”.
Impacts The results of achieving specific outcomes, such as reducing poverty and creating jobs.

Logic Model Explanation

1. In groups, discuss a project or programme currently in the implementation phase of delivery. Assign the
components of the logic model to this project or programme.
2. Critically discuss how this model can be used as a monitoring and evaluation tool.

Balanced scorecard perspectives

In an article published by the Harvard Business Review, Kaplan and Norton identified four
perspectives for evaluating the performance of an organisation, shown in Table 6.

TABLE 6: BALANCED SCORECARD PERSPECTIVES

Financial Is the organisation financially successful? Does the project or programme deliver value for
money?
Customer Is the public satisfied with service delivery?
Learning and growth Is the organisation achieving its vision and goals? Monitoring and evaluation is intended to
develop a learning organisation. If the organisation achieves its vision and goals, growth
will be inevitable.
Internal business This perspective assesses implementation procedures.
process
(PSC, 2008:19)

© Regenesys School of Public Management 23


Kaplan and Norton write, “Think of the balanced scorecard as the dials and indicators in an
airplane cockpit. For the complex task of navigating and flying an airplane, pilots need detailed
information about many aspects of the flight… Similarly, the complexity of managing an
organization today requires that managers be able to view performance in several areas
simultaneously.” Visit this link and view the study:

• Kaplan, R.S and Norton, D.P. 1992, The Balanced Scorecard – Measures that Drive
Performance, Harvard Business Review, https://hbr.org/1992/01/the-balanced-
scorecard-measures-that-drive-performance-2 (accessed 18 June 2020).

Government's outcomes-based approach

The monitoring and evaluation process involves inputs, activities, outputs, outcomes and impacts.
In this section, we consider government’s interpretation of the theories and models presented above.
The components of the logic model make up government’s outcomes-based approach to managing
performance. See Figure 2.

FIGURE 2: GOVERNMENT'S OUTCOMES-BASED APPROACH

(Adapted from Presidency, 2009:11 and National Treasury, 2007:7)

© Regenesys School of Public Management 24


As the figure shows, the outcomes approach consists of the following components:

• Inputs: what is required to complete a task? For example, financial resources, human
resources, infrastructure;
• Activities: the functions, actions, and tasks that use the inputs and produce results. For
example, contract for services, answer queries, give advice;
• Outputs: the products or services made from activities. For example, “service providers
obtained” and “work initiated”;
• Outcomes: the end goal – what we wish to achieve. They are the product of effective outputs.
Outcomes are linked to organisational strategic plan; and
• Impacts: what results from achieving the outcomes? For example, faster production of official
documents at home affairs, reduction of poverty, etc.
(Republic of South Africa, 2010a:12)

This model is considered in more detail in the fourth section of this course, titled “Implement a
Monitoring and Evaluation System”.

7.2.5 M&E Stakeholders in the Government Sphere

We have discussed the developmental purpose of the government’s monitoring and evaluation
system. In this section we consider the involvement of national, provincial, line department and
constitutional institutions.

National level

The Presidency

The Presidency is responsible for formulating the medium-term strategic framework and the
government’s programme of action. The implementation of these plans is then monitored against
their priorities. The Presidency publishes bi-annual progress reports on the implementation of the
government’s programme of action. The Presidency relies on data provided by the monitoring and
evaluation systems (PSC, 2008:13) to compile these reports.

National Treasury

The minister of finance, supported by the National Treasury, determines fiscal policy. The Treasury
compiles the national budget and devises and implements financial management policy. Parliament
allocates money according to strategic objectives. Indicators and targets are set to measure the
attainment of objectives, and the National Treasury plays an important role in monitoring
performance against these objectives. The Treasury evaluates whether expenditure achieved value
for money. The results of the evaluations are published in quarterly reports in the Budget Review,
Provincial Budgets and Expenditure Review and the Local Government Budgets and Expenditure
Review (PSC, 2008:14).

© Regenesys School of Public Management 25


Department of Public Service and Administration (DPSA)

This department is responsible (PSC, 2008:14) for:

• Macro-organisation of the public service;


• Formulating of policy on the functioning of the public service;
• Human resource management policy;
• Determining the conditions of service for the public service;
• Determination of policy, regulations, norms and standards for information management;
• The use of information technology in the public service; and
• The promotion of a public service conforming to the values governing public administration
listed in section 195 of the constitution.

The DPSA provides monitoring and evaluation information through the bodies shown in Figure 3.

FIGURE 3: KEY STAKEHOLDERS IN TERMS OF DPSA MONITORING AND EVALUATION

Ministry of
Performance
Co-ordinating Monitoring &
Departments: NT, Evaluation: G&A
DPSA, OPSC; National cluster: NPC Minister, deputy
School of minister, Parliament,
Government, Co- Portfolio Committee
operative on Public Service and
Governance & Administration, and
Traditional Affairs, so on
Stats SA, etc
DPSA
monitoring
and
evaluation

Government
departments,
Directors-general,
premiers' offices,
exco/manco, DPSA
provincial and local
staff
offices
Other government
agencies, research
institutions, trade
unions and other
partners

The Public

(Molepo, 2011)

© Regenesys School of Public Management 26


Department of Co-operative Governance and Traditional Affairs

This department devises policy on the structure and functioning of provincial and local government,
and therefore evaluates the performance of local and provincial government. Local government is
essential for the delivery of basic services, and the department's role in monitoring and evaluating
service delivery is therefore directly linked to monitoring and evaluation (PSC, 2008:15).

Statistics South Africa (Stats SA)

This organisation collects, analyses and publishes information generated by the national statistics
system. The system also collects statistics on development indicators from the government’s
strategies. Without reliable statistics, planning and service, monitoring and evaluation would not be
possible (PSC, 2008:15).

National School of Government (NSG)

The National School of Government replaces the Public Administration Leadership and Management
Academy (PMG, 2014). The purpose of the school is “to build an effective and professional public
service through the provision of relevant, mandatory training programmes” (PMG, 2014). The
National School of Government is therefore involved in training staff in M&E principles.

The Department of Performance Monitoring and Evaluation in the Presidency

While the PSC had sole responsibility for monitoring and evaluation functions during the first two
decades of the democratic dispensation, in January 2010 a new department in the Presidency – the
Department of Performance Monitoring and Evaluation – was established to:

• Facilitate the outcomes-based approach (performance agreements and delivery


agreements) and monitor the implementation of priorities;
• Develop and implement the management performance assessment tool (MPAT) in
collaboration with key stakeholders;
• Conduct frontline service delivery monitoring, including handling the Presidential hot line;
• Facilitate citizen-based service delivery monitoring;
• Undertake evaluation and research;
• Promote good monitoring and evaluation practices in government; and
• Take steps to address blockages in delivery, in partnership with delivery institutions.

(DPME: 2012:2)

This means there are now two government bodies responsible for public service M&E. To prevent
duplication of responsibilities, the PSC and the DPME have specific monitoring and evaluation
functions, shown in Table 7.

© Regenesys School of Public Management 27


TABLE 7: DIVISION OF M&E RESPONSIBILITIES BETWEEN THE PSC AND THE DPME

PSC responsibilities DPME responsibilities

Evaluation of the success of identified Performance monitoring of government priorities


government programmes

Evaluation of service delivery Evaluation of government priorities

Compliance evaluations Assessment of the quality of management practices in


government departments

Monitoring of heads of department performance Frontline service delivery monitoring and the presidential hot
management line

Evaluation of departments against the values Citizen-based monitoring


listed in section 195 of the constitution

Evaluation of the state of the public service Government-wide monitoring and evaluation system and
capacity building

Development indicators

Constitutional institutions

The Public Service Commission

The Public Service Commission (PSC) was established in accordance with sections 195 and 196 of
the constitution, which stipulate that it would be the body in charge of monitoring and evaluation of
public service performance. However, as you have seen, this has now changed, with responsibility
for performance monitoring and evaluation being split between the PSC and the Department of
Performance Monitoring and Evaluation (DPME).

The Differences Between the PSC and the DPME's Responsibilities

1. Compare and contrast the monitoring and evaluation responsibilities of the PSC and the DPME.
2. Analyse whether these differences are actually adhered to in the public service. Using this analysis, recommend
ways that the PSC and DPME could use a more integrated approach to monitoring and evaluation.

© Regenesys School of Public Management 28


The auditor-general

The auditor-general is responsible for auditing the accounts and financial statements of national and
provincial departments (PSC, 2008:16). From a monitoring and evaluation perspective, the auditor-
general's most important role is performance auditing. This involves determining how well the
organisation or department being audited has spent money. In addition, part of the audit can be
determining how well the audited entity has determined its performance indicators (PSC, 2008:16).

The Human Rights Commission

As part of the Bill of Rights (enshrined in the constitution), the Human Rights Commission must
protect, promote and ensure respect of the rights of citizens (PSC, 2008:16). As a large portion of
these rights are socioeconomic, the Human Rights Commission has a role to play in ensuring that
government delivers essential services to its people (PSC, 2008:16). This involves effective M&E.

Departments at the centre of provincial government

The key departments in monitoring and evaluation are the offices of the premiers and the provincial
treasuries. Key strategic objectives are set for each province in the Provincial Growth and
Development Strategy and the Provincial Government Programme of Action. Offices of the premier
monitor and evaluate the performance of provincial departments according to the direction set in the
growth and development strategy (PSC, 2008:15).

Line departments

Line departments implement government policy. They must monitor and evaluate the
implementation of policy, the impact of policy and the quality of service delivery (PSC, 2008:15).

Why is it necessary to have multiple entities to ensure effective monitoring and evaluation in the
public service? Discuss critically the implications of this for efficiency.

© Regenesys School of Public Management 29


Case Study

Read the case study below and answer the questions that follow.

Building a results-based monitoring and evaluation system for the Western Cape government of
South Africa

The Western Cape government has developed a series of provincial strategic objectives on which to base its
priorities, transversal planning processes and service delivery. The government has also worked on building a
results-based monitoring and evaluation system to capture information on the extent to which these objectives are
being achieved. This system enables integrated province-wide monitoring and evaluation to occur. It consists of
seven phases as described later in this paper. This seven-phase process provides a toolkit for any government
institution to set up its own RBM&E system. This is the main value of the system.

Context of results-based monitoring and evaluation for the Western Cape government

The primary aim of monitoring and evaluation in government is to provide information for decision-making. But data
collected for monitoring and evaluation is often wide-ranging and fragmented. To make sense of the complexity and
diversity of the data collected against indicators, and to turn data into useful strategic management information it is
necessary to integrate the data into a system.

The central feature of monitoring and evaluation for the Western Cape government is that it is used to improve
performance. The results-based monitoring and evaluation system is being used to measure the performance of the
desired outcomes in relation to the strategic objectives government aims to achieve. Other provinces may wish to use
a similar results-based monitoring and evaluation system to measure their performance.

Focus of the Western Cape government’s results-based monitoring and evaluation system

The main focus of the system is measuring outcomes against a set of indicators. Given that achieving outcomes
depends, in part, on factors beyond the direct control of government, outcomes and their measurement, in our
approach, are clearly distinguished from outputs and their measurement. Thus, outputs are about what the province
as a whole, and each department in it, actually delivers, while outcomes are about what they wish to achieve through
these outputs. Indicators measuring outputs are therefore clearly differentiated from indicators that measure
outcomes.

Aims of the results-based monitoring and evaluation system

The Western Cape model aims to provide a platform for setting up results-based systems for public sector monitoring
and evaluation in the province. The system should ensure that it provides data and information that is necessary to
measure government’s achievements against a core set of indicators contained within it. Such information enables
evidence-based decision-making in line with the provincial government’s policies, strategies, programmes and
projects.

© Regenesys School of Public Management 30


Developing the toolkit

As a starting point for developing this toolkit, a strategic framework for province-wide monitoring and evaluation
system was developed. This framework examined how such a system could measure the results of the work done in
the province. Its emphasis was on measuring outcomes directly linked to specific provincial strategic objectives. It
also sketched how the processing of monitoring and evaluation data collection would be supported by an electronic
application.

To develop a results-based monitoring and evaluation system for the province, a specific seven-phase sequence was
formulated, taking into account the principles of Kusek and Rist (2004: 23): formulating outcomes and goals;
selecting outcome indicators for monitoring; gathering baseline information on the current condition; setting specific
targets to reach; setting timelines for reaching them; collecting data to assess whether the targets were being met.
The development of the system was also informed by other international practices, such as the Malaysian
Government Result-Based Budgeting System and the International IMA Model. The results-based monitoring and
evaluation system for the province was developed in-house with technical support conducting quality assurance in
the field of indicators and data governance.

Each of the seven phases constitutes a subsystem. These subsystems are interdependent and contained within the
overarching province-wide monitoring and evaluation system. They provide the necessary components of the system,
so that they can operate as a whole through effective indicator and data management. The components are then
aligned to the core processes and supporting processes of the province-wide system’s mandate. The subsystems are
reviewed annually to maintain an up-to-date and comprehensive monitoring and evaluation system that will function
effectively.

The seven phases of the results-based monitoring and evaluation system

Phase 1: Readiness assessment and stakeholder engagements

The readiness assessment involved conducting provincial audits with the monitoring and evaluation staff in the
Western Cape provincial government, ascertaining the capacity and readiness in each department to build a results-
based monitoring and evaluation system that could be aligned to this province-wide system, and the critical
challenges faced in each department in relation to building the results-based monitoring and evaluation systems.

Stakeholder engagement involved identifying relevant stakeholders at local, provincial, national and international
levels, and institutionalising stakeholder engagement through the establishment of a monitoring and evaluation forum
and an external reference group, which met on a regular basis. This phase was important for understanding the
stakeholder environment.

Phase 2: Developing overarching strategic frameworks

This phase focused on the development of the strategic monitoring and evaluation frameworks that would provide the
conceptual and strategic understanding of the province-wide system’s mandate, its results-based approach and its
relationship to the policy context of the provincial government. This phase set out the strategic approach on how to
implement results-based monitoring and evaluation to assess how well the provincial government was doing in
meeting its strategic objectives.

© Regenesys School of Public Management 31


Phase 3: indicator definition process and indicator development

This phase was the starting point in translating the provincial strategic objectives into broad, outcomes-based themes,
and then subdividing these themes into aims or desired outcomes for the period 2010 to 2014. In this phase a
compendium of indicators was selected to measure each aim or desired outcome for the province’s strategic
objectives, taking into account the national statistical production areas and global imperatives.

Phase 4: Developing monitoring and results frameworks

In this phase, attributes for each core indicator were identified in order to build the monitoring system. These
attributes included information on appropriate data sources; the frequency of data collection; responsible data
producers and level of disaggregation to measure results based on the indicators. Baseline data was also collected,
and targets were set against which the outcome indicators could be measured. The indicators and their attributes
culminated in a monitoring and results framework for each strategic objective. This phase was the essence of the
results-based monitoring and evaluation system, and was interlinked with phases 3 and 5.

Phase 5: Data management and data assessment

This phase related to the collection of data on the outcome indicators to observe the situation and the changes that
occurred as well as the analysis and reporting of results. It included the identification and location of the data
sources, and the assessment of the data quality by building quality standards into provincial administrative data
records. This phase was critical, as it related to broader data governance matters such as data profiling, data quality
standards and data architecture, and it was interlinked with phases 3 and 4.

Phase 6: information architecture

The information architecture was designed to support the manual processes regarding collecting province-wide
information, and to manage data collected to measure the indicators.

The information architecture of the province-wide monitoring and evaluation system was included in a broader
computer-based relational database. This database contains data collected not only for the province-wide system,
but also for the Annual Performance Assessment System and the Executive Projects Dashboard. The province-wide
system, as an electronic system, draws its data by interfacing with other e-platform systems. The Annual
Performance Assessment System, as an electronic application, stores output indicator results, while the Executive
Projects Dashboard, as an electronic application, receives and captures information on departmental projects and
tracks progress and budget utilisation. This phase was the essence of automating the work done in phases 2, 3, 4
and 5.

Phase 7: Planning to implement and sustain the province-wide monitoring and evaluation system

In this phase the Western Cape government ensured that the system delivered an effective indicator and data
management system for collecting relevant data and information for strategic management purposes. The annual
review of the subsystems of the province-wide system takes place in this phase. This review is pivotal in maintaining
an up-to-date, comprehensive monitoring and evaluation system, and ensuring that the components in each phase
adhere to the necessary policy context, monitoring and evaluation elements and mechanisms for such a system. This
phase indicates when the results-based monitoring and evaluation system is ready to start, to be implemented and to
be sustained.

© Regenesys School of Public Management 32


Conclusion

In conclusion, building the results-based system and its application set the direction for improving monitoring and
evaluation processes and methods within the provincial government, ultimately improving the measuring of results on
a continuous basis.
(Ishmail, 2012)

Questions

1. Critically reflect on the components of the logic model in reference to the case study.
2. Evaluate the balanced scorecard perspectives using the case study.
3. Explain the interrelationships between the components of the outcomes-based approach using the case study.

7.2.6 Conclusion

Monitoring and evaluation allows government to track and correct progress in public organisations.
It is therefore a vital tool in achieving improved government services, which continue to modify their
processes and strategies as they learn from M&E experiences.

As a helpful overview of monitoring and evaluation, read this article:

• Goldman, I., Engela, R., Akhalwaya, I., Gasa, N., Leon, B., Mohamed, H. and Phillips, S.
(2012). Establishing a national M&E system in South Africa. The World Bank Special
Series on The Nuts & Bolts of Monitoring and Evaluation Systems, 21, 1-11,
http://documents.worldbank.org/curated/en/556311468101955480/pdf/760630BRI0Nuts
00Box374357B00PUBLIC0.pdf (accessed 18 June 2020).

Once you have read the article, complete the following tasks:

© Regenesys School of Public Management 33


Overview of the Monitoring and Evaluation System

1. Critically evaluate the six types of evaluation. Do you think these are comprehensive enough to cover all areas
of evaluation in the public service? Why?
2. Given the overview of the emerging successes, challenges and sustainability issues for a monitoring and
evaluation system provided in the article, explain how obstacles to effective monitoring and evaluation can be
overcome.
3. Using Table 1 in the article, evaluate the ability of the management performance assessment tool to measure
performance of government departments and administrations.
4. Evaluate how Figure 1 in the article represents the relationship between the different stakeholders of monitoring
and evaluation in South Africa. Draw a diagram in which you represent the stakeholder relationships more
appropriately. Present your alternative diagram in a presentation to the rest of the class, explaining the
adaptations you made to it and why you felt these changes were necessary.

7.2.7 Key Points

Some key points made in this section were:

• Various definitions of monitoring and evaluation were offered. The essential point is that M&E
helps us to improve the quality of our work;
• We distinguished between monitoring and evaluation;
• Monitoring and evaluation is a tool for development;
• Among the purposes of M&E are:
o Improved management decision-making
o Organisational learning
o Enhanced accountability
o Soliciting support for programmes
o Supporting advocacy
o Promoting Transparency
• We identified M&E stakeholders:
o The Presidency, National Treasury, the Department of Public Service and
Administration, the Department of Co-operative Governance and Traditional Affairs
o Statistics South Africa, the National School of Government, the Department of
Performance Monitoring and Evaluation in the Presidency
o The Public Service Commission, the auditor-general, the Human Rights Commission
o Premier’s offices and line departments
• We distinguished between the M&E function of the Public Service Commission and the
Department of Performance Monitoring and Evaluation; and
• We identified various M&E models and techniques, namely the logic model, the balanced
scorecard approach, and government’s outcomes-based approach.

© Regenesys School of Public Management 34


Remember to do your digital assessment for this section online!

It will help you strengthen and embed your understanding of the course. You will not be able to
change your answers once you have submitted them, so make sure you have completed the
relevant section of coursework first. Where you see Select all that are relevant, be aware that
any number of the options presented could be correct. You will lose marks for incorrect
selections, so choose carefully. Your combined marks from these assessments count towards a
total of 20% of your course mark.

© Regenesys School of Public Management 35


7.3 THE LEGAL CONTEXT FOR MONITORING AND EVALUATION

Timeframe Minimum of 15 hours

Learning outcome • Understand the relevant legislation informing the monitoring and evaluation process.

• Department of Public Service and Administration. (2012). Public Service Act, 1994.
http://www.dpsa.gov.za/dpsa2g/documents/acts&regulations/psact1994/PublicServiceAct.pd
f (accessed 18 June 2020).
• Nkwinti, G. (nd). National Development Plan and the New Growth Path: Transforming the
Economy. http://kzntopbusiness.co.za/site/search/downloadencode/nLaqaaKelpO8mnjc
(accessed 18 June 2020).
• National Treasury. (2005). Treasury regulations for departments, trading entities,
constitutional institutions and public entities.
http://www.treasury.gov.za/legislation/pfma/regulations/gazette_27388%20showing%20ame
ndments.pdf (accessed 18 June 2020).
• National Treasury. (2013). National Treasury Strategic Plan 2013/2017.
http://www.treasury.gov.za/publications/strategic%20plan/Strat%20Plan%202013-2017.pdf
Recommended (accessed 18 June 2020).
reading
• National Treasury. (2014). Division of Revenue Act, 2014.
http://www.treasury.gov.za/legislation/acts/2014/Division%20of%20Revenue%20Act,%2020
14%20(Act%20No.%2010%20of%202014).pdf (accessed 18 June 2020).
• National Treasury. (2014). Public Finance Management Act No. 1 of 1999.
http://www.treasury.gov.za/legislation/pfma/act.pdf (accessed 18 June 2020).
• National Treasury. (2003). Local Government: Municipal Finance Management Act No. 1 of
1999, http://mfma.treasury.gov.za/MFMA/Legislation/Local%20Government%20-
%20Municipal%20Finance%20Management%20Act/Municipal%20Finance%20Managemen
t%20Act%20(No.%2056%20of%202003).pdf (accessed 18 June 2020).
• Department of Economic Development. (2011). The New Growth Path: Framework.
http://www.economic.gov.za/communications/publications/new-growth-path-series/download
(accessed 18 June 2020).

Monitoring and evaluation in South Africa originates from legislation mandating the efficient and
effective monitoring and evaluation of policies, actions and use of resources in the public service.
Section overview In this section, we study the most important of these laws to reinforce the importance of M&E in
complying with legislation and to impart an understanding that monitoring and evaluation is
integral to achieving the goals of this legislation.

© Regenesys School of Public Management 36


7.3.1 Introduction to Legislation

Various laws support the implementation of a monitoring and evaluation system: the overarching
framework provided by the constitution, the Public Finance Management Act, the Public Service Act,
and the Municipal Finance Management Act (DPME, 2011:2).

Be aware that these acts are frequently amended. It is your responsibility to ensure that you have
the latest version of each act, which you can find on the relevant department’s website.

In this section we will review this legislation in terms of the purposes it serves in relation to monitoring
and evaluation.

7.3.2 Constitution of the Republic of South Africa, 1996

The constitution is the highest level of legislation in South Africa. It is from the constitution that the
public service was first established. The constitution contains values by which the public service
must strive to operate. It also contains mandates for the establishing of entities to manage monitoring
and evaluation in the public service.

Constitutional values, and monitoring and evaluation

The values enshrined in the constitution shape and define M&E practices in South Africa. In every
constitutional principle is the implicit implication that the public service must be monitored and
evaluated to ensure compliance with the values of the constitution. In other words, to achieve the
ideals of the constitution, monitoring and evaluation of all public entities is necessary.

This may be more clearly understood in Table 8, in which the PSC links indicators and standards to
two constitutional values as an example of how values can be made into measurable indicators.

© Regenesys School of Public Management 37


TABLE 8: EXAMPLE OF CONSTITUTIONAL VALUES AND MONITORING AND EVALUATION INDICATORS

Value M&E criteria/indicators Standards


Efficient, 1. Expenditure is 1. Expenditure is as budgeted and material variances are
economic and according to budget. explained.
effective use 2. Programme outputs 2. More than half of each programme's service delivery indicators
of resources are clearly defined are measurable in terms of quantity, quality and time.
must be and there is credible 3. Outputs, service delivery indicators and targets are clearly linked
promoted evidence that they with each other as they appear in the strategic plan, estimates of
have been achieved. expenditure and the annual report for the year under review.
4. Programmes are implemented as planned or changes to
implementation are reasonably explained.
5. A system to monitor and evaluate programmes or projects is
operative.
Public The department is 1. Beneficiaries play an active role in the governance, designing
administration effectively involved in and monitoring of projects.
must be programmes or projects 2. A standardised project plan format is used showing:
development- that aim to promote
• All relevant details including measurable objectives
orientated development and reduce
poverty. • Time frame (targets)
• Clear governance arrangements
• Detailed financial projections
• Review meetings
• Considering issues such as gender, the environment and
HIV/AIDS
3. Poverty reduction projects are aligned with local development
plans.
4. Organisational learning takes place.
5. Projects are successfully initiated and or implemented.
(PSC, 2008:27-28)

Constitutional Values as Performance Indicators

1. Using the format of the table from the PSC above, choose two constitutional values of your own and draw a table
converting them into measurable indicators.
2. Present your table to the rest of the class, clearly explaining how you developed your indicators.

© Regenesys School of Public Management 38


Constitutional mandates for monitoring and evaluation in the public service

The constitution enumerates values in section 195 that the PSC uses to define good governance
(PSC, 2008:17). In addition, the constitution holds that the public service be (PSC, 2008:17):

• Accountable to Parliament for:


o The spending of public money
o How it achieves the purposes for which the money was voted
o Going about duties with integrity
• Developmental (promoting growth and development); and
• Monitored and evaluated by the Department of Public Service and Administration (DPSA).

Constitutional Requirements for Monitoring and Evaluation

Using the constitutional values listed above, explain the importance of monitoring and evaluation for the public service.

7.3.3 The Public Service Act, 1994 as Amended

The Public Service Act, No. 103 of 1994, as amended by the Public Service Amendment Act, No.
30 of 2007, prescribes how national and provincial departments should function. It also regulates the
appointment and performance of government employees (Clear, 2012:150).

In order to understand the Public Service Act, you can read it here:

• Department of Public Service and Administration. (2012). Public Service Act, 1994.
http://www.dpsa.gov.za/dpsa2g/documents/acts&regulations/psact1994/PublicServiceA
ct.pdf (accessed 18 June 2020).

7.3.4 The Public Finance Management Act (PFMA), as Amended

The Public Finance Management Act, No. 1 of 1999, as amended by the Public Finance Amendment
Act, No. 29 of 1999, is intended to ensure the fair and ethical use of finances in national and
provincial government. It is therefore important to have an understanding of the act, which is largely
concerned with the monitoring and evaluation of government finances.

© Regenesys School of Public Management 39


From an input approach to an output approach

The Public Finance Management Act emphasises the need to move away from an input approach
to an output approach, while focusing on the responsible managing of public funds. The following
sections of the act are relevant to monitoring and evaluation:

• Section 6.1 (f), which mandates the responsibilities of the National Treasury, states that the
Treasury must monitor the implementation of provincial budgets;
• Section 6.2 (c), which also deals with the responsibilities of the National Treasury, states that
the Treasury “must monitor and assess the implementation of (the act), including any
prescribed norms and standards, in provincial departments, in public entities and in
constitutional institutions”;
• Section 18.2 (c), under the functions and powers of a provincial treasury, says that the
provincial treasury must comply with the annual Division of Revenue Act, and monitor and
assess the implementation of that act in provincial public entities and in terms of section 18.2
(d), “must monitor and assess the implementation in provincial public entities of national and
provincial norms and standards”;
• Section 27 states that measurable objectives must be submitted for each programme (Clear,
2012:150);
• Section 38 (a) (iv), under the responsibilities of accounting officers, states that “a system for
properly evaluating all major capital projects prior to a final decision on the project” must be
maintained by the accounting officer in all departments, trading entities and constitutional
institutions;
• Section 45 states that department officials must assume responsibility for the “effective,
efficient, economical and transparent use of financial resources” (Clear, 2012:150);
• Section 51 (a) (iv), under general responsibilities of accounting officers, states that the
accounting authority must ensure “a system for properly evaluating all major capital projects
prior to final decisions on the project”.

We recommend that you familiarise yourself with the Public Finance Management Act:

• National Treasury. (2014). Public Finance Management Act No. 1 of 1999.


http://www.treasury.gov.za/legislation/pfma/act.pdf (accessed 18 June 2020).

The Public Finance Management Act

Read the act and then complete the tasks below:

1. Briefly summarise the purpose of the act.


2. Explain the monitoring function performed by the National Treasury.
3. Critically assess the monitoring function performed by the provincial treasury.
4. Explain why it is essential to have monitoring of financial matters (particularly public funds).

© Regenesys School of Public Management 40


7.3.5 The Municipal Finance Management Act

The Municipal Finance Management Act, No. 56 of 2003, follows the requirements for adequate
reporting and responsibilities laid down by the Public Finance Management Act at the municipal
level. It also provides instructions for how performance management is monitored by the municipality
(Clear, 2012:150).

Monitoring and evaluation of municipal funds

The following sections of the Municipal Finance Management Act are useful for monitoring and
evaluation:

• Section 5.2, which outlines the general functions of the National Treasury and provincial
treasuries, which states that the National Treasury may:
o 2 (a) “Monitor the budgets of municipalities to establish whether they: are consistent with
the national government’s fiscal and macroeconomic policy.
o (b) “Promote good budget and fiscal management by municipalities, and for this purpose
monitor the implementation of municipal budgets, including their expenditure, revenue
collection and borrowing”; and
o (c) “Monitor and assess compliance by municipalities and municipal entities with (i) this
act; and (ii) any applicable standards of generally recognised accounting practice and
uniform expenditure and revenue classification systems.”
• Section 5.4 requires a provincial treasury to monitor:
o (i) Compliance with this act by municipalities and municipal entities in the province;
o (ii) The preparation by municipalities in the province of their budgets;
o (iii) The monthly outcome of those budgets; and
o (iv) The submission of reports by municipalities in the province as required in terms of
this act.
• Section 34.3, under the heading “Capacity Building”, refers to the monitoring function in terms
of section 155(6) of the Constitution, and says a provincial government:
o (a) “Must share with a municipality the results of its monitoring to the extent that those
results may assist the municipality in improving its financial management;
o (b) “Must, upon detecting any emerging or impending financial problems in a
municipality, alert the municipality to those problems;” and
o (c) “May assist the municipality to avert or resolve financial problems.
• Section 41.1 under the Monitoring of Prices and Payments for Bulk Resources, “the National
Treasury must monitor:
o “The pricing structure of organs of state for the supply of electricity, water or any other
bulk resources that may be prescribed, to municipalities and municipal entities for the
provision of municipal services; and
o “Payments made by municipalities and municipal entities for such bulk resources.”

© Regenesys School of Public Management 41


• Section 52, under the general responsibilities of mayors, (b) states that the mayor of a
municipality, “in providing general political guidance, may monitor and, to the extent provided
in this act, oversee the exercise of responsibilities assigned in terms of this act to the
accounting officer and the chief financial officer, but may not interfere in the exercise of those
responsibilities”.
• Section 56.2 says the mayor of the municipality may also “monitor the operational functions
of the entity, but may not interfere in the performance of those functions”.
• Section 69.1 (b) emphasises that revenue and expenditure must be properly monitored.
• Section 89 (b) states that the parent municipality of a municipal entity must “monitor and
ensure that the municipal entity reports to the (municipal) council on all expenditure incurred
by that municipal entity on directors and staff remuneration matters, and in a manner that
discloses such expenditure per type of expenditure, namely:
o (i) Salaries and wages
o (ii) Contributions for pensions and medical aid
o (iii) Travel, motor car, accommodation, subsistence and other allowances;
o (iv) Housing benefits and allowances;
o (v) Overtime payments;
o (vi) Loans and advances; and
o (vii) Any other type of benefit or allowance related to directors and staff.
• Section 100 (b) under budget implementation, referring to the responsibilities of the
accounting officer of a municipal entity, says: “The accounting officer of a municipal entity is
responsible for implementing the entity’s budget, including taking effective and appropriate
steps to ensure that revenue and expenditure are properly monitored.”
• Section 116.2 (b), under the responsibility of the accounting officer of a municipality or
municipal entity, the accounting officer must “monitor on a monthly basis the performance of
the contractor under the contract or agreement”.
• Section 120.4 (d), as a condition for partnerships formed between public and private
partnership, says the municipality is allowed to form a public-private partnership if it “explains
the capacity of the municipality to effectively monitor, manage and enforce the agreement”.
• Section 128, the section on compliance to be monitored, says: “The accounting officer of a
parent municipality must:
o (a) “Monitor whether the accounting officer of any municipal entity under the sole or
shared control of the municipality has complied with sections 121 (1) and 126 (2);
o (b) “Establish the reasons for any non-compliance; and
o (c) “Promptly report any non-compliance, together with the reasons for such non-
compliance, to the council of the parent municipality, the relevant provincial treasury and
the auditor-general.
• Section 112.1 (h), under the supply chain management policy, which must comply with the
prescribed framework, points out that there must be “procedures and mechanisms for the
evaluation of bids to ensure best value for money”.
• Section 158 (c), under the role of the Municipal Financial Recovery Service, states that it may
“on request by the MEC for finance in the province, monitor the implementation of any
financial recovery plans that it has prepared, and may recommend such amendments and
revisions as are appropriate”.
• Section 166.2 (viii), as part of the regulations on audit committees, notes that one of the roles
of the independent advisory body is performance evaluation.

© Regenesys School of Public Management 42


Read the full Municipal Finance Management Act here:

• National Treasury. (2003). Local Government: Municipal Finance Management Act No. 1
of 1999, http://mfma.treasury.gov.za/MFMA/Legislation/Local%20Government%20-
%20Municipal%20Finance%20Management%20Act/Municipal%20Finance%20Manage
ment%20Act%20(No.%2056%20of%202003).pdf (accessed 18 June 2020).

Now complete these tasks:

The Municipal Finance Management Act

1. Explain the importance of the Municipal Finance Management Act in the context of monitoring.
2. How might the act be used to evaluate the use of municipal funds?

7.3.6 Treasury Regulations

The regulations that took effect in 2000 and were amended in 2005 are a significant shift from the
previous approach (Treasury instructions) in that they allow for more flexibility and they place
responsibility for decisions in the hands of the accounting officer.

Read the regulations here:

• National Treasury. (2005). Treasury regulations for departments, trading entities,


constitutional institutions and public entities.
http://www.treasury.gov.za/legislation/pfma/regulations/gazette_27388%20showing%20
amendments.pdf (accessed 18 June 2020).

Table 9 outlines significant features of the regulations. However, students are reminded that they
should read the regulations in full to appreciate the extent of this document.

© Regenesys School of Public Management 43


TABLE 9: SAMPLE TREASURY REGULATIONS

Part 1 General definitions, application, and date of commencement are stated here.
Part 2 Here “management” is defined, including:
• Corporate management (including chief financial officer);
• Internal control (ie audit committees that act consistently with the Institute of Internal Auditors);
• Risk management strategy (including a fraud prevention plan); and
• Financial misconduct (including investigation, criminal proceedings, and reporting).
Part 3 Planning (at various levels) and budgeting are prescribed here, including:
• Strategic planning (annual preparation, submission and content to facilitate departmental
votes); and
• Budgeting and related matters (ie format, virement, rollovers, transfer of functions, additional
funds and adjustment budgets).
Part 4 The two key responsibilities of revenue and expenditure are dealt with here (including unauthorised,
irregular, fruitless and wasteful expenditure).
Revenue management:
• Application (identification, collection, recording and safeguarding of all revenue for which the
institution is responsible);
• Responsibility for revenue management; and
• Services rendered by the state.
Expenditure management:
• Accounting officer’s responsibilities;
• Approval;
• Compensation of employees (personnel costs);
• Transfer payments and subsidies (excluding division of revenue grants and other allocations to
municipalities);
• Division of revenue grants and other allocations to municipalities;
• The charging of expenditure against a particular vote (or main division of a vote); and
• Recovery, disallowance, and adjustment of payments.
Unauthorised, irregular, fruitless and wasteful expenditure:
• Prevention and detection;
• Reporting;
• Disciplinary steps; and
• Recovery of losses (or damages).

© Regenesys School of Public Management 44


Part 5 Asset and liability management is dealt with here as outlined below:
Asset management:
• Responsibilities (ie control systems that eliminate theft, losses, wastage and misuse, and those
that maintain optimal and economical stock levels);
• Disposal and letting of assets; and
• Management of debtors
Liability management:
• Management of losses and claims;
• Loans, guarantees, leases and other commitments; and
• Money and property held in trust.
Part 6 Several frameworks together with their respective regulations are contained in this part, including:
• Banking, cash management and investment;
• Public-private partnerships (PPPs); and
• Supply chain management (SCM).
Together with all of the above, basic accounting records must be used. Key areas include:
• Clearing and suspense accounts;
Part 7 • Maintenance of financial records (retentions from five to 15 years, availability or access to, and
disposal of); and
• Monthly and annual reports (including annual financial statements and any additional annual
reporting requirements for departments controlling trading entities and public entities).
This part refers to miscellaneous items, including:
• Trading entities;
• Commissions and committees of inquiry;
Part 8
• Gifts, donations and sponsorships;
• Payments and remissions as an act of grace; and
• Government payroll deductions.
Public entities as listed in Schedules 2 and 3 are governed by these regulations. Regulations pertain to:
• Listing;
• Responsibilities of designated accounting officers;
• Internal control and corporate management (including audit committees and chief financial
officers);
Part 9 • Annual financial statements and annual reports;
• Corporate planning, shareholder’s compacts and annual budgets;
• Strategic planning;
• Cash, banking and investment management;
• Borrowing and leases; and
• Financial misconduct.
(Adapted from Treasury Regulations, 2005)

From time to time regulations are repealed (as with all other legislation). The onus is on you to take
note of these and keep abreast of current legislation.

© Regenesys School of Public Management 45


Internalising the Treasury Regulations

1. To whom do the Treasury regulations apply?


2. Draw a mind map to capture the roles and responsibilities of the audit committee.
3. Broadly outline the content of a strategic plan as prescribed by the regulations, and explain how this affects your
department, division or community.
4. Identify and explain how and where “measurable objectives”, “expected outcomes”, “programme outputs”,
“indicators (measures)” and “targets” are applied to the strategic plans of your department, division or
community.
5. Explain with reference to the “service delivery improvement programme” in Part 3 of the regulations:
• The need for this provision;
• Why monitoring and evaluation is imperative with regards to the delivery of services; and
• The implications of inefficiency in terms of service delivery on a local, provincial and national level.
6. What is “unforeseeable and unavoidable expenditure” according to the regulations and why?
• Analyse the policy or protocols in place in your department or division to prevent unforeseeable and
unavoidable expenditure and discuss its (or their) strengths and weaknesses; and
• Bearing the above in mind, unforeseen and unavoidable expenditure is a major concern in state
organisations. In your opinion why is this so, despite the measures in place?
7. Explain the concept and purpose for segregation of duties, using the example from the regulations that state
“activities relating to the authorisation of appointments, the authorisation of payments and the recording of those
payments may not be performed by the same person”.
8. Discuss the importance of being “market-related” with regard to the letting or disposal of assets, and reflect on
the opportunities (cases) for potential abuse.
9. Propose acceptable measures to ensure the accuracy and readiness of accessible financial records, including
information that must be retained even if the national archivist has authorised its disposal.
10. Reflect on why commissions (or committees) of enquiry may be necessary from the public finance perspective to
ensure effective monitoring and evaluation.

7.3.7 Government’s Revenue and Expenditure Strategy

The strategic plan, the National Treasury says, should take into account all the relevant policies,
legislation and other mandates for which the department is responsible and should reflect the
strategic outcome-oriented goals and objectives that the department will strive to achieve over the
stated period.

Refer to the document below:

• National Treasury. (2013). National Treasury Strategic Plan 2013/2017.


http://www.treasury.gov.za/publications/strategic%20plan/Strat%20Plan%202013-
2017.pdf (accessed 18 June 2020).

© Regenesys School of Public Management 46


Revenue and expenditure strategies are based on political, economic, environmental, sociological,
and technological factors, which together ensure that South Africa is placed to attend to the
challenges of the past, present and future in both a domestic and a global capacity. These strategies
stem from broad strategic frameworks, including the National Development Plan (NDP) and the 2015
Millennium Development Goals (United Nations, 2013).

Additionally, the minister of finance (as the political principal of the department) is guided by active
collaboration with Parliament (including the Standing Committee on Finance, the Select Committee
on Finance, and the Standing Committee on Public Accounts).

Read these articles to gain more insight into the government’s strategic frameworks:

• Nkwinti, G. (nd). National Development Plan and the New Growth Path: Transforming the
Economy. http://kzntopbusiness.co.za/site/search/downloadencode/nLaqaaKelpO8mnjc
(accessed 18 June 2020).
• Department of Economic Development. (2011). The New Growth Path: Framework.
http://www.economic.gov.za/communications/publications/new-growth-path-
series/download (accessed 18 June 2020).

Strategic Plan

1. Discuss what is meant by the following terms in the National Treasury Strategic Plan 2013/2017 and identify how
these concepts affect the monitoring and evaluation process, their impact on the provision of services and why it
is important to consider these concepts both locally and nationally in strategic planning:

• “High levels of public and private debt” (pg. 5)


• “Structural economic constraints” (pg. 5)
• “Fiscal sustainability” (pg. 5)
• “Inclusive economic growth” (pg. 5)
• “Commercial incentives for private enterprises … with the aim of increasing employment and promoting
inclusive economic growth” (pg. 5)
• “Countercyclicality” (pg. 5) and
• “Special economic zones” (pg. 6).

2. Revenue and expenditure strategies are based on various factors, including broad strategic frameworks. Why is it
important to consider these factors and broad strategic frameworks when devising revenue and expenditure
strategies?

© Regenesys School of Public Management 47


7.3.8 Division of Revenue Act (DoRA)

The constitution requires that every year a Division of Revenue Act (DoRA) determine the equitable
division of nationally raised revenue between national government, the nine provinces and
municipalities.

You are required to read the act:

• National Treasury. (2014). Division of Revenue Act, 2014.


http://www.treasury.gov.za/legislation/acts/2014/Division%20of%20Revenue%20Act,%20
2014%20(Act%20No.%2010%20of%202014).pdf (accessed 18 June 2020).

The act follows a highly consultative process. The following organisations and institutions are
consulted:

• The Financial and Fiscal Commission (as discussed in Section 1 of this course);
• The South African Local Government Association (Salga); and
• The national and provincial departments.

Schedule 1 shows the equitable division of revenue raised nationally among the three spheres of
government as follows (National Treasury, 2014).

(National Treasury, 2014)

© Regenesys School of Public Management 48


Schedule 2 shows the determination of each province’s equitable share of the provincial sphere’s
share of revenue raised nationally (as a charge against the National Revenue Fund) (National
Treasury, 2014).

(National Treasury, 2014)

Schedule 3 (too extensive to replicate here) determines each municipality’s equitable share of the
local government sphere’s share of revenue raised nationally.

Schedule 4 sets out the allocations to provinces to supplement the funding of programmes or
functions funded from provincial budgets (by vote), with Schedule 5A showing specific-purpose
allocations to provinces (by vote) and 5B to municipalities (by vote).

Schedule 6A shows the allocations-in-kind to provinces for designated special programmes and 6B
likewise to municipalities. Schedule 7A gives the unallocated provisions for provinces for disaster
response.

Accompanying memorandum

The Intergovernmental Fiscal Relations Act (1997) requires that the Division of Revenue Bill be
accompanied by a memorandum explaining:

• How the bill takes account of the respective sections of the constitution;
• The extent to which the Financial and Fiscal Commission’s recommendations have been
taken into account; and
• Any assumptions (or formulae) used to allocate the funds between the three spheres of
government.

© Regenesys School of Public Management 49


Division of Revenue Act

Compare the Division of Revenue Act from last year to the current Division of Revenue Act or Bill, whichever is most
recent.

• What significant differences are evident?


• Discuss possible factors leading to these differences.
• Why is it important to consider these factors and broad strategic frameworks when allocating resources?

7.3.9 Monitoring and Evaluation and Policy Management

Policy mandates processes and procedures implemented in an organisation. Figure 4 shows how
monitoring and evaluation fit into the policy life cycle.

FIGURE 4: POLICY LIFE CYCLE

Problem

Policy
Review
objectives

Policy
Evaluation
options

Feasibility of
Monitoring
options

Policy
Implementation decisions

Planning

(PSC, 2008:9)

© Regenesys School of Public Management 50


The Public Service Commission explains this process as follows:

Since there are not many completely new problems that the state has never addressed before,
the cycle probably starts with the review of existing policy. The stages of problem identification,
determining policy objectives, examining policy options, and taking a policy decision are a
complex process filtered through many layers of stakeholders. These stakeholders include
political parties, civil society, legislative and executive arms of government, and government
departments. Policy is further argued and explained in various documents, like discussion and
policy documents.

The process is invariably not as sequential or rational as depicted. Identification of options and
rational evaluation of the feasibility, or the costs and benefits, of options, in any precise sense,
assume perfect knowledge of what will work, which is frequently not the case. Policy options
emerge through political debate, and the best policies through taking a considered decision and
making adjustments when the effect of a policy is seen in practice.

As soon as a policy decision has been taken, government departments initiate the processes of
designing a programme that can achieve the policy objectives, detailed planning of the
programme, and implementation. To ensure that implementation proceeds as planned and
that the envisaged objectives are achieved, the programme is monitored and evaluated.
Depending on the results achieved by the programme, the initial policy decision, or aspects of the
design, implementation and resource allocation to the programme may be reviewed.

The evaluation of the success of policy and the reasons for success or failure are critical parts of
the process. This evaluation is not necessarily a formal, technical evaluation but one that is
intricately part of administrative and political processes, where the judgements and power of key
decision-makers play the primary role. Monitoring and evaluation mediates this by producing valid
evidence for policy decisions, ensuring greater objectivity.

Since public policy is a set of statements that “determine what actions government will take, what
effects those actions will have on social conditions, and how those actions can be altered if they
produce undesirable outcomes”, policy evaluation is also an inherent part of monitoring and
evaluation.

7.3.10 Conclusion

The legislation studied above is intended to guide the implementation of monitoring and evaluation
in the public service. While the laws do contain helpful values and directives from which monitoring
and evaluation practitioners can draw, the PSC and DPME have been active in producing many
documents aiming to explain the impact of the legislation in practical terms. This leads to the
question: is the legal framework for monitoring and evaluation sufficient? You may continue to think
about this question as we explore the PSC and DPME frameworks and guidance notes in the section
that follows.

© Regenesys School of Public Management 51


Recap Your Knowledge

1. Critically evaluate the legislation studied above by considering whether it contains adequate guidelines for the
implementation of a monitoring and evaluation system for the public service. Substantiate your response.
2. In your groups, compile a list of different policies involved in the monitoring and evaluation process in your
department. Critically evaluate why each policy is crucial to the success of the process.

7.3.11 Key Points

Some key points made in this section were:

• The Constitution of the Republic of South Africa, 1996, enumerates various constitutional
values, some of which inform the practice of monitoring and evaluation. In other words, M&E
is not simply an administrative requirement, but is drawn from our fundamental national
values;
• Various acts of Parliament flesh out the constitutional requirements of monitoring and
evaluation, namely the:
o Public Service Act
o Public Finance Management Act
o Municipal Finance Management Act
• The Treasury Regulations, 2005, also add detail to M&E practices;
• Departments’ strategic plans and the government’s revenue and expenditure strategy also
inform M&E;
• The annual Division of Revenue Act allocates how nationally-raised revenue is shared
between the three spheres of government. Application of the act also relates to M&E; and
• Monitoring and evaluation also contributes to (or feeds into) the policy life cycle.

Remember to do your digital assessment for this section online!

It will help you strengthen and embed your understanding of the course. You will not be able to
change your answers once you have submitted them, so make sure you have completed the
relevant section of coursework first. Where you see Select all that are relevant, be aware that
any number of the options presented could be correct. You will lose marks for incorrect
selections, so choose carefully. Your combined marks from these assessments count towards a
total of 20% of your course mark.

© Regenesys School of Public Management 52


7.4 THE MONITORING AND EVALUATION PROCESS

Timeframe Minimum of 25 hours

• Able to explain the monitoring and evaluation process; and


Learning outcomes
• Able to explain the link between monitoring and evaluation and other business functions.

• Presidency. (2010). Guide to the Outcomes Approach.


http://www.dpme.gov.za/publications/Guides%20Manuals%20and%20Templates/Guideline
%20to%20outcome%20approach.pdf (accessed 18 June 2020).
• Presidency. (2012). National Evaluation Plan 2013-14 to 2015-16.
Recommended
https://www.dpme.gov.za/publications/Policy%20Framework/National%20Evaluation%20Pl
reading
an%202013%20-%2014.pdf (accessed 18 June 2020).
• Public Service Commission. (2012). Evolution of monitoring and evaluation in the South
African public service. http://www.psc.gov.za/newsletters/docs/2012/K-
9555%20PSC_6th%20edition%20magazine_DevV11.pdf (accessed 18 June 2020).

In the previous section we placed monitoring and evaluation within their legal requirements. In
this section we will examine the organisational values that a monitoring and evaluation system
Section overview
must enhance. We will evaluate the current values of the public service and ask whether the
theories used to develop a monitoring and evaluation system answers the country’s needs.

7.4.1 Placement in the Government’s Planning Cycle

Previously, we located monitoring and evaluation in the policy cycle. We will now have a look at how
monitoring and evaluation fits into the planning and implementation processes of government
departments. This is illustrated in Figure 5 and discussed thereafter.

© Regenesys School of Public Management 53


FIGURE 5: MONITORING AND EVALUATION IN THE PLANNING CYCLE

1. Preparation of performance plans


Five-year
strategic plan

Medium-term
budget
4. Review
Annual review
Annual
performance plan

Specially Third-quarter
2. Monitoring
commissioned report
evaluations Performance
plans for units
3. Evaluation and individuals

Performance Monthly and


information data quarterly reports
base

(Adapted from National Treasury, 2007:4)

Each department is responsible for devising a five-year strategic plan. The strategic plan must be
aligned with the governments’ strategic direction, which is published in the Medium-Term Strategic
Framework and the Government Programme of Action.

The process starts with a general election, when government produces new programmes. This
process is the same for on provincial level where provincial strategic plans must align with provincial
government programmes of action. At departmental level, plans must align with provincial growth
and development strategies and integrated development plans. Plans must also align with local
integrated development plans.

From the strategic plan, each department prepares a budget (estimates of expenditure/medium-
term expenditure framework) and submit this to the National Treasury. This is approved by
Parliament or the provincial legislature.

© Regenesys School of Public Management 54


Once approved, the budget becomes law (Appropriation Act of the year) and the department is
therefore legislated to spend public money for the purposes indicated.

From the strategic plan and budget, departments must then prepare an annual performance plan.
The plans must contain objectives, outputs, indicators and targets. The annual performance plan is
then broken down into plans for each component of the organisation (for example: the human
resources plan, the risk management plan, the programme management plan, etc). These plans are
implemented and monitoring starts immediately. Monitoring measures are set against the
objectives, outputs, indicators and targets in the plan. The progress is reported in monthly and
quarterly reports.

Managers then conduct quarterly monitoring through evaluation and analyse the success and
failure of programmes. Action plans are developed for performance improvements. Quarterly
monitoring can be supplemented by commissioned evaluations by experts internally or externally.
These reports form part of the annual review of performance, which then feeds the new planning
cycle for the next financial year.
(Adapted from National Treasury, 2007:4)

7.4.2 Intergovernmental Relations and the Local Government Fiscal


Framework

In addition to the constitution, four acts govern (or organise) the system of intergovernmental
relations and the local government fiscal framework. The integration (coherency) of these are
summarised in Table 10.

TABLE 10: LEGISLATION THAT ORGANISES INTERGOVERNMENTAL RELATIONS

Intergovernmental Passed to promote co-operation between the three spheres of government on fiscal,
Fiscal Relations Act budgetary and financial matters (it establishes the Budget Forum); and to prescribe a
(1997) process for the determination of an equitable sharing and allocation of revenue raised
nationally (requires that a Division of Revenue Bill is tabled annually).
Municipal Structures Act Provides for the establishment of different types of municipalities, including the division
(1998) (including of powers and functions between local and district municipalities; regulates the internal
amendments) systems, structures and office bearers of municipalities.
Municipal Systems Act Sets out detailed requirements for community participation, integrated development
(2000) and Municipal planning, performance management, administration, service provision, debt collection,
Systems Amendment and the establishment of municipal entities; regulates the publication of by-laws and
Act (2003) determines the role of national provincial government in setting standards and
monitoring local government.
Intergovernmental Established to provide a framework for the three spheres of government to promote and
Relations Framework facilitate intergovernmental relations; to provide for mechanisms and procedures to
Act (2005) facilitate the settlement of intergovernmental disputes.
(National Treasury, 2011 and related acts)

© Regenesys School of Public Management 55


The legislation formalises the different roles and responsibilities of the three spheres of government
with regard to various functions and provides for a range of consultative structures. Municipalities
are generally represented on the national intergovernmental structures by the South African Local
Government Association (Salga). At the provincial level, municipalities are either represented directly
or through the provincial local government associations (eg budget council and budget forum,
technical intergovernmental forums, and the Financial and Fiscal Commission, to name a few).

The constitution envisages the decentralisation of the administration of many functions – currently
the responsibility of national and provincial government – to municipalities. To enable this, the local
government fiscal framework must provide municipalities with access to revenue sources that are
commensurate with the powers and functions for which they are responsible. As stated by National
Treasury (2011:27), “It is important to keep in mind that the whole local government fiscal framework
is designed to fund local government, and not just the transfers from national government.”

The basis for this section is highlighted in the following excerpt:

“It is important to understand the relationship between the allocation of functions and the fiscal
framework, the fiscal effort the municipality makes to collect revenues, the appropriate allocation of
those revenues to services, the responsible management of service delivery processes and the
effective delivery of services.”
(National Treasury, 2011:27-28)

The framework and systems continue to evolve as better modes of co-operation and co-ordination
emerge and as functions are shifted between spheres. While changes take place, National Treasury
(2011:29-30) reinforces key elements and principles that underpin the intergovernmental system,
namely:

• Accountability (while each sphere has specific constitutionally defined powers and
responsibilities, intervention from, for example, provincial governments in local government,
occurs when relevant parties fail to carry out their constitutionally defined responsibilities);
• Transparency and good governance (transparent reporting arrangements within and
between spheres; political executives are responsible for policy and outcomes and
accounting officers are responsible for implementation and outputs);
• Mutual support (continually strengthening the capacity of municipalities);
• Redistribution (achieved through the division of revenue and the latest equitable share
formulae);
• Vertical division (driven by priorities, budget process, and trade-offs, where appropriate);
• Revenue-sharing (funded from its own revenues, equitable share allocations, and
conditional and unconditional grants);
• Broadened access to services (innovative but efficient modes of delivery, leveraging public
and private resources to fund infrastructure); and
• Responsibility over budgets (self-determination and responsibility to comply with these;
national government will not bail out provinces or municipalities that mismanage their funds,
nor will it provide guarantees for loans).

© Regenesys School of Public Management 56


7.4.3 Monitoring and Evaluation System

The Department of Performance Monitoring and Evaluation (2012:6) offers a model for implementing
a monitoring and evaluation framework. This model is shown in Figure 6.

FIGURE 6: THE MONITORING AND EVALUATION PROCESS

(DPME, 2012)

The steps in the model above summarised are in Table 11.

© Regenesys School of Public Management 57


TABLE 11: STEPS IN THE MONITORING AND EVALUATION PROCESS

Step 1: Situation analysis • List the policy objectives and main sources.
• List other joint implementation institutions and partners, the sphere of government,
and nature of co-operative leadership.
Step 2: Describe • These are data records, IT, financial, and other day-to- day systems that are
administrative sources of information.
information systems and • You need to list and describe them in terms of their purpose, their location, the
data sets frequency of report extraction, the users of the reports, the nature of the system
(manual or electronic), the nature of the interface, maintenance, etc.
• Indicate any planned systems.
• List and describe data sets in current use.
Step 3: List indicators, • Indicators must relate to policy outcomes; cross-cutting issues; targets prescribed;
targets, and baselines sector and premiers' office requirements; and other monitoring and evaluation-
related research and indexes, including international comparisons and
requirements.
• Each indicator must relate to the logic model.
Step 4: Group indicators Clarify which indicators give information on the attainment of each policy objective.
by policy objective
Step 5: (If policy Identify and design new indicators, baselines and targets, and repeat steps 3 and 4.
objectives have no
indicators)
Step 6: Review links • There should be causal relationships between different elements of the results
between inputs, outputs, chain: if the appropriate mix of inputs is combined, this will result in service
outcomes and impact, delivery outputs; if the appropriate service delivery outputs are achieved, this
and identify causal should contribute towards achieving policy outcomes/impacts. This is called the
relationships and links logic model.
(results chain) • Indicators must be measured against six criteria:
o Reliability
o Well-defined
o Verifiability
o Cost-effective
o Appropriateness
o Relevance.

• A scale of 1-4 is used:


o 1 indicates a non-existent/unacceptable indicator
o 2 is unsatisfactory or incomplete
o 3 is acceptable or satisfactory
o 4 is good.

© Regenesys School of Public Management 58


Step 7: Monitoring and • Departments will have indicator-based reporting requirements (for example, those
reporting articulated in the strategic plan and annual performance plan, as well as non-
indicator-based reporting requirements (e.g. reports to regulatory bodies, MDGs,
development indicators etc).
• For indicator-based reporting, describe to whom each indicator will be reported,
frequency and date of reporting, and compiler of report.
• Describe other indicator-based reporting.
Step 8: Undertake • In line with the National Evaluation Policy Framework, institutions must develop an
evaluation approach to evaluate their programmes. To ensure that evaluations are objective
and credible, they must be carefully planned.
• List all evaluations done by your department and those commissioned over the
past three years. It is important to state the potential use of the studies, and
develop mechanisms to ensure credibility and quality of evaluations.
Step 9: Capacity-building This must:
• Target users of monitoring and evaluation data, monitoring and evaluation
managers, and monitoring and evaluation practitioners.
• Embrace technical skills in respect of information analysis, integration of
monitoring and evaluation functions and systems, and management and
maintenance of the monitoring and evaluation system.
• Involve recruitment of specialist skills; training of existing staff; mentoring and
coaching and skills transfer; and learning through knowledge forums and
networks
• Outline the role of public participation and involvement in monitoring and
evaluation (by communities, NGOs and civil society).
Step 10: Communications • Public institutions should analyse how monitoring and evaluation findings can be
packaged to reach their stakeholders. They need to consider what steps are
needed to build demand for monitoring and evaluation in the organisation and in
the sector in which it operates. Communication channels such as websites and
other media should be used to report major evaluation activities and findings.
• Write a one-page summary of the key messages emanating from the monitoring
and evaluation process, and create a three-page executive summary and a 25-
page summary report.
• Outline the key communications activities, including how to use monitoring and
evaluation information and channels.
(DPME, 2012)

As Table 11 confirms, there are many complex issues in the implementation of a monitoring and
evaluation process. We will look more closely in the final section of this course at each of the steps
in the monitoring and evaluation process to ensure complete understanding of the practices that
must be followed when setting up monitoring and evaluation in your organisation or department.

© Regenesys School of Public Management 59


7.4.4 Evaluation Perspectives

In order to develop and implement a monitoring and evaluation system, you have to understand
against which perspectives the system will be measured. In the previous section we explained the
planning process as well as the steps involved in implementing the monitoring and evaluation
system. In this section we will look at the broader perspectives of monitoring and evaluation.

Programme performance perspective

A government programme is a set of activities that deliver the products of government (PSC,
2008:39). For example:

The Department of Basic Education is currently in the implementation phase of the National
School Nutrition Programme. The programme aims to:

• Contribute to enhanced learning capacity through school feeding programmes;


• Promote and support food production and improve food security in school communities;
and
• Strengthen nutrition education in schools and communities.

From the example above, it is clear that the programme has complex outcomes and includes
governance, safety and security, social change and services. Evaluating this programme will require
examining whether the objectives of the programme have been achieved and whether they could
have been achieved in a different manner using different strategies and activities. Key factors
relevant to the delivery of the programme and how they relate to each other need to be analysed, as
does its impact. An impact evaluation:

“… is the systematic identification of the effects – positive or negative, intended or not – on


individual households, institutions, and the environment caused by a given development activity
such as a programme or project. Impact evaluation helps us better understand the extent to
which activities reach the poor and the magnitude of their effects on people’s welfare.

“Impact evaluations can range from large-scale sample surveys in which project populations and
control groups are compared before and after, and possibly at several points during programme
intervention; to small-scale rapid assessment and participatory appraisals where estimates of
impact are obtained from combining group interviews, key informants, case studies and available
secondary data.”
(World Bank, 2004)

The impact of a programme like the National School Nutrition Programme could for example be
assessed in the learners’ performance at the end of a school semester. The Public Service
Commission (2008:40-41) lists the following as key elements of programme evaluation:

© Regenesys School of Public Management 60


1. Success of the programme.
2. Needs of citizens.
3. The societal problem the programme is supposed to address (for example, poverty, crime,
environmental degradation).
4. The environment or context in which the programme will be implemented (for example, the
political, social, economic environment).
5. Programme design:
5.1 Objectives of the programme.
5.2 The target population the programme is intended to benefit.
5.3 The course of action government intends to take to address the identified needs or
societal problems. Alternative courses of action can also be viewed as alternative
strategies, means or instruments to achieve desired ends. Instruments that may be
used include a service, a financial grant, regulation of an activity, funding or
subsidisation of an activity, or the provision of infrastructure. Some theory explaining
why it is expected that the chosen instrument will work, and under what conditions,
may be used to justify the choice of instrument. The conditions determining success
can also be called critical success factors. For instance, the success of regulation may
depend on capacity to enforce the regulation. The comparative cost and benefits of
alternative courses of action may also be considered.
5.4 The risks associated with the course of action.
5.5 Legal enablement of the course of action (should a law be enacted or changed?).
5.6 Control over or governance of bodies empowered to take a course of action,
especially if it affects the rights of citizens.
5.7 The scale of the programme. Scale includes the proportion of the population that will
benefit from the programme (its reach) and the level of service. The scale will depend
on the level of funding that the programme can attract.
5.8 The institutional arrangements for delivery of the programme. This may include
government departments, public entities, private institutions and institutions in the
national, provincial, or local sphere of government.
5.9 Procedures for implementing the chosen course of action.
5.10 The human resource capacity available to implement the chosen course of action.
5.11 Effective leadership and management of the programme.
5.12 Government policy with regard to all of the above elements.
6. Implementation of the programme.
A programme evaluation will assess:
• The success of the programme in relation to the needs of citizens and the societal
problem the programme is supposed to address
• Contextual factors that may have influenced the success of the programme
• How the design of the programme determined its success
• How the implementation of the programme determined its success

Programmes are complex and not all elements are pre-designed or implemented as planned. The
form that many of the elements take may emerge as the programme is implemented and
adjustments are made based on experience. Monitoring and evaluation provides the evidence for
decisions on what adjustments to make.

© Regenesys School of Public Management 61


Evaluate Programme Performance

Critically evaluate the success of a programme currently in the implementation phase in your department using the
list provided above as guidance.

Financial perspective

Monitoring and evaluating from a financial perspective will ask:

• Were public funds spent appropriately?


• Was the income to government collected?
• Were assets protected?
• Has the department paid its debtors? and
• Was sound financial control practiced and adhered to?

Financial statements provide evaluators with the answers to these questions. Financial statements
are presented monthly and quarterly in the form of reports, which are then measured against budget.
These statements are prepared according to the prescriptions of the Public Finance Management
Act (discussed in section 2). The auditor-general audits the financial statements of the department
annually. So, as with other perspectives for monitoring and evaluation, the financial perspective
answers pre-set questions and then digs deeper as more and more questions are asked.

Governance perspective

Good governance in departments means compliance with the values listed in Section 195 of the
Constitution. Good governance is:

“… a system of values, policies and institutions by which a society manages its economic, political
and social affairs through interaction within and among the state, civil society and private sector”
(PSC, 2008:21).

“Monitoring and evaluation is responsible for establishing a high standard of service delivery,
monitoring and good governance in the public service” (PSC, 2013:3). Good governance is
mandated by the eight Batho Pele (“People First”) principles, which guide service delivery in
government organisations. As a performance initiative, the Batho Pele principles are intricately linked
to monitoring and evaluation.

© Regenesys School of Public Management 62


Batho Pele Principles

Eight Batho Pele principles were developed to serve as acceptable policy and legal framework
regarding public service delivery. These principles are aligned with the Constitutional ideals of:

• Promoting and maintaining high standards of professional ethics


• Providing service impartially, fairly, equitably and without bias
• Using resources efficiently and effectively
• Responding to people's needs; the citizens are encouraged to participate in policy-
making
• Rendering an accountable, transparent, and development-orientated public
administration

The Batho Pele principles are:

1. Consultation

There are many ways to consult users of services including conducting customer surveys,
interviews with individual users, consultation with groups, and holding meetings with
consumer representative bodies, NGOs and CBOs. Often, more than one method of
consultation will be necessary to ensure comprehensiveness and representativeness.
Consultation is a powerful tool that enriches and shapes government policies such as the
integrated development plans (IDPs) and their implementation in local government.

2. Setting service standards

This principle reinforces the need for benchmarks to measure constantly the extent to which
citizens are satisfied with the service or products they receive from departments. It also
plays a critical role in the development of service delivery improvement plans to ensure a
better life for all South Africans. Citizens should be involved in the development of service
standards.

Required are standards that are precise and measurable so that users can judge for
themselves whether they are receiving what was promised. Some standards will cover
processes, such as the length of time taken to authorise a housing claim, to issue a passport
or identity document, or even to respond to letters. To achieve the goal of making South
Africa globally competitive, standards should be benchmarked (where applicable) against
those used internationally, taking into account South Africa's current level of development.

3. Increasing access

One of the prime aims of Batho Pele is to provide a framework for making decisions about
delivering public services to the many South Africans who do not have access to them.
Batho Pele also aims to rectify the inequalities in the distribution of existing services.
Examples of initiatives by government to improve access to services include such platforms
as the Gateway, Multipurpose Community Centres and Call Centres. Access to information
and services empowers citizens and creates value for money, and quality services. It
reduces unnecessary expenditure for citizens.

© Regenesys School of Public Management 63


4. Ensuring courtesy

This goes beyond a polite smile, “please” and ”thank you”. It requires service providers to
empathise with citizens and treat them with as much consideration and respect as they
would like themselves.

The public service is committed to continuous, honest and transparent communication with
citizens. This involves communication of services, products, information and problems,
which may hamper or delay the efficient delivery of services to promised standards. If
applied properly, the principle will help (dispel) the negative perceptions that citizens in
general have about the attitude of the public servants.

5. Providing information

As a requirement, available information about services should be at the point of delivery, but
for users who are far from the point of delivery, other arrangements will be needed ...
managers and employees should regularly seek to make information about the organisation
and all other service delivery-related matters available to fellow staff members.

6. Openness and transparency

A key aspect of openness and transparency is that the public should know more about the
way national, provincial and local government institutions operate, how well they utilise the
resources they consume, and who is in charge. It is anticipated that the public will take
advantage of this principle and make suggestions for improvement of service delivery
mechanisms, and to even make government employees accountable and responsible by
raising queries with them.

7. Redress

This principle emphasises a need to identify quickly and accurately when services are falling
below the promised standard, and to have procedures in place to remedy the situation. This
should be done at the individual transactional level with the public, as well as at the
organisational level, in relation to the entire service delivery programme. Public servants are
encouraged to welcome complaints as an opportunity to improve service, and to deal with
complaints so that weaknesses can be remedied quickly for the good of the citizen.

8. Value for money

Many improvements that the public would like to see often require no additional resources
and can sometimes even reduce costs. Failure to give a member of the public a simple,
satisfactory explanation to an enquiry may, for example, result in an incorrectly completed
application form, which will cost time to rectify.

(Independent Police Investigative Directorate, 2013)

© Regenesys School of Public Management 64


Batho Pele

Explain, using the Batho Pele principles, how monitoring and evaluation are important in achieving good governance
in South Africa's public service.

Human resource management perspective

From a human resources perspective, monitoring and evaluation should examine whether human
resource management objectives have been achieved and whether good human resource
management practices are being applied in the public service. Human resource practices are
underlined by the following constitutional principles:

• Good human resource management and career development practices must be cultivated to
maximise human potential; and
• Employment and personnel management practices (must be) based on ability, objectivity,
fairness.

According to the Public Service Commission (2008:22) human resource management objectives
include:

• The recruitment of skilled staff who can meet service delivery requirements;
• The achievement of a status of being a good employer; and
• Creation of a public service that meets professional standards, is proud to serve the public,
is patriotic, selfless, non-racial and non-sexist.

Ethics perspective

The ethical perspective of evaluation will examine outcomes linked to change in conduct (for
example fewer incidents of corruption) and whether enough measures are in place to prevent
unwanted outcomes. These measures have been called an ethics infrastructure and include:

• Anti-corruption strategies and fraud prevention plans;


• Risk assessment;
• Activities to promote the code of conduct;
• Minimum anti-corruption capacity;
• Investigation procedures and protocols;
• Effective reporting lines (whistle blowing);
• Inter-agency co-operation;
• Management of conflicts of interest;
• Dealing with financial misconduct;
• Assignment of responsibility for the ethics function in the organisation;
• Pre-employment screening; and
• Ethics training.
(PSC, 2008:22)

© Regenesys School of Public Management 65


The National Treasury guidelines

The National Treasury provides guidelines for departments concerning strategic goals. These
guidelines prescribe that departments set strategic goals that include service delivery management
and organisation areas, financial management, and training and learning.

These areas inform the annual reports submitted to the National Treasury. They should use the
following headings:

• General Information;
• Programme (or service delivery) Performance;
• Report of the Audit Committee;
• Annual Financial Statements; and
• Human Resource Management.

National Treasury further requires that the reports focus on performance and data gathered from
monitoring and evaluation systems and evaluations.

7.4.5 National Evaluation Plan 2013/2014

The National Evaluation Plan is a set of summarised evaluations approved by Cabinet as priorities.
It provides feedback on ongoing evaluations as well as the national evaluation system. The current
National Evaluation Policy Framework was approved on 23 November 2011. The purpose of the
plan is:

• Improving policy or programme performance – providing feedback to managers;


• Improving accountability for where public spending is going and the difference it is making;
• Improving decision-making on what is working or not working; and
• Increasing knowledge about what works and what does not with regards to a public policy,
plan, programme, or project.

The Cabinet developed a set of 12 outcomes through consultation and discussion at many levels.
These outcomes reflect the development impacts government seeks to achieve. The outcomes have
been written in terms of measurable outputs and key activities in order to achieve the outputs.

© Regenesys School of Public Management 66


The key outcomes are (Presidency, 2010:13):

1. Improved quality of basic education


2. A long and healthy life for all South Africans
3. All people in South Africa are and feel safe
4. Decent employment through inclusive economic growth
5. A skilled and capable workforce to support an inclusive growth path
6. An efficient, competitive and responsive economic infrastructure network
7. Vibrant, equitable and sustainable rural communities with food security for all
8. Sustainable human settlements and improved quality of household life
9. A responsive, accountable, effective and efficient local government system
10. Environmental assets and natural resources that are well protected and continually enhanced
11. Create a better South Africa and contribute to a better and safer Africa and world
12. An efficient, effective and development-orientated public service and an empowered, fair and
inclusive citizenry

An extract from the National Evaluation Plan (2012), overleaf, summarises the programmes
approved for evaluation for the fiscal year 2013/2014. Each programme is linked to the 12 outcomes
and the accountable departments established by Parliament.

© Regenesys School of Public Management 67


Name of Department Name of intervention Title of evaluation Key motivation for this evaluation including scale (eg budget, beneficiaries)
Department of Basic National Senior Evaluation of the quality The quality of the senior certificate is a key aspect for outcome 1: “Improved quality of basic
Education Certificate (matric) of the National Senior education”, and future education and economic outcomes, both for the children concerned and the
Certificate country as a whole. The estimated budget for 2013/14 (provincial level only), is approximately R2
billion. There are about 600 000 learners who are affected by this intervention.
Department of Trade Export Marketing Evaluation of Export This programme is linked to outcome 4: “Decent employment through inclusive growth” in terms of
and Industry Investment Assistance Marketing Investment increased export sales and job creation. The estimated budget of the programme for 2013/14 is
Incentive programme Assistance Incentive R189 million. The number of people affected by the intervention from the period when the first
(EMIA) Programme applications were captured in 2001/02 to 31 March 2012 is 8 169 with a total incentive value of
approximately R471 million.
Department of Trade Support Programme for Evaluation of Support This programme is linked to outcome 4. SPII provides financial assistance to SMMEs and large
and Industry Industrial Innovation Programme for Industrial companies for them to develop commercially viable innovative products and processes. Currently
(SPII) Innovation the budget is R52.7 million and R339.4 million from 2007/8 to 2011/12. From 2007/8 to date SPII
has benefited 269 companies and created 5 012 shop-floor jobs.
Department of Trade Technology and Impact evaluation of THRIP is linked to outcome 4 and outcome 5 in terms of a “Skilled and capable workforce”. It
and Industry Human Resources for Technology and Human supports more labour absorbing growth to support an inclusive growth path. It has a budget of
Industry Programme Resources for Industry R157 million for 2012/13 and its budget in the medium-term strategic plan rises to R184 million for
Programme (THRIP) 2015/16. The THRIP beneficiaries are 205 SMMEs, with 322 projects and 780 students.
Department of Military Veterans Diagnostic evaluation of The plight of military veterans was so dire that a Department of Military Veterans was established
Military Veterans Economic Military Veterans to cater for their needs as well as to ensure that the country honours and appreciates their
Empowerment Economic Empowerment contribution to bringing about freedom and democracy. Given that the focus on the military
and skills and skills veterans’ mandate is emerging, it is thus important to position the intervention programmes that
deal with issues that will fast-track the improvement of their lives, hence this proposed diagnostic
Transferability and Transferability and
evaluation. The evaluation will assess how military veterans should be re-integrated in civilian life.
Recognition Recognition Programme
It will also provide strategic information that will inform the development of the Economic
Programme
Empowerment and Skills Transferability and Recognition Programme.

© Regenesys School of Public Management 68


Name of Department Name of intervention Title of evaluation Key motivation for this evaluation including scale (eg budget, beneficiaries)
Department of Advanced Evaluation of National This intervention is linked to outcome 4 (employment). While not a large programme it is strategic
Science and Manufacturing Advanced Manufacturing and innovative and addresses making South African manufacturing more competitive. The budget
Technology Technology Strategy Technology Strategy is R43 million in 2012/13 rising to R48 million in 2014/15.
(AMTS)
South African Tax compliance cost of Impact evaluation on Tax Key to successful job creation (outcome 4) is the success of small businesses. SARS wishes to
Revenue Service small businesses Compliance Cost of undertake a regular evaluation of the cost of tax compliance for small businesses, so informing the
small businesses streamlining of the tax system. Approximately 800 000 small businesses are affected by this
system.
Department of Co- Community Work Impact evaluation of the This intervention is linked to outcome 4 and is creating a guaranteed level of basic employment of
operative Programme (CWP) Community Work 100 days per year as a safety net. Participants undertake a range of services such as home-based
Governance Programme care, developing agricultural projects etc. The budget is R1.4 billion in 2012/13 rising to R2.7 billion
in 2014/15. As at March 2012, CWP had 105 218 participants. Other beneficiaries include those
who benefit from the services provided by CWP participants, but the extent of the benefits is not
known. The evaluation will help to decide whether the benefits justify scaling up.
Department of Rural Land Restitution Evaluation of the Land Restitution is directly linked to outcome 7 (rural development). The restitution programme also
Development and Programme Restitution Programme contributes to other outcomes including outcome 4 and outcome 10: “Sustainable natural
Land Reform resources management”. The programme is politically sensitive and an important part of the
reparations from apartheid. The estimated budget for the current financial year is R3 billion
benefitting 1.6 million people.
Department of Comprehensive Impact evaluation of The six pillars of CASP are providing support provided to farmers, as planned for in outcome 7:
Agriculture, Forestry Agricultural Support Comprehensive “Vibrant, equitable and sustainable rural communities with food security for all”, and also impacts
and Fisheries Programme (CASP) Agricultural Support on outcome 4. In 2012/13 the budget for CASP is R1.5 billion and around R1.6 billion for the next
Programme two years. 36 505 beneficiaries were supported by CASP in 2011/12, creating 10 062 jobs. There
is a wider importance in that finding a successful way to provide support for small farmers is key for
successful rural development.

© Regenesys School of Public Management 69


Name of Department Name of intervention Title of evaluation Key motivation for this evaluation including scale (eg budget, beneficiaries)
Department of Upgrading of informal Setting a baseline for The upgrading of informal settlements responds to outcome 8 “Sustainable human settlements and
Human Settlements settlements future impact evaluations improved quality of household life”. Sub-output 1 addresses the upgrading of households in
for the informal informal settlements with access to secure tenure rights and basic services. There is no estimated
settlements targeted for budget and the evaluation will establish the number of poor households that are targeted for the
upgrading upgrading.
Department of Access to the city Evaluating interventions The White Paper on Housing of 1995 (A New Housing Policy and Strategy for South Africa)
Human Settlements by the Department of acknowledged the increasingly urban nature of South Africa's landscape and the fact that even
Human Settlements to those who reside in rural areas will at some time in their life spend time in a town or city. The
facilitate access to the question of whether or not the housing programmes of the department have increased access to
city the urban space addresses indirectly the issues reflected in outcome 8, including access to land,
the property market, informal settlement upgrading and the acceleration of access to housing.
There is no specific budget allocated to efforts of providing access to the city. Thus far over 15
million people have benefitted from the housing programme
Department of Provision of state Diagnostic of whether The 1995 White Paper on Housing highlighted the importance of creating assets that households
Human Settlements subsidised housing the provision of state can leverage to improve their lives and that of their children. The building of integrated human
subsidised housing has settlements (outcome 8) deals with an improved property market through the creation of assets.
addressed asset poverty Over 15 million people have benefitted from the government housing programme
for households and local
municipalities
Department of Outcomes approach Impact evaluation of the This evaluation is based on the lessons from implementation of the outcomes approach, Approach
Performance Outcomes Approach affects services covering all of the population and it is important to learn the lessons before
Monitoring and completion of the first five-year mandate
Evaluation

© Regenesys School of Public Management 70


Name of Department Name of intervention Title of evaluation Key motivation for this evaluation including scale (eg budget, beneficiaries)
Presidency Government's co- Implementation For much of government's work to be effective requires co-ordination horizontally within national
ordination systems evaluation of Co- and provincial spheres, as well as vertically across spheres. Challenges are being experienced in
ordination Systems co-ordination both vertically and horizontally which is negatively affecting implementation.
Particular systems to be looked at include the cluster system, outcome Implementation Forums,
MinMECs which bring together national and provincial departments in a sector, and
interdepartmental co-ordination mechanisms, such as for early childhood development. These co-
ordination mechanisms affect all government actions, and so indirectly impacts on the whole
population of the country.

The document further proposes 2014/2015 priority evaluations:

Name of Department Name of Intervention Title of evaluation Key motivation for this evaluation including scale (eg budget, beneficiaries)
Department of Rural Revitalization of Cost benefit analysis of The ultimate objective of the revitalisation of irrigation schemes is directly linked to outcome 7,”
Development and irrigation schemes revitalization of existing Vibrant, equitable and sustainable rural communities and food security”. Over and above that the
Land Reform irrigation schemes irrigation schemes contribute to the achievement of other outcomes, namely outcome 4: decent
employment through economic growth. Irrigation is one of the main mechanisms for permitting high
productivity production and for providing significant numbers of smallholder farmers with a decent
living. Many of the irrigation schemes were established in the former homelands and are not
working effectively.
Department of Basic Funza-Lushaka Evaluation of Funza- The intervention is linked to outcome 1: improved quality of basic education and sub-output 1:
Education Bursary Scheme Lushaka Bursary improve teacher capacity and practices. The budget is R672 million for 2012/13 with 11 650
Scheme bursaries awarded for 2012/13. Given the shortage of teachers in key subjects such as maths,
physical science and accounting, as well as in the foundation phase, it is important to assess the
extent to which the Funza Lushaka bursary scheme addresses this problem.

© Regenesys School of Public Management 71


Name of Department Name of Intervention Title of evaluation Key motivation for this evaluation including scale (eg budget, beneficiaries)
Department of Ilima Letsema Impact Evaluation of Ilima-Letsema responds to outcome 7 in terms of the establishment and support provided to
Agriculture, Forestry Programme Ilima-Letsema farmers at large, as well as support for domestic food production. Currently the budget is R415.7
and Fisheries Programme million, rising to R460 million in 2014/15. 54 740 beneficiaries were targeted, but 99 245 have been
reached.
Department of Mafisa Implementation Mafisa was set up to provide funding through provisionally accredited DFIs to on-lend to targeted
Agriculture, Forestry evaluation of Mafisa HDI agricultural micro-businesses, covering irrigation, livestock, equipment and production inputs.
and Fisheries The scheme was first piloted in 2005 and was set up to complement larger scale finance provided
by the Land Bank. Credit is an important part of the technical package of support needed by small
scale farmers, and is part of output 7.1 on agrarian reform of outcome 7 on rural development.
Department of Small farmer support Policy evaluation of Support for small farmers is a key component of outcome 7 on rural development, where the target
Agriculture, Forestry small farmer support is to increase the number of smallholder farmers from 200 000 to 250 000. A number of
and Fisheries / programmes are being proposed for evaluation in 2013/14 and 2014/15 which relate to small
Department of Rural farmer support. It is proposed to then do an overarching review of policy for small farmer support,
Development and drawing from these various evaluations (CRDP, land recapitalisation and development, CASP,
Land Reform Ilima-Letsema, land restitution, support for irrigation schemes, Mafisa) to review in an integrated
way policy for small farmer support.

© Regenesys School of Public Management 72


The document further proposes 2015/2016 priority evaluations:

Name of Department Name of Intervention Title of evaluation Key motivation for this evaluation including scale (eg budget, beneficiaries)
Department of Land Care Impact Evaluation on The Land Care Programme is about the sustainable use of land and is linked to outcomes 7 (rural
Agriculture, Forestry Land Care development) and 10 (environment). Land care projects are implemented mostly in communal
and Fisheries lands and the Programme employs community members to implement activities. The programme
benefited 15 867 beneficiaries in 2011/12 and it is envisaged to benefit 28 500 people in the
2012/13 financial year. The estimated budget for 2012/13 is R115 661 000 and R108 million for
2013/14. It is not a large programme but is innovative in seeking to achieve environmental,
production and economic objectives simultaneously.
Department of Rural National Rural Youth Diagnostic Evaluation of Half of all 18-to-24-year-olds are unemployed, accounting for about 30 per cent of total
Development and Service the National Rural Youth unemployment and National Treasury estimates that the average probability of an 18-to-24-year-
Land Reform Service Corps old of finding a job is just 25 per cent. Overall unemployment is worse in rural areas. The National
(NARYSEC) Rural Youth Service attempts to deal with issues of youth unemployment and rural development,
supporting rural youths who lack skills and enabling them to develop skills and take forward
productive activities. As such it is linked to outcomes 7 (rural development), 5 (skills) and 4
(employment). The programme targets unskilled and unemployed rural youths from ages of 18-35
who have a minimum of grade 10 certificate. (outcome 4 delivery agreement)
Department of Basic New School Evaluation of curriculum A key initiative of government has been in changing the school curriculum, affecting 12 million
Education Curriculum implementation learners. This is a key activity in outcome 1: Improved quality of basic education, sub-output 1:
improve teacher capacity and practices and sub-output 2: increase access to high quality learning
materials. An evaluation in 2013/14 is looking at the school certificate more generally. This
evaluation will look more particularly at the issue of the school curriculum.

© Regenesys School of Public Management 73


Name of Department Name of Intervention Title of evaluation Key motivation for this evaluation including scale (e.g. budget, beneficiaries)
Department of Evaluation of the Implementation Cabinet has created the national evaluation system since the adoption of the National Evaluation
Performance impact of evaluations evaluation of the national Policy Framework in November 2011. Evaluations are selected specifically because they are
Monitoring & evaluation policy and national priorities and linked to the 12 outcomes. Implementing the evaluation system requires
Evaluation system investment in time and money. This evaluation will seek to establish whether this system is adding
value, and how it can be strengthened to maximise the impact on performance and decision-
making, as well as accountability and knowledge sharing.

A breakdown of these priority programmes is provided in your resource pack:

• Presidency. (2012). National Evaluation Plan 2013-14 to 2015-16.


https://www.dpme.gov.za/publications/Policy%20Framework/National%20Evaluation%20Plan%202013%20-%2014.pdf (accessed 18 June 2020).

© Regenesys School of Public Management 74


7.4.6 M&E as Part of Other Management Functions

Whatever the place of monitoring and evaluation within an organisation, in the public service the
following must be integrated with monitoring and evaluation:

• The strategic plan of every government organisation: monitoring and evaluation is a part
of the organisation's strategic plan as it specifically uses the organisation's strategic goals
and objectives to set performance indicators and thereby facilitate monitoring and
evaluating;

• The annual performance plan: as monitoring and evaluation tracks progress of


organisations and individuals, it is integral to compiling the annual performance plan;

• Human resource planning: an important part of monitoring and evaluation is capacity (ie
the human resources/people of the organisation); human resources needs to include
monitoring and evaluation in its planning processes;

• The development strategy of the organisation (training for monitoring and


evaluation): managers need to be trained to become monitoring and evaluation experts, so
that they understand the importance of the processes and can develop ways of integrating
them with other management functions. In addition, they need to be able to train junior
employees in monitoring and evaluation processes to help them to understand the ways in
which the processes are implemented in the organisation and why this is so important;

• Electronic systems: it is crucial that organisations keep electronic records of all monitoring
and evaluation processes, and that these processes are facilitated by the IT systems
implemented within the organisation;

• Planning and budgeting: monitoring and evaluation needs to be part of the organisation's
budget and other planning processes, and to be able to provide monitoring and evaluation
feedback on these systems;

• Performance management: individual performance should be linked to the monitoring and


evaluation system within the organisation, so that individual action is leading to the
accomplishment of organisational goals;

• Framework for reward and recognition: individuals should be recognised and rewarded
appropriately for their successful role in monitoring and evaluation practices; and

• Training and development: monitoring and evaluation training should take place as a way
to create an understanding and a culture of monitoring and evaluation within the
organisation.

© Regenesys School of Public Management 75


In addition to the formal functions of monitoring and evaluation, the organisation must try to establish
a culture of monitoring and evaluation. This means eliminating any negative attitudes towards
monitoring and evaluation and replacing them with a positive attitude. A positive attitude means
viewing monitoring and evaluation as a way to study and solve problems, rather than as a negative
critique of the organisation and the people within it.

Creating this culture depends to a large degree on the approach managers choose to take.

7.4.7 Conclusion

In this section we discussed the broader aspects of monitoring and evaluation within the public
service. We established an understanding of the government-wide monitoring and evaluation system
and the values the system wishes to address. We provided the cabinet’s priority evaluations, which
should lead all implementation strategies and plans for monitoring and evaluation. In the next section
we will look at the steps involved in implementing a monitoring and evaluation system.

As a summary for this section and an introduction to the next, read this article:

• Public Service Commission. (2012). Evolution of monitoring and evaluation in the South
African public service. http://www.psc.gov.za/newsletters/docs/2012/K-
9555%20PSC_6th%20edition%20magazine_DevV11.pdf (accessed 18 June 2020).

Recap Your Knowledge

1. Critically evaluate the summary of proposed evaluations for 2013/2014. Review programmes in your
department and link them to these evaluations. Discuss in your groups the progress of the implementation of
these programmes and explain how they link to the 12 outcomes set by the cabinet.
2. Critically review the summary of proposed evaluations for 2014-2015. Discuss which programmes your
department could implement. Explain how these programmes link to the 12 outcomes set by the cabinet.
3. Evaluate how well your organisation integrates monitoring and evaluation with other management functions.

© Regenesys School of Public Management 76


7.4.8 Key Points

Some key points made in this section were:

• Monitoring and evaluation fits into the government’s overall planning cycle, which runs from
one general election to the next;
• M&E takes place in the context of three inter-related spheres of government (national,
provincial and municipal);
• Intergovernmental relations are organised in terms of four acts of Parliament, namely the:
o Intergovernmental Fiscal relations Act, 1997
o Municipal Structures Act, 1998
o Municipal Systems Act, 2000, and Municipal Systems Amendment Act 2003
o Intergovernmental relations Framework Act, 2005
• These frameworks and systems continue to evolve, and M&E plays a role in improving them;
• The DPME offers a model for implementing a monitoring and evaluation framework,
illustrated in Figure 6;
• Evaluation can be conducted in more than one way. There are, for example, the
o Programme performance perspective
o Financial perspective
o Governance perspective
o Human resource management perspective
o Ethics perspective
o Perspective of National treasury guidelines
• There is an overall National Evaluation Plan first introduced in 2013/14. The essential point
is that having such a plan (with a set of identified outcomes) is a way to improve the quality
of public administration; and
• Monitoring and evaluation is one of many management functions, and must take other
management functions into consideration, not so that we can tick boxes, but so that we can
improve the quality of all our work.

Remember to do your digital assessment for this section online!

It will help you strengthen and embed your understanding of the course. You will not be able to
change your answers once you have submitted them, so make sure you have completed the
relevant section of coursework first. Where you see Select all that are relevant, be aware that
any number of the options presented could be correct. You will lose marks for incorrect
selections, so choose carefully. Your combined marks from these assessments count towards a
total of 20% of your course mark.

© Regenesys School of Public Management 77


7.5 IMPLEMENT A MONITORING AND EVALUATION SYSTEM

Timeframe Minimum of 25 hours

• Able to communicate monitoring and evaluation results;


• Equipped to compile and implement a performance-based framework for monitoring and
Learning outcomes
evaluation; and
• Able to assess the success factors of a monitoring and evaluation system.

• Adato, M. (2011). Combining quantitative and qualitative methods for program monitoring
and evaluation: why are mixed-method designs best?
http://documents.worldbank.org/curated/en/633721468349812987/pdf/643860BRI0Mixe00
Box0361535B0PUBLIC0.pdf (accessed 18 June 2020).
• Goldman, I., Engela, R., Akhalwaya, I., Gasa, N., Leon, B., Mohamed, H. and Phillips, S.
(2012). Establishing a national M&E system in South Africa. The World Bank Special
Series on The Nuts & Bolts of Monitoring and Evaluation Systems, 21, 1-11,
http://documents.worldbank.org/curated/en/556311468101955480/pdf/760630BRI0Nuts00
Box374357B00PUBLIC0.pdf (accessed 18 June 2020).
Recommended
reading • Lahey, R. (2010). The Canadian monitoring and evaluation (M&E) system: lessons learned
from 30 years of development. ECD Working Paper Series.
http://documents.worldbank.org/curated/en/865531468226748462/pdf/654070NWP0230c0
C0disclosed011040110.pdf (accessed 18 June 2020).
• Molleman, E. and Timmerman, H. (2003). Performance management when innovation and
learning become critical performance indicators. Personnel Review, 32(1), 93-113.
https://www.researchgate.net/profile/Eric_Molleman/publication/235285519_Performance_
management_when_innovation_and_learning_become_critical_performance_indicators/lin
ks/5948df07458515db1fd8df78/Performance-management-when-innovation-and-learning-
become-critical-performance-indicators.pdf (accessed 18 June 2020).

Monitoring and evaluation is, as noted, the responsibility of the Department of Performance
Monitoring and Evaluation and the Public Service Commission. In this section, we examine the
M&E process set by the Department of Performance Monitoring and Evaluation. This will allow
Section overview you to create a performance-based framework for monitoring and evaluation in your own
context. You must keep in mind that all of this occurs within the government's framework for
M&E (the Government-wide Monitoring and Evaluation System) and that monitoring and
evaluation is integrated with many management functions.

© Regenesys School of Public Management 78


7.5.1 Introduction

We briefly discussed the steps involved in the monitoring and evaluation system in the previous
section of the course. In this section we will examine the implementation of the monitoring and
evaluation process, which is not necessarily linear. An evaluation may occur at any time in the life
cycle of a project or programme. Keep this in mind when compiling or adapting monitoring and
evaluation programmes.

Once evaluation is complete, you need to communicate the results to relevant stakeholders. This
involves following the Department of Performance Monitoring and Evaluation’s guidelines for
communicating evaluation results. These will also be explained in this section.

The monitoring and evaluation process must be followed within the framework of the government-
wide monitoring and evaluation system. See Figure 7.

© Regenesys School of Public Management 79


FIGURE 7: ACHIEVING OUTCOMES USING THE GOVERNMENT-WIDE MONITORING AND EVALUATION SYSTEM

An issue is identified as a public concern and a policy is formulated

A programme is designed to implement the policy

The programme logic clearly shows


Ways of checking whether those
how undertaking specific activities
activities, outcomes and impacts
that have calculated outcomes will
are happening are also chosen
lead to the achievement of the
(these are indicators)
intended policy impacts

The legislature provides funding and the public officials carry out the activities described in the
programme

The logic’s process flows, and the


performance indicators send
As implementation rolls out, work managers and officials clear signals
gets done and records are kept about what they should do and what
is important

Public scrutiny and robust systems The records are captured, verified
result in good management and analysed into reports

Reports are compared to plans and benchmarks


such as international best practices

Success is Challenges are Evidence-based Affected


Accountability highlighted and decision making stakeholders are
identified and involved extensively
is improved addressed
around resources
replicated is facilitated and consistently

Public service becomes more effective and poverty is eradicated

Census and
Performance Follow up
surveys, admin Evaluations
information actions
data sets, etc

© Regenesys School of Public Management 80


The steps below discuss the detail of achieving monitoring and evaluation success within the
GWMES.

7.5.2 Step 1: Examine the Context and Current State

Surely the first step is to identify the problem and its context? Please bring problem identification up
so that it runs ahead of situational analysis.

The first step of a monitoring and evaluation process is to outline the current state. A situational
analysis provides an overview of the current state of the department. Information about different
aspects of the organisation’s current state are gathered, analysed and presented. For the purpose
of a government programme, a situational analysis should give you an overview of the current status
of a department and how this department aligns with the needs of the public or the goals set out in
the strategic plan.

A situational analysis:

• Forms the first step of a departmental planning cycle;


• Provides the basis of the operational plan and first-quarter report; and
• Identifies gaps in the information available.
(McCoy and Bamford, 1998)

You have discussed the planning process in your Strategic Management course. For an overview of
the planning process study Figure 8.

FIGURE 8: THE PLANNING CYCLE

Monitoring
Situation
and
analysis
evaluation

Priority
and
Task setting
objective
setting

Option
appraisal

(McCoy and Bamford, 1998)

© Regenesys School of Public Management 81


Conduct a situation analysis

McCoy and Bamford (1998:6-9) suggest the conducting a situation analysis involves the following
steps. See Table 12.

TABLE 12: SITUATION ANALYSIS

Step 1: Determine the The framework provides the scope of the analysis. It should be focused and should fit
framework the needs of the research.
Step 2: Identify what There is no need to reinvent the wheel. Do research within the department to see what
information is already information is already available. Make sure that it is accurate.
available
Step 3: Identify what Analyse the gaps between the information that already exists, and the information
information is still needed. This will give you an indication of what should still be gathered. Usually at this
required stage questionnaires or feedback forms will be developed to collect specific
information.
Step 4: Collect the Next you need to develop a research plan. Describe the information needed, the
required information process you will follow to gather it, how you will analyse it and in what format you will
present your findings.
Step 5: Compile and Write a report presenting the situation and its analysis. Usually the format of a research
write the report report could be used, but different departments prefer different reporting formats for
specific information.
Step 6: Distribute and Lastly the information collected should be published. The type of programme or project
disseminate the report will predict which channels to use for the publication of the report.

The department will have to invest time and resources in the collection of the information. Although
the situation analysis is a useful tool, it can be costly and often requires experts at some stage of the
process (Miller, nd).

© Regenesys School of Public Management 82


Situation Analysis: Case Study

Socioeconomic Conditions in the Kakamas Subdistrict

Agriculture is the major economic activity in the area. The main produce is grapes and sun-dried fruit. Although the
water supply limits the development of agriculture, exploitation of overseas markets provides opportunity for some
economic growth. Apart from some food processing (wine and sun-dried fruit), there is no manufacturing or industrial
activity. There are no accurate unemployment figures for the Northern Cape; the October Household Survey of 1994
estimated that 32,5% of an estimated 278 743 economically active people were unemployed. Rates were higher for
coloureds (37, 9%) and blacks (39,4%), than for whites (7,2%).

Fifty-seven percent of unemployed people had been unemployed for more than a year at the time of the survey.
Almost 75% of unemployed people are not trained or skilled for specific work. Employment opportunities are limited,
with strong seasonal variation in the availability of work. Pensions and other grants form an important source of
income for many households. Although there are no accurate figures, there is no doubt that a sizable proportion of
the population lives in poverty. In comparison to other regions in the province, more people live in rural areas with
poorer access to basic services when compared to the provincial figures.

Outcome 4 of the national evaluation plan posits that government is committed to ensure decent employment through
inclusive economic growth. In the case study, one of the problems underpinning unemployment is listed as insufficient
skills.

Imagine that you are an employee of the Department of Higher Education for this region. You are tasked with
investigating the situation and presenting your findings to the minister of education for the Northern Cape. Design a
plan for the collection of the necessary information needed for the feedback report for the minister.

7.5.3 Step 2: Administrative Information Systems and Data Sets

Step 2 of the process links well with step one – data collection and analysis. In step one of the
process the current situation has been analysed while in step 2 the information is taken a step further:
it is analysed to inform decision. Therefore – knowledge is created.

Primary data collection

Primary data can be collected in different ways, depending on the project needs, skills levels, time
and budget. Data can also be collected from individuals or from a group of people.

Examples of primary data collection include: questionnaires, surveys, interviews, focus groups and
group interviews. These are explained in Table 13.

© Regenesys School of Public Management 83


TABLE 13: PRIMARY DATA COLLECTION METHODS

Data collection method Description


Unstructured interviews Unstructured interviews consist of open-ended questions, which are designed to probe
and stimulate the respondent to think rather than just giving quick answers.
Unstructured interviews are useful:
• When we need to know about people’s experiences or views in depth;
• When we are able to rely on information from a fairly small number of
respondents;
• When the issue is sensitive, and people may not be able to speak freely in
groups; and
• When respondents are unable to express themselves fully through a written
questionnaire.
Structured interviews Structured interviews are conducted using questionnaires, ie a written list of questions,
mostly closed-ended, or precoded, either given or posted to respondents. Structured
interviews are necessary:
• When you need information from a large numbers of respondents;
• When we know exactly what information you need, which you have
established through other research methods; and
• When the information needed is fairly straightforward, and we want it in a
standard format.
Focus groups This is a group interview, where six to 12 people are brought together for a discussion.
It is not a series of individual interviews conducted in a group – the interaction between
group members is part of the process and should be encouraged. Focus groups are
useful:
• When in-depth information is needed about how people think about an issue –
their reasoning about why things are the way they are, why they hold views
they do; and
• When we need guidance in setting framework for some larger-scale research,
about what people see as issues.
Observation This method involves observing objects, processes, relationships, or people’s and
recording these observations. Observation is useful when the information required is
about observable things, or we need to cross-check peoples’ accounts of what
happens.

Which method of data collection would suit needs of the programme you discussed in the
previous activity? Why would this be the best method for collecting data?

© Regenesys School of Public Management 84


Quantitative and qualitative information

There are two main types of information produced by the data collection process: qualitative and
quantitative. The most obvious difference between the two is that quantitative data are numerical
(for example amounts, proportions) and qualitative data gives information best described in words,
diagrams, or pictures.

Qualitative and quantitative data

Most monitoring and evaluation systems require the collection of both quantitative and qualitative
data. Interventions need qualitative data about the nature of results (for example beneficial or harmful
effects, intended or unintended impacts). Interventions also need quantitative data (for example
about the distribution or intensity of the results) to ensure the accuracy of the analysis.

Whether the data we collect is numerical or textual (descriptive) is determined by the type of
questions we ask in our tools. Detailed qualitative data can be obtained by asking open-ended
questions, whereas numerical data can be obtained by asking closed-ended questions.

Mixed research uses both quantitative and qualitative techniques in a single study. For example, a
study could use a qualitative method such as focus groups as well as a quantitative method such as
a questionnaire survey to collect data. Alternatively, a research instrument could use a mix of open-
ended (qualitative) and closed-ended (quantitative) questions to collect responses.

Read more about this in the article indicated below:

• Adato, M. (2011). Combining quantitative and qualitative methods for program


monitoring and evaluation: why are mixed-method designs best?
http://documents.worldbank.org/curated/en/633721468349812987/pdf/643860BRI0Mixe
00Box0361535B0PUBLIC0.pdf (accessed 18 June 2020).

Identify Qualitative and Quantitative Methodologies

1. In your group, decide whether you will use qualitative or quantitative methods to fix the problem you identified in
the previous activity.
2. Explain how you would use this method to collect the data you require with specific reference to the processes
explained above.

© Regenesys School of Public Management 85


7.5.4 Step 3: List Indicators, Targets and Baselines

National Treasury (2007:22) defines a programme indicator as:

“… a pre-determined signal that a specific point in a process has been reached or result
achieved. The nature of the signal will depend on what is being tracked and needs to be very
carefully chosen. In management terms, an indicator is a variable that is used to assess the
achievement of results in relation to the stated goals/objective.”

Developing indicators involves answering the question: How will I know or what will I see that will tell
me the specific result has been achieved? To answer this question, let us examine the guidelines
for developing performance indicators.

National Treasury (2007:7) requires that all performance indicators conform to the standards
reflected Table 14.

TABLE 14: PERFORMANCE INDICATORS

Reliable The indicator should be accurate enough for its intended use and respond to changes in the level
of performance.
Well-defined The indicator needs to have a clear, unambiguous definition so that data will be collected
consistently, and be easy to understand and use.
Verifiable It must be possible to validate the processes and systems that produce the indicator.
Cost-effective The usefulness of the indicator must justify the cost of collecting the data.
Appropriate The indicator must avoid unintended consequences and encourage service delivery
improvements, and not merely create incentives to meet targets.
Relevant The indicator must relate logically and directly to an aspect of the institution's mandate, and the
realisation of strategic goals and objectives.

Furthermore, performance indicators need to contribute to the four standards required of monitoring
and evaluation:

• Equity
• Effectiveness
• Efficiency
• Economy.

1. What is meant by each of these terms?


2. Why are they used in association with monitoring and evaluation?

© Regenesys School of Public Management 86


Economy, efficiency, effectiveness and equity indicators are each related to a specific component of
the outcomes approach (inputs, activities, outputs, outcomes and impacts). Figure 3, section 1,
illustrated this – turn back to this section to review. These specific relationships are explained below
(National Treasury, 2007:8):

• Economy indicators are used to determine that the correct inputs are attained at the lowest
possible cost. They also determine whether the correlation of outputs with cost is correct.
Economy indicators are relative due to the fact that the correct use of funds will differ in
different cases. The best way to determine whether inputs are economical or not is therefore
to compare them with international best practice (are similar institutions achieving similar
results based on the same amount of resources?);

• Efficiency indicators measure how efficiently the inputs have been transformed into
outputs. The most efficient input indicators are those that can produce the maximum number
of outputs per indicator, or alternatively, the least amount of input to create a single output.
Efficiency is measured with an input-to-output ratio. Again, efficiency differs for various
institutions, and so again using international best practice to determine efficiency is advised;

• Effectiveness indicators measure how well the outputs achieve the outcomes. The
outcomes must relate to the strategic objectives of the specific organisation, and therefore
effectiveness can be measured based on whether or not these objectives (outcomes) are
being achieved. As an organisation's goals and objectives are likely to be the same over a
period of at least five years, effectiveness only needs to be measured once during this period;
and

• Equity indicators measure how well services are being provided without unfair bias or
discrimination. In other words, equity indicators are used to measure how well an
organisation has attained comparable outputs among different groups; for example, between
those living in rural and urban areas. The best way to measure equity is by conducting
benchmark tests.

The National Treasury (2007:10) offers the following steps for developing performance indicators:

Step 1: Agree on what you are aiming to achieve

There needs to be a shared understanding of the problem that needs solving.

Step 2: Specify the outputs, activities and inputs

During this stage, managers need to determine what needs to be done to reach the desired outcome
and impacts.

© Regenesys School of Public Management 87


Step 3: Select the most important indicators

Although the organisation may have developed a range of extensive indicators, it is more effective
to select and work with a few of the most important indicators, rather than attempt to measure every
aspect of service delivery and outputs (National Treasury, 2007:11). The National Treasury offers
the following advice for selecting the best indicators:

• The indicators should clearly communicate the strategic goals and objectives of the
organisation;
• The data used to determine indicators should be easily available; and
• Choose indicators according to their manageability: will it be likely to control these indicators
and monitor them closely enough?

Step 4: Set realistic performance targets

After selecting appropriate performance indicators, determine what level of performance will ensure
the achievement of the organisation's outcomes. Performance targets are goals the organisation
sets that determine a specific set level of performance that the organisation, programme, project or
individual aims to achieve within a set time frame (National Treasury, 2007:9).

Setting performance targets involves determining the baseline and performance standards (National
Treasury, 2007:9):

• The baseline refers to the current level of performance, which must be improved
• Performance standards are the "minimum acceptable level of performance" required from
the individual, organisation or project

Look at the examples below:

Objective: “To expand access province-wide by the end of 2015 to an appropriate package of
treatment to all people in the province diagnosed with HIV or AIDS.”

Baseline: “Currently 45 000 people in the province have been diagnosed with HIV or AIDS.
Fewer than 4000 are receiving appropriate treatment.”

Performance standard: “To provide mobile HIV clinic services to all districts at least monthly.”

Performance targets are set at the beginning of a strategic planning period. According to the National
Treasury (2007:9), they should be determined using Smart criteria:

• Specific: the form and required level of performance must be clearly identified;
• Measurable: there must be a way to measure the required performance standard;
• Achievable: the performance requirements must be realistic given the context;
• Relevant: the performance requirements must not be linked to a specific goal; and
• Time-bound: there must be a limited period for the performance requirements to be met.

© Regenesys School of Public Management 88


Step 5: Determine the process and format for reporting performance

As the central aim of this scheme is to be able to monitor and evaluate performance information, it
is crucial that a process for reporting progress and results is integrated into the planning, budgeting
and implementation process.

The organisation needs to determine the best way of getting the information to the right people. This
will of course depend on the institution and its specific structure.

Step 6: Establish processes and mechanisms to facilitate corrective action

Monitoring and evaluation needs to be a cyclical process. Therefore, at every stage of the plan,
regular monitoring and evaluation has to be taking place in order to determine (National Treasury,
2007:9):

• What has happened so far?


• What is likely to happen if the current action persists?
• What actions, if any, need to occur to increase the probability of attaining performance
targets?

Develop Performance Indicators

1. Using the criteria for performance indicators, and Molleman and Timmerman's (2003) argument, develop
indicators you could use to measure progress for the monitoring and evaluation project you have been working
on in previous activities.

2. Critically evaluate these indicators and recommend improvements:

a. To draw out lessons in order to assess the strength of the programme


b. Consider the benefits relative to the costs of the grade R Programme
c. To assess whether the National School Nutrition Programme is being implemented

The importance of baseline information

Once a set of suitable indicators has been defined for an intervention, and a data collection method
chosen, stipulate the performance level that the institution and its employees will strive to achieve.
This includes stipulating performance targets that are relative to current baselines.

Baseline information offers a point of comparison. A baseline is the specific measurement of the
indicators within your monitoring system.

You start collecting baseline information during the first year of the intervention. After that, you can
use this information to measure the progress of the project or programme.

© Regenesys School of Public Management 89


Collecting baseline information

In order to begin collecting baseline information, you must:

• Compare the indicators measuring the situation before the intervention started with the
situation at a specified period after it started;
• Compare changes in areas where the intervention has taken place with those in similar
locations, but where the intervention has not taken place; and
• Compare the difference between similar groups – one that has been exposed to the
intervention and a so-called control group that is not within intervention influence.

Setting performance targets

Performance targets express a specific level of performance that the intervention is aiming to achieve
within a given time period. The first step in setting performance targets is to identify the baseline,
which may be the performance level recorded in the year prior to the planning period.

Performance standards express the minimum acceptable level of performance, or the performance
level that is generally expected. These should be informed by legal requirements, departmental
policies and service-level agreements. They can also be benchmarked against performance levels
in other institutions, or according to accepted best practices.

Performance standards and performance targets must be specified before the beginning of a service
cycle, as this may be a strategic planning period or a financial year. This is so that the institution and
its managers know what they are responsible for, and can therefore be held accountable at the end
of the cycle. While standards are normally timeless, targets must be set in relation to a specific
period. The targets for outcomes will incline to span multi-year periods, while the targets for inputs,
activities and outputs should cover either quarterly or annual periods.

An organisation must use standards and targets during the course of the intervention, as part of its
internal management plans and individual performance management system.

When you develop indicators, there might be a temptation to set unrealistic performance targets.
Successful performance management requires realistic, achievable targets but ones that challenge
the institution and its staff. Targets must preferably be set with regards to previous and existing levels
of achievement (ie current baselines), and realistic forecasts of what is possible. Where targets are
set about service delivery standards it is important to recognise current service standards and what
is normally regarded as acceptable.

The chosen performance targets should:

• Communicate what will be accomplished if the current policies and expenditure programmes
are continued;
• Allow performance to be compared at regular intervals – on a monthly, quarterly or annual
basis as required by departmental standards; and
• Make possible evaluations of the correctness of current policies and expenditure
programmes.

© Regenesys School of Public Management 90


Read more about this concept here:

• Molleman, E. and Timmerman, H. (2003). Performance management when innovation


and learning become critical performance indicators. Personnel Review, 32(1), 93-113.
https://www.researchgate.net/profile/Eric_Molleman/publication/235285519_Performanc
e_management_when_innovation_and_learning_become_critical_performance_indicator
s/links/5948df07458515db1fd8df78/Performance-management-when-innovation-and-
learning-become-critical-performance-indicators.pdf (accessed 18 June 2020).

Identify Baselines and Performance Targets

1. Using the project you have been working on in previous activities, develop baseline and performance targets for
your department or organisation.
2. Explain why you have chosen these specific baselines and targets for your project.

7.5.5 Step 4: Group Indicators by Policy Objective

This step is fairly straightforward: you link your indicators to the relevant policy objective. This allows
a clear correlation between what you are trying to achieve and how you are going to achieve it. If
there are no indicators for your policy objectives, you proceed to Step 5.

7.5.6 Step 5: If Policy Objectives have no Indicators

Based on the guidelines highlighted in steps 3 and 4, you must design new indicators, targets and
baselines if these are not present for the specific policy that you are addressing.

7.5.7 Step 6: Review Link Between Inputs, Outputs, Outcomes and


Impacts, and Identify Causal Relationships and Links

To maintain the goals of government's outcomes-based approach, you need to ensure that each link
in the monitoring and evaluation chain is serving a purpose and can be related back to inputs,
outputs, outcomes and impacts.

© Regenesys School of Public Management 91


Performance indicators

In Figure 3, performance indicators are shown to be present at each point in the pyramid. This is
because performance indicators define inputs, activities, outputs, outcomes and impacts.
Measurable indicators therefore need to be defined for each element of the pyramid.

Read more about government's outcomes-based approach here:

• Presidency. (2010). Guide to the Outcomes Approach.


http://www.dpme.gov.za/publications/Guides%20Manuals%20and%20Templates/Guideli
ne%20to%20outcome%20approach.pdf (accessed 18 June 2020).

The best way to identify and evaluate the links between outcomes and your monitoring and
evaluation system is to use a theory of change or logic model.

According to the DPME (2011:20 in City of Johannesburg, 2012:18), a theory of change "describes
a process of planned change, from the assumptions that guide its design, the planned outputs and
outcomes to the long-term impacts it seeks to achieve." In other words, it allows you to identify the
casual links between the impact (long-term) and the outcomes (as measured by their outputs,
activities and inputs).

Therefore, we can define a theory of change as follows:

“A theory of change is therefore a reflection of the end goal or impact desired, and the outcomes,
outputs, activities and inputs viewed as necessary for this end goal to be achieved. A set of
assumptions underpins identification of each of the elements in the chain. Assumptions may arise
from experience, facts, insights, formal learning, research or other sources. Through ongoing
monitoring and evaluation activities, these assumptions may be surfaced, challenged and refined,
thereby allowing those using the monitoring and evaluation framework to apply insights from past
practice when identifying the most appropriate set of activities, outputs and outcomes through
which to drive the desired long-term goals.”
(City of Johannesburg, 2012:18)

The example below, drawn from the City of Johannesburg’s’ guidelines for the implementation of the
government-wide monitoring and evaluation system, illustrates this.

© Regenesys School of Public Management 92


An example of a theory of change model produced for the City of Johannesburg is given below:

FIGURE 9: EXAMPLE OF A THEORY OF CHANGE MODEL FOR THE CITY OF JOHANNESBURG

(City of Johannesburg, 2012:19)

As the example shows, an impact is identified. This impact is then broken down using outcomes to
specify how it will be achieved. In turn, these outcomes are broken down into different activities and
inputs that will be used to ensure the outcomes are met. The time frame on the left-hand side of the
figure illustrates the period in which each activity and output will be achieved.

To get to the point where you can create a logic model, you need to follow the two steps discussed below:

Step 1: Use a problem tree analysis

A problem tree analysis is a problem-solving technique used to understand the roots of a problem. In
order to generate a plan for evaluating whether outputs and activities are contributing to outcomes, and
thereby ensuring the attainment of the correct inputs (the logic model), you need to understand which
activities and outputs are necessary.

An example of a problem tree analysis follows:

© Regenesys School of Public Management 93


FIGURE 10: EXAMPLE OF A PROBLEM TREE ANALYSIS

(City of Johannesburg, 2012:20)

This example shows a problem-tree analysis as applied for national budget execution. The primary and
secondary causes (roots of the problem) are shown in the context of their contribution to the problem (in
this case, weak national audit authority, the new emerging public service ethic, top-down management
systems and culture, and limited HR capacity). These are seen as creating the problems illustrated in the
"trunk" of the tree (limited budgeting and accounting capacity, for example). These ultimately result in the
negative effect highlighted in the top "branches" of the tree.

Step 2: Applying the model of change

Once you have completed the problem-tree analysis, you can link the "roots" to inputs, activities and
outputs that can be used to solve these problems. Here is an example from the City of Johannesburg:

© Regenesys School of Public Management 94


FIGURE 11: EXAMPLE OF APPLICATION OF THE OUTCOMES APPROACH TO MONITORING AND
EVALUATION

(City of Johannesburg, 2012:22)

As the figure shows, the city has developed inputs, activities and outputs that will solve the district's biggest
problems. These are then linked to the outcomes and impacts that will be achieved.

Use the Problem-tree Analysis

1. Using the problem-tree analysis and your own department's problem/s, create a diagram representing the roots
of the problem.
2. Link these roots to the inputs and activities that could be used to solve them.
3. Present your analysis to the class.
4. Discuss and critique the presentations.

© Regenesys School of Public Management 95


7.5.8 Step 7: Reporting Approach

A successful and relevant monitoring system feeds the results of the monitoring to all stakeholders
who require the information.

There are several ways to share information. The most common way is through written reports. As
mentioned before – each step might require a different reporting format to communicate the relevant
information. Usually the purpose of the report dictates the format. A summary of different reports is
provided in Table 15.

TABLE 15: DIFFERENT REPORTS

Problem analysis or These reports examine and provide an analysis of a specific problem or need identified
need analysis report by an organisation
Project or programme The report will outline the plan of the project or programme
plans
Feasibility reports Feasibility reports are written after the conducting a feasibility study. A feasibility study
researches whether a specific project or programme would be successful or profitable.
This is done before the project or programme starts.
Proposals Proposals are reports containing documents, official statements or letters written to
convince a reader that a project or programme presented should be tendered to the
writer.
Progress reports Progress reports reflect on the progress of a specific project or programme
implementation
Evaluation reports These reports summarise the evaluation information gathered and presents results to a
reader concerning the success or failure of a programme or project.
Impact assessment These reports provide the impact results of a specific programme or project
reports
Annual reports Annual reports summarise the operational functions of a specific department. The results
are usually linked to the strategic goals of the department.
(BusinessDictionary, 2019)

The Framework for Managing Programme Performance Information (2007) was devised by National
Treasury and provides guidance to national, provincial and local government on managing
performance. 
Performance information is useful only if it is consolidated and reported back into
planning, budgeting and implementation processes where it can be used for management decisions,
particularly for taking corrective action.

What this means is getting the right information in the right format to the right people at the right time.
Organisations must find out what information the various users of performance information require.
Likewise, they must develop formats and systems to ensure that their needs are met.

© Regenesys School of Public Management 96


7.5.9 Step 8: Evaluate Approach

In section three we discussed the different perspectives of evaluation. We also discussed the
possible outcomes for evaluating a government programme. In this section we will have a look at
implementing evaluation techniques.

Framing the evaluation requires the subject of the evaluation to be clearly identified and evaluated.
Framing questions linked to the evaluation perspectives will ensure this. An example is provided to
illustrate the point. The example provides evaluation questions linked to the evaluation perspectives
identified in the previous chapter of this course.

Examples of evaluation questions: Housing Programme

Concept Evaluation question Logic Model: Outputs, outcomes, impact

• What were the objectives of the programme and how well were they achieved?
• Did a secondary housing market develop so that beneficiaries could realise the
economic value of their asset?
• Did the housing programme create sustainable human settlements? (Included in this
concept are informal settlement upgrade, promoting densification and integration,
enhancing spatial planning, enhancing the location of new housing projects,
supporting urban renewal and inner city regeneration, developing social and economic
infrastructure, designing housing projects so that they support informal economic
activity and enhancing the housing product.)

Programme design

• What are the design features of the programme with regard to:
o The service delivery model (will government build houses, finance housing or
subsidise housing?)
o The financial contribution of government to each household
o The access mechanism: will people apply for houses (demand driven strategy) or
will government undertake housing projects where the need has been identified
by officials (supply driven strategy)?
o The size and quantity of houses
o The location of housing projects.
o Town planning patterns
o The types of units that are provided (family units or rental housing)
• The configuration of institutions through which the programme will be delivered (In the
case of housing the programme is delivered through government departments on
national, provincial and local level plus financial institutions, housing institutions
(landlords), consultants, developers and building contractors)

© Regenesys School of Public Management 97


The housing process (from identifying the need, planning, budgeting, and packaging the
project, project approval, building and inspection, to transfer of the houses to
beneficiaries)

• How did these design features contribute to the success of the programme?
• How flexible was the programme design so that creative solutions were possible on
the project level?
• How well was the programme implemented?

Responsiveness to needs

• How were housing needs identified?


• How quickly is government able to respond to dire housing needs?
• How were the beneficiaries consulted and how much choice did they have?

Values targeting

• Who were the targeted beneficiaries and how well were they reached?
• What are the eligibility criteria and are really poor people not excluded by the way the
programme is implemented?
• Was there any special dispensation for vulnerable groups like women, disabled people,
children and youth?

Scale of engagement

What is the scale of the programme and how does it compare to housing needs?

(PSC, 2008:51-53)

The evaluation questions determine the scope of the evaluation, the type of evaluation and the
methodologies used to evaluate. The evaluation questions will also prescribe the information sources
needed for the process.

Evaluation Questions

Use the programme or project you identified in the first activity of this section and design evaluation questions that will
cover the evaluation perspectives of monitoring and evaluation.

© Regenesys School of Public Management 98


7.5.10 Step 9: Capacity Building Plan

In order to implement the monitoring and evaluation system, staff need two essential elements:

• Line managers must have generic monitoring and evaluation skills as required by the
Framework for Managing Programme Performance Information; and
• Specialist monitoring and evaluation skills to ensure monitoring and evaluation strategy
implementation and to ensure quality.
(National Treasury, 2007:15)

“Building capacity” means that:

• The users of the monitoring and evaluation data understand how to integrate information in
monitoring and evaluation functions and understand how to respond to the findings of the
initial situational analysis report; and
• Monitoring and evaluation managers will understand the monitoring and evaluation system,
be able to manage it, and to produce results related to indicators and needs of the community.

Monitoring and evaluation practitioners must be able to link various components of the monitoring
and evaluation system information provided to ensure projects and programmes succeed. The
approach adapted by practitioners should be evidence-based, and data gathering methodology must
be scientific.

A capacity building plan should be developed once the department’s monitoring and evaluation
strategy is reviewed, evaluating the skills needed to implement the strategy. If they are not available,
the options are:

• Training and development – using the National School of Government to train managers;
• Recruitment and selection – finding the right employee with the right skills to complete the
monitoring and evaluation team;
• Talent management – mentoring and coaching individuals identified by the talent
management plan to become expert monitoring and evaluation practitioners;
• On-the-job coaching – existing monitoring and evaluation experts in the department can
transfer skills to other employees; and
• Participation in knowledge transfer – experts and laypeople could take part in conferences
and workshops addressing the skills needed for monitoring and evaluation system
implementation.
(National Treasury, 2007:16)

7.5.11 Step 10: Communication Plan

The results of the entire monitoring and evaluation process must be communicated to all relevant
stakeholders (employees, relevant departments, the public, etc.)

© Regenesys School of Public Management 99


DPME guidelines for communicating monitoring and evaluation results

The DPME offers guidelines for communicating evaluation results. These are as follows:

1. Determine what the monitoring and evaluation findings mean: what do they say about the
project/programme/department/organisation? Using the answer to this question, figure out
the best way to convey these findings to the stakeholders: written, verbal, electronic means
(website, social media, etc.)?
2. Produce three summaries:
• Summarise the key findings of the monitoring and evaluation process in plain language
in a one-page summary, containing the key messages you want to convey.
• Write a three-page executive summary of the findings
• Write a 25-page summary report
3. The DPME's communications plan template must be completed.
4. Explain how stakeholders can access and use information.

Communicating using an evaluation report

An evaluation report provides readers with relevant information pertaining to the evaluation and its
results. Generically, an evaluation report includes the components discussed in Table 16.

TABLE 16: COMPONENTS OF AN EVALUATION REPORT

Executive summary A short summary of the process followed, the results of the evaluation, the objectives
achieved, lessons learnt, questions answered, and needs fulfilled is presented first.
Introduction The background of the evaluation, the purpose of the evaluation and the major activities of
the project are presented in the introduction to the report.
Evaluation methods A short explanation of the tools and methods used to gather information and conduct the
and tools evaluation is provided. Examples of the tools and plans could be presented as appendices
linked to this section of the report.
Summary of results Here, a data analysis is provided.
Interpretation of Results are interpreted and presented. The results must be linked to the outcomes of the
results system or the goals and objectives of the plan.
Conclusion Present how the project objectives were met and whether the purpose of the evaluation
was achieved.
Recommendations Key points must be summarised in this section and suggestions of improvements should
be made.
(Zarinpoush, 2006:51)

© Regenesys School of Public Management 100


Here is an example of an evaluation report:

• KZN Department of Economic Development. (2008). DED Composite Report on Project


Evaluation. Monitoring and Evaluation Unit.
http://evaluations.dpme.gov.za/evaluations/26/documents/051705c8-7aa7-471c-bcf4-
dff62dcc7157 (accessed 18 June 2020).

Communicating Monitoring and Evaluation Results

1. What method do you think would be best to communicate the results of your evaluation to the relevant
stakeholders?
2. Explain why this method would be the most suitable given the target audience.
3. Identify and explain the importance of communicating evaluation results.

7.5.12 Monitoring and Evaluation Success Factors

According to MacKay (2007:22) the success of a monitoring and evaluation system relies on the
information the system provides. The information should be of such calibre that it can be used to:

• Support government policy making – this includes performance budgeting and national
planning;
• Support policy development and programme planning;
• Support programme and project management; and
• For accountability purposes.
(MacKay, 2007:23)

Information provided by the system should be of such high quality that it becomes institutionalised
and sustainable.

The success of the system relies on the critical success factors identified within each programme or
project. These factors will be linked to the strategy of the department and remain the focus of all
activities involved in the programme or project.

Read more about this concept in this article:

• Lahey, R. (2010). The Canadian monitoring and evaluation (M&E) system: lessons
learned from 30 years of development. ECD Working Paper Series.
http://documents.worldbank.org/curated/en/865531468226748462/pdf/654070NWP023
0c0C0disclosed011040110.pdf (accessed 18 June 2020).

© Regenesys School of Public Management 101


7.5.13 Conclusion

Implementing a monitoring and evaluation system successfully ultimately will ensure that the 12
outcomes outlined by Cabinet are met. If these objectives are met, all South Africans will benefit
from the dedication and hard work of monitoring and evaluation practitioners.

Recap Your Knowledge

1. Draw up a model that shows how you could apply the success factors of monitoring and evaluation to the
government-wide monitoring and evaluation system.
2. Within this model, show where your department or organisation would play a role in the success of monitoring
and evaluation.
3. Present your model to the class.
4. Give critical feedback on fellow student’s models.
5. After the presentations are complete, use the knowledge you have gained from this course to explain the
importance of monitoring and evaluation from a public-sector perspective.

7.5.14 Key Points

Some key points made in this section were:

• The implementation of monitoring and evaluation is not necessarily linear – evaluation can
take place at any stage in the life cycle of a project or programme;
• The results of evaluation must be communicated to stakeholders;
• The DPME offers a 10-step process for monitoring and evaluation as follows:
o Step 1: Examine the context and current state of the problem
o Step 2: Administrative information systems and data sets are used to collect data about
the problem and the programme intended to rectify it
o Step 3: List indicators, targets and baselines, so that you know what you are trying to
achieve, and whether you have achieved it
o Step 4: Group indicators by policy objective
o Step 5: If policy objectives have no indicators you must design them
o Step 6: Review link between inputs, outcomes and impacts, and identify causal
relationships: each link in the M&E chain must serve a purpose, and must relate to inputs,
outputs, outcomes and impacts of the policy or programme
o Step 7: Reporting approach – decide how best to report back to your various
stakeholders
o Step 8: Evaluate approach – here you evaluate the success of the programme or policy
o Step 9: Capacity building plan – make sure you have the required skills to conduct the
M&E process
o Step 10: Communication plan – communicate your results to all who need to know them
• We briefly consider the success factors of M&E: the key is usable information.

© Regenesys School of Public Management 102


Remember to do your digital assessment for this section online!

It will help you strengthen and embed your understanding of the course. You will not be able to
change your answers once you have submitted them, so make sure you have completed the
relevant section of coursework first. Where you see Select all that are relevant, be aware that
any number of the options presented could be correct. You will lose marks for incorrect
selections, so choose carefully. Your combined marks from these assessments count towards a
total of 20% of your course mark.

© Regenesys School of Public Management 103


8. REFERENCES

Adato, M. (2011). Combining quantitative and qualitative methods for program monitoring and
evaluation: why are mixed-method designs best?
http://documents.worldbank.org/curated/en/633721468349812987/pdf/643860BRI0Mixe00Box036
1535B0PUBLIC0.pdf (accessed 18 June 2020).

BusinessDictionary.com, 2019, ‘Annual Report’,


http://www.businessdictionary.com/definition/annual-report.html (accessed 18 March 2019).

BusinessDictionary.com, 2019, ‘Feasibility Report’,


http://www.businessdictionary.com/definition/feasibility-study.html (accessed 18 March 2019).

BusinessDictionary.com, 2019, ‘Proposal’,


http://www.businessdictionary.com/definition/proposal.html (accessed 18 March 2019).

Centre for Learning on Evaluation and Results (Clear), 2012, African Monitoring and Evaluation
Systems: Exploratory Case Studies, Johannesburg: Graduate School of Public and Development
Management, University of the Witwatersrand.

Chabane, C. 2011, Views on performance management, http://www.sanews.gov.za (accessed 18


March 2019).

City of Johannesburg, 2012, Annexure 3: The City of Johannesburg's Monitoring and evaluation
Framework,
https://www.joburg.org.za/documents_/Documents/Intergrated%20Development%20Plan/2013-
16%20IDP%2017may2013%20final.pdf (accessed 18 March 2019).

Department of Economic Development. (2011). The New Growth Path: Framework.


http://www.economic.gov.za/communications/publications/new-growth-path-series/download
(accessed 18 June 2020).

Department of Public Service and Administration, 2018, ‘The Batho Pele Vision',
http://www.dpsa.gov.za/documents/Abridged%20BP%20programme%20July2014.pdf (accessed
18 March 2019).

© Regenesys School of Public Management 104


DPME, 2011, National Evaluation Policy Framework,
http://www.thepresidency.gov.za/MediaLib/Downloads/Home/Ministries/National_Evaluation_Policy
_Framework.pdf (accessed 18 March 2019).

DPME, 2012, ‘Government-Wide Monitoring and Evaluation’,


https://www.dpme.gov.za/publications/Policy%20Framework/Functions%20of%20an%20M%20and
%20E%20Component%20in%20National%20Government%20Departments%20(2).pdf (accessed
18 March 2019).

DPME, 2019, ‘Evaluations’,


https://www.dpme.gov.za/keyfocusareas/evaluationsSite/Pages/default.aspx( (accessed 18 March
2019).

Department of Public Service and Administration. (2012). Public Service Act, 1994.
http://www.dpsa.gov.za/dpsa2g/documents/acts&regulations/psact1994/PublicServiceAct.pdf
(accessed 18 June 2020).

Goldman, I., Engela, R., Akhalwaya, I., Gasa, N., Leon, B., Mohamed, H. and Phillips, S. (2012).
Establishing a national M&E system in South Africa. The World Bank Special Series on The Nuts &
Bolts of Monitoring and Evaluation Systems, 21, 1-11,
http://documents.worldbank.org/curated/en/556311468101955480/pdf/760630BRI0Nuts00Box374
357B00PUBLIC0.pdf (accessed 18 June 2020).

Ishmail, Z. 2012, Building a results-based monitoring and evaluation system for the Western Cape
government of South Africa, In: PSC News, February/March 2012.
http://www.psc.gov.za/newsletters/docs/2012/K-
9555%20PSC_6th%20edition%20magazine_DevV11.pdf (accessed: 18 March 2019).

Kaplan, R.S. and Norton, D.P. 1996, The Balanced Scorecard, Harvard Business School Press,
Boston, Massachusetts.

Lahey, R. 2006, A Framework for Developing an Effective Monitoring and evaluation System in the
Public Sector – Key Considerations from International Experience, Solutions, Canada.
http://www.ecdg.net/wp-content/uploads/2011/12/Framework-for-developing-an-effective-ME-
system-in-the-public-sector-2009_Lahey_good.doc (accessed: 18 March 2019).

© Regenesys School of Public Management 105


Lahey, R. (2010). The Canadian monitoring and evaluation (M&E) system: lessons learned from 30
years of development. ECD Working Paper Series.
http://documents.worldbank.org/curated/en/865531468226748462/pdf/654070NWP0230c0C0discl
osed011040110.pdf (accessed 18 June 2020).

MacKay, K. 2007, ‘How to build M&E systems to support better government’, Washington: The
World Bank.

McCoy, D. and Bamford, D. 1998, How to Conduct a Rapid Situation Analysis: A Guide for Health
Districts in South Africa, Durban: Health Systems Trust, www.hst.org.za/uploads/files/rapid.pdf
(accessed 14 March 2014).

Miller, R. nd, ‘Situation analysis’,


http://www.unfpa.org/webdav/site/global/shared/documents/publications/2010/srh_guide/tools_situ
ationanalysis.html (accessed: 18 March 2019).

Molepo, A.N. 2011, Monitoring & Evaluation Framework for the Public Service, M&E Learning
Network, SA Reserve Bank Conference Centre, 15 February 2011.

Molleman, E. and Timmerman, H. (2003). Performance management when innovation and learning
become critical performance indicators. Personnel Review, 32(1), 93-113.
https://www.researchgate.net/profile/Eric_Molleman/publication/235285519_Performance_manage
ment_when_innovation_and_learning_become_critical_performance_indicators/links/5948df07458
515db1fd8df78/Performance-management-when-innovation-and-learning-become-critical-
performance-indicators.pdf (accessed 18 June 2020).

Nkwinti, G. (nd). National Development Plan and the New Growth Path: Transforming the
Economy. http://kzntopbusiness.co.za/site/search/downloadencode/nLaqaaKelpO8mnjc (accessed
18 June 2020).

National Treasury. (2005). Treasury regulations for departments, trading entities, constitutional
institutions and public entities.
http://www.treasury.gov.za/legislation/pfma/regulations/gazette_27388%20showing%20amendmen
ts.pdf (accessed 18 June 2020).

National Treasury, 2007, Framework for Managing Programme Performance Information,


www.treasury.gov.za/publications/guidelines/FMPI.pdf (accessed 18 March 2019).

© Regenesys School of Public Management 106


National Treasury. (2003). Local Government: Municipal Finance Management Act No. 1 of 1999,
http://mfma.treasury.gov.za/MFMA/Legislation/Local%20Government%20-
%20Municipal%20Finance%20Management%20Act/Municipal%20Finance%20Management%20A
ct%20(No.%2056%20of%202003).pdf (accessed 18 June 2020).

National Treasury. (2014). Public Finance Management Act No. 1 of 1999.


http://www.treasury.gov.za/legislation/pfma/act.pdf (accessed 18 June 2020).

National Treasury. (2013). National Treasury Strategic Plan 2013/2017.


http://www.treasury.gov.za/publications/strategic%20plan/Strat%20Plan%202013-2017.pdf
(accessed 18 June 2020).

National Treasury. (2014). Division of Revenue Act, 2014.


http://www.treasury.gov.za/legislation/acts/2014/Division%20of%20Revenue%20Act,%202014%20
(Act%20No.%2010%20of%202014).pdf (accessed 18 June 2020).

National Treasury. (2019). Division of Revenue Bill, 2019.


https://www.gov.za/sites/default/files/gcis_document/201902/b5-2019division-revenue-bill_0.pdf
(accessed 18 June 2020).

PMG, 2014, National School of Government & Public Service Commission mandate and
challenges, https://pmg.org.za/committee-meeting/17515/ (accessed 18 March 2019).

Presidency, 2007, Policy Framework for the Government-wide Monitoring and evaluation System.
https://www.dpme.gov.za/publications/Guides%20Manuals%20and%20Templates/Functions%20of
%20an%20M%20and%20E%20component%20in%20National%20Government%20Departments.p
df (accessed 18 March 2019).

Presidency, 2009, Improving government performance: our approach,


https://www.dpme.gov.za/publications/Policy%20Framework/Improving%20Government%20Perfor
mance_Our%20Approach.pdf (accessed 18 March 2019).

Presidency. (2010). Guide to the Outcomes Approach.


http://www.dpme.gov.za/publications/Guides%20Manuals%20and%20Templates/Guideline%20to
%20outcome%20approach.pdf (accessed 18 June 2020).

Presidency, 2011, National Evaluation Policy Framework,


http://www.thepresidency.gov.za/MediaLib/Downloads/Home/Ministries/National_Evaluation_Policy
_Framework.pdf (accessed 18 March 2019).

© Regenesys School of Public Management 107


Presidency. (2012). National Evaluation Plan 2013-14 to 2015-16.
https://www.dpme.gov.za/publications/Policy%20Framework/National%20Evaluation%20Plan%20
2013%20-%2014.pdf (accessed 18 June 2020).

Presidency, nd, Monitoring and evaluation: Capacity-building within the public sector,
http://www.thepresidency.gov.za/learning/curriculum.pdf (accessed 18 March 2019).

PSC, 2008, Basic Concepts in Monitoring and Evaluation,


http://www.psc.gov.za/documents/docs/guidelines/PSC%206%20in%20one.pdf (accessed 18
March 2019).

PSC, 2013, Annual Report to Citizens for the 2017/2018 Financial Year,
http://www.psc.gov.za/documents/2013/ARC%20English.pdf (accessed 18 March 2019).

PSC, 2018, Annual Report to Citizens for the 2012/2013 Financial Year,
http://www.psc.gov.za/documents/reports/2018/FINAL_PUBLIC_SERVICE_COMMISSION_Annual
_Report_2017_2018_%2021_SEPT_2018.pdf (accessed 18 March 2019).

Public Service Commission. (2012). Evolution of monitoring and evaluation in the South African
public service. http://www.psc.gov.za/newsletters/docs/2012/K-
9555%20PSC_6th%20edition%20magazine_DevV11.pdf (accessed 18 June 2020).

World Bank, 2004, Influential Evaluations: Evaluations that Improved Performance and Impacts of
Development Programs, Operations Evaluation Department Knowledge Programs and Evaluation
Capacity Development Group (OEDKE): Washington

Zall Kusek, J. and Görgens-Albino, 2009, Making Monitoring and evaluation Systems Work: A
Capacity Development Toolkit, Washington: The World Bank.

Zarinpoush, F. 2006, ‘Project evaluation guide for nonprofit organisations: fundamental methods
and steps for conducting project evaluation.’
http://sectorsource.ca/sites/default/files/resources/files/projectguide_final.pdf (accessed 18 March
2019).

© Regenesys School of Public Management 108


9. GLOSSARY OF KEY TERMS AND ABBREVIATIONS

The lists below will help you to understand important terminology used throughout this course.
Use them as a point of reference as you work through the material and feel free to add your own
terms and abbreviations to the list.

9.1 KEY TERMINOLOGY

Term: Explanation:
Accountability The obligation of government to account for its activities, accept responsibility for them, and
to disclose the results in a transparent manner.
Activity Actions taken, or work performed through inputs such as funds, technical assistance, and
other types of resources are mobilized to produce specific outputs.
Batho Pele Meaning “people first” in English, Batho Pele consists of eight principles intended to
encourage and promote an efficient and effective public service.
Data Specific quantitative and qualitative information or facts that are collected and analysed.
Department of Established in 2010, the DPME is responsible for continuous improvement in service
Performance delivery through monitoring and evaluation.
Monitoring and
Evaluation (DPME)
Effectiveness The extent to which a programme/intervention has achieved its objectives.
Efficacy The extent to which an intervention produces the expected results under ideal conditions in a
controlled environment.
Efficiency A measure of how economically inputs are converted into results.
Feedback Process in which the effect or output of an activity or input is returned to change the next
action.
Goal A broad statement of a desired outcome for a programme.
Government-wide A system developed by the Presidency that describes monitoring and evaluation in
monitoring and government.
evaluation system
Impact The long-term effect of programmes or interventions.
Inputs The financial, human and material resources used in a programme or intervention.
Intervention A specific activity intended to bring about change in some aspects of
organisation/department.
Logical framework Management tool used to improve the design of interventions.
Monitoring and A multi-year implementation strategy for the collection, analysis and use of data for a specific
evaluation plan programme.

© Regenesys School of Public Management 109


Term: Explanation:
Objective A statement of a desired programme or intervention result that meets criteria of being
specific, measurable, achievable, realistic and time-phrased (SMART).
Outputs The results of the programme or intervention.
Performance The degree to which an intervention or organisation operates according to specific criteria or
standards.
Performance Used to measure performance in relation to inputs, activities, outputs, outcomes and
indicators impacts.
Performance A generic term for non-financial information about government services and activities.
Information
Performance Expressions of the minimum acceptable level of performance.
standards
Performance targets Expressions of the specific level of performance that needs to be achieved within the
organisation or on an individual basis.
Relevance The extent to which the objectives, outputs, or outcomes of an intervention are consistent
with the beneficiaries' requirements, organisation's policies and country's needs.
Reliability Consistency or dependability of data collected.
Service delivery Providing service to citizens.
Transparency The minimum degree of disclosure through which agreements, dealings, practices and
transactions can be seen clearly by the public.

© Regenesys School of Public Management 110


9.2 ABBREVIATIONS

Abbreviation Explanation
DoRA Division of Revenue Act
DPME Department of Performance Monitoring and Evaluation
GWMES Government-wide monitoring and evaluation system
M&E Monitoring and Evaluation
PSC Public Service Commission
PSA Public Service Act
PFMA Public Finance Management Act
MFMA Municipal Finance Management Act

© Regenesys School of Public Management 111


10. VERSION CONTROL

Date of first draft: January 2014


Version number 7.1_e_f
Date of Publication: June, 2020
Publisher: Regenesys Management: Sandton

Document Change History

Date Version Initials Description of Change


29 July 2015 5.6_e_f SK Updated SG template and checked hyperlinks
11 December 2015 5.7_e_f SK Finalised SG for 2016 intake
27 February 2017 5.8_e_f SK Updated SG template and checked hyperlinks
19 March 2018 5.9_e_f SK Prepared for 2018 intake
23 August 2018 6_e_f SK Links up to date
18 March 2019 6.1 KA Slight amendment, new content additions, checked hyperlinks,
glossary edited, updated and added to digital assessments
22 March 2019 6.2 TS Reviewed
22 March 2019 6.3_e RT Edited; wrote key points
26 March 2019 6.4_e_f SK Formatted
04 September 2019 7 TL Updated
05 September 2019 7_e_f SK Formatted
18 June 2020 7.1_e_f SK Updated links, intro section and models

© Regenesys School of Public Management 112

You might also like