Professional Documents
Culture Documents
T&D M&E Framework & Tools Handbook V2010
T&D M&E Framework & Tools Handbook V2010
Department of Education
HANDBOOK
DepED-EDPITAF-STRIVE
Training and Development
June 2010
This document, The Training and Development Monitoring and
Evaluation Framework and Tools Handbook, was developed and
validated in Regions VI, VII and VIII, Divisions of Negros Occidental,
Bohol/Tagbilaran and Northern Samar through the AusAID-funded
project, STRIVE (Strengthening the Implementation of Basic
Education in selected Provinces in the Visayas), in coordination with
the EDPITAF (Educational Development Project Implementing Task
Force), and in consultation with the TEDP-TWG, NEAP and the
Bureaus of the Department of Education.
T&D System M&E Framework and Tools Handbook, June 2010 Page 2
Table of Contents
Section 1.0: Training and Development (T&D) System Monitoring and
Evaluation General Framework
……………………………………………………………………….........................1
Section 2.0: T&D System Monitoring and Evaluation for the Training
and Development Needs Assessment (TDNA) System
……………………………….....................................................................................6
2.1 The TDNA System, p6
2.2 Monitoring and Evaluation for the NCBTS-TSNA, p7
2.3 Monitoring and Evaluation for the TDNASH, p30
2.4 Monitoring and Evaluation for the Origanizational TDNA for
the Region and Division, p40
Section 4.0: T&D System Monitoring and Evaluation for the Program
Desigining and Resource Development (PDRD) System ………........................85
4.1 The PDRD System, p85
4.2 Monitoring and Evaluation for Program Designing, p86
4.3 Monitoring and Evaluation for Resource Development, p95
Acknowledgements ..........................................................................................................142
T&D System M&E Framework and Tools Handbook, June 2010 Page 3
Section 1.0: Training & Development System Monitoring and Evaluation
General Framework
Integral to the Training & Development (T&D) System is its monitoring and evaluation (M&E)
support. This ensures the effectiveness and efficiency of its operations. Monitoring and Evaluation
activities are vital in ensuring that program implementation adheres to the standards for the system’s
inputs, processes, outputs and outcomes. In carrying out monitoring and evaluation activities, M & E
instruments are indispensable and the processes relating to the application and use of these
instruments equally important.
The M&E of the Training Development Needs Assessment (TDNA), the Professional Development
Planning (PDP), the Program Designing and Resource Development (PDRD) and the Program Delivery
(PDy) Systems support their integration and adherence to the overall goal and objectives of the
entire system. While the M&E framework is the basis for the internal quality assurance of the system,
its results also inform external Quality Assurance of the system’s adherence to standards and
specifications that are expected for the outputs at the different levels. Moreover, the M&E results
provide information on the strengths and/or weaknesses of the Training & Development System
itself and of the different systems to support sustainability and improvement.
Below is the General M&E Framework containing the standards at the input, process, output and
outcome system levels covering the T&D operations for the four sub-systems at the region, division
and school levels.
T&D System M&E Framework and Tools Handbook, June 2010 Page 4
Output Reliable and valid TDNA results Reliable and valid TDNA results for Reliable and valid NCBTS-
for R-Organizational TDNA Organizational Division TSNA results
MPPD Results serve as basis Systematic and periodic conduct of -SPPD Results serve as basis
for Program Designing IPPD for school heads for Program Designing
Output Relevant and needs-based Relevant and needs-based Division Relevant and needs-based
Regional MPPD MPPD IPPD/SPPD
T&D System M&E Framework and Tools Handbook, June 2010 Page 5
Process Systematic and efficient Systematic and efficient conduct of Systematic and efficient
conduct of Regional MPPD Division MPPD conduct of the IPPD/SPPD
Input Competent and sufficient Competent and sufficient planners Competent and sufficient
planners for the conduct of for the conduct of the Division planners for the conduct of
Regional MPPD MPPD and conduct of Cluster SPPD and IPPD
Orientation of SPPD
Available and relevant support Available and relevant support Available and relevant support
resources such as the MPPD resources such as the MPPD resources such as the SPPD &
Guide, template, tools, Guide, template, tools , IPPD guides, templates, tools,
consolidated Organizational consolidated Organizational TDNA, consolidated TSNA Results,
TDNA results, REDP, TDNASH and TSNA Results, Consolidated IPPD for
Student/Pupil performance, Consolidated IPPD of school heads Teachers, SIP, Student/Pupil
funds & teachers, DEDP, Student/Pupil Performance, funds
performance, funds
Complete and accurate Complete and accurate
consolidation/analysis/profile of Complete and accurate consolidation/analysis /profile of
Divisions MPPDs consolidation/analysis /profile of IPPDs
SPPDs
Output Comprehensive, flexible and Comprehensive, flexible and needs- Comprehensive, flexible and
needs-based program designs based program designs of the needs-based program designs
of the region division of the school
T&D System M&E Framework and Tools Handbook, June 2010 Page 6
Input Competent and sufficient Competent and sufficient personnel Competent and sufficient
personnel of the Reg. PDRD- of the Division PDRD-WG personnel of the school’s
WG PDRD-WG
Available support resources:
Available support resources: MPPD ( Div.) & SPPD (Schools), Available support resources:
MPPD (Region and Division, adequate funds, PDRD System-Vol. SRC, SIP(AIP), EMIS, SPPD,
adequate funds, PDRD System- 4, resource persons) adequate funds, PDRD
Vol. 4, resource persons) System-Vol. 4, resource
persons
Process Systematic & efficient conduct Systematic & efficient conduct of Systematic & efficient conduct
of the Resource Package the Resource Package of the Resource Package
Development in the region Development in the Division Development in the School
Outcom Improved Work Performance of Improved work performance of Improved school-based practice
e clientele in all areas clientele in all areas
T&D System M&E Framework and Tools Handbook, June 2010 Page 7
clientele clientele clientele
Process Effective and efficient Effective and efficient management Effective and efficient
management and conduct of F3 and conduct of F3 management and conduct of F3
- F3 Relevant and needs-based Relevant and needs-based
Componen Technical Assistance to the Technical Assistance to the school
t division conduct of F3 conduct of F3
Complete, available, and Complete, available, and relevant Complete, available, and
relevant support resources (as support resources (as required in relevant support resources ((as
required in the approved the approved resource required in the approved
resource package/JEL Contract, package/JEL Contract, funds) resource package/JEL
funds) Contract, funds)
Enabling policies, standards, Enabling policies, standards, and Enabling policies, standards,
and processes processes and processes
T&D System M&E Framework and Tools Handbook, June 2010 Page 8
Section 2.0: T&D System Monitoring and Evaluation for the Training and
Development Needs Assessment (TDNA) System
The Training Development Needs Assessment (TDNA) System has three major needs assessment
processes. These are the National Competency Based Teacher Standards – Teacher Strengths
and Needs Assessment (NCBTS-TSNA), the Training Development Needs Assessment for
School Heads (TDNASH) and the Organizational Training Development Needs Assessment at
the region and division level.
The diagram below shows the procedural design for the TDNA monitoring and evaluation at
the Division and Regional levels. The personnel from the TDNA-Working Group (WG)
responsible for M&E are tasked to monitor and evaluate the preparation, conduct and
consolidation of the TDNA results.
1.4.
Call
TDNA System Monitoring Division
& Evaluation
T&D Unit/Regional
1.4.6a
Division/Region
1.4.1 TDNA -WG Identify &
for Prepare 1.4.5 T&D Office
Situati Inform
TDNA- WG Review
onal monitored
1.4.2 and
for M&E M&E
Analys schools/Divisio
Monitor
Resources Report 1.4. 6b
is ns of M&E
process and Make
findings
TDNA
compliance necessary
1.4.6c System
1.4.3 of
standards adjustments to
Prepare Report
Record
TDNA TDNA System Adjust
on TDNA ment
results of Process based
monitoring & on M&E for
evaluations 1.4.4 Regional policy
TDIS Prepare review
Reportandon
Data M&E adjustment
TDNARegional
base TDNA
Report
ProcessPolicy
System
Review &
M&E
Adjustme
Report
nt
At the Division or Regional level, the M&E personnel prepare the M&E report and informs
the T&D Office or Unit of the findings. Moreover, the T&D Office/Unit develops
recommendations for the improvement of the process, which informs regional policy review,
and adjustment of the TDNA System.
T&D System M&E Framework and Tools Handbook, June 2010 Page 9
2.2. M&E for the NCBTS-TSNA
A number of M & E instruments have been developed to support the NCBTS Orientation and
the administration of the NCBTS-TSNA tool. The following tools are available to support the
NCBTS-TSNA processes.
T&D System M&E Framework and Tools Handbook, June 2010 Page 10
Input T&D-M&E 1: Individual Profile NCBTS-M&E 1: Teacher Profile for
Template (for ES/PSDS; NCBTS-TSNA
SHs/NCBTS Coordinator)
Checklist Prior to Conduct of
Checklist Prior to Conduct of NCBTS-TSNA incorporated into
NCBTS-TSNA incorporated into NCBTS Guide
NCBTS Guide
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation
of the NCBTS-TSNA process:
What will be How it will be M&E tool to Who will When will the How will the results
monitored monitored be used be monitoring be used
responsibl take place
e for the
monitoring
NCBTS All NCBTS T&D-M&E TDNA-WG Prior to their Results will be
Implementers details Implementers Form 1: involvement in analyzed to ensure
in relation to their will be asked to Individual the NCBTS- NCBTS Implementers
current position, their complete the TSNA process have the required
Profile
level of experience profile KSAs. Results will be
and qualification Template entered into the TDIS
Teachers details in All teachers will NCBTS-M&E TDNA-WG Prior to the Results will be entered
relation to their be asked to Form 1: accomplishment into the TD IS
current position, their complete the Teacher of the NCBTS- database along with
level of experience profile TSNA Tool their corresponding
Profile for
and qualification NCBTS-TSNA results
NCBTS-TSNA
T&D System M&E Framework and Tools Handbook, June 2010 Page 11
Recommendations
based on an analysis
of the results should
be included in the
Program Completion
Report
The overall Each of the NCBTS-M&E Division Upon Results will be collated
effectiveness of the trainers will be Form 4: TDNA-WG completion of and analyzed by the
workshop as asked to make Trainer’s the NCBTS TDNA-WG. A
delivered by the an assessment orientation summary of the results
Assessment of
whole Team. of the workshop will be included in the
orientation. the NCBTS Program Completion
Orientation Report and will inform
Workshop future training.
The priority training The NCBTS NCBTS-M&E TDNA-WG After the Results will be used to
needs of teachers Coordinator and Form 7: accomplishment inform school and
the School School’s of the NCBTS- division plans for
Head will NCBTS-TSNA TSNA tool professional
consolidate the Consolidation development. Results
results from the Template will be submitted to the
administration Division.
of the NCBTS –
TSNA tool
T&D System M&E Framework and Tools Handbook, June 2010 Page 12
T&D-M&E Form 1: Individual Profile Template
I PERSONAL DATA
Name:
T&D System M&E Framework and Tools Handbook, June 2010 Page 1
III. TRAINING ATTENDED OVER THE LAST THREE YEARS
Please check training focus and management level for all training attended over the last three
years.
Resource Materials
Development
Planning
Management
Policy Development
Research
T&D System M&E Framework and Tools Handbook, June 2010 Page 2
V. TRAINING AND DEVELOPMENT EXPERIENCES
Identify which of the following specific areas you consider to be your
area(s) of expertise:
I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best of my knowledge and belief.
Date: Signature:
Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.
T&D System M&E Framework and Tools Handbook, June 2010 Page 3
NCBTS-M&E Form 1: Teacher’s Profile for NCBTS-TSNA
T&D System M&E Framework and Tools Handbook, June 2010 Page 4
T&D System M&E Framework and Tools Handbook, June 2010 Page 5
NCBTS-M&E Form 2: Learning Process Observation and
Facilitation Skills
This form is to be used during the actual delivery of a program. A Process Observer will need to be
assigned to complete the Learning Process Observation for each session. Results should be used to
inform daily debriefing sessions. At the end of this form is a checklist of facilitation skills which may
be observed and recorded.
Activity
Analysis
Abstraction
Application
Concluding
Activity
T&D System M&E Framework and Tools Handbook, June 2010 Page 7
Observe if the skill has been demonstrated by the Facilitator. If so, put a check in the appropriate
column.
T&D System M&E Framework and Tools Handbook, June 2010 Page 8
NCBTS-M&E Form 3: NCBTS Coordinator’s Checklist
Please assess the competency of the NCBTS Coordinator according to the following indicators by
checking under the appropriate column.
T&D System M&E Framework and Tools Handbook, June 2010 Page 9
NCBTS-M&E Form 3: NCBTS Coordinator’s Checklist
Consolidation Template
6
___%
7
8
9
10
`Total M Grand Total
T&D System M&E Framework and Tools Handbook, June 2010 Page 10
NCBTS-M&E Form 4: Trainers Assessment of the NCBTS
Orientation Workshop
Please assess the effectiveness of the entire workshop according to the indicators below.
Please refer to the following rating scale:
4-Very High (VH); 3-High (H); 2-Low (Low); 1-Very Low (VL)
After the conduct of the Orientation Program by the Team and Rating
considering participants’ outputs I believe that ………. 1 2 3 4
What changes would you like to make to improve similar workshops in the future? Why?
Recommendations
T&D System M&E Framework and Tools Handbook, June 2010 Page 11
NCBTS-M&E Form 4: Trainers Assessment of the NCBTS
Orientation Workshop Consolidation
Template
Give this instrument to the trainers prior to the beginning of the workshop. Brief
the trainer of the content and purpose of the instrument prior to administration.
Consolidate the results based on the accomplished instruments.
II. Scoring and Consolidation -This can efficiently be done using MS Excel.
Ex. lll lll lll lll 7x1= 9x2= 4x3=1 5x4=2 7+ 18=12+ 7+9+4+5 57/25=
l l l l 7 18 2 0 20= = 25 2.28
57
ll lll
l
1
4
5
6
7
8
9
10
11
12
13
14
15
16
T&D System M&E Framework and Tools Handbook, June 2010 Page 12
T&D System M&E Framework and Tools Handbook, June 2010 Page 13
NCBTS-M&E Form 5: Trainees’ End of the F3 Program
Assessment
Direction: Please assess the effectiveness of the entire F3 component of the program according to
the
indicators below. Please refer to the following rating scale:
4-Strongly Agree (SA); 3-Agree (A); 2-Disagree (D); 1-Strongly Disagree (SD)
Rating
After the conduct of the F3 component of the program,
1 2 3 4
I believe that …
SD D A SA
A Program Planning/Management/Preparation
1 the training program was delivered as planned
2 the training program was managed efficiently
3 the training program was well-structured
B Attainment of Objectives
4 the program objectives were clearly presented
5 the session objectives were logically arranged
6 the program and session objectives were attained
C Delivery of Program Content
7 program content was appropriate to trainees’ roles and responsibilities
8 content delivered was based on authoritative and reliable sources
9 new learning was clearly presented
10 the session activities were effective in generating learning
11 adult learning methodologies were used effectively
12 management of learning was effectively structured e.g. portfolio,
synthesis of previous learning, etc.
D Trainees’ Learning
13 trainees were encouraged to consider how ideas and skills gained
during the training could be incorporated into their own practices
14 contribution of all trainees, both male and female, were encouraged
15 trainees demonstrated a clear understanding of the content delivered
E Trainers’ Conduct of Sessions
16 the trainers’ competencies were evident in the conduct of the sessions
17 teamwork among the trainers and staff was manifested
18 trainers established a positive learning environment
19 training activities moved quickly enough to maintain trainees’ interest
F Provision of Support Materials
20 training materials were clear and useful
21 powerpoint presentations supported the flow of the sessions
22 the resources provided were appropriate to trainees’ needs
G Program Management Team
23 Program Management Team members were courteous
24 Program Management Team was efficient
25 Program Management Team was responsive to the needs of trainees
H Venue and Accommodation
26 the venue was well lighted and ventilated
27 the venue was comfortable with sufficient space for program activities
T&D System M&E Framework and Tools Handbook, June 2010 Page 14
28 the venue had sanitary and hygienic conditions
29 Meals were nutritious and sufficient in quantity and quality.
30 the accommodation was comfortable with sanitary and hygienic
conditions
I Overall
31 I have the knowledge and skills to apply the new learning
32 I have the confidence to implement the JEL contract
What do you consider your most significant learning from the program?
What changes would you suggest to improve similar programs in the future?
Briefly describe what you have learned and how it will help you with your work.
Collate the accomplished F3-M&E Form 5: Trainees’ End of the F3 Program Assessment, and
review the results. Use the table below to consolidate the results for the quantitative items.
Note: The scoring and consolidation can be efficiently done using MS Excel.
T&D System M&E Framework and Tools Handbook, June 2010 Page 15
Use the scale below to interpret mean rating for each item of the assessment:
3.5 to 4.0 = (SA) Strongly Agree
2.5 to 3.4 = (A) Agree
1.5 to 2.4 = (D) Disagree
1.0 to 1.4 = (SD) Strongly Disagree
12
D Trainees’ Learning
13
14
15
16
17
18
19
T&D System M&E Framework and Tools Handbook, June 2010 Page 16
F Provision of Support Materials
20
21
22
23
24
25
26
27
28
29
30
I Overall
31
32
What do you consider your most significant learning from the program?
T&D System M&E Framework and Tools Handbook, June 2010 Page 17
What changes would you suggest to improve similar programs in the future?
Briefly describe what you have learned and how it will help you with your work.
NCBTS-M&E Form 6: Documentation Tool for the Conduct of Division, Cluster or School
Level NCBTS-TSNA Implementation
This form is to be used to support Regional monitoring of the NCBTS-TSNA process at the Division
level and Division monitoring of district and school level activities. It is expected that the assessment
will be based on observations, discussions with the implementing team and review of relevant
documents.
T&D System M&E Framework and Tools Handbook, June 2010 Page 18
NCBTS-M&E Form 7: School’s NCBTS-TSNA Consolidation
Template
Division/District/School _________________________ Date: __________________
Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way
Use the scale above to assess the extent to which the conduct of TDNA documentation adhered to
the following:
Recommendations:
Name: ___________________________________
Designation: _________________________________
Date: ____________________________________
T&D System M&E Framework and Tools Handbook, June 2010 Page 19
Average Percentage
Teacher’s Percentage Score
Domain
/Strand No. Total
( NumberTotal
of Teachers )
T T2 T3 T4 …
1
1.1
1.2
Total Domain
1.
2.1
2.2
2.3
2.4
2.5
Total Domain
2.
3.1
Total Domain
3.
4.1
4.2
4.3
4.4
4.5
4.6
4.7
Total Domain
4.
5.1
5.2
5.3
5.4
Total Domain
5.
6.1
Total Domain
6.
7.1
7.2
7.3
Total Domain
7.
T&D System M&E Framework and Tools Handbook, June 2010 Page 20
2.3. M&E for the TDNASH
The following M&E tools are available to support the Training and Development Needs
Assessment for School Heads (TDNASH) process:
Process
TDNASH-M&E Form 3: TDNASH-M&E Form 1: Division
Documentation Tool for Division Monitoring and Evaluation Tool of the
Implementation of TDNASH Conduct of TDNASH
T&D System M&E Framework and Tools Handbook, June 2010 Page 21
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of
the TDNASH process:
What will be How it will be M&E tool to Who will When will the How will the results be
monitored monitored be used be monitoring used
responsibl take place
e for the
monitoring
School Head’s All School T&D-M&E TDNA - WG Prior to the Results will be entered
details in relation to Heads will be Form 1: accomplishment into the TDIS database
their current position, asked to Individual of the TDNASH along with their
their level of complete the Profile Tool corresponding TDNASH
experience and profile Template results
qualification
The implementation Members of TDNASH- Division During the Results will be collated
of the TDNASH the Division M&E Form TDNA - WG accomplishment and analyzed by the
process at the TDNA- WG 1: Division and TDNASH TDNA-WG and used to
school/cluster level will be asked Monitoring process and the inform future TDNASH
to observe the & consolidation of processes.
conduct of the Evaluation results
TDNASH at of the
the Conduct of
school/cluster TDNASH
level and
complete the
tool
The training and The PSDS /ES TDNASH- TDNA -WG After the Results will be analyzed
development needs will be asked M&E Form accomplishment and used to inform the
of the School Heads to 2: TDNASH of the TDNASH Division on the training
consolidated Consolidate by a cluster of and development needs
the School d Cluster School Heads for School Heads. Results
Head Results will be incorporate into the
TDNASH Template MPPD and DEDP
results for a
cluster of
schools
I PERSONAL DATA
Name:
T&D System M&E Framework and Tools Handbook, June 2010 Page 22
(Surname) (First Name) (Middle Name)
Please check training focus and management level for all training attended over the last three
years.
T&D System M&E Framework and Tools Handbook, June 2010 Page 23
Curriculum
Resource Materials
Development
Planning
Management
Policy Development
Research
T&D System M&E Framework and Tools Handbook, June 2010 Page 24
Program Delivery Program Management
I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best of my knowledge and belief.
Date: Signature:
Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.
T&D System M&E Framework and Tools Handbook, June 2010 Page 25
TDNASH-M&E Form 2: Division Monitoring & Evaluation of the
Conduct of TDNASH
Activity Monitored:
School Head completion of TDNASH Supervisor
T&D System M&E Framework and Tools Handbook, June 2010 Page 26
TDNASH-M&E Form 3: TDNASH Consolidated Cluster Results Template
Division _______________________________________
SH 1
SH 2
SH 3
SH 4
SH 5
SH 6
SH 7
SH 8
SH 9
SH 10
SH 11
SH 12
SH 13
SH 14
SH 15
T&D System M&E Framework and Tools Handbook, June 2010 Page 27
PART II. TDNASH Cluster Summary Sheet for School Leadership Experience Level (SLEL)
Cluster Name: _______________________
Domai School Heads’ SLEL Overall Rating obtained from the Cluster SLEL
Overall
ns (D) triangulation data
S S S S S S S S S SH SH S S S SH Ave. Level
H H H H H H H H H 10 11 H H H 15 Ratings Equivalent
1 2 3 4 5 6 7 8 9 1 1 1
2 3 4
D 1.1
1
1.2
1.3
1.4
1.5
1.6
D 2.1a
2
2.1
b
2.2a
2.2
b
2.2c
2.3
D 3.1a
3
3.1
b
3.2a
3.2
b
D 4.1a
4
4.1
b
4.1c
4.1
d
4.2
4.3
4.4
D 5.1a
5
5.1
b
5.2
D 6.1
6
6.2a
6.2
b
6.3
D 7.1
7
7.2a
7.2
b
7.2c
7.3
T&D System M&E Framework and Tools Handbook, June 2010 Page 28
7.4
PART III. TDNASH Cluster Summary Sheet for Level of Importance (LOI)
Cluster Name:
________________________
Domain School Heads’ LOI Overall Rating obtained from the triangulation data Cluste
r LOI
s (D)
SH SH SH SH SH SH SH SH SH S SH SH S SH S Ave.
1 2 3 4 5 6 7 8 9 H 11 12 H 14 H Rating
s
1 1 1
0 3 5
D 1.1
1
1.2
1.3
1.4
1.5
1.6
D2 2.1a
2.1b
2.2a
2.2b
2.2c
2.3
D3 3.1a
3.1b
3.2a
3.2b
D4 4.1a
4.1b
4.1c
4.1d
4.2
4.3
D5 5.1a
5.1b
5.2
D6 6.1
6.2a
6.2b
6.3
D7 7.1
7.2a
7.2b
7.2c
7.3
7.4
T&D System M&E Framework and Tools Handbook, June 2010 Page 29
TDNASH-M&E Form 4: Documentation Tool for Division
Implementation of TDNASH
This form is to be used to support Regional monitoring of the TDNASH process at the Division level
and Division monitoring of district and level activities. It is expected that the assessment will be
based on observations, discussions with the implementing team and review of relevant documents.
Division/District _________________________ Date: __________________
Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way
Use the scale above to evaluate the extent to which the conduct of TDNASH documentation adhered
to the following:
To what extent …….. 1 2 3 4
1. was thorough planning conducted prior to the TDNASH orientation?
2. were participants oriented to the competencies expected of a School Head?
3. was the purpose of the TDNASH explained?
4. Was the triangular process used for the TDNASH explained e.g. three different
respondents, group consensual assessment technique
5. was a clear explanation provided on how to accomplish the TDNASH tools e.g.
manual and/or e-version
6. was the scoring system for the TDNASH tool explained e.g. continuum of
indicators for each competency matched to school leadership experience
levels?
7. were the steps involved in consolidating the triangulation results for an
individual school head explained?
8. were the steps involved in consolidating TDNASH results for a group of school
heads explained?
9. was an explanation on how to interpreted individual and consolidated results
provided ?
10. was technical assistance provided when required?
11. were the M&E tools and processes implemented?
12. Was there evidence of team work and collaboration amongst the TDNASH
Implementers
13. were recommendations for improving the TDNASH administration processes
identified?
Recommendations:
Name:______________________________________
Designation: _________________________________
Date: _______________________________________
T&D System M&E Framework and Tools Handbook, June 2010 Page 30
2.4. M&E of the Organizational TDNA for Region and Division
The following M&E tools are available to support the conduct of the Organizational TDNA:
Input T&D-M&E Form 1: Individual Profile T&D-M&E Form 1: Individual Profile Organizational TDNA
Template Template not conducted at
School Level
Checklist of Available Resources for Checklist of Available Resources for
Organizational TDNA Region/Division Organizational TDNA Region/Division
Level incorporated into Organizational Level incorporated into Organizational
TDNA Guide TDNA Guide
T&D System M&E Framework and Tools Handbook, June 2010 Page 31
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of
the Organizational TDNA process:
What will be How it will M&E tool to Who will be When will the How will the results be
monitored be be used responsible monitoring used
monitored for the take place
monitoring
Respondent’s All T&D-M&E Division, Prior to the Information will be
details in relation to participants Form 1: Region accomplishment entered into the TDIS
their current in the Individual TDNA-WG of the database
position, their level Organization Profile Organizational
of experience and al TDNA will Template TDNA Tool
qualification be asked to
complete the
profile
The processes A process Org’l TDNA- Division, During the Results will be shared
followed during the observer will M&E Form 1: Region conduct of the with the FGD facilitators
conduct of the be appointed Organizational TDNA-WG FGD for the to identify best practices
Focus Group and will use TDNA Tool for Organizational and areas for
Discussion (FGD) the tool FGD Process TDNA improvement.
at the at the Region/ Recommendations for
Region/Division Division level improving the process
level will be included in the
Program Completion
Report to inform future
processes.
The level of Results of the Org’l TDNA- Division, Following the Results will inform
competency and organizationa M&E Form 2a: Region accomplishment decisions on the training
the level of l TDNA will Division TDNA-WG of the Division / and development
importance of the be Organizational Region programs offered at the
Division/Region for consolidated TDNA Scores Organizational division level and will be
the various using the Summary TDNA incorporate into both the
management template Template DEDP and REDP
competencies provided
across Service Org’l TDNA-
areas M&E Form 2b:
Regional
Organizational
TDNA Scores
Summary
Template
The Organizational Results of the Org’l TDNA- Division, Following the Results will inform
TDNA of the organizationa M&E Form 3: Region accomplishment decisions on the training
functional divisions/ l TDNA will Functional TDNA-WG of the and development
sections/ units be Organizational programs offered at the
Divisions/
consolidated TDNA division/regional level
for each Sections/ and will be incorporate
functional Units into both the DEDP and
division/ Organizational REDP
section/ unit TDNA
using the Prioritization
template Template
T&D System M&E Framework and Tools Handbook, June 2010 Page 32
provided
The Organizational Results of the Org’l TDNA- Region Following the Results will inform
TDNA results of the organizationa M&E Form 4: TDNA-WG submission of decisions on the training
various divisions l TDNA will Organizational Division and development
across a region be TDNA Schools Organizational programs offered at the
consolidated Division TDNA results region level and will be
for all division Consolidation incorporate into the
within a Template REDP.
region using The results will be
the template analyzed to inform
provided future TDNA policy
The implementation A Process Org’l TDNA- Division, During the Results to be discussed
of the Observer will M&E Form 5: Region conduct of the with the Division/Region
Organizational be identified Documentatio TDNA-WG Organizational and identify strengths
TDNA at the and asked to n Review of TDNA at the and areas for
Division/Region complete the Division/ Division/Region improvement.
levels tool Region Level
Organizational Observations will be
TDNA collated by the TDNA-
WG and the results
analyzed to inform
future TDNA policy
I PERSONAL DATA
Name:
T&D System M&E Framework and Tools Handbook, June 2010 Page 33
LEVEL e.g. INCLUS
MAIN AREA OF RESPONSIBILITY
POSITION Elem/Sec/ALS school, IVE
e.g. subjects taught, level supervised PERIOD
district, division, region
Please check training focus and management level for all training attended over the last three
years.
Resource Materials
Development
Planning
Management
Policy Development
Research
T&D System M&E Framework and Tools Handbook, June 2010 Page 34
Identify which of the following areas you consider to be your area(s) of expertise:
S School Based Management
Quality Assurance Monitoring and Evaluation
Access Education Subject Specialization:
_____________)
Education Planning Policy Development
Learning Resource Materials Development ICT
Delivery of Training Other, please specify
________________
T&D System M&E Framework and Tools Handbook, June 2010 Page 35
Use additional sheet if necessary.
I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best of my knowledge and belief.
Date: Signature:
Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.
T&D System M&E Framework and Tools Handbook, June 2010 Page 36
Org’l TDNA-M & E Form 1: Organizational TDNA Tool for Focus
Group Discussion (FGD) Process at the Region/Division Level
_______ FGD Flow of Regional TDNA Self Assessment
_______ FGD Flow of Division TDNA Self Assessment
_______ Monitoring of Division Organization TDNA by Regional Team
Please check (✔) under the manifested (M) column if the process was manifested and under
the not manifested (NM) column if the process was not manifested. Please indicate any
variations noted and include any additional comments regarding the facilitation of the
session.
ACTIVITY M N Variations/Comment
M s
1. Facilitator emphasizes to the participants that as key
respondents to this Organizational TDNA of the
Region/Division, their answers and collaboration with their
colleagues to reach a consensual assessment will be most
helpful for the future development of the management
competencies of the region.
2. Facilitator clearly presents the purpose of the
Organizational TDNA
3. Brief description of the data gathering method (FGD) was
provided by the facilitator
4. Facilitator comprehensively explained the rating scale
5. Systematic walk through of the ‘Management
Competencies per Service Area’ one service area at a
time was carried out.
6. Each section collaboratively reached a consensus on the
level of importance and level of competencies for each
service area
7. Section ratings were recorded properly
8. Each participant provided individual perception on
perceived importance of the competency in the
performance of the region’s/division’s task / job’ and shared
these with the group.
9. TDNA-WG members efficiently performed their assigned
task.
10. Participants careful deliberated on the average level of
importance and competency ratings
11. A consolidation of the Organizational TDNA results
following the guidelines outlined in the FGD flow was
accomplished.
12. An M&E committee was tasked to monitor and evaluate on
the preparation, conduct and consolidation of the TDNA
results.
Total
T&D System M&E Framework and Tools Handbook, June 2010 Page 37
Date: ____________________________
T&D System M&E Framework and Tools Handbook, June 2010 Page 38
Org’l TDNA-M & E Form 2a
Division Organizational - TDNA Scores Summary Template
DIVISION ___________________
1 Understanding DepED
as an Organization
2 Understanding RA 9155
or the Governance of
Basic Education Act
3 Management of
Change
4 Organization
Analysis/Diagnosis
5 Identifying and Solving
Problems
6 Decision-Making
7 Dealing Effectively with
Pressure Groups
8 Conflict Management
T&D System M&E Framework and Tools Handbook, June 2010 Page 39
9 Negotiation Skills
10 Transformational and
Enabling Leadership
L
Raw WEIGHTED Raw WEIGHTED L
RATING RATING
O
Scores Scores OI
C
T T
Re Re Re
o o
Re Self-A g Self-A Self-A g
Self-A 60% 40
t g. 60% 40
t
g. a a
% %
l l
SERVICE AREA 1: EDUCATIONAL PLANNING (DO/RO)
11 Strategic Planning
12 Implementation
Planning
13 Project/Program
Identification
14 Resource Mobilization
and Allocation
15 Financial
Management and
Control
16 Group Process
Management
17 Facilitation Skills
18 Communication Skills
19 Advocacy
T&D System M&E Framework and Tools Handbook, June 2010 Page 40
L
Raw WEIGHTED Raw WEIGHTED LO
RATING RATING
O
Scores Scores I
C
Re Self- Re Re
Self- Self-
Re g Tot g Tot
Self-A A
40 al
A g. A
40 al
g. 60% 60%
% %
SERVICE AREA 2: LEARNING OUTCOME MANAGEMENT (DO)
20 Understanding of the
Revitalized Basic
Education Curriculum
21 Curriculum Review
22 Curriculum
Implementation
Planning (Indiginized
Curriculum and
Instructional
Materials)
23 Instructional
Materials
Development
24 Instructional
Supervision and
Management
25 Student/Pupil
Assessment/Testing
26 Intervention
Programming
27 Education
Programs/Project
Management
T&D System M&E Framework and Tools Handbook, June 2010 Page 41
28 Tracking Student
Progress
29 Quality Management
30 Staff Development
31 Coaching and
Mentoring
T&D System M&E Framework and Tools Handbook, June 2010 Page 42
Raw WEIGHTED Raw WEIGHTED LO L
Scores RATING Scores RATING C OI
Re Self- Re Re
Self Self
Re g Tot g Tot
Self-A -A
40 al
A g. -A
40 al
g. 60% 60%
% %
SERVICE AREA 3: MONITORING AND EVALUATION (DO/RO
3 Monitoring and
2 Evaluation Design
and Development
3 Instrument/Tools
3 Development for M&E
Data Gathering
3 Data Processing,
4 Analysis and
Utilization
3 Communication
5 Skills/Feedback
Giving
3 Education
6 Management
Information System
(EMIS)
SERVICE AREA 4: EDUCATION ADMINISTRATION & MANAGEMENT (CO/RO/DO
3 Resource Mobilization
7 and Management
3 Resource
8 Procurement and
Management
3 Building Partnerships
9
T&D System M&E Framework and Tools Handbook, June 2010 Page 43
4 Human Resource
0 Management
4 Delegation
2
4 Physical Facilities
2 Programming
4 Records Management
3
4 Understanding the
4 intent of the Policy&
Implementation
T&D System M&E Framework and Tools Handbook, June 2010 Page 44
Org’l TDNA-M & E Form 2b
Region Organizational - TDNA Scores Summary Template
(Region & Division)
REGION ______
Raw
Scores
Level of WEIGHTED RATING
Competen (LOC)
Level of
cy
(LOC)
Importa
nce
GENERAL
(LOI)
COMPETENCI
ES ACROSS
UNITS S-A Div
(CO/RO/DO) SA Div 60% 40% Total
1 Understanding
DepED as an
Organization
2 Understanding
RA 9155 or the
Governance of
Basic
Education Act
3 Management of
Change
4 Organization
Analysis/Diagn
osis
5 Identifying and
Solving
Problems
6 Decision-
Making
7 Dealing
Effectively with
Pressure
Groups
8 Conflict
Management
9 Negotiation
Skills
1 Transformation
0 al and Enabling
Leadership
T&D System M&E Framework and Tools Handbook, June 2010 Page 45
T&D System M&E Framework and Tools Handbook, June 2010 Page 46
Raw
Scores
Level of Level of
WEIGHTED RATING Importa
SERVICE AREA 1: Competen (LOC)
EDUCATIONAL cy (LOC) nce
PLANNING (LOI)
S-A Div
TOTA
(DO/RO) S-A Div L
60% 40%
11 Strategic
Planning
12 Implementati
on Planning
13 Project/Progra
m
Identification
14 Resource
Mobilization
and Allocation
15 Financial
Management
and Control
16 Group
Process
Management
17 Facilitation
Skills
18 Communicati
on Skills
19 Advocacy
SERVICE AREA 3:
MONITORING
AND EVALUATION S-A Div
TOTA
(DO/RO) S-A Div 60% 40%. L
20 Monitoring
and
Evaluation
Design and
Development
21 Instrument/To
ols
Development
for M&E Data
Gathering
22 Data
Processing,
Analysis and
Utilization
23 Communicati
on
T&D System M&E Framework and Tools Handbook, June 2010 Page 47
Skills/Feedbac
k Giving
24 Education
Management
Information
System
(EMIS)
T&D System M&E Framework and Tools Handbook, June 2010 Page 48
Raw
Scores
SERVICE AREA 4: Level of WEIGHTED RATING
EDUCATION Competen (LOC) Level of
ADMINISTRATION cy Importa
AND (LOC) nce (LOI)
MANAGEMENT S-A
(CO/RO/DO) 60 Div
TOTA
S-A Div % 40% L
25 Resource
Mobilization
and
Management
26 Resource
Procurement
and
Management
27 Building
Partnerships
28 Human
Resource
Management
29 Delegation
30 Physical
Facilities
Programming
31 Records
Management
32 Understandin
g the intent of
the Policy and
Implementati
on
SERVICE AREA 5:
POLICY
FORMULATION
AND STANDARD
SETTING (RO/CO) S-A Div
33 Policy
Framework
Development
34 Policy
Instrument
Development
35 Policy
Formulation
36 Policy Review
37 Standard
T&D System M&E Framework and Tools Handbook, June 2010 Page 49
Setting
38 Technical
Writing
39 Advocacy for
Policy
Formulation/I
mplementatio
n
T&D System M&E Framework and Tools Handbook, June 2010 Page 50
Raw Scores
SERVICE AREA Level of WEIGHTED RATING
Level
6: Competenc (LOC)
of
CURRICULUM y (LOC)
Import
DEVELOPMENT
ance
(CO/RO) S-A
TOT (LOI)
60 Div
S-A Div. % 40% AL
40
Knowledge
on the
Technical
Vocabulary
of
Curriculum
Engineering
41
Understandi
ng of the
Foundation
of the
Curriculum
42
Application
of the
Foundations
of the
Curriculum
in
Curriculum
Engineering
43
Curriculum
Designing
44
Curriculum
Structuring
45
Implementa
tion of
Various
Curriculum
Models
46
Curriculum
Evaluation
T&D System M&E Framework and Tools Handbook, June 2010 Page 51
NOTE: The lower the numerical value of the LOC and LOI, the greater is the need for
training.
T&D System M&E Framework and Tools Handbook, June 2010 Page 52
Org’l TDNA-M & E Form 3: Functional Divisions/Sections/Units
Organizational - TDNA Prioritization Template
(To be accomplished by the TDNA –WG)
LEVEL OF PLAN: REGION DIVISION DATE Accomplished: ______________________
Supply the following data: 1) name of functional divisions/sections/units, and 2) numerical rating of LOI and LOC for each service area of each
divisions/sections/unit.
Priority1
Priority 2
T&D System M&E Framework and Tools Handbook, June 2010 Page 53
Priority 3
Org’l TDNA-M & E Form 4: Organizational - TDNA Schools Division Consolidation Template
NOTE: For Regions with more than seven (7) Schools Divisions, additional columns maybe added. Only the general average of each service
area is entered.
Priority1
Priority 2
Priority3
T&D System M&E Framework and Tools Handbook, June 2010 Page 54
Org’l TDNA-M&E Form 5: Documentation Review
of Organizational TDNA Region/Division level
This form is to be used to support Regional monitoring of the Organizational TDNA processes at the Division level. It is expected that the assessment will be
based on observations, discussions with the implementing team and review of relevant documents.
Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way
Use the scale above to assess the extent to which the conduct of Organizational TDNA
documentation adhered to the following:
To what extent …….. 1 2 3 4
1. was thorough planning conducted prior to administration?
2. was the purpose of the Organizational TDNA explained?
3. was the data collection method to be followed for administering the
Organizational TDNA explained e.g. group consensual assessment technique,
self assessment and an external assessment?
4. were participants oriented to the Organizational Management Competencies for
each service?
5. was a clear explanation provided on how to accomplish the Organizational
TDNA process e.g. consensus agreement within each division/unit regarding
level of competence and level of importance, agreement across divisions/units,
scoring system
6. were the steps involved in consolidating the results for individual divisions/units
as well as the overall region/division explained?
7. were the steps involved in consolidating the self assessment and the external
T&D System M&E Framework and Tools Handbook, June 2010 Page 55
assessment explained?
8. was an explanation on how to interpreted results to identify priority training
needs provided ?
9. was technical assistance provided when required?
10. were the M&E tools and processes implemented?
11. was there evidence of team work and collaboration amongst the Organizational
TDNA Implementers
12. were recommendations for improving the Organizational TDNA administration
processes identified?
Recommendations:
Name: ___________________________________
Position: _________________________________
Date: ____________________________________
T&D System M&E Framework and Tools Handbook, June 2010 Page 56
Section 3.0: Training & Development System Monitoring and Evaluation for
the Professional Development Planning (PDP) System
The Professional Development Planning System has three major planning components; the Individual Plan for
Professional Development (IPPD) for teachers and school heads, the School Plan for Professional Development
(SPPD) and the Master Plan for Professional Development (MPPD) at both the region and division level.
The diagram below shows the Quality Assurance (QA) and M&E scheme for the Professional Development
Planning System at the Division and Regional levels. At both levels, the T&D Office is tasked to prepare the PDP-
WG members who are assigned to monitor the professional development planning conducted by clusters of
schools and at the level of individual school implementation using M&E tools. The System’s compliance to
standards particularly the development and quality of the SPPDs and MPPDs are quality-assured by the PDP-
WG.
The M&E Report accomplished by the Division PDP-WG is submitted to the Division T&D Chair who in turn
reviews the report with the T&D Division Team. The T&D Office has the responsibility to inform monitored
schools of the significant findings related to the professional development plans. The Report is also used as the
basis for necessary adjustments to the system, if any.
The same process is followed at the Regional level for the completion of professional development plans. The
Regional T&D Office convenes the PDP-WG who is tasked to monitor the program planning in terms of quality
and processes followed and reports its findings to the Regional T&D Chief. Following any necesssary
adjustments, new standards and guidelines are sent to the divisions and schools.
2.4. QA-M&E for Professional Development
2.4.1
Planning
Division/Region
Prepare PDP-WG
IPPD, Division/Region 2.4.6a
T&D Office
Compl SPPD
PDP-WG 2.4.5 Identify &
etion Division
for M&E IPPD Inform
of MPPD Review
2.4.2a
and monitored
TDNA Region
2.4.2b M&E/QA
Monitor
Resource schools/divi
MPPD
QA for Report 2.4.7b
Professional
s sions
Development Products’ Makeof
QA/M&E
necessary
Planning complianc
findings
adjustments
PDP
System at e to
TDI School and 2.4.3 standards 2.4.8c
to PDPSystem
S District levelRecord Prepare
System Adjustm
Dat M&E and Reportenton
QA PDP Process
aba Results
2.4.4 based on
se Prepare QA/M&E for
M&E/QA Reg policy
Report Reg
on
Report review
PDP &Polic
M&E/QA adjustment
Processy
Report Revie
w&
Adjus
tment
T&D System M&E Framework and Tools Handbook, June 2010 Page 57
3.2. M&E for the IPPD for Teachers and School Heads
M&E tools are provided to support the orientation and the implementation of the IPPD as well as the
overall management of the process. The following tools are available:
Input Resource Materials Checklist for IPPD Resource Materials Checklist for IPPD
incorporated into the IPPD Guide incorporated into the IPPD Guide
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of the IPPD
process:
What will be How it will be M&E tool to Who will be When will the How will the results be used
monitored monitored be used responsible monitoring
for the take place
monitoring
The process A process observer IPPD-M&E School PDP – During the IPPD Results will be reviewed by the
followed in the will be identified and Form 1: IPPD WG (Teachers process for PDP-WG, recommendations
accomplishment will use the Process IPPD) teachers at the developed to improve processes
of the IPPD by Observation Guide Observation school level and and included in the Program
teachers and Guide for Division PDP- School Heads Completion Report
school heads Teachers/Sch WG (SHs IPPD) at cluster level
ool Heads
T&D System M&E Framework and Tools Handbook, June 2010 Page 58
Teachers/Schoo Teachers/School IPPD-M&E School PDP – Following the End of IPPD Evaluation will be
l Heads Heads will complete Form 2: End WG (Teachers accomplishment collated by the PDP-WG and
perception of an End of IPPD of IPPD IPPD) of the IPPD reviewed to identify how the
the success of Planning Evaluation Planning Planning processes can be improved.
the IPPD Evaluation process at the
planning Division PDP- school level for A summary of the results and
process WG (SHs IPPD) teachers and at recommendations will be included
the cluster level in the Program Completion
for School Report and recommendations
Heads incorporated into future processes
The quality of School Heads and IPPD-M&E School Heads Following the Feedback will be provided to
the Department Heads Form 3: and Department completion of individual teachers/school heads
accomplished will review teachers’ Review of Heads for the teachers’ to enhance the quality of the
IPPD IPPD at school level Accomplishe teachers IPPD at the IPPD.
d IPPD school level and
ES1/PSDS will ES1/PSDS/ASD School Heads
review completed S for School at the cluster
IPPD of School Heads level
Heads at the cluster
level
The IPPD goals IPPD’s will be IPPD-M&E School PDP – Following the For Teachers IPPD: SHs/Dept
and objectives reviewed and results Form 4: WG (Teachers completion of Heads will consolidate key
of summarized at the Summary IPPD) the teachers’ findings to inform the
teachers/school school level for Template of IPPD at the SPPD/MPPD
heads teachers and at the IPPD Division PDP- school level and
cluster level for Goal/Objectiv WG (SHs IPPD) School Heads’ For School Heads IPPD: Division
School Heads es at the cluster PDP-WG/PSDS will consolidate
level key findings for a cluster of
School Heads and prepare a
report for submission to the
Division T&D Chair. T&D Chair
will identify key recommendations
to be included in Program
Completion Report and inform
MPPD
The number of A Division Tracking IPPD/SPPD- Division PDP- Following the Results will be included in
IPPDs Form will be M&E Form 5: WG accomplishment Division Program Completion
accomplished completed listing the Division of the IPPD by Report and inform future IPPD
by schools number of teachers Tracking teachers and policy
within the and school heads Form for school heads
division who have Accomplishe
accomplished IPPDs d
IPPDs/SPPD
s
The number of A Region Tracking IPPD/SPPD- Region PDP- Following the Results will be included in Region
IPPDs Form will be M&E Form 6: WG accomplishment Program Completion Report and
accomplished completed listing the Region of the IPPD by inform future IPPD policy
within each number of teachers Tracking teachers and
division within and school heads Form for school heads in
the region who have Accomplishe each division
accomplished IPPDs d IPPDs
across all divisions
T&D System M&E Framework and Tools Handbook, June 2010 Page 59
T&D System M&E Framework and Tools Handbook, June 2010 Page 60
IPPD-M&E Form 1: IPPD Process Observation Guide for Teachers /
School Heads
SCHOOL/CLUSTER Observed:_________________________________________________
NAME OF PROCESS OBSERVER: _____________________________________________
DATE: ______________________ VENUE: _____________________
PARTICIPANTS NAME: (Attached Attendance Sheet)
DIRECTION: Observe the process involved in the activities associated with the development of the
IPPD. If the activity is accomplished, write YES in the appropriate column, if not, write NO.
ACCOMPLISHED
ACTIVITIES
Yes or No
I. Development of an understanding of the IPPD and its purpose
a. Conduct of a warm up activity to start the session
b. Discussion on how to further develop oneself as a professional to improve
performance of one’s duties and responsibilities.
c. Presentation of the objective of the IPPD workshop and explanation of the
meaning of IPPD, its purpose and guiding principles.
d. Explanation regarding the accomplishment of the IPPD being a vital
responsibility of all professionals for the development of the school and
improvement of learners
II. Completion of the IPPD
a. Analysis of the information such as TDNA, AIP, School assessment reports
and/or other relevant available documents.
b. Formulation of the IPPD goal
c. Deriving the objectives from the goal by reviewing the list of priority needs and
specific competency areas
d. Identification of the strategies/methods and activities for pursuing one’s
professional development goal and objectives
e. Establishment of the timeframe for the various activities identified in the IPPD
f. Identification of possible resources that can support the implementation of the
IPPD
g. Review of the IPPD
h. Signing of the IPPD
Do you have other comments/suggestions/recommendations for the improvement of the IPPD process?
Please rate how you feel about the IPPD planning session relative to the following processes involved in
the accomplishment of the IPPD. Please tick the appropriate column for your rating using the scale below.
Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Level In a very significant way
3 High Level In a meaningful way
2 Low Level In a limited way only
1 Very Low Level Not in any meaningful way
T&D System M&E Framework and Tools Handbook, June 2010 Page 62
IPPD-M&E Form 3: Review of Accomplished IPPD
This form has been developed to support a Review Process of the accomplished Individual
Plan for Professional Development (IPPD). The School Head and Department
Heads/Coordinators should review the IPPD completed by the teachers while the PDP-WG
Chair/ES1/PSDS should review the IPPD of SHs to evaluate the level of adherence to
standards followed. Based on the review feedback should be provided to the IPPD Planner
and the IPPD further enhance if required.
Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Level In a very significant way
3 High Level In a meaningful way
2 Low Level In a limited way only
1 Very Low Level Not in any meaningful way
Use the scale above to evaluate the level to which the accomplished IPPD adheres to the following
standards:
5. does the IPPD reflect processes that are embedded in the job, i.e. inherent to
the practice of the profession, and a continuing course of action.
Reviewed by:
T&D System M&E Framework and Tools Handbook, June 2010 Page 63
IPPD-M&E Form 4: Summary Template of IPPD Goal/Objectives for
Teachers/School Heads
Totals
Directions: (1) List the names of teachers/school heads who accomplished their IPPD. (2) Write the competency number
the IPPD, on the top row of the succeeding columns. (3) Enter the 3 prioritized objectives of each teacher/school head in
O3 indicating the IPPD objective of an individual teacher/school head based on set objectives in the IPPD form. The numb
number of entered objectives per column and write this in the totals. This information will be useful for the planners of the
T&D System M&E Framework and Tools Handbook, June 2010 Page 64
IPPD/SPPD-M&E Form 5: Division Tracking Form for Accomplished
IPPDs/SPPDs (electronic version available)
T&D System M&E Framework and Tools Handbook, June 2010 Page 65
IPPD/SPPD-M&E Form 6: Region Tracking Form for Accomplished
IPPDs/SPPDs (electronic version available)
Region: ____________________________ Date: -
_______________________
No. of No. of No. of No. School No. SPPD
Divisions District Name Schools Teacher Teacher Head IPPD Complete Comments
s IPPD accomplishe d
accomplishe d
d
Division 1.
1 2.
3.
4.
5.
6.
7.
8.
9.
1
0
Sub Total
Division 1
2 1
1
2
1
3
1
4
1
5
1
6
1
7
1
8
1
9
2
0
Sub Total
Division 2
3 1
2
2
2
3
2
4
2
5
T&D System M&E Framework and Tools Handbook, June 2010 Page 66
2
6
2
7
2
8
2
9
3
0
Sub Total
TOTAL
S
T&D System M&E Framework and Tools Handbook, June 2010 Page 67
3.3 : M&E for the SPPD
M&E tools are provided to support the School Plan for Professional Development (SPPD) process.
The following tools are available:
System M&E Tools for Regional M&E Tools for M&E Tools for School
Level Level Division/Cluster Level Level
T&D System M&E Framework and Tools Handbook, June 2010 Page 68
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of the
SPPD process:
What is monitored How it is M&E tool to be Who is When does the How are results used
monitored used responsible monitoring take
for the place
monitoring
The membership of All members of T&D-M&E School During the PDP -WG analyzes profiles
the teams planning teams Form 1: Head formation of to ensure teams are well
responsible for the are asked to Individual planning teams represented by the various
development of provide a Profile personnel groups and have
SPPD in relation to: personal profile Template members with relevant
outlining their experiences.
- The experiences work Recommendation based on
which individuals experiences the analysis is made to
bring to the team and improve future team
- The level of qualifications. membership and included in
representation of the Program Completion
the different Report
personnel groups
on the team
The process followed A process SPPD-M&E School During the SPPD ES1/PSDS consolidates
in accomplishing the observation is Form 1: PDP-WG process at the results from observations
SPPD and the level completed Process school level from their cluster and
of collaboration Observation Divisions PD prepare a report for
between team Guide for – WG submission to the Division
members SPPD (represented T&D Chair. T&D Chair
by identifies key
ES1/PSDS) recommendations and
include in Program
Completion Report for the
conduct of the SPPD.
Team Members Team members SPPD-M&E School Following the End of Program Evaluations
perception of the complete an Form 2: End of PDP- WG accomplishment are collated by the PDP-WG
extend they End of Program SPPD Planning of the SPPD and reviewed to identify how
successfully Planning Evaluation the processes can be
completed the SPPD Evaluation improved.
planning process A summary of the results are
included in the Program
Completion Report and
recommendations
incorporated into future
processes
The SPPD process A debriefing SPPD-M&E School Following the Key finding and
meeting will be Form 3: SPPD PDP-WG accomplishment recommendations to be
conducted De-briefing (led by SH) of the SPPD at the include in Program
involving all Guide school level Completion Report and will
those involved Checklist inform future conduct of the
in facilitating SPPD
the SPPD
process
T&D System M&E Framework and Tools Handbook, June 2010 Page 69
The accomplished SPPDs will be SPPD-M&E Division Following the ES1/PSDS/ASDS
SPPD reviewed Form 4: PDP-WG completion of the consolidates key finding and
Review Tool for (represented SPPD prepares a report for
Accomplished by submission to the Division
SPPD ES1/PSDS) T&D Chair. T&D Chair
identifies key
recommendations and
include in Activity Completion
Report for the conduct of the
SPPD
All SPPDs submitted SPPDs Priority SPPD-M&E District Upon submission ES1/PSDS consolidates the
at the District Level Programs will Form 5: ES/PSDS of the SPPDs at 3 priority programs for
be consolidated Summary the District level professional development
Template for listed in each SPPD. The
Schools’ accomplished Template will
Priority be submitted with a cover
Professional report to the Division T&D
Development Chair through the PDP-WG.
Programs
based on
SPPDs at
District Level
The number of A Division IPPD/SPPD- Division Following the Results will be included in
SPPDs Tracking Form M&E Form 6: PDP-WG accomplishment Division Program Completion
accomplished by will be Division of the SPPD by Report and inform future
schools within the completed Tracking Form schools in each SPPD policy
division listing the for division
number Accomplished
schools who IPPDs/SPPDs
have
accomplished
SPPDs
The number of A Region IPPD/SPPD- Region Following the Results will be included in
SPPDs Tracking Form M&E Form 7: PDP-WG accomplishment Region Program Completion
accomplished in will be Region of the SPPD by Report and inform future
each division within completed Tracking Form schools in each SPPD policy
the region listing the for division
number of Accomplished
schools who IPPDs/SPPDs
have
accomplished
SPPDs across
all divisions
T&D System M&E Framework and Tools Handbook, June 2010 Page 70
T&D-M&E Form 1: Individual Profile Template
I PERSONAL DATA
Name:
Please check training focus and management level for all training attended over the last three years.
T&D System M&E Framework and Tools Handbook, June 2010 Page 71
Training Focus Training Management Level of Training
attended Central Region Division Cluster School
over last 3
years ()
Curriculum
Resource Materials
Development
Planning
Management
Policy Development
Research
SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please specify -----
T&D System M&E Framework and Tools Handbook, June 2010 Page 72
Program Delivery Program Management
I certify that the information I have given to the foregoing questions are true, complete, and correct to the best
of my knowledge and belief.
Date: Signature:
Please submit completed form to Training and Development Division/Unit. Information will be incorporated into
the T&D Information System Database.
T&D System M&E Framework and Tools Handbook, June 2010 Page 73
SPPD-M&E Form 1: Process Observation Guide for SPPD
SCHOOL OBSERVED:_________________________________________________
NAME OF PROCESS OBSERVER: _____________________________________
DATE: ______________________
VENUE: _____________________
PARTICIPANTS NAME & DESIGNATION: (Attached Attendance Sheet)
DIRECTION: Observe the process involved in the activities associated with the development of the
SPPD. If the activity is accomplished, write YES in the appropriate column, if not, write NO. Rate
the level of collaboration between participants in completing the various activities using the rating
scale below:
RATING SCALE: (1) Low (2) Moderate (3) Average (4) High
ACTIVITIES LEVEL OF
ACCOMPLISHED COLLABORATIO
Yes or No N
1 2 3 4
I. Development of an understanding the SPPD and its purpose
a. Conduct of a warm up activity to form personnel groups
T&D System M&E Framework and Tools Handbook, June 2010 Page 74
identified in the SPPD
k. Estimation of the budgetary requirements for each
program
l. Identification of sources of funds for each program
m Review of the SPPD
.
n. Signing of the SPPD
Process Observer:
___________________________________
Signature Over Printed Name
___________________________________
Designation
T&D System M&E Framework and Tools Handbook, June 2010 Page 75
SPPD-M&E Form 2: End of SPPD Planning Evaluation
Please rate how you feel the SPPD team faired relative to the following processes involved in the
accomplishment of the SPPD. Please tick the appropriate column for your rating using the scale
below.
Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way
1 the following documents were used in the analysis and development of the context
of the SPPD?
a. BESRA PIP
b. SIP/AIP
c. NCBTS Framework
d. Consolidated TSNA Results
e. Consolidated Teachers IPPD
f. Consolidated Non-Teaching TDNA
g. Student Performance Data
2 the formulation of SPPD overall goal was based on the results of the analysis of
the current context relating to Human Resource Development?
3 the SPPD goal was taken into consideration in formulating the objectives?
T&D System M&E Framework and Tools Handbook, June 2010 Page 76
8 the various funding sources were identified to support the implementation of the
different programs?
9 the following were considered in setting the timeframe for the different programs?
a. Development Priorities
b. Cumulative Nature of the Programs
c. One-Year Coverage of the SPPD
10 you have been capacitated through your involvement in the planning process?
11 you will be able to apply the learning gained in planning for future similar
activities?
12 you are able to transfer the technology learnt to others?
T&D System M&E Framework and Tools Handbook, June 2010 Page 77
SPPD-M&E Form 3: SPPD Debriefing Guide Checklist
This form has been developed to guide the facilitators debriefing meeting following the completion of the
School Plan for Professional Development (SPPD). The T&D Chair /ES1/PSDS/School Head should manage
the debriefing meeting and ensure a record is kept of the discussions. The information from this meeting
should inform future SPPD activities and the Program Completion Report.
------------------------------------------------------------------------------------------------------------
DATE: ______________________
VENUE: _____________________
Directions: Discuss each of the questions below as a group and reach a consensual answer. Check the
appropriate column that corresponds to your group response. Write comments to support your response.
QUESTIONS YES NO COMMENTS
1. Were all targeted participants
present?
T&D System M&E Framework and Tools Handbook, June 2010 Page 78
9. What suggestions/recommendations can you make that will improve the conduct of the SPPD?
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________
10. General comments on the conduct of the activity:
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________
Name: ____________________________
(Signature Over Printed Name)
T&D Chair /ES1/PSDS/School Head
T&D System M&E Framework and Tools Handbook, June 2010 Page 79
SPPD-M&E Form 4: Review Tool for Accomplished SPPD
Use the scale above to evaluate the extent to which the accomplished SPPD adheres to the following
standards:
5. is the SPPD part of a formative and cyclical process where data from previous
planning experiences are analyzed and used to improve the process?
6. does the SPPD reflect a unified approach in improving human resource
development by taking into consideration the national goals and thrusts?
7. does the SPPD provide for alternative programs for professional development to
incorporate emerging priorities?
Name: _____________________________________
Position: ____________________________________
Date: _______________________________________
T&D System M&E Framework and Tools Handbook, June 2010 Page 80
SPPD-M&E Form 5: Summary Template of School Professional Development Priority Programs
Based on SPPDs at District Level
(Note: This Form is for one clientele group only, e.g. Teachers group. For non-teaching personnel, a separate sheet should be
used.)
Direction: Write the name of school and respective priority programs. Then, check the appropriate column that represents the
domain/strand related to the each priority program listed.
2.
3.
B. 1.
C. 1
T&D System M&E Framework and Tools Handbook, June 2010 Page 81
D. 1
E.
T&D System M&E Framework and Tools Handbook, June 2010 Page 82
IPPD/SPPD-M&E Form 6: Division Tracking Form for Accomplished IPPDs/SPPDs (electronic version available)
T&D System M&E Framework and Tools Handbook, June 2010 Page 83
23
24
IPPD/SPPD-M&E Form 7: Region Tracking Form for Accomplished IPPDs/SPPDs (electronic version available)
25
26
27
28
29
30
Sub Total
TOTALS
T&D System M&E Framework and Tools Handbook, June 2010 Page 84
15
16
17
18
19
20
Sub Total
Division 3 21
22
23
24
25
26
27
28
29
30
Sub Total
TOTALS
T&D System M&E Framework and Tools Handbook, June 2010 Page 85
3.4: M&E for the MPPD
M&E tools are provided to support the Master Plan for Professional Development (MPPD)
process. The following tools are available:
T&D System M&E Framework and Tools Handbook, June 2010 Page 86
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of
MPPD process:
What is monitored How it is M&E tool to Who is When does How are results used
monitored be used responsible the
for the monitoring
monitoring take place
The members of the All members T&D-M&E PDP-WG During the PDP- WG analyze profiles to
teams responsible for of planning Form 1: Division formation of ensure teams are well
the development of teams are Individual and planning represented by the various
MPPD in relation to: asked to Profile Region teams personnel groups and have
- The experiences provide a Template level members with relevant
which individual personal experiences. Recommendation
members bring to the profile based on the results is made to
team outlining improve future team
- The appropriateness their work membership and included in
of representation of experiences the Program Completion
the different and Report
personnel groups on qualification
the team s. Profiles of planners to be
entered into TDIS data base
The process followed A process MPPD-M&E PDP- WG During the Results are reviewed by the
in accomplishing a observation Form 1: at the MPPD at the PDP-WG and
MPPD and the level of is completed Process Region division/ recommendations developed
collaboration between Observation and region level to improve processes and
team members Guide for the Division included in the Program
Division/ Completion Report
Region
Team Members Team MPPD-M&E PDP- WG Following the End of Program Evaluations
perception of the members Form 2: End accomplish- are collated by the PDP-WG
extent they complete an of MPPD ment of the and reviewed to identify how
successfully End of Planning Program the processes can be
completed the Program Evaluation Planning improved. Recommendations
planning process Planning Division/ process at are included in the Program
Evaluation Region the region Completion Report
and division
level
The MPPD process at A debriefing MPPD-M&E PDP-WG Following the Results from the de-briefing
the region and meeting Form 3: accomplishm are incorporated into
division level involving all Division/ ent of the Region/Division Program
those Region MPPD at the Completion reports and used
involved in MPPD De- region and by the T&D Chief/Chair to
facilitating briefing division level improve future processes
the MPPD Guide
process Checklist
The accomplished Completed MPPD-M&E Following Regional Results from the review of
MPPD MPPD’s will Form 4: the PDP- WG MPPDs are incorporated into
be reviewed Review Tool completio members for Region/Division Program
at the region for n of the both the Completion reports and used
and division Accomplishe MPPD at Region and by the T&D Chief/Chair to
level d MPPD for the region Division improve future processes
Division/Regi and MPPD
on division
T&D System M&E Framework and Tools Handbook, June 2010 Page 87
level
I PERSONAL DATA
Name:
T&D System M&E Framework and Tools Handbook, June 2010 Page 88
Use additional sheet if necessary.
Please check training focus and management level for all training attended over the last three
years.
Resource Materials
Development
Planning
Management
Policy Development
Research
T&D System M&E Framework and Tools Handbook, June 2010 Page 89
Use additional sheet if necessary.
I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best of my knowledge and belief.
Date: Signature:
Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.
T&D System M&E Framework and Tools Handbook, June 2010 Page 90
T&D System M&E Framework and Tools Handbook, June 2010 Page 91
MPPD-M&E
MPPD-M&EForm 2: 1:
Form End of MPPD
Process Planning Evaluation
Observation Guide for –
Division/Region
Division/Region MPPD
___________________________________
Signature Over Printed Name
T&D System M&E Framework and Tools Handbook, June 2010 Page 92
Name of MPPD Planner: ________________________ Sex: Male Female
Please rate how you feel the MPPD team faired relative to the following processes involved
in the accomplishment of the MPPD. Please tick the appropriate column for your rating using
the scale below.
Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way
7 the following were taken into consideration when estimating the budget
for the different programs?
a. Cost associated with Program Designing/Resource
Materials Development
b. Cost associated with Program Delivery
c. Cost associated with Program Monitoring and
T&D System M&E Framework and Tools Handbook, June 2010 Page 93
Evaluation
8 you were able to successfully identify various funding sources to support
the implementation of the different programs?
9 the following were taken into consideration when setting the timeframe
for the different programs?
a. Development Priorities
b. Cumulative Nature of the Programs
c. Three-Year Coverage of the MPPD
10 you have been capacitated by your involvement in the planning process?
11 you will be able to apply the learning gained in planning for future
similar activities?
12 you are able to transfer the technology learnt to others?
This form has been developed to guide the facilitators debriefing meeting following
the completion of the Master Plan for Professional Development (MPPD) at the
region or division level. The T&D Chief/Chair should manage the debriefing meeting
and ensure a record is kept of the discussions. The information from this meeting
should inform future MPPD activities and the Program Completion Report.
T&D System M&E Framework and Tools Handbook, June 2010 Page 94
DATE: ______________________ VENUE: _____________________
QUESTIONS YE NO COMMENTS
S
1. Were all targeted participants present?
9. What suggestions/recommendations can you make that will improve the conduct
of the MPPD?
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________
T&D System M&E Framework and Tools Handbook, June 2010 Page 95
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________
Name: ____________________________
(Signature Over Printed Name)
T&D Chief/Chair
Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way
Use the scale above to evaluate the extent to which the accomplished MPPD adheres to the
following:
To what extent …….. 1 2 3 4
1. does the MPPD focus on improving student learning and consider the
development priorities of the division/region, national goals and
thrusts?
2. does the MPPD outline opportunities for all personnel groups to
participate in continuous professional development programs to
increase their current level of competencies?
3. do the objectives and competencies directly relate to the overall goal of
the MPPD?
4. have related competencies been organized to form programs?
T&D System M&E Framework and Tools Handbook, June 2010 Page 96
5. are related programs logically sequenced?
6. have a range of delivery modes been recommended?
7. will the output and outcomes identified provide evidence that program
objectives have been met and competencies enhanced?
8. does the budget estimate take into consideration the costs associated
with the design, resource materials development, implementation and
monitoring of all programs
9. have sources of funds been identified for the proposed programs
10. are the programs logically scheduled across the years covered by the
MPPD?
11. are the strategies identified in the MPPD effective in increasing
participation and involvement of education personnel in professional
learning?
12. does the MPPD incorporate formative and cyclical processes and
promote the accurate collection and analysis of data to improve future
activities?
13. does the MPPD reflect a unified approach to improve human resource
development?
Name: _____________________________________
Position: ____________________________________
Date: _______________________________________
Section 4.0: T&D System Monitoring and Evaluation for the Program
Designing and Resource Development (PDRD) System
4.1 M&E for the Program Designing and Resource Development (PDRD) System
The Program Designing and Resource Development System has two major planning
components; program Designing and Resource Development
It is the responsibility of the PDRD-WG to complete the M&E and QA processes associated
with the development of program designs and resource packages. The PDRD-WG will be
expected to report on their findings and to incorporate any recommendations for
improvement into future processes.
The M&E and QA for the PDRD System is shown in the diagram below. The systems flow is the
same as that in the M&E for the other subsystems of Training and Development and is
followed at the division and at the regional levels. Basically, the members of the PDRD-WG
responsible for the M&E prepares the resources they need for the task, then implements its
M&E Plan as scheduled. The nature of the task is to ensure the compliance of the
T&D System M&E Framework and Tools Handbook, June 2010 Page 97
implementers monitored to the standards set for program designing and resource
development activities for the various clientele. Part of the M&E task is to review the quality
of the program designs and the resource packages that have been produced and another
group is convened by the T&D Chief/Chair to review the quality of the plans based on
standards set for professional development plans. Results are recorded, reports prepared and
submitted to the T&D Office for uploading in the TDIS.
The T&D Office, in turn, informs the schools and the divisions monitored of the findings and
makes the necessary adjustments to the system. Reports are prepared on the monitored
processes to inform Regional policy review and adjustment.
3.4. QA –M&E for Program Designing and
Region/Division
Resource T&D
Development Regional/Division
PDRD-WG
3.4.1 T&D Unit
Com Prepare 3.4.6 3.4.6a
pletio PDRD- Review Identify &
n of WG for M&E Inform
MPP M&E and Report monitored
3.4.2
DMonitor Resource 3.4.3 Division of
s QA assess M&E3.4.6b
findings
Program Formulate/R
Designing compliance
to standards evise PDRD
and resource Guidelines
development at
division/sch Region PDRD
at regional 3.4.4 ool level Policy Standards &
level Record
TDIS Review Guidelines
M&E
Data &
Results 3.4.7
base 3.4.5 Adjust
M&E ment Communicat
Prepare e standards
M&E Report
& guidelines
Report to divisions
M&E tools are provided to support the program designing process. The following tools are available:
T&D System M&E Framework and Tools Handbook, June 2010 Page 98
Level Regional Level Division/Cluster Level School Level
Output
D-M&E Form 2: Program Design D-M&E Form 2: Program Design D-M&E Form 2: Program Design
Review/Quality Assurance Tool Review/Quality Assurance Tool Review/Quality Assurance Tool
Process D-M&E Form 1: End of Program D-M&E Form 1: End of Program D-M&E Form 1: End of Program
Designing Evaluation Designing Evaluation Designing Evaluation
Input T&D-M&E Form 1: Individual T&D-M&E Form 1: Individual T&D-M&E Form 1: Individual
Profile Template Profile Template Profile Template
T&D System M&E Framework and Tools Handbook, June 2010 Page 99
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of the
Program Designing process:
What will be How it will be M&E tool to Who will be When will the How will the results be
monitored monitored be used responsible monitoring used
for the take place
monitoring
The membership of All members of T&D-M&E PDRD-WG During the The PDRD- WG analyzes
the team the Program Form 1: formation of the profiles to ensure that
responsible for the Designing Team Individual Program members have the relevant
development of the are asked to Profile Designing Team experience and expertise to
Program Design in provide a Template support the program design
relation to the personal profile process.
experiences and outlining their
expertise which work Profiles are to be entered
individuals bring to experiences and into the TDIS database of
the team. qualifications. Program Designers at the
Region, Division and
School levels.
Team Members Program D-M&E PDRD- WG Following the End of Program Evaluation
perception of the Designing Team Form : 1 End completion of Forms are collated by the
extent they members will of Program the Program PDRD-WG and reviewed to
successfully individually Designing Designing identify how the processes
completed the complete the Evaluation process can be improved.
designing process End of Program
Designing A summary of the results
Evaluation are included in the Program
Completion Report and the
recommendations are
incorporated in future
processes
Completed Program The Program D- M&E Form Program At the The Program Design is
Designs Designing Team 2: Program Designing Team completion of a refined based on
and a QA Team Design and a QA Team program design recommendations from the
will review and Review/ at the region, review/QA. Based on the
quality assure Quality division and review, a decision is made
the completed Assurance school level regarding whether the
Program Designs Tool program is to be
at the region, implemented or not.
division and Recommendations are
school levels made to improve future
program designing
processes and included in
the Program Completion
Report.
I PERSONAL DATA
Name:
T&D System M&E Framework and Tools Handbook, June 2010 Page 100
(Surname) (First Name) (Middle Name)
Please check training focus and management level for all training attended over the last three years.
T&D System M&E Framework and Tools Handbook, June 2010 Page 101
Resource Materials
Development
Planning
Management
Policy Development
Research
T&D System M&E Framework and Tools Handbook, June 2010 Page 102
Use additional sheet if necessary.
I certify that the information I have given to the foregoing questions are true, complete, and correct to the best
of my knowledge and belief.
Date: Signature:
Please submit completed form to Training and Development Division/Unit. Information will be incorporated
into the T&D Information System Database.
T&D System M&E Framework and Tools Handbook, June 2010 Page 103
D-M&E Form 1: End of Program Designing Evaluation
As a member of the Program Designing Team please rate how you think the team implemented the following
processes involved in the development of the program design. Please tick the appropriate column for your
rating using the scale below.
Numerical Rating Interpretation
4 Very High Extent
3 High Extent
2 Low Extent
1 Very Low Extent
T&D System M&E Framework and Tools Handbook, June 2010 Page 104
Do you have other comments/suggestions/recommendations for the improvement of the
program designing process?
T&D System M&E Framework and Tools Handbook, June 2010 Page 105
D-M&E Form 2: Program Design Review/Quality Assurance Tool
This form is used by both the PDRD-WG and the Quality Assurance Team to support the review
and quality assurance of the developed program designs at the region, division and school level.
The PDRD-WG will use the form to internally review its work before submitting the Program
Design for QA, through the T&D Chief/Chair or School Head. The T&D Chief/Chair or School
Head will establish a Quality Assurance Team to review the developed program design to ensure
that it meets the standards set for program designs.
Rating Guide:
Numerical Rating Interpretation
4 Very High Extent
3 High Extent
2 Low Extent
1 Very Low Extent
Use the scale above to evaluate the Program Design by checking the appropriate column
To what extent …….. 1 2 3 4
1 does the program design build on quality program design concepts?
2 do the rationale, objectives and competencies identified in the
program design relate to current demands in education as stipulated
in the SPPD/MPPD?
3 does the program design take into consideration the specific need
of the target group and the context in which the work in
identifying:
a. the delivery mode (formal and job-embedded)?
b. innovative strategies?
c. research-based practices?
4 does the Program Content Matrix provide sufficient information in
relation to the KSAs to be developed for:
a. the F3 program delivery?
b. the JEL program delivery?
5 is the content described in the program design:
a. logically sequenced?
b. accurately presented?
c. sufficiently covered?
6 is the schedule of activities:
a. logically organized
b. an accurate reflection of required resources needed to
successfully implement both the formal face-to-face and
job-embedded learning components of the program?
7 have the required support materials been accurately identified?
8 has an accurate budget for the program been prepared?
9 is the program design a product of collaboration between qualified
and competent educators?
10 is the program design user friendly, technology enabled and cost
effective?
T&D System M&E Framework and Tools Handbook, June 2010 Page 106
11 does the suggested job-embedded learning (JEL) component
encourage the engagement of participants in applying their learning
from the face-to-face (F3) training in their daily work?
12 does the JEL promote opportunities for collaborative learning in the
workplace?
13 does the JEL include an appropriate time frame adequate to
complete expected accomplishments/outputs?
14 does the JEL provide a flow of activities for JEL implementation?
15 does the JEL identify means of verifying accomplishments and
outputs?
Name: _____________________________________
Position: ____________________________________
Date: _______________________________________
T&D System M&E Framework and Tools Handbook, June 2010 Page 107
4.3 M&E for Resource Development
M&E tools are provided to support the resource development process. The following tools are
available:
Process RD-M&E Form 1: End of RD-M&E Form 1: End of RD-M&E Form 1: End of
Resource Package Development Resource Package Development Resource Package
Evaluation Evaluation Development Evaluation
Input T&D-M&E Form 1: Individual T&D-M&E Form 1: Individual T&D-M&E Form 1: Individual
Profile Template Profile Template Profile Template
T&D System M&E Framework and Tools Handbook, June 2010 Page 108
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation
of the Resource Development process:
What will be How it will be M&E tool to Who will be When will the How will the results
monitored monitored be used responsible monitoring be used
for the take place
monitoring
The membership All members of T&D-Form PDRD-WG During the PDRD-WG analyzes
of the teams the Resource 1: Individual formation of the the profiles to ensure
responsible for Development Profile team that teams have
the development team are asked Template members with relevant
of Program to provide a experiences.
Resource personal profile Recommendations
Package in outlining their based on the analysis
relation to the work are made to improve
level of experiences and future team
experiences qualifications. membership and are
which individuals included in the Program
bring to the team Completion Report
Team Members’ Team members RD-Form 1: PDRD-WG Following the End of Program
perception of the complete the End of completion of Resource Package
extent they Form for End of Program the Resource Development
successfully Program Resource Package Evaluation Forms are
completed the Resource Package development collated by the PDRD-
Program Package Developmen process at the WG and reviewed to
Resource Development t Evaluation region, division identify how the
Package process Evaluation and school level processes can be
improved.
A summary of the
results are included in
the Program
Completion Report and
recommendations
incorporated into future
processes
T&D System M&E Framework and Tools Handbook, June 2010 Page 109
Program Completion
Report.
I PERSONAL DATA
Name:
T&D System M&E Framework and Tools Handbook, June 2010 Page 110
III. TRAINING ATTENDED OVER THE LAST THREE YEARS
Please check training focus and management level for all training attended over the last three
years.
Resource Materials
Development
Planning
Management
Policy Development
Research
T&D System M&E Framework and Tools Handbook, June 2010 Page 111
V. TRAINING AND DEVELOPMENT EXPERIENCES
Identify which of the following specific areas you consider to be your area(s) of expertise:
I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best of my knowledge and belief.
Date: Signature:
Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.
T&D System M&E Framework and Tools Handbook, June 2010 Page 112
Title of Resource Package Developed: ____________________________________
Target Personnel Group: _______________________________________________
Please rate how you think the Training & Development Resource Package Development team
implemented the following processes involved in the development of the resource package.
Please tick the appropriate column for your rating using the scale below.
Numerical Rating Interpretation
4 Very High Extent
3 High Extent
2 Low Extent
1 Very Low Extent
To what extent did the developers demonstrate the following? 1 2 3 4
1 Considered the Program Design when developing the Resource Package
2 Examined existing resource materials/packages for the purpose of adoption/adaption
3 Followed the standards and guiding principles in the development of the resource package
4 Conceptualized the context matrix clearly articulating the requirement of the program in
relation to:
a. activities to be conducted
b. specific objectives to be achieved
c. key understanding to be developed
d. resource materials required
5 Outlined a sequentially organized schedule of activities to support the implementation of the
program including both the F3 and JEL Components
6 Articulated in the resource package a clear learning approach and methodology for both the
F3 and JEL components
7 Developed session guides, (including accompanying powerpoint presentations and scripts)
that clearly outline the conduct of all activities
8 Developed all the necessary handouts and reading materials required for the successful
implementation of the program
9 Identified all the materials required to support the delivery of the program
10 Described the monitoring and evaluation process clearly and developed any necessary M&E
tools
11 Reviewed and quality assured the Resource Package
12 Enhanced competencies in program resource development
13 Expressed learning gained during the resource development process
Please submit completed form to PDRD-WG. Results should be incorporated into the Program
Completion Report
This form is used by both the PDRD-WG and the Quality Assurance Team to support the
review and quality assurance of the developed program resource package at the region,
T&D System M&E Framework and Tools Handbook, June 2010 Page 113
division, school levels. The PDRD-WG will use the form to internally review its work before
submitting the Program Resource Package for QA, through the T&D Chief/Chair or School
Head. The T&D Chief/Chair or School Head will establish a Quality Assurance Team to
review the developed program resource package to ensure that it meets the standards set for
program resource development.
Rating Guide:
Numerical Rating Interpretation
4 Very High Extent
3 High Extent
2 Low Extent
1 Very Low Extent
Use the scale above to assess the Program Resource Package and check the appropriate
column for each item.
3. It uses language and symbols of the content domain and its way
of representation, and supports learners in developing and using
them.
4 It uses the following correctly and appropriately:
- - terms and expressions
- - symbols and notations
A. Integrity
- - diagrammatic representation
- - graphical representation
5. It assists the learner by identifying and differentiating between
different points of view and perspectives presented.
6. It is supported with content that is based on current research and
incorporates innovative strategies and best practices.
7. Presentation of factual content is accurate and up-to-date.
T&D System M&E Framework and Tools Handbook, June 2010 Page 114
13. Content is structured to scaffold learning.
14. The resource package takes into consideration the specific needs
of the target group and the context in which they work.
15. Clear instructions for use are provided (i.e. purpose, processes,
intended outcomes are explicit).
16. Learning and information design is intuitive (i.e. the user knows
what to do and how to do it).
17. The context matrix is clearly conceptualized articulating the
C. Usability
T&D System M&E Framework and Tools Handbook, June 2010 Page 115
DevelopmentF. Collaborative 26. The resource package reflects a product of group effort among
qualified and competent educators.
27. The resource package provides monitoring and evaluation scheme and
for M&EG. Provision
tools
Name: _____________________________________
Position: ____________________________________
Date: _______________________________________
Results should be shared with the Program Resource Development Working Group and inform the
development of the Program Completion Report
T&D System M&E Framework and Tools Handbook, June 2010 Page 116
The diagram below shows the M&E process flow for the program delivery component of the
system. It includes M & E processes and tools designed to monitor the operations,
adherence to standards, processes and end-of-training evaluation at the Regional, Division
and School levels. It includes the utilization of required tools and methodology (e.g. process
observation tool, rating scales, open-ended questionnaire, journal writing, and evaluation
feedback). The data gathering strategies, data analysis and resources for the preparation of
M&E reports are necessary elements for the system to be operational. The M&E provides
information on strengths and weaknesses of the system for the improvement and
sustainability of operations at the Regional, Division/District and School levels.
M&E tools are provided to support both the Formal face-to-face (F3) and Job-embedded
Learning (JEL) program delivery phases of the training program as well as the overall
management of the program. The following tools are available.
T&D System M&E Framework and Tools Handbook, June 2010 Page 117
Tools for Job-Embedded Learning (JEL) delivery
JEL-M&E Form 1: Quality Assurance of the JEL Contract
JEL-M&E Form 2: JEL Journal Entry Sheet
JEL-M&E Form 3: JEL Reflection Template
JEL-M&E Form 4: JEL Advising Tracking Form
JEL-M&E Form 5: Trainees End of Job-Embedded Learning Evaluation and Consolidation
Template
JEL-M&E Form 6: JEL Program Completion Template
F3-M&E Form 2: Learning F3-M&E Form 2: Learning Process F3-M&E Form 2: Learning
Process Observation and Observation and Facilitation Skills Process Observation and
Facilitation Skills Facilitation Skills
F3-M&E Form 4: External ME for F3-M&E Form 4: External ME for F3 F3-M&E Form 4: External ME for
F3 Processes and Accomplishment Processes and Accomplishments F3 Processes and
Accomplishments
Input T&D-M&E Form 1: Individual T&D-M&E Form 1: Individual Profile T&D-M&E Form 1: Individual
Profile Template Template Profile Template
D.2. Job-embedded Learning (JEL)
Output JEL-M&E Form 6: JEL Program JEL-M&E Form 6: JEL Program JEL-M&E Form 6: JEL Program
Completion Report Template Completion Report Template Completion Report Template
Process JEL-M&E Form 2: JEL Journal JEL-M&E Form 2: JEL Journal Entry JEL-M&E Form 2: JEL Journal
Entry Sheet Sheet Entry Sheet
T&D System M&E Framework and Tools Handbook, June 2010 Page 118
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation
of the Program Delivery System.
What will be How it will be M&E tool to Who will be When will the How will the results
monitored monitored be used responsible monitoring take be used
for the place
monitoring
The membership All members of T&D-M&E PDy-WG During the The PDy-WG will
and experience the Program Form 1: formation of the analyze profiles to
of the all those Management Individual Program ensure team members
Team and Profile Management have relevant
involved in the
Trainers will be Template Team and experiences.
program delivery asked to Trainers’ Team, Recommendation based
process e.g. provide a and upon on the analysis will be
Program personal profile registration of the made to improve future
Managers, outlining their Trainees selection processes of
Trainers, work Program Managers and
Trainees experiences Trainers and included in
and the F3 Program
qualifications. Completion Report.
Trainees will Profiles to be entered
also be asked into TDIS
to complete a
profile on
registration
The effectiveness The Program F3-M&E Form Program At the end of the Results will be reviewed
of the walk- Management 1: Management Walk-through of to identify how the walk-
through process Team members Walkthrough M&E Team the Resource through process can be
and the Observation package at least improved. Results will
in preparing the
Trainers will all Checklist 1 week prior to also inform the activities
Program complete a the delivery of the that will need to be
Management checklist and training program accomplished prior to
Team and the results will be the training program
Trainers for the collated by the delivery.
delivery of the Program Results will inform the
training Management final F3 Program
Team Completion Report
The trainers Process F3-M&E Form Program During the Results from the
learning process observers will 2: Learning Management conduct of all Learning Process
and facilitation be identified for Process M&E Team sessions during Observation will be
each session Observation the F3 program used to inform daily
skills
to complete the and delivery debriefing sessions and
proforma e.g. Facilitation to improve the delivery
An off-duty Skills of the training program.
trainer, a
member of the
Program
Management
team
Trainees, All trainees, F3-M&E Form Program At the end of the Results will be used to
Trainers and trainers and 3: End of F3 Management F3 phase of the inform future delivery of
Program Program Program M&E Team training program the training program
Managers will Assessment and to enhance
Managers level
complete an with Program Management
of satisfaction evaluation of Consolidation and Trainers future
T&D System M&E Framework and Tools Handbook, June 2010 Page 119
with the F3 the F3 phase Template performance.
phase of the of the training Consolidated results will
training program program analyzed and used to
inform the final F3
Program Program Completion
Management Report
Staff will
consolidate
results
The quality of the External F3-M&E Form Regional During the Results will be
F3 program monitors will be 4: External Personal for F3phase of the discussed with the
asked to M&E for the Division level program program management
evaluate the F3 Process F3 programs. staff and trainers and
compliance to and will be incorporated into
standards Accomplishmen Division the F3 Program
ts Personal for Completion Report.
Cluster/school
level F3 Results will be used to
programs inform future F3
programs and T&D
policies.
Trainee’s self- All Trainees’ F3-M&E Form Program Prior to the Results will be used to
perception of will complete a 5: Rapid Management beginning of the inform future delivery of
their level of rapid Competency M&E Team F3 phase of the the training program.
competency Assessment program and Consolidated results will
competency
assessment. Before and again at the end analyzed and used to
before and after Program After the F3 of the F3 inform the final F3
their involvement Management Program program. Program Completion
in a F3 training Staff will Report
program consolidate
results
Overall All members of F3-M&E Form PDy-WG At the completion The F3 Program
effectiveness, the Program 6: F3 of the F3 phase Completion report will
efficiency and Management Program of a training be submitted to the T&D
Team will be Completion program Chief at the Region/
success of the F3
expected to Report Division level and the
training program contribute to Template School Head at the
the school level and used to
accomplishme inform future F3
nt of a F3 programs and T&D
Program policies.
Completion
Report.
The Resource All trainers will F3-M&E Form Program Following the Results will be
package used to be asked to 7: Summary Management delivery of the F3 incorporated into the F3
inform the F3 review the Template for M&E Team phase of the Program Completion
Resource Refining the program Report and submitted to
program delivery
Package in Resource the T&D Chief at the
relation to the Package Region/ Division level
sessions they and the School Head at
were the school level for
responsible for consideration and
delivering and action.
T&D System M&E Framework and Tools Handbook, June 2010 Page 120
make
recommendatio
ns for further
enhancement\
Program
Management
will be
consolidate
recommendatio
ns for all
sessions
Individual All Trainees will JEL-M&E JEL Team During all phases Journals will be
progress, complete a Form 2: JEL of the JEL reviewed by the JEL
learning, insights journal to Journal Entry program Team during the JEL
document the Sheet program to inform next
gained and
JEL process. steps. The journal will
issues provide a means of
encountered verification for the
during the successful completion
various phases of of the JEL Contract.
JEL the program
Individual All Trainees will JEL-M&E JEL Team During the Results will be reviewed
progress and complete the Form 3: JEL Reflection Stage by the JEL Team and be
ability to reflect JEL Reflection Reflection of the JEL phase used to support the
Template Template of the program Trainee in identifying
on learning to
next steps.
improve future It will provide a means
practice of verification for the
successful completion
of JEL Reflection phase
of the JEL program.
The level, type JEL Advisers JEL-M&E Program During the Results will be used to
and effectiveness will be expected Form 4: JEL Management various phases of improve the JEL
of the support to keep a record Advising M&E Team the JEL program Advising process and
of the JEL Tracking will be incorporated into
provided by a
advise they Form the JEL Program
JEL Adviser provide to Completion Report.
trainee’s
Results will be used to
inform the JEL
Handbook and future
programs and T&D
policies.
T&D System M&E Framework and Tools Handbook, June 2010 Page 121
Trainees level of All trainees will JEL-M&E Program At the end of the Results will be used to
satisfaction with complete an Form 5: Management JEL program inform future JEL
the JEL program evaluation of Trainee’s M&E Team delivery of programs.
the JEL End of JEL Results will analyzed
program. Evaluation and used to inform the
Program final JEL Program
Management Completion Report
Staff will
consolidate
results.
Overall All members of JEL-M&E T&D Chief at At the completion The JEL Program
effectiveness, the Program Form 6: JEL the of the JEL Completion report will
efficiency and Management Program Region/Divisi program be submitted to the T&D
Team will be Completion on level Chief at the Region/
success of the
expected to Template School Head Division level and the
JEL program contribute to the at the school School Head at the
accomplishment level school level and used to
of a JEL inform future JEL
Program programs and T&D
Completion policies.
Report.
T&D System M&E Framework and Tools Handbook, June 2010 Page 122
T&D-M&E Form 1: Individual Profile Template
I PERSONAL DATA
Name:
T&D System M&E Framework and Tools Handbook, June 2010 Page 123
III. TRAINING ATTENDED OVER THE LAST THREE YEARS
Please check training focus and management level for all training attended over the last three
years.
Resource Materials
Development
Planning
Management
Policy Development
Research
T&D System M&E Framework and Tools Handbook, June 2010 Page 124
Use additional sheet if necessary.
I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best
of my knowledge and belief.
Date: Signature:
T&D System M&E Framework and Tools Handbook, June 2010 Page 125
Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.
Directions: Read the following statements. Tick the appropriate column that
corresponds to your response.
Statements YE N
S O
1. Program Management Team members and Trainers were all present during the
walkthrough.
2. A collaborative effort among the Program Management Team members and trainers
was manifested during the walkthrough.
3. Individual tasks were understood and fairly assigned to all the Program
Management Team members and trainers based on peoples strengths.
4. During the walk-through trainers were made aware of the materials and resources
they were require to prepare for their assigned sessions
5. During the walk -through the session plans were reviewed in detail and the
strategies recommended were discussed and practiced.
6. The trainers were open to suggestions and were willing to learn.
7. The sequencing and the relationship between the different sessions was discussed
and the linkages identified.
8. Any required adjustments to the resource package where made while staying
faithful to the training programs intent and purpose.
9. Issues and concerns were discussed and settled in a healthy atmosphere.
10 The Program Management Team members and trainers committed to perform their
. tasks and responsibilities.
Significant Observations:
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________
Suggestions/Recommendations:
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________
___________________________________
Observer’s Signature over Printed Name
T&D System M&E Framework and Tools Handbook, June 2010 Page 126
F3-M&E Form 2: Learning Process Observation and Facilitation
Skills
This form is to be used during the actual delivery of a program. A Process Observer will need to be
assigned to complete the Learning Process Observation for each session. Results should be used to
inform daily debriefing sessions. At the end of this form is a checklist of facilitation skills which may
be observed and recorded.
Activity
T&D System M&E Framework and Tools Handbook, June 2010 Page 127
Analysis
Abstraction
T&D System M&E Framework and Tools Handbook, June 2010 Page 128
Application
Concluding
Activity
T&D System M&E Framework and Tools Handbook, June 2010 Page 129
Observe if the skill has been demonstrated by the Facilitator. If so, put a check in the appropriate
column.
Facilitation Skills √
OBSERVING SKILLS
1. noted trainees’ level of involvement in all activities
2. monitored the energy level of the trainees during sessions
3. sensed the needs of the trainees that may affect the learning process
QUESTIONING SKILLS
4. formulated questions in a simple manner
5. asked questions that were clear and focused
6. formulates follow-up questions to trainees’ responses appropriately
7. asked Higher Order Thinking Skills (HOTS)
8. acknowledged trainees’ responses
9. solicited, accepted and acted on feedback from trainees
10. processed responses with probing questions to elicit the desired training
LISTENING SKILLS
11. listened and understood the meaning of what had been said
12. responded positively to trainees insights
13. clarified and checked my understanding of what was heard
14. reacted to ideas not to the person
ATTENDING SKILLS
15. created the proper environment based on adult learning principles
16. directed and redirected the trainees to the learning tasks
17. managed the learning atmosphere throughout the sessions
18. acknowledged greetings and responses of trainees
INTEGRATING SKILLS
19. highlighted important results of the activity that lead to the attainment of the
objectives of the session
20. deepened and broadened trainees outlook on the significance of the outputs
ORAL COMMUNICATION SKILLS
21. expressed ideas with clarity, logic and in grammatically correct sentences
22. spoke with a well-modulated voice
23. delivered ideas with confidence and sincerity
SKILL IN USING TRAINING AIDS
24. employed appropriate and updated training aids
25. made training aids that were simple and clear
26. used training aids that were attractive and interesting
27. utilized training aids that were socially, culturally, and gender-fair
T&D System M&E Framework and Tools Handbook, June 2010 Page 130
F3-M&E Form 3: End of the F3 Program Assessment
Please assess the effectiveness of the entire F3 component of the program according to the indicators
below.
Please refer to the following rating scale:
4-Strongly Agree (SA); 3-Agree (A); 2-Disagree (D); 1-Strongly Disagree (SD)
Rating
After the conduct of the F3 component of the 1 2 3 4
program, I believe that … SD D A SA
A. Program 1 the training program was delivered as planned
Planning, 2 the training program was managed efficiently
Management and 3 the training program was well-structured
Preparation
B. Attainment of 4 the program objectives were clearly presented
Objectives 5 the session objectives were logically arranged
6 the program and session objectives were
attained
C. Delivery of 7 program content was appropriate to trainees’
Program Content roles and responsibilities
8 content delivered was based on authoritative
and reliable sources
9 new learning was clearly presented
1 the session activities were effective in
0 generating learning
1 adult learning methodologies were used
1 effectively
1 management of learning was effectively
2 structured e.g. portfolio, synthesis of previous
learning, etc.
D. Trainees’ 1 trainees were encouraged to consider how
Learning 3 ideas and skills gained during the training could
be incorporated into their own practices
1 contribution of all trainees, both male and
4 female, were encouraged
1 trainees demonstrated a clear understanding of
5 the content delivered
E. Trainers’ Conduct 1 the trainers’ competencies were evident in the
of Sessions 6 conduct of the sessions
1 teamwork among the trainers and staff was
7 manifested
1 trainers established a positive learning
8 environment
1 training activities moved quickly enough to
9 maintain trainees’ interest
T&D System M&E Framework and Tools Handbook, June 2010 Page 131
F. Provision of 2 training materials were clear and useful
Support Materials 0
2 powerpoint presentations supported the flow of
1 the sessions
2 the resources provided were appropriate to
2 trainees’ needs
G. Program 2 Program Management Team members were
Management Team 3 courteous
2 Program Management Team was efficient
4
2 Program Management Team was responsive to
5 the needs of trainees
H. Venue and 2 the venue was well lighted and ventilated
Accommodation 6
2 the venue was comfortable with sufficient
7 space for program activities
2 the venue had sanitary and hygienic conditions
8
2 meals were nutritious and sufficient in quantity
9 and quality.
3 the accommodation was comfortable with
0 sanitary and hygienic conditions
I. Overall 3 I have the knowledge and skills to apply the
1 new learning
3 I have the confidence to implement the JEL
2 contract
What do you consider your most significant learning from the program?
What changes would you suggest to improve similar programs in the future?
Briefly describe what you have learned and how it will help you with your work.
T&D System M&E Framework and Tools Handbook, June 2010 Page 132
F3-M&E Form 3: End of the F3 Program Assessment
Consolidation Template
Collate the accomplished F3-M&E Form 3: End of the F3 Program Assessment, and review the
results. Separate results should be consolidated for each respondent type e.g. Trainees.
Trainers and Program Managers. Use the table below to consolidate the results for the
quantitative items.
Note: The scoring and consolidation can be efficiently done using MS Excel.
Use the scale below to interpret mean rating for each item of the assessment:
3.5 to 4.0 = (SA) Strongly Agree
2.5 to 3.4 = (A) Agree
1.5 to 2.4 = (D) Disagree
1.0 to 1.4 = (SD) Strongly Disagree
T&D System M&E Framework and Tools Handbook, June 2010 Page 133
12
D Trainees’ Learning
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
I Overall
31
32
T&D System M&E Framework and Tools Handbook, June 2010 Page 134
Summary of Qualitative Responses
What do you consider your most significant learning from the program?
What changes would you suggest to improve similar programs in the future?
Briefly describe what you have learned and how it will help you with your work.
T&D System M&E Framework and Tools Handbook, June 2010 Page 135
F3-M&E Form 4: External M&E for the F3 Process
and Accomplishments
The indicators below are the standards for monitoring F3 sessions. In the “Observation”
column, put a check mark (√) if the indicator is observed and a cross mark (x) if not
observed. In the “remarks” column write your comments.
T&D System M&E Framework and Tools Handbook, June 2010 Page 136
6. JEL 6.1 The trainee has the knowledge and
Contract skills to apply the new learning
6.2 The trainees accomplished the JEL
Contract and are confident to
implement it
7. Provision 7.1 Training materials are organized, clear
of Support and useful
Materials 7.2 Powerpoint presentations support the
flow of the sessions
7.3 The resources provided are
appropriate to trainees’ needs
8. Program 8.1 Program Management Team
Managemen members are cooperative and
t Team courteous
8.2 Program Management Team
members are responsive to the needs
of trainees
8.3 Program Management Team
members are efficient and effective
9. Venue 9.1 Well-lighted, ventilated and with good
and hygienic conditions
Accommoda 9.2 Comfortable with sufficient space for
tion program activities
9.3 Meals were nutritious and sufficient in
quantity and quality.
Recommendations
Monitored by:
____________________________
Name and Designation
T&D System M&E Framework and Tools Handbook, June 2010 Page 137
F3-M&E Form 5: Rapid Competency Assessment
Note to the Program Management: This Template will guide you in developing the M & E
Form, Rapid Competency Assessment Tool, for a specific program to be delivered. Work
through the following steps to complete the M&E Form:
Direction: Describe your level of competency in the following items. For the column labeled “Pre
F3”, you will describe your competency level before you joined/attended this program. For the
column marked “Post F3”, describe your competency level after having participated in the training
program
Competency Scale to be used:
4 – I have a mastery of the competency and have demonstrated/applied it
3 – I have adequate competency and need to practice it
2 – I have inadequate competency and no understanding of how to apply it
1 – I have no competency/learning at all.
T&D System M&E Framework and Tools Handbook, June 2010 Page 138
10.
11.
12.
Session 4…and so on
13.
14.
C. The Job-embedded Learning (JEL)
15. The concept and purpose of the JEL in professional
development
16. Accomplishing a JEL Contract
THANK YOU
T&D System M&E Framework and Tools Handbook, June 2010 Page 139
F3-M&E Form 5: Rapid Competency Assessment Consolidation
Template
Title of F3 : ______________________________________________
Region/Division/School : ___________________________________
Date: ___________________________________________________
Direction: Each of the participant’s Rapid Assessment Mean Score for the column “PRE-F3”and
“POST-F3” (refer to F3-M&E Form 5) will need to be calculated and then consolidated in the table
below.
Compute the gain for each participant using the formula: Gain = (POST-F3) – (PRE-F3 ).
Calculate the Average Gain for the participants using the formula:
Average Gain = Sum of gains /number of participants
AVERAGE GAIN
T&D System M&E Framework and Tools Handbook, June 2010 Page 140
F3-M&E Form 6: F3 Program Completion Report Template
Program Title: (Please
(Add title of use electronic version)
program)
Location and (Write the city and the actual venue e.g. Cebu, EcoTech)
venue:
Duration: (Include duration of the F3 phase )
Key Results (Identify the key results from the conduct of the program taking in to
consideration the F3 phase)
Resources (Identify the resources required to conduct the program e.g. Title of the
Materials Resource Package, Operations Manual)
T&D System M&E Framework and Tools Handbook, June 2010 Page 141
F3-M&E Form
M&E Analysis 7: Summary
After reviewing Template
the F3-M&E forthe
results from Refining the a narrative
program write
analyzing the results. This should include
Resource
Results Package
from the participants evaluation of the program
(electronic
Results version
from the available)
facilitators review of the program
Results from the program managers review of the program
Strengths and areas for improvement should be identified in this
section
General In this section make any general comments about the program and identify
Comments and any issues encountered in relation to:
Issues its delivery (during the F3 phase)
Encountered - trainers/facilitators
- participants
- content of program
- delivery strategies
- training materials
-
its management (during the F3 phase)
- prior to delivery
- during the F3 phase
Recommendation In this section discuss any recommendations you may have to improve future
s programs. Include suggestions for refining the Resource Package
T&D System M&E Framework and Tools Handbook, June 2010 Page 142
Title of F3 Program: _________________________________________________________________
Directions: Fill in the template with the necessary information for the refinement of the Resource
Package.
Session 2
Session 3
Etc…
(Note: Please attach Resource Package with corrections when/upon endorsing to T&D Chief)
T&D System M&E Framework and Tools Handbook, June 2010 Page 143
JEL-M&E Form 1: Quality Assurance of JEL Contract
This form aims to support a Quality Assurance Process of the accomplished JEL Contract. The JEL
TEAM shall review the JEL contract to assess the extent to which the standards were followed in its
accomplishment.
Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way
Use the scale above to assess the extent to which the accomplished JEL Contract adheres to the
following:
Recommendations:
Name: ___________________________________
Position: _________________________________
Date: ____________________________________
T&D System M&E Framework and Tools Handbook, June 2010 Page 144
JEL-M&E Form 2: JEL Journal Entry Sheet
The JEL Journal identifies the type of information that should be documented by all JEL
Team Members to record progress during the various stages of the JEL Program. The journal
entries will be used to inform discussions during the Reflection Stage and as a Means of
Verification of Learning. Each JEL Member should establish their own Journal Booklet and
record their entries based on the information below, for each entry.
- Problems met
Identified Strengths
JEL-M&E Form 4: JEL ADVISING TRACKING FORM
(electronic version available)
NOTE: If you have demonstrated all the objectives outlined in the JEL Contract, proceed to the
Internalization Stage.
If you have not demonstrated all the objectives outlined in the JEL Contract please complete the
sections below before commencing the Enhancement Stage
What are my next When? JEL Team What resources What will be the
steps? support do I need? Means of Verifying
(Activities and required? (MOV) of my learning?
Strategies) ( planned
accomplishments and
date)
Signature of Date
Learner/Trainee
Next Steps
Please rate how you feel you have faired relative to the following processes involved in the
accomplishment of the Job –Embedded Learning phase of the training program. Please tick the
appropriate column for your rating using the scale below.
Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way
A. Planning for 1 the roles and responsibilities of JEL team and trainees
Implementation were thoroughly reviewed and understood by all?
2 the objectives of JEL contract were clearly understood
by all?
3 the JEL team schedule of activities was confirmed and
agreed to by all?
B. Implementation 4 there was evidence of enhancement of your
competencies?
5 that there was minimum disruption to your organic
functions and the entire learning community when
accomplishing JEL activities?
6 activities accomplished were cost effective and
practical?
7 activities accomplished were well-coordinated and well
managed?
8 formative records of learning were kept e.g. journal?
C. Reflection 9 reflections contained qualitative data on your
accomplishments?
10 the documents presented were objective?
11 the reflections made were KSA-oriented?
12 KSAs for further enhancement were identified?
13 strengths were identified?
14 next steps where identified?
D. Enhancement 15 enhancement activities were focused on achieving
identified competencies?
16 alternative strategies were employed?
17 additional support from the coach / JEL team was
provided?
18 additional time for enhancement was allocated when
required?
E. Internalization 19 enhanced competencies were demonstrated in daily
work?
20 new KSAs were recognized by others?
21 best practices were voluntarily shared with colleagues?
F. Portfolio 22 the portfolio provided evidence of the learning that has
(optional) taken place during the JEL phase of the program?
Describe the major changes have you made to your work practice as a result of the training program
(F3 and JEL).
Describe how you shared your learning with colleagues. Give details as to how it was done, who was
involved and the reactions received.
Do you have other comments/suggestions/recommendations for the improvement of the JEL phase
of the training program?
JEL-M&E Form 5: Trainees End of Job-Embedded Learning (JEL)
Evaluation Consolidation Template
Collate the accomplished JEL-M&E Form 5: Trainees’ End of JEL Evaluation, and review the
results. Use the table below to consolidate the results for the quantitative items.
Note: The scoring and consolidation can be efficiently done using MS Excel.
Use the scale below to interpret mean rating for each item of the assessment:
3.5 to 4.0 = Very High Extent (VHE)
2.5 to 3.4 = High Extent (HE)
1.5 to 2.4 = Low Extent (LE)
1.0 to 1.4 = Very Low Extent (VLE)
C Reflection
9
10
11
12
13
14
D Enhancement
15
16
17
18
E Internalization
19
20
21
F Portfolio (Optional)
22
Summary of Qualitative Responses
Major changes made as a result of the training program (F3 and JEL).
Reactions received.
JEL-M&E Form 6: JEL Program Completion Report Template
Program Title: (Add title(Please use electronic version)
of program)
JEL Program At the end of the JEL program the trainees will have
Objectives as
indicated in F3
Resource
Package
Key Results (Identify the key results from the conduct of the program taking in to
consideration JEL phase)
General In this section make any general comments about the program and identify
Comments and any issues encountered in relation to:
Issues its delivery (during JEL phase)
Encountered - trainers/advisers
- participants
- content of JEL program
- strategies
- training materials
-
its management (during the JEL phases)
- prior to delivery
- during the JEL phase
Recommendation In this section discuss any recommendations you may have to improve future
s programs
to
Jonathan Batenga