You are on page 1of 167

Republic of the Philippines

Department of Education

T&D Monitoring and Evaluation


Framework and Tools

HANDBOOK

DepED-EDPITAF-STRIVE
Training and Development
June 2010
This document, The Training and Development Monitoring and
Evaluation Framework and Tools Handbook, was developed and
validated in Regions VI, VII and VIII, Divisions of Negros Occidental,
Bohol/Tagbilaran and Northern Samar through the AusAID-funded
project, STRIVE (Strengthening the Implementation of Basic
Education in selected Provinces in the Visayas), in coordination with
the EDPITAF (Educational Development Project Implementing Task
Force), and in consultation with the TEDP-TWG, NEAP and the
Bureaus of the Department of Education.

T&D System M&E Framework and Tools Handbook, June 2010 Page 2
Table of Contents
Section 1.0: Training and Development (T&D) System Monitoring and
Evaluation General Framework
……………………………………………………………………….........................1

Section 2.0: T&D System Monitoring and Evaluation for the Training
and Development Needs Assessment (TDNA) System
……………………………….....................................................................................6
2.1 The TDNA System, p6
2.2 Monitoring and Evaluation for the NCBTS-TSNA, p7
2.3 Monitoring and Evaluation for the TDNASH, p30
2.4 Monitoring and Evaluation for the Origanizational TDNA for
the Region and Division, p40

Section 3.0: T&D System Monitoring and Evaluation for the


Professional Development Planning (PDP) System ………..
………………………...…........................................................................................56
3.1 The PDP System, p56
3.2 Monitoring and Evaluation for the IPPD, p59
3.3 Monitoring and Evaluation for the SPPD, p68
3.4 Monitoring and Evaluation for the MPPD, p75

Section 4.0: T&D System Monitoring and Evaluation for the Program
Desigining and Resource Development (PDRD) System ………........................85
4.1 The PDRD System, p85
4.2 Monitoring and Evaluation for Program Designing, p86
4.3 Monitoring and Evaluation for Resource Development, p95

Section 5.0: T&D System Monitoring and Evaluation for the


Program Delivery (PDy) System …....................................................................104
5.1 The PDy System, p104
5.2 Monitoring and Evaluation for Program Delivery, p104

Acknowledgements ..........................................................................................................142

T&D System M&E Framework and Tools Handbook, June 2010 Page 3
Section 1.0: Training & Development System Monitoring and Evaluation
General Framework

Monitoring and Evaluation

Integral to the Training & Development (T&D) System is its monitoring and evaluation (M&E)
support. This ensures the effectiveness and efficiency of its operations. Monitoring and Evaluation
activities are vital in ensuring that program implementation adheres to the standards for the system’s
inputs, processes, outputs and outcomes. In carrying out monitoring and evaluation activities, M & E
instruments are indispensable and the processes relating to the application and use of these
instruments equally important.

The M&E of the Training Development Needs Assessment (TDNA), the Professional Development
Planning (PDP), the Program Designing and Resource Development (PDRD) and the Program Delivery
(PDy) Systems support their integration and adherence to the overall goal and objectives of the
entire system. While the M&E framework is the basis for the internal quality assurance of the system,
its results also inform external Quality Assurance of the system’s adherence to standards and
specifications that are expected for the outputs at the different levels. Moreover, the M&E results
provide information on the strengths and/or weaknesses of the Training & Development System
itself and of the different systems to support sustainability and improvement.

Below is the General M&E Framework containing the standards at the input, process, output and
outcome system levels covering the T&D operations for the four sub-systems at the region, division
and school levels.

Regional Level Division Level School Level


T&D System Standards Standards Standards
Monitoring
and
Evaluation
General
Framework
System
Levels
A. T & D Needs Assessment (TDNA) System
Outcom Increased % of RO Increased % of DO division/units Increased % of Teachers who
e division/units participation in the participation in the organizational are assessed of T&D needs
organizational T&D process T&D process
Systematic and continuous
Systematic and continuous Increased % of SHs who are NCBTS-TSNA for the teachers
TDNA for the region/divisions assessed of T&D needs
Informative TDNA Results that
Informative TDNA Results that Systematic and continuous TDNA serve as basis for needs-based
serve as basis for needs-based for the division/district and schools School planning for T&D
Regional planning for HR T&D
Informative TDNA Results that
serve as basis for needs-based
Division planning for HR T&D

T&D System M&E Framework and Tools Handbook, June 2010 Page 4
Output Reliable and valid TDNA results Reliable and valid TDNA results for Reliable and valid NCBTS-
for R-Organizational TDNA Organizational Division TSNA results

Regularly updated Database Regularly updated Database


identifying T&D Priority Service identifying T&D Priority Service
Areas/Competency Needs for Areas/ Competency Needs for the
the RO DO

Complete and accurate Complete and accurate Complete and accurate


consolidation/analysis and consolidation/ analysis & Profile of consolidation/analysis & Profile
Profile of all (or % of) Divisions’ all (or % of) the Division’s and of Teachers’ NCBTS-TSNA
Priority Needs for T&D Districts’ Priority Needs of Teachers Priority Needs for T&D
(TSNA), & School Heads(TDNASH)
Process Systematic and efficient Systematic and efficient conduct of Systematic and efficient
conduct of the Region’s the Division’s Organizational TDNA conduct of NCBTS-TSNA
Organizational TDNA and TDNASH,

Well-documented M&E of the Well-documented M&E of the


Conduct of Region/Division Conduct of NCBTS-TSNA
Organizational TDNA, Division
management of the TDNASH Relevant feedback provided to
and NCBTS-TSNA improve the TDNA process in the
division and schools
Relevant feedback provided to
improve the TDNA process in
the region and division
Input Competent and sufficient Competent and sufficient personnel Competent and sufficient
personnel of the Regional of the Division TDNA-WG personnel of the School TDNA-
TDNA-WG WG (NCBTS Coordinator)
Sufficient and proper
Sufficient and proper representation of the different Available and relevant support
representation of the different sections’ respondents of the resources: School T&D Work
sections’ respondents of the organizational TDNA plan, NCBTS-TSNA
organizational TDNA tools/materials (Volume 1& 2),
Available and relevant support funds
Available and relevant support resources: Div-T&D Work plan,
resources: Reg-T&D Work plan, TDNA tools/materials (Volume
TDNA tools/materials (Volume 1&2), funds
1&2), Funds
Levels Regional Level Division Level School Level
Standards Standards Standards
B. Professional Development Planning (PDP) System
Outcom Systematic & periodic conduct Systematic and periodic conduct of -Systematic and periodic
e of MPPD for the Region MPPD for the Division conduct of IPPD/SPPD

MPPD Results serve as basis Systematic and periodic conduct of -SPPD Results serve as basis
for Program Designing IPPD for school heads for Program Designing

MPPD Results serve as basis for


Program Designing

Output Relevant and needs-based Relevant and needs-based Division Relevant and needs-based
Regional MPPD MPPD IPPD/SPPD

Updated database identifying Updated database identifying Updated database identifying


priority training needs across priority training needs across the priority training needs of the
the region division school personnel

T&D System M&E Framework and Tools Handbook, June 2010 Page 5
Process Systematic and efficient Systematic and efficient conduct of Systematic and efficient
conduct of Regional MPPD Division MPPD conduct of the IPPD/SPPD

Well documented evidence of Well documented evidence of TA Well documented evidence of


TA provided and M&E results provided and M&E results for the TA provided and M&E results
for the conduct of conduct of the Division MPPD and for the conduct of SPPD and
Region/Division MPPD and SPPDs IPPD
Division Management on the
conduct of SPPDs
Relevant and reflective Relevant and reflective feedback
feedback provided to improve provided to improve the Division Relevant and reflective
the Regional MPPD MPPD feedback provided to improve
the IPPD/SPPD

Input Competent and sufficient Competent and sufficient planners Competent and sufficient
planners for the conduct of for the conduct of the Division planners for the conduct of
Regional MPPD MPPD and conduct of Cluster SPPD and IPPD
Orientation of SPPD

Available and relevant support Available and relevant support Available and relevant support
resources such as the MPPD resources such as the MPPD resources such as the SPPD &
Guide, template, tools, Guide, template, tools , IPPD guides, templates, tools,
consolidated Organizational consolidated Organizational TDNA, consolidated TSNA Results,
TDNA results, REDP, TDNASH and TSNA Results, Consolidated IPPD for
Student/Pupil performance, Consolidated IPPD of school heads Teachers, SIP, Student/Pupil
funds & teachers, DEDP, Student/Pupil Performance, funds
performance, funds
Complete and accurate Complete and accurate
consolidation/analysis/profile of Complete and accurate consolidation/analysis /profile of
Divisions MPPDs consolidation/analysis /profile of IPPDs
SPPDs

Levels Regional Level Division Level School Level


Standards Standards Standards
C. Program Designing and Resource Development (PDRD) System
C.1. Program Designing
Outcom Increased access to quality Increased access to quality Increased access to quality
e program designs program designs program designs

Output Comprehensive, flexible and Comprehensive, flexible and needs- Comprehensive, flexible and
needs-based program designs based program designs of the needs-based program designs
of the region division of the school

Quality-assured/standards- Quality-assured/standards-based Quality-assured/standards-


based program designs program designs based program designs
Process Systematic development of Systematic development of Systematic development of
program designs at the regional program designs at the division program designs at the school
level level level

Well documented M & E on the


Region/Division Management Well documented M & E of the
and development of program conduct of Division/cluster/ school
designs program designs

Relevant feedback provided to Relevant feedback provided to


enhance the program designs Relevant feedback provided to enhance the program designs
quality and processes in the enhance the program designs quality and processes in the
region quality and processes in the school
division/cluster/school levels

T&D System M&E Framework and Tools Handbook, June 2010 Page 6
Input Competent and sufficient Competent and sufficient personnel Competent and sufficient
personnel of the Reg. PDRD- of the Division PDRD-WG personnel of the school’s
WG PDRD-WG
Available support resources:
Available support resources: MPPD ( Div.) & SPPD (Schools), Available support resources:
MPPD (Region and Division, adequate funds, PDRD System-Vol. SRC, SIP(AIP), EMIS, SPPD,
adequate funds, PDRD System- 4, resource persons) adequate funds, PDRD
Vol. 4, resource persons) System-Vol. 4, resource
persons

Levels Regional Level Division Level School Level


Standards Standards Standards
C.2. Resource Development
Outcom Increased access to quality Increased access to quality Increased access to quality
e Resource Packages that are Resource Packages that are Resource Packages that are
relevant to address professional relevant to address professional relevant to address professional
development needs in the development needs development needs
region
Output Quality assured/ standards- Quality assured/ standards-based Quality assured/ standards-
based Resource Packages for Resource Packages based Resource Packages
Professional development

Process Systematic & efficient conduct Systematic & efficient conduct of Systematic & efficient conduct
of the Resource Package the Resource Package of the Resource Package
Development in the region Development in the Division Development in the School

Well-documented M&E on the Well-documented M&E on the Well-documented M&E on the


conduct of Resource Package conduct of Resource Package conduct of Resource Package
Development in the Development in the Division/ Development in the Schools
Region/Division Cluster/Schools
Relevant feedback provided on
Relevant feedback provided on Relevant feedback provided on the the quality of the Resource
the quality of the Resource quality of Resource Package and Package and the conduct of
Package and the conduct of the conduct of Resource Package Resource Package
Resource Package Development in the Development in the schools
Development in the Division clusters/schools
Input Competent/Expert Resource Competent/Expert Resource Competent/Expert Resource
Package Developers (updated Package Developers (updated Package Developers (updated
database profile of resource database profile of resource database profile of resource
developers) developers) developers)

Quality-assured Program Quality-assured Program Design Quality-assured Program


Design with approved budget with approved budget for the Design with approved budget
for the delivery of the program delivery of the program for the delivery of the program

Available support resources: Available support resources: Available support resources:


adequate funds, PDRD System- adequate funds, PDRD System-Vol. adequate funds, PDRD
Vol. 4,) 4,) System-Vol. 4,)
Levels Regional Level Division Level School Level
Standards Standards Standards
D. Program Delivery (PDy) System

Outcom Improved Work Performance of Improved work performance of Improved school-based practice
e clientele in all areas clientele in all areas

Improved Learning Outcomes of Improved Learning Outcomes of Improved learning outcomes of

T&D System M&E Framework and Tools Handbook, June 2010 Page 7
clientele clientele clientele

Systematized & Continuous Systematized & Continuous Systematized & Continuous


Professional Development of all Professional Development of all in- Professional Development of all
in-service personnel service personnel in-service teachers
Output Enhanced (KSAs) Enhanced (KSAs) competencies of Enhanced competencies
competencies of clientele clientele (KSAs) of clientele

Well-documented best practices Well-documented best practices Well-documented best


shared with colleagues shared with colleagues practices shared with
colleagues

Process Effective and efficient Effective and efficient management Effective and efficient
management and conduct of F3 and conduct of F3 management and conduct of F3
- F3 Relevant and needs-based Relevant and needs-based
Componen Technical Assistance to the Technical Assistance to the school
t division conduct of F3 conduct of F3

Standard-based and well- Standard-based and well- Standard-based and well-


documented M&E of the documented M&E of the documented M&E of the
Region/Division-led F3 Division/Cluster/School-based F3 School-based F3

Updated information of the Updated information of the school-


school-based F3 as monitored based F3
by the division

Documentation of significant Documentation of significant Documentation of feedback for


feedback for the refinement of feedback for the refinement of the refinement of Resource
Resource Package used in F3 Resource Package Package

Accomplished quality outputs in Accomplished quality outputs in Accomplished quality outputs in


each F3 program each F3 program each F3 program
Process Systematic and efficient Systematic and efficient conduct of Systematic and efficient
conduct of JEL activities JEL activities conduct of JEL activities
-JEL Relevant feedback provided to Relevant feedback provided to Relevant feedback provided to
Componen improve the conduct of JEL improve the conduct of JEL improve the conduct of JEL
t
Input Competent and sufficient Competent and sufficient program Competent and sufficient
program management staff and management staff and trainers for program management staff and
trainers for F3 and JEL F3 and JEL trainers/School JEL Team

Complete, available, and Complete, available, and relevant Complete, available, and
relevant support resources (as support resources (as required in relevant support resources ((as
required in the approved the approved resource required in the approved
resource package/JEL Contract, package/JEL Contract, funds) resource package/JEL
funds) Contract, funds)

Enabling policies, standards, Enabling policies, standards, and Enabling policies, standards,
and processes processes and processes

T&D System M&E Framework and Tools Handbook, June 2010 Page 8
Section 2.0: T&D System Monitoring and Evaluation for the Training and
Development Needs Assessment (TDNA) System

2.1. M&E for the TDNA System

The Training Development Needs Assessment (TDNA) System has three major needs assessment
processes. These are the National Competency Based Teacher Standards – Teacher Strengths
and Needs Assessment (NCBTS-TSNA), the Training Development Needs Assessment for
School Heads (TDNASH) and the Organizational Training Development Needs Assessment at
the region and division level.

The diagram below shows the procedural design for the TDNA monitoring and evaluation at
the Division and Regional levels. The personnel from the TDNA-Working Group (WG)
responsible for M&E are tasked to monitor and evaluate the preparation, conduct and
consolidation of the TDNA results.

1.4.
Call
TDNA System Monitoring Division
& Evaluation
T&D Unit/Regional
1.4.6a
Division/Region
1.4.1 TDNA -WG Identify &
for Prepare 1.4.5 T&D Office
Situati Inform
TDNA- WG Review
onal monitored
1.4.2 and
for M&E M&E
Analys schools/Divisio
Monitor
Resources Report 1.4. 6b
is ns of M&E
process and Make
findings
TDNA
compliance necessary
1.4.6c System
1.4.3 of
standards adjustments to
Prepare Report
Record
TDNA TDNA System Adjust
on TDNA ment
results of Process based
monitoring & on M&E for
evaluations 1.4.4 Regional policy
TDIS Prepare review
Reportandon
Data M&E adjustment
TDNARegional
base TDNA
Report
ProcessPolicy
System
Review &
M&E
Adjustme
Report
nt

At the Division or Regional level, the M&E personnel prepare the M&E report and informs
the T&D Office or Unit of the findings. Moreover, the T&D Office/Unit develops
recommendations for the improvement of the process, which informs regional policy review,
and adjustment of the TDNA System.

T&D System M&E Framework and Tools Handbook, June 2010 Page 9
2.2. M&E for the NCBTS-TSNA

A number of M & E instruments have been developed to support the NCBTS Orientation and
the administration of the NCBTS-TSNA tool. The following tools are available to support the
NCBTS-TSNA processes.

Tools for NCBTS:


T&D-M&E Form 1: Individual Profile Template
NCBTS-M&E Form 1: Teacher’s Profile
NCBTS-M&E Form 2: Learning Process Observation and Facilitation Skills
NCBTS-M&E Form 3: NCBTS-Coordinators Checklist plus and Consolidation Template
NCBTS-M&E Form 4: Trainer’s Assessment of NCBTS Orientation Workshop and Consolidation
Template
NCBTS-M&E Form 5: Trainee’s End of F3 Program Assessment and Consolidation Template
NCBTS-M&E Form 6: Documentation Tool for the Conduct of Cluster or School level NCBTS-
TSNA Implementation
NCBTS-M&E Form 7: School’s NCBTS –TSNA Consolidation Template

A: Training and Development Needs Assessment (TDNA) System


A1. NCBTS-TSNA
System M&E Tools for M&E Tools for Division/Cluster M&E Tools for School Level
Levels Regional Level Level
Output NCBTS-M&E Form 7: School’s NCBTS-M&E Form 7: School’s
NCBTS-TSNA Consolidation NCBTS-TSNA Consolidation
Template Template

Process NCBTS-M&E Form 6 NCBTS-M&E Form 2: Learning NCBTS-M&E Form 2: Learning


Documentation Tool for Process Observation and Facilitation Process Observation and Facilitation
Division, Cluster or Skills Skills
School Level NCBTS-
TSNA Implementation NCBTS-M&E Form 3: NCBTS NCBTS-M&E Form 4: Trainer’s
Coordinator’s Checklist and Assessment of the NCBTS
Consolidation Template Workshop and Consolidation
Template
NCBTS-M&E Form 4: Trainer’s
Assessment of the NCBTS NCBTS-M&E Form 5: Trainee’s End
Workshop and Consolidation of F3 Program Assessment and
Template Consolidation Template

NCBTS-M&E Form 5: Trainee’s End NCBTS-M&E Form 6:


of F3 Program Assessment and Documentation Tool for Division,
Consolidation Template Cluster or School Level NCBTS-
TSNA Implementation
NCBTS-M&E Form 6:
Documentation Tool for Division,
Cluster or School Level NCBTS-
TSNA Implementation

T&D System M&E Framework and Tools Handbook, June 2010 Page 10
Input T&D-M&E 1: Individual Profile NCBTS-M&E 1: Teacher Profile for
Template (for ES/PSDS; NCBTS-TSNA
SHs/NCBTS Coordinator)
Checklist Prior to Conduct of
Checklist Prior to Conduct of NCBTS-TSNA incorporated into
NCBTS-TSNA incorporated into NCBTS Guide
NCBTS Guide

The matrix below describes the mechanism and tools to be used for the monitoring and evaluation
of the NCBTS-TSNA process:

What will be How it will be M&E tool to Who will When will the How will the results
monitored monitored be used be monitoring be used
responsibl take place
e for the
monitoring
NCBTS All NCBTS T&D-M&E TDNA-WG Prior to their Results will be
Implementers details Implementers Form 1: involvement in analyzed to ensure
in relation to their will be asked to Individual the NCBTS- NCBTS Implementers
current position, their complete the TSNA process have the required
Profile
level of experience profile KSAs. Results will be
and qualification Template entered into the TDIS

Teachers details in All teachers will NCBTS-M&E TDNA-WG Prior to the Results will be entered
relation to their be asked to Form 1: accomplishment into the TD IS
current position, their complete the Teacher of the NCBTS- database along with
level of experience profile TSNA Tool their corresponding
Profile for
and qualification NCBTS-TSNA results
NCBTS-TSNA

Implementation of A Process NCBTS-M&E TDNA -WG During the Results will be


the NCBTS-TNA Observer will be Form 2: NCBTS discussed with
Orientation Package assigned to Learning orientation individual Trainers to
in relation to the complete a workshop identify strengths and
Process
processes followed Learning areas for improvement
and the facilitations Process Observation during debriefing
skills demonstrated Observation for and sessions.
each session Facilitation Recommendations
Skills based on a analysis
of the results should
be included in the
Program Completion
Report

The competency of A TDNA-WG NCBTS-M&E Division During the Results will be


the NCBTS member will be Form 3: TDNA-WG NCBTS discussed with
Coordinators in assigned to NCBTS orientation individual NCBTS
relation to the criteria observer the workshop Coordinators to
Coordinator’s
set for the role. NCBTS identify strengths and
Coordinator Checklist areas for improvement
during the Results will be used to
orientation inform future decisions
process regarding the criteria
and process for
selecting NCBTS
Coordinators.

T&D System M&E Framework and Tools Handbook, June 2010 Page 11
Recommendations
based on an analysis
of the results should
be included in the
Program Completion
Report

The overall Each of the NCBTS-M&E Division Upon Results will be collated
effectiveness of the trainers will be Form 4: TDNA-WG completion of and analyzed by the
workshop as asked to make Trainer’s the NCBTS TDNA-WG. A
delivered by the an assessment orientation summary of the results
Assessment of
whole Team. of the workshop will be included in the
orientation. the NCBTS Program Completion
Orientation Report and will inform
Workshop future training.

Participants All participants NCBTS-M&E TDNA-WG Upon Participants


perception of the will be asked to Form 5: completion of evaluations will be
training in relation to complete the Trainee’s End the NCBTS- collated by the TDNA-
- the overall quality of Trainee’s End TSNA WG and the results
of F3 Program
the training of F3 Program orientation analyzed. A summary
- the usefulness of Assessment Assessment workshop of the results will be
the training Form Form included in the
- their ability to Program Completion
implement the Report and will inform
content of the future training.
training
- strengths and
weaknesses of the
training

The implementation A Process NCBTS-M&E Region, During the Results to be


of the NCBTS-TSNA Observer will be Form 6: Division NCBTS-TSNA discussed with the
Orientation at the identified and Documentatio TDNA - WG Orientation Implementers and
division, cluster and asked to n Tool for the Workshop at identify strengths and
school level complete the Conduct of the Division, areas for
tool Cluster or Cluster or improvement.
School Level School Level
NCBTS-TSNA Observations will be
Implementatio collated by the TDNA-
n WG and the results
analyzed to inform
future training

The priority training The NCBTS NCBTS-M&E TDNA-WG After the Results will be used to
needs of teachers Coordinator and Form 7: accomplishment inform school and
the School School’s of the NCBTS- division plans for
Head will NCBTS-TSNA TSNA tool professional
consolidate the Consolidation development. Results
results from the Template will be submitted to the
administration Division.
of the NCBTS –
TSNA tool

T&D System M&E Framework and Tools Handbook, June 2010 Page 12
T&D-M&E Form 1: Individual Profile Template

I PERSONAL DATA
Name:

(Surname) (First Name) (Middle Name)

Employee Number (If Applicable): Femal


Sex: Male e
Date of Birth:
Home
Address:
Contact #: e-mail address:
Region
: Division: District:
Office/School: Address:
Current Other
Position: Designations:
Highest Educational Attainment:

II. WORK EXPERIENCE


(List from most current.)
LEVEL e.g. Elem/Sec/ALS INCLUS
MAIN AREA OF RESPONSIBILITY
POSITION school, district, division, IVE
e.g. subjects taught, level supervised PERIOD
region

Use additional sheet if necessary.

T&D System M&E Framework and Tools Handbook, June 2010 Page 1
III. TRAINING ATTENDED OVER THE LAST THREE YEARS

Please check training focus and management level for all training attended over the last three
years.

Training Focus Training Management Level of Training


attended Central Region Division Cluster School
over last 3
years ()
Curriculum

Resource Materials
Development

Planning

Management

Policy Development

Research

Other, please specify


______________

IV. SIGNIFICANT EXPERIENCES


Identify which of the following areas you consider to be your area(s) of expertise:
S School Based Management
Quality Assurance Monitoring and Evaluation
Access Education Subject Specialization:
_____________)
Education Planning Policy Development
Learning Resource Materials Development ICT
Delivery of Training Other, please specify
________________

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please specify


-----

List your significant experiences in the identified areas

Use additional sheet if necessary.

T&D System M&E Framework and Tools Handbook, June 2010 Page 2
V. TRAINING AND DEVELOPMENT EXPERIENCES
Identify which of the following specific areas you consider to be your
area(s) of expertise:

Competency Assessment Program Planning

Program Designing Resource Materials


Development

Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best of my knowledge and belief.

Date: Signature:

Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.

T&D System M&E Framework and Tools Handbook, June 2010 Page 3
NCBTS-M&E Form 1: Teacher’s Profile for NCBTS-TSNA

T&D System M&E Framework and Tools Handbook, June 2010 Page 4
T&D System M&E Framework and Tools Handbook, June 2010 Page 5
NCBTS-M&E Form 2: Learning Process Observation and
Facilitation Skills

This form is to be used during the actual delivery of a program. A Process Observer will need to be
assigned to complete the Learning Process Observation for each session. Results should be used to
inform daily debriefing sessions. At the end of this form is a checklist of facilitation skills which may
be observed and recorded.

Session No. _____ Title: ____________________________________________

Time Session Started: ________________ Time Session Ended:____________

Process Observer: ___________________ Designation (M&E Team Member/Trainer)

Phases of Facilitation Skills Trainee’s Comments


Session Demonstrated Knowledge
/Insights/Skills,
Values Learned
Introductory

Activity

Analysis
Abstraction

Application

Concluding
Activity

T&D System M&E Framework and Tools Handbook, June 2010 Page 7
Observe if the skill has been demonstrated by the Facilitator. If so, put a check in the appropriate
column.

Checklist of Facilitation Skills √


OBSERVING SKILLS
1. noted trainees’ level of involvement in all activities
2. monitored the energy level of the trainees during sessions
3. sensed the needs of the trainees that may affect the learning process
QUESTIONING SKILLS
4. formulated questions in a simple manner
5. asked questions that were clear and focused
6. formulates follow-up questions to trainees’ responses appropriately
7. asked Higher Order Thinking Skills (HOTS)
8. acknowledged trainees’ responses
9. solicited, accepted and acted on feedback from trainees
10. processed responses with probing questions to elicit the desired training
LISTENING SKILLS
11. listened and understood the meaning of what had been said
12. responded positively to trainees insights
13. clarified and checked my understanding of what was heard
14. reacted to ideas not to the person
ATTENDING SKILLS
15. created the proper environment based on adult learning principles
16. directed and redirected the trainees to the learning tasks
17. managed the learning atmosphere throughout the sessions
18. acknowledged greetings and responses of trainees
INTEGRATING SKILLS
19. highlighted important results of the activity that lead to the attainment of the
objectives of the session
20. deepened and broadened trainees outlook on the significance of the outputs
ORAL COMMUNICATION SKILLS
21. expressed ideas with clarity, logic and in grammatically correct sentences
22. spoke with a well-modulated voice
23. delivered ideas with confidence and sincerity
SKILL IN USING TRAINING AIDS
24. employed appropriate and updated training aids
25. made training aids that were simple and clear
26. used training aids that were attractive and interesting
27. utilized training aids that were socially, culturally, and gender-fair

T&D System M&E Framework and Tools Handbook, June 2010 Page 8
NCBTS-M&E Form 3: NCBTS Coordinator’s Checklist

Name of NCBTS Coordinator to be monitored: ____________________________________________

Please assess the competency of the NCBTS Coordinator according to the following indicators by
checking under the appropriate column.

Legend: M-Manifested; N-Not Manifested


The NCBTS Coordinator demonstrates… M NM Comments
1 Proficiency in the use of MS Word

2 Proficiency in the use of MS Excel proficient

3 Proficiency in the use of MS powerpoint

4 Confidence in using the e-version of the tool with


minimum assistance
5 Understanding of the scoring guide described in
the manual version of the tool
6 Fluency in communicating ideas

7 Active participation in the workshop

8 Positive attitude towards interacting with the


other participants
9 Leadership potential during group activities

1 Readiness to act as anchor/lead person as


0 required

Name and Signature of the Monitor: _________________________________________________

Date Accomplished: _____________________________

T&D System M&E Framework and Tools Handbook, June 2010 Page 9
NCBTS-M&E Form 3: NCBTS Coordinator’s Checklist
Consolidation Template

INSTRUCTIONS FOR NCBTS COORDINATOR’S CHECKLIST

Instructions for Administration


A. For the TDNAWG Chair
1. Assign each member of the TDNA-WG to observe how the orientation is conducted
the cluster level or school level orientation.
2. Distribute this instrument to the members of the TDNA-WG. This instrument may
also be given to Education Supervisors or Public Schools District Supervisors who
may assist in the monitoring of the orientation activities.
3. Brief the members, ES, and PSDS on how to use this instrument.
4. Retrieve all the accomplished instruments.
B. For the TDNAWG Member or ES/PSDS
1. Be familiar with the indicators included in this instrument.
2. Observe how the NCBTS Coordinator conducts the orientation activities.
3. Use the instrument to record the indicators manifested by the NCBTS Coordinator.
4. Submit the accomplished instruments to the TDNAWG Chair.

Scoring and Consolidation


Use the Template that follows to consolidate results. *This can efficiently be done using MS
Excel
NCBTS Coordinator Checklist Consolidation
Tally Frequency Total Percentage of
Items M NM M NM (M + NM) Manifestation
(Total M ÷ Grand Total=
100%)
1
2
3
4
5

6
___%
7
8
9
10
`Total M Grand Total

T&D System M&E Framework and Tools Handbook, June 2010 Page 10
NCBTS-M&E Form 4: Trainers Assessment of the NCBTS
Orientation Workshop

Trainer’s Name: _________________________________ Sex: Male Female

Please assess the effectiveness of the entire workshop according to the indicators below.
Please refer to the following rating scale:
4-Very High (VH); 3-High (H); 2-Low (Low); 1-Very Low (VL)

After the conduct of the Orientation Program by the Team and Rating
considering participants’ outputs I believe that ………. 1 2 3 4

1 the workshop was well planned


2. the workshop objectives were met
3. new information was clearly presented
4. new information was appropriate to participants’ roles and responsibilities
5. the strategies and methods used were interesting and enjoyable for
participants
6. the andragogical (4 As) approach was properly applied
7. training activities moved quickly enough to maintain participants’ interest
8. contribution of all participants, both male and female, were encouraged
9. participants were encouraged to consider how ideas and skills gained during
the training could be incorporate into their own practices
10. handout materials were clear
11. workshop topics were summarised
12. powerpoint presentations supported the flow of sessions
13. the resources provided were appropriate to particiants’ needs

My contribution to the objectives of the workshop: I …


14. contributed in the preparation for the workshop.
15. effectively delivered what was expected of me in the conduct of the
workshop.
16. gave support needed to the Team.

Please provide your honest response to each of the following questions:

What were the successful aspects of the workshop? Why?

What changes would you like to make to improve similar workshops in the future? Why?

Recommendations

Signature: _________________________________ Date Accomplished: ____________

T&D System M&E Framework and Tools Handbook, June 2010 Page 11
NCBTS-M&E Form 4: Trainers Assessment of the NCBTS
Orientation Workshop Consolidation
Template

INSTRUCTION FOR TRAINER’S ASSESSMENT OF WORKSHOP


I. Instructions for Administration

Give this instrument to the trainers prior to the beginning of the workshop. Brief
the trainer of the content and purpose of the instrument prior to administration.
Consolidate the results based on the accomplished instruments.

II. Scoring and Consolidation -This can efficiently be done using MS Excel.

Tally (T) Frequency (e) (f) Mean


a b c d Ratin
g
Items VL L H VH VL L H VH a+b+c+d VH + H + L e/f
Tx1 Tx2 Tx3 Tx4 + VL

Ex. lll lll lll lll 7x1= 9x2= 4x3=1 5x4=2 7+ 18=12+ 7+9+4+5 57/25=
l l l l 7 18 2 0 20= = 25 2.28
57
ll lll
l
1

4
5
6

7
8

9
10

11

12

13

14

15

16

T&D System M&E Framework and Tools Handbook, June 2010 Page 12
T&D System M&E Framework and Tools Handbook, June 2010 Page 13
NCBTS-M&E Form 5: Trainees’ End of the F3 Program
Assessment

Trainee’s Name (Optional): _________________________ Sex: Male Female

Program Title: ________________________ Date: __________________

Direction: Please assess the effectiveness of the entire F3 component of the program according to
the
indicators below. Please refer to the following rating scale:

4-Strongly Agree (SA); 3-Agree (A); 2-Disagree (D); 1-Strongly Disagree (SD)

Rating
After the conduct of the F3 component of the program,
1 2 3 4
I believe that …
SD D A SA
A Program Planning/Management/Preparation
1 the training program was delivered as planned
2 the training program was managed efficiently
3 the training program was well-structured
B Attainment of Objectives
4 the program objectives were clearly presented
5 the session objectives were logically arranged
6 the program and session objectives were attained
C Delivery of Program Content
7 program content was appropriate to trainees’ roles and responsibilities
8 content delivered was based on authoritative and reliable sources
9 new learning was clearly presented
10 the session activities were effective in generating learning
11 adult learning methodologies were used effectively
12 management of learning was effectively structured e.g. portfolio,
synthesis of previous learning, etc.
D Trainees’ Learning
13 trainees were encouraged to consider how ideas and skills gained
during the training could be incorporated into their own practices
14 contribution of all trainees, both male and female, were encouraged
15 trainees demonstrated a clear understanding of the content delivered
E Trainers’ Conduct of Sessions
16 the trainers’ competencies were evident in the conduct of the sessions
17 teamwork among the trainers and staff was manifested
18 trainers established a positive learning environment
19 training activities moved quickly enough to maintain trainees’ interest
F Provision of Support Materials
20 training materials were clear and useful
21 powerpoint presentations supported the flow of the sessions
22 the resources provided were appropriate to trainees’ needs
G Program Management Team
23 Program Management Team members were courteous
24 Program Management Team was efficient
25 Program Management Team was responsive to the needs of trainees
H Venue and Accommodation
26 the venue was well lighted and ventilated
27 the venue was comfortable with sufficient space for program activities

T&D System M&E Framework and Tools Handbook, June 2010 Page 14
28 the venue had sanitary and hygienic conditions
29 Meals were nutritious and sufficient in quantity and quality.
30 the accommodation was comfortable with sanitary and hygienic
conditions
I Overall
31 I have the knowledge and skills to apply the new learning
32 I have the confidence to implement the JEL contract

Please provide your honest response to each of the following questions:

What do you consider your most significant learning from the program?

What changes would you suggest to improve similar programs in the future?

Briefly describe what you have learned and how it will help you with your work.

What further recommendations do you have?

NCBTS-M&E Form 5: Trainees’ End of the F3 Program


Assessment Consolidation Template

Collate the accomplished F3-M&E Form 5: Trainees’ End of the F3 Program Assessment, and
review the results. Use the table below to consolidate the results for the quantitative items.

Note: The scoring and consolidation can be efficiently done using MS Excel.

T&D System M&E Framework and Tools Handbook, June 2010 Page 15
Use the scale below to interpret mean rating for each item of the assessment:
3.5 to 4.0 = (SA) Strongly Agree
2.5 to 3.4 = (A) Agree
1.5 to 2.4 = (D) Disagree
1.0 to 1.4 = (SD) Strongly Disagree

Qualitative results should also be summarized below.

Tally (T) Frequency (e) (f) Mean


Ratin
A b c d
g
Item S D A SA SD A D SA a+b+ SA+ A e/f
s D Tx1 TxTx2 Tx4 c+d + D+
3 SD
Ex. llll llll 0x1=0 0x2=0 8x3 7x4= 24+28 7+8= 15 52/15
III =7 =2 28 = = 3.47
=8 4 52
A Program Planning/Management/Preparation
1
2
3
B Attainment of Objectives
4
5
6

C Delivery of Program Content


7
8
9
10
11

12

D Trainees’ Learning

13

14

15

E Trainers Conduct of Sessions

16

17

18

19

T&D System M&E Framework and Tools Handbook, June 2010 Page 16
F Provision of Support Materials

20

21

22

G Program Management Team

23

24

25

H Venue and Accommodation

26

27

28

29

30

I Overall

31

32

Summary of Qualitative Responses

What do you consider your most significant learning from the program?





T&D System M&E Framework and Tools Handbook, June 2010 Page 17




What changes would you suggest to improve similar programs in the future?











Briefly describe what you have learned and how it will help you with your work.










What further recommendations do you have?










NCBTS-M&E Form 6: Documentation Tool for the Conduct of Division, Cluster or School
Level NCBTS-TSNA Implementation

This form is to be used to support Regional monitoring of the NCBTS-TSNA process at the Division
level and Division monitoring of district and school level activities. It is expected that the assessment
will be based on observations, discussions with the implementing team and review of relevant
documents.

T&D System M&E Framework and Tools Handbook, June 2010 Page 18
NCBTS-M&E Form 7: School’s NCBTS-TSNA Consolidation
Template
Division/District/School _________________________ Date: __________________

Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way

Use the scale above to assess the extent to which the conduct of TDNA documentation adhered to
the following:

To what extent …….. 1 2 3 4


1. was thorough planning conducted prior to the NCBTS-TSNA orientation
workshop?
2. were participants oriented to the NCBTS?
3. was the purpose of the NCBTS-TSNA explained?
4. was a clear explanation provided on how to accomplish the NCBTS-TSNA tools
e.g. manual and/or e-version
5. was the scoring system for the NCBTS-TSNA tool explained?
6. were the steps involved in developing an Individual Summary TSNA results
explained?
7. were the steps involved in consolidating TSNA results explained?
8. was an explanation on how to interpreted individual and consolidated results
provided ?
9. was technical assistance provided when required?
10. were the M&E tools and processes implemented?
11. Was there evidence of team work and collaboration amongst the NCBTS
Implementers
12. were recommendations for improving the NCBTS-TSNA Orientation and
Administration processes identified?

Recommendations:

Name: ___________________________________

Designation: _________________________________

Date: ____________________________________

Name of School: ____________________________Division _______________________

School NCBTS-TSNA Results

T&D System M&E Framework and Tools Handbook, June 2010 Page 19
Average Percentage
Teacher’s Percentage Score
Domain
/Strand No. Total
( NumberTotal
of Teachers )
T T2 T3 T4 …
1
1.1
1.2
Total Domain
1.
2.1
2.2
2.3
2.4
2.5
Total Domain
2.
3.1
Total Domain
3.
4.1
4.2
4.3
4.4
4.5
4.6
4.7
Total Domain
4.
5.1
5.2
5.3
5.4
Total Domain
5.
6.1
Total Domain
6.
7.1
7.2
7.3
Total Domain
7.

School Head _________________________________________

NCBTS Coordinator: __________________________________

T&D System M&E Framework and Tools Handbook, June 2010 Page 20
2.3. M&E for the TDNASH

The following M&E tools are available to support the Training and Development Needs
Assessment for School Heads (TDNASH) process:

Tools for TDNASH:


T&D-M&E Form 1: Individual Profile Template
TDNASH-M&E Form 1: Division M&E of Conduct of TDNASH
TDNASH-M&E Form 2: TDNASH Consolidated Cluster Results Template
TDNASH-M&E Form 3: Documentation Tool for Division Implementation of TDNASH

A: Training and Development Needs Assessment (TDNA) System


A2. TDNASH
System M&E Tools for Regional Level M&E Tools for Division/Cluster M&E Tools for School
Levels Level Level
Output TDNASH-M&E Form 2: TDNASH
Consolidated Cluster Results
Template

Process
TDNASH-M&E Form 3: TDNASH-M&E Form 1: Division
Documentation Tool for Division Monitoring and Evaluation Tool of the
Implementation of TDNASH Conduct of TDNASH

Input T&D-M&E Form 1: Individual Profile


Template (for SH, ES/PSDS )

Checklist of Available Resources for


TDNASH incorporated into TDNASH
Guide

T&D System M&E Framework and Tools Handbook, June 2010 Page 21
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of
the TDNASH process:

What will be How it will be M&E tool to Who will When will the How will the results be
monitored monitored be used be monitoring used
responsibl take place
e for the
monitoring
School Head’s All School T&D-M&E TDNA - WG Prior to the Results will be entered
details in relation to Heads will be Form 1: accomplishment into the TDIS database
their current position, asked to Individual of the TDNASH along with their
their level of complete the Profile Tool corresponding TDNASH
experience and profile Template results
qualification

The implementation Members of TDNASH- Division During the Results will be collated
of the TDNASH the Division M&E Form TDNA - WG accomplishment and analyzed by the
process at the TDNA- WG 1: Division and TDNASH TDNA-WG and used to
school/cluster level will be asked Monitoring process and the inform future TDNASH
to observe the & consolidation of processes.
conduct of the Evaluation results
TDNASH at of the
the Conduct of
school/cluster TDNASH
level and
complete the
tool

The training and The PSDS /ES TDNASH- TDNA -WG After the Results will be analyzed
development needs will be asked M&E Form accomplishment and used to inform the
of the School Heads to 2: TDNASH of the TDNASH Division on the training
consolidated Consolidate by a cluster of and development needs
the School d Cluster School Heads for School Heads. Results
Head Results will be incorporate into the
TDNASH Template MPPD and DEDP
results for a
cluster of
schools

The implementation A Process TDNASH- Region During the Results to be discussed


of the TDNASH at Observer will M&E Form TDNA - WG TDNASH with the Division and
the Division level be identified 3: process at the identify strengths and
and asked to Documentat Division Level areas for improvement.
complete the ion Tool for
tool Division Observations will be
Implementat collated by the TDNA- WG
ion of and the results analyzed
TDNASH to inform future TDNA
policy

T&D-M&E Form 1: Individual Profile Template

I PERSONAL DATA
Name:

T&D System M&E Framework and Tools Handbook, June 2010 Page 22
(Surname) (First Name) (Middle Name)

Employee Number (If Applicable):


Sex: Male Female
Date of Birth:
Home Address:
Contact #: e-mail address:
Region
: Division: District:
Office/School: Address:
Current Other
Position: Designations:
Highest Educational Attainment:

II. WORK EXPERIENCE


(List from most current.)
LEVEL e.g. INCLUS
MAIN AREA OF RESPONSIBILITY
POSITION Elem/Sec/ALS school, IVE
e.g. subjects taught, level supervised PERIOD
district, division, region

Use additional sheet if necessary.

III. TRAINING ATTENDED OVER THE LAST THREE YEARS

Please check training focus and management level for all training attended over the last three
years.

Training Focus Training Management Level of Training


attended Central Region Division Cluster School
over last 3
years ()

T&D System M&E Framework and Tools Handbook, June 2010 Page 23
Curriculum

Resource Materials
Development

Planning

Management

Policy Development

Research

Other, please specify


______________

IV. SIGNIFICANT EXPERIENCES


Identify which of the following areas you consider to be your area(s) of expertise:
S School Based Management
Quality Assurance Monitoring and Evaluation
Access Education Subject Specialization:
_____________)
Education Planning Policy Development
Learning Resource Materials Development ICT
Delivery of Training Other, please specify
________________

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please


specify --

List your significant experiences in the identified areas

Use additional sheet if necessary.

V. TRAINING AND DEVELOPMENT EXPERIENCES


Identify which of the following specific areas you consider to be your
area(s) of expertise:

Competency Assessment Program Planning

Program Designing Resource Materials


Development

T&D System M&E Framework and Tools Handbook, June 2010 Page 24
Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best of my knowledge and belief.

Date: Signature:

Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.

T&D System M&E Framework and Tools Handbook, June 2010 Page 25
TDNASH-M&E Form 2: Division Monitoring & Evaluation of the
Conduct of TDNASH

TDNASH Administrator Monitored: ____________________ Dates: ________________

Activity Monitored:
School Head completion of TDNASH Supervisor

Teachers focus group discussion


Please consider the administration of the TDNASH in line with the indicators listed below. Please check (✔ ) the
appropriate column to indicate your level of agreement for each of the statements.

Strongly Disagre Strongly


Agree
Agree e Disagree
Administration of the TDNASH
Materials for conducting the TDNASH were organized and
prepared in advance
Respondents were informed of the purpose for conducting the
TDNASH
Clear instructions were provided on the process to be followed
Clarification was provided by the TDNASH Administer when
necessary
Answer Sheets were collected and checked to ensure
respondent information is complete and answers have been
provided for all indicators/competencies
Results Analysis
Results have been accurately analysed to identify the School
Leadership Experience Levels and the Level of Importance
Results have been accurately consolidated for individual School
Heads
Results have been accurately consolidated for clusters of
School Heads
Results have been submitted to the School Head, Cluster Lead
School Head and TDNA-WG Chair in a timely manner
General Comments

Recommendations to improve processes

T&D System M&E Framework and Tools Handbook, June 2010 Page 26
TDNASH-M&E Form 3: TDNASH Consolidated Cluster Results Template

PART I. Cluster School Heads Identification

Cluster Name: ____________________________

Division _______________________________________

No. School Head Name School

SH 1

SH 2

SH 3

SH 4

SH 5

SH 6

SH 7

SH 8

SH 9

SH 10

SH 11

SH 12

SH 13

SH 14

SH 15

T&D System M&E Framework and Tools Handbook, June 2010 Page 27
PART II. TDNASH Cluster Summary Sheet for School Leadership Experience Level (SLEL)
Cluster Name: _______________________
Domai School Heads’ SLEL Overall Rating obtained from the Cluster SLEL
Overall
ns (D) triangulation data
S S S S S S S S S SH SH S S S SH Ave. Level
H H H H H H H H H 10 11 H H H 15 Ratings Equivalent
1 2 3 4 5 6 7 8 9 1 1 1
2 3 4
D 1.1
1
1.2
1.3
1.4
1.5
1.6
D 2.1a
2
2.1
b
2.2a
2.2
b
2.2c
2.3
D 3.1a
3
3.1
b
3.2a
3.2
b
D 4.1a
4
4.1
b
4.1c
4.1
d
4.2
4.3
4.4
D 5.1a
5
5.1
b
5.2
D 6.1
6
6.2a
6.2
b
6.3
D 7.1
7
7.2a
7.2
b
7.2c
7.3

T&D System M&E Framework and Tools Handbook, June 2010 Page 28
7.4

PART III. TDNASH Cluster Summary Sheet for Level of Importance (LOI)

Cluster Name:
________________________

Domain School Heads’ LOI Overall Rating obtained from the triangulation data Cluste
r LOI
s (D)
SH SH SH SH SH SH SH SH SH S SH SH S SH S Ave.
1 2 3 4 5 6 7 8 9 H 11 12 H 14 H Rating
s
1 1 1
0 3 5
D 1.1
1
1.2
1.3
1.4
1.5
1.6
D2 2.1a
2.1b
2.2a
2.2b
2.2c
2.3
D3 3.1a
3.1b
3.2a
3.2b
D4 4.1a
4.1b
4.1c
4.1d
4.2
4.3
D5 5.1a
5.1b
5.2
D6 6.1
6.2a
6.2b
6.3
D7 7.1
7.2a
7.2b
7.2c
7.3
7.4

T&D System M&E Framework and Tools Handbook, June 2010 Page 29
TDNASH-M&E Form 4: Documentation Tool for Division
Implementation of TDNASH
This form is to be used to support Regional monitoring of the TDNASH process at the Division level
and Division monitoring of district and level activities. It is expected that the assessment will be
based on observations, discussions with the implementing team and review of relevant documents.
Division/District _________________________ Date: __________________

Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way
Use the scale above to evaluate the extent to which the conduct of TDNASH documentation adhered
to the following:
To what extent …….. 1 2 3 4
1. was thorough planning conducted prior to the TDNASH orientation?
2. were participants oriented to the competencies expected of a School Head?
3. was the purpose of the TDNASH explained?
4. Was the triangular process used for the TDNASH explained e.g. three different
respondents, group consensual assessment technique
5. was a clear explanation provided on how to accomplish the TDNASH tools e.g.
manual and/or e-version
6. was the scoring system for the TDNASH tool explained e.g. continuum of
indicators for each competency matched to school leadership experience
levels?
7. were the steps involved in consolidating the triangulation results for an
individual school head explained?
8. were the steps involved in consolidating TDNASH results for a group of school
heads explained?
9. was an explanation on how to interpreted individual and consolidated results
provided ?
10. was technical assistance provided when required?
11. were the M&E tools and processes implemented?
12. Was there evidence of team work and collaboration amongst the TDNASH
Implementers
13. were recommendations for improving the TDNASH administration processes
identified?
Recommendations:

Name:______________________________________
Designation: _________________________________
Date: _______________________________________

T&D System M&E Framework and Tools Handbook, June 2010 Page 30
2.4. M&E of the Organizational TDNA for Region and Division

The following M&E tools are available to support the conduct of the Organizational TDNA:

T&D-M&E Form 1: Individual Profile Template


Org’l TDNA-M&E Form 1: Organizational TDNA Tool for the Focus Group Discussion
(FGD) Process at the Region/Division Level
Org’l TDNA-M&E Form 2a: Division Organizational TDNA Scores Summary Template
Org’l TDNA-M&E Form 2b: Region Organizational TDNA Scores Summary Template
Org’l TDNA-M&E Form 3: Functional Divisions/Units Organizational TDNA
Prioritization Template
Org’l TDNA-M&E Form 4: Organizational TDNA Schools Division Consolidation
Template
Org’l TDNA-M&E Form 5: Documentation Review of Division/Region Organizational
TDNA

A: Training and Development Needs Assessment (TDNA) System


A3. Organizational TDNA
System M&E Tools for Regional Level M&E Tools for Division/Cluster M&E Tools for
Levels Level School Level
Output
Org’l TDNA-M&E Form 2b: Org’l TDNA-M&E Form 2a: Division Organizational TDNA
Organizational TDNA Scores Organizational TDNA Scores not conducted at
Summary Template Region Level Summary Template Division Level School Level
Org’l TDNA-M&E Form 3: Functional Org’l TDNA-M&E Form 3: Functional
Division’s/Sections/Units Division’s/Sections/Units
Organizational TDNA Prioritization Organizational TDNA Prioritization
Template Template

Org’l TDNA-M&E Form 4: Org’l TDNA-M&E Form 5:


Organizational TDNA Schools Division Documentation Review of
Consolidation Template Region/Division TDNA

Org’l TDNA-M&E Form 5:


Documentation Review of
Region/Division TDNA
Process
Org’l TDNA-M&E Form 1: Org’l TDNA-M&E Form 1: Organizational TDNA
Organizational TDNA Tool for FGD Organizational TDNA Tool for FGD not conducted at
Process for the Region /Division Process for the Region/Division School Level

Input T&D-M&E Form 1: Individual Profile T&D-M&E Form 1: Individual Profile Organizational TDNA
Template Template not conducted at
School Level
Checklist of Available Resources for Checklist of Available Resources for
Organizational TDNA Region/Division Organizational TDNA Region/Division
Level incorporated into Organizational Level incorporated into Organizational
TDNA Guide TDNA Guide

T&D System M&E Framework and Tools Handbook, June 2010 Page 31
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of
the Organizational TDNA process:

What will be How it will M&E tool to Who will be When will the How will the results be
monitored be be used responsible monitoring used
monitored for the take place
monitoring
Respondent’s All T&D-M&E Division, Prior to the Information will be
details in relation to participants Form 1: Region accomplishment entered into the TDIS
their current in the Individual TDNA-WG of the database
position, their level Organization Profile Organizational
of experience and al TDNA will Template TDNA Tool
qualification be asked to
complete the
profile

The processes A process Org’l TDNA- Division, During the Results will be shared
followed during the observer will M&E Form 1: Region conduct of the with the FGD facilitators
conduct of the be appointed Organizational TDNA-WG FGD for the to identify best practices
Focus Group and will use TDNA Tool for Organizational and areas for
Discussion (FGD) the tool FGD Process TDNA improvement.
at the at the Region/ Recommendations for
Region/Division Division level improving the process
level will be included in the
Program Completion
Report to inform future
processes.

The level of Results of the Org’l TDNA- Division, Following the Results will inform
competency and organizationa M&E Form 2a: Region accomplishment decisions on the training
the level of l TDNA will Division TDNA-WG of the Division / and development
importance of the be Organizational Region programs offered at the
Division/Region for consolidated TDNA Scores Organizational division level and will be
the various using the Summary TDNA incorporate into both the
management template Template DEDP and REDP
competencies provided
across Service Org’l TDNA-
areas M&E Form 2b:
Regional
Organizational
TDNA Scores
Summary
Template

The Organizational Results of the Org’l TDNA- Division, Following the Results will inform
TDNA of the organizationa M&E Form 3: Region accomplishment decisions on the training
functional divisions/ l TDNA will Functional TDNA-WG of the and development
sections/ units be Organizational programs offered at the
Divisions/
consolidated TDNA division/regional level
for each Sections/ and will be incorporate
functional Units into both the DEDP and
division/ Organizational REDP
section/ unit TDNA
using the Prioritization
template Template

T&D System M&E Framework and Tools Handbook, June 2010 Page 32
provided

The Organizational Results of the Org’l TDNA- Region Following the Results will inform
TDNA results of the organizationa M&E Form 4: TDNA-WG submission of decisions on the training
various divisions l TDNA will Organizational Division and development
across a region be TDNA Schools Organizational programs offered at the
consolidated Division TDNA results region level and will be
for all division Consolidation incorporate into the
within a Template REDP.
region using The results will be
the template analyzed to inform
provided future TDNA policy

The implementation A Process Org’l TDNA- Division, During the Results to be discussed
of the Observer will M&E Form 5: Region conduct of the with the Division/Region
Organizational be identified Documentatio TDNA-WG Organizational and identify strengths
TDNA at the and asked to n Review of TDNA at the and areas for
Division/Region complete the Division/ Division/Region improvement.
levels tool Region Level
Organizational Observations will be
TDNA collated by the TDNA-
WG and the results
analyzed to inform
future TDNA policy

T&D-M&E Form 1: Individual Profile Template

I PERSONAL DATA
Name:

(Surname) (First Name) (Middle Name)

Employee Number (If Applicable):


Sex: Male Female
Date of Birth:
Home Address:
Contact #: e-mail address:
Region
: Division: District:
Office/School: Address:
Current Other
Position: Designations:
Highest Educational Attainment:

II. WORK EXPERIENCE


(List from most current.)

T&D System M&E Framework and Tools Handbook, June 2010 Page 33
LEVEL e.g. INCLUS
MAIN AREA OF RESPONSIBILITY
POSITION Elem/Sec/ALS school, IVE
e.g. subjects taught, level supervised PERIOD
district, division, region

Use additional sheet if necessary.

III. TRAINING ATTENDED OVER THE LAST THREE YEARS

Please check training focus and management level for all training attended over the last three
years.

Management Level of Training


Central Region Division Cluster School
Curriculum

Resource Materials
Development

Planning

Management

Policy Development

Research

Other, please specify


______________

IV. SIGNIFICANT EXPERIENCES

T&D System M&E Framework and Tools Handbook, June 2010 Page 34
Identify which of the following areas you consider to be your area(s) of expertise:
S School Based Management
Quality Assurance Monitoring and Evaluation
Access Education Subject Specialization:
_____________)
Education Planning Policy Development
Learning Resource Materials Development ICT
Delivery of Training Other, please specify
________________

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please


specify --

List your significant experiences in the identified areas

Use additional sheet if necessary.

V. TRAINING AND DEVELOPMENT EXPERIENCES


Identify which of the following specific areas you consider to be your
area(s) of expertise:

Competency Assessment Program Planning

Program Designing Resource Materials


Development

Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

T&D System M&E Framework and Tools Handbook, June 2010 Page 35
Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best of my knowledge and belief.

Date: Signature:

Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.

T&D System M&E Framework and Tools Handbook, June 2010 Page 36
Org’l TDNA-M & E Form 1: Organizational TDNA Tool for Focus
Group Discussion (FGD) Process at the Region/Division Level
_______ FGD Flow of Regional TDNA Self Assessment
_______ FGD Flow of Division TDNA Self Assessment
_______ Monitoring of Division Organization TDNA by Regional Team
Please check (✔) under the manifested (M) column if the process was manifested and under
the not manifested (NM) column if the process was not manifested. Please indicate any
variations noted and include any additional comments regarding the facilitation of the
session.
ACTIVITY M N Variations/Comment
M s
1. Facilitator emphasizes to the participants that as key
respondents to this Organizational TDNA of the
Region/Division, their answers and collaboration with their
colleagues to reach a consensual assessment will be most
helpful for the future development of the management
competencies of the region.
2. Facilitator clearly presents the purpose of the
Organizational TDNA
3. Brief description of the data gathering method (FGD) was
provided by the facilitator
4. Facilitator comprehensively explained the rating scale
5. Systematic walk through of the ‘Management
Competencies per Service Area’ one service area at a
time was carried out.
6. Each section collaboratively reached a consensus on the
level of importance and level of competencies for each
service area
7. Section ratings were recorded properly
8. Each participant provided individual perception on
perceived importance of the competency in the
performance of the region’s/division’s task / job’ and shared
these with the group.
9. TDNA-WG members efficiently performed their assigned
task.
10. Participants careful deliberated on the average level of
importance and competency ratings
11. A consolidation of the Organizational TDNA results
following the guidelines outlined in the FGD flow was
accomplished.
12. An M&E committee was tasked to monitor and evaluate on
the preparation, conduct and consolidation of the TDNA
results.
Total

Name and Signature of the Process Observer: ___________________________________

T&D System M&E Framework and Tools Handbook, June 2010 Page 37
Date: ____________________________

T&D System M&E Framework and Tools Handbook, June 2010 Page 38
Org’l TDNA-M & E Form 2a
Division Organizational - TDNA Scores Summary Template
DIVISION ___________________

Level of COMPETENCIES (LOC) Level of IMPORTANCE (LOI)

Raw WEIGHTED Raw WEIGHTED


L
Scores RATING Scores RATING
O LO
C I
Re Self- Re
Self Self
Re g Tot Reg. g Tot
Self-A -A A -A
g. 60%
40 al
60%
40 al
% %
GENERAL COMPETENCIES ACROSS UNITS (CO/RO/DO

1 Understanding DepED
as an Organization
2 Understanding RA 9155
or the Governance of
Basic Education Act
3 Management of
Change
4 Organization
Analysis/Diagnosis
5 Identifying and Solving
Problems
6 Decision-Making
7 Dealing Effectively with
Pressure Groups
8 Conflict Management

T&D System M&E Framework and Tools Handbook, June 2010 Page 39
9 Negotiation Skills
10 Transformational and
Enabling Leadership
L
Raw WEIGHTED Raw WEIGHTED L
RATING RATING
O
Scores Scores OI
C
T T
Re Re Re
o o
Re Self-A g Self-A Self-A g
Self-A 60% 40
t g. 60% 40
t
g. a a
% %
l l
SERVICE AREA 1: EDUCATIONAL PLANNING (DO/RO)

11 Strategic Planning
12 Implementation
Planning
13 Project/Program
Identification
14 Resource Mobilization
and Allocation
15 Financial
Management and
Control
16 Group Process
Management
17 Facilitation Skills
18 Communication Skills
19 Advocacy

T&D System M&E Framework and Tools Handbook, June 2010 Page 40
L
Raw WEIGHTED Raw WEIGHTED LO
RATING RATING
O
Scores Scores I
C
Re Self- Re Re
Self- Self-
Re g Tot g Tot
Self-A A
40 al
A g. A
40 al
g. 60% 60%
% %
SERVICE AREA 2: LEARNING OUTCOME MANAGEMENT (DO)

20 Understanding of the
Revitalized Basic
Education Curriculum
21 Curriculum Review
22 Curriculum
Implementation
Planning (Indiginized
Curriculum and
Instructional
Materials)
23 Instructional
Materials
Development
24 Instructional
Supervision and
Management
25 Student/Pupil
Assessment/Testing
26 Intervention
Programming
27 Education
Programs/Project
Management

T&D System M&E Framework and Tools Handbook, June 2010 Page 41
28 Tracking Student
Progress
29 Quality Management
30 Staff Development
31 Coaching and
Mentoring

T&D System M&E Framework and Tools Handbook, June 2010 Page 42
Raw WEIGHTED Raw WEIGHTED LO L
Scores RATING Scores RATING C OI
Re Self- Re Re
Self Self
Re g Tot g Tot
Self-A -A
40 al
A g. -A
40 al
g. 60% 60%
% %
SERVICE AREA 3: MONITORING AND EVALUATION (DO/RO
3 Monitoring and
2 Evaluation Design
and Development
3 Instrument/Tools
3 Development for M&E
Data Gathering
3 Data Processing,
4 Analysis and
Utilization
3 Communication
5 Skills/Feedback
Giving
3 Education
6 Management
Information System
(EMIS)
SERVICE AREA 4: EDUCATION ADMINISTRATION & MANAGEMENT (CO/RO/DO
3 Resource Mobilization
7 and Management
3 Resource
8 Procurement and
Management
3 Building Partnerships
9

T&D System M&E Framework and Tools Handbook, June 2010 Page 43
4 Human Resource
0 Management
4 Delegation
2
4 Physical Facilities
2 Programming
4 Records Management
3
4 Understanding the
4 intent of the Policy&
Implementation

T&D System M&E Framework and Tools Handbook, June 2010 Page 44
Org’l TDNA-M & E Form 2b
Region Organizational - TDNA Scores Summary Template
(Region & Division)
REGION ______

Raw
Scores
Level of WEIGHTED RATING
Competen (LOC)
Level of
cy
(LOC)
Importa
nce
GENERAL
(LOI)
COMPETENCI
ES ACROSS
UNITS S-A Div
(CO/RO/DO) SA Div 60% 40% Total
1 Understanding
DepED as an
Organization
2 Understanding
RA 9155 or the
Governance of
Basic
Education Act
3 Management of
Change
4 Organization
Analysis/Diagn
osis
5 Identifying and
Solving
Problems
6 Decision-
Making
7 Dealing
Effectively with
Pressure
Groups
8 Conflict
Management
9 Negotiation
Skills
1 Transformation
0 al and Enabling
Leadership

T&D System M&E Framework and Tools Handbook, June 2010 Page 45
T&D System M&E Framework and Tools Handbook, June 2010 Page 46
Raw
Scores
Level of Level of
WEIGHTED RATING Importa
SERVICE AREA 1: Competen (LOC)
EDUCATIONAL cy (LOC) nce
PLANNING (LOI)
S-A Div
TOTA
(DO/RO) S-A Div L
60% 40%
11 Strategic
Planning
12 Implementati
on Planning
13 Project/Progra
m
Identification
14 Resource
Mobilization
and Allocation
15 Financial
Management
and Control
16 Group
Process
Management
17 Facilitation
Skills
18 Communicati
on Skills
19 Advocacy
SERVICE AREA 3:
MONITORING
AND EVALUATION S-A Div
TOTA
(DO/RO) S-A Div 60% 40%. L
20 Monitoring
and
Evaluation
Design and
Development
21 Instrument/To
ols
Development
for M&E Data
Gathering
22 Data
Processing,
Analysis and
Utilization
23 Communicati
on

T&D System M&E Framework and Tools Handbook, June 2010 Page 47
Skills/Feedbac
k Giving
24 Education
Management
Information
System
(EMIS)

T&D System M&E Framework and Tools Handbook, June 2010 Page 48
Raw
Scores
SERVICE AREA 4: Level of WEIGHTED RATING
EDUCATION Competen (LOC) Level of
ADMINISTRATION cy Importa
AND (LOC) nce (LOI)
MANAGEMENT S-A
(CO/RO/DO) 60 Div
TOTA
S-A Div % 40% L
25 Resource
Mobilization
and
Management
26 Resource
Procurement
and
Management
27 Building
Partnerships
28 Human
Resource
Management
29 Delegation
30 Physical
Facilities
Programming
31 Records
Management
32 Understandin
g the intent of
the Policy and
Implementati
on
SERVICE AREA 5:
POLICY
FORMULATION
AND STANDARD
SETTING (RO/CO) S-A Div
33 Policy
Framework
Development
34 Policy
Instrument
Development
35 Policy
Formulation
36 Policy Review
37 Standard

T&D System M&E Framework and Tools Handbook, June 2010 Page 49
Setting
38 Technical
Writing
39 Advocacy for
Policy
Formulation/I
mplementatio
n

T&D System M&E Framework and Tools Handbook, June 2010 Page 50
Raw Scores
SERVICE AREA Level of WEIGHTED RATING
Level
6: Competenc (LOC)
of
CURRICULUM y (LOC)
Import
DEVELOPMENT
ance
(CO/RO) S-A
TOT (LOI)
60 Div
S-A Div. % 40% AL
40
Knowledge
on the
Technical
Vocabulary
of
Curriculum
Engineering
41
Understandi
ng of the
Foundation
of the
Curriculum
42
Application
of the
Foundations
of the
Curriculum
in
Curriculum
Engineering
43
Curriculum
Designing
44
Curriculum
Structuring
45
Implementa
tion of
Various
Curriculum
Models
46
Curriculum
Evaluation

T&D System M&E Framework and Tools Handbook, June 2010 Page 51
NOTE: The lower the numerical value of the LOC and LOI, the greater is the need for
training.

T&D System M&E Framework and Tools Handbook, June 2010 Page 52
Org’l TDNA-M & E Form 3: Functional Divisions/Sections/Units
Organizational - TDNA Prioritization Template
(To be accomplished by the TDNA –WG)
LEVEL OF PLAN: REGION DIVISION DATE Accomplished: ______________________

Supply the following data: 1) name of functional divisions/sections/units, and 2) numerical rating of LOI and LOC for each service area of each
divisions/sections/unit.

Name of Functional Divisions/Sections/Units


Competencies /
Service Areas LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC
General Competencies
Service Area 1:
Educational Planning
Service Area 2:
Learning Outcome
Management
Service Area 3:
Monitoring&
Evaluation
Service Area 4:
Education
Administration &
Management
Service Area 5: Policy
Formulation and
Standard Setting
Service Area 6:
Curriculum
Development

Priority1
Priority 2

T&D System M&E Framework and Tools Handbook, June 2010 Page 53
Priority 3
Org’l TDNA-M & E Form 4: Organizational - TDNA Schools Division Consolidation Template

(To be accomplished by the Regional TDNA-WG)


REGION ___________________________ DATE Accomplished: ______________________

NOTE: For Regions with more than seven (7) Schools Divisions, additional columns maybe added. Only the general average of each service
area is entered.

Names of Divisions within Region


Competencies / Service Areas Division 1: Division 2: Division 3: Division 4: Division 5: Division 6: Division 7:
____________ __________ ____________ ___________ ___________ ____________ ___________
__ _ _ _
LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC LOI LOC
General Competencies
Service Area 1: Educational
Planning
Service Area 2: Learning Outcome
Management
Service Area 3: Monitoring&
Evaluation
Service Area 4: Education
Administration & Management
Service Area 5: Policy Formulation
and Standard Setting
Service Area 6: Curriculum
Development

Priority1
Priority 2
Priority3

T&D System M&E Framework and Tools Handbook, June 2010 Page 54
Org’l TDNA-M&E Form 5: Documentation Review
of Organizational TDNA Region/Division level
This form is to be used to support Regional monitoring of the Organizational TDNA processes at the Division level. It is expected that the assessment will be
based on observations, discussions with the implementing team and review of relevant documents.

Division/Region _________________________ Date: __________________

Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way

Use the scale above to assess the extent to which the conduct of Organizational TDNA
documentation adhered to the following:
To what extent …….. 1 2 3 4
1. was thorough planning conducted prior to administration?
2. was the purpose of the Organizational TDNA explained?
3. was the data collection method to be followed for administering the
Organizational TDNA explained e.g. group consensual assessment technique,
self assessment and an external assessment?
4. were participants oriented to the Organizational Management Competencies for
each service?
5. was a clear explanation provided on how to accomplish the Organizational
TDNA process e.g. consensus agreement within each division/unit regarding
level of competence and level of importance, agreement across divisions/units,
scoring system
6. were the steps involved in consolidating the results for individual divisions/units
as well as the overall region/division explained?
7. were the steps involved in consolidating the self assessment and the external

T&D System M&E Framework and Tools Handbook, June 2010 Page 55
assessment explained?
8. was an explanation on how to interpreted results to identify priority training
needs provided ?
9. was technical assistance provided when required?
10. were the M&E tools and processes implemented?
11. was there evidence of team work and collaboration amongst the Organizational
TDNA Implementers
12. were recommendations for improving the Organizational TDNA administration
processes identified?

Recommendations:

Name: ___________________________________
Position: _________________________________
Date: ____________________________________

T&D System M&E Framework and Tools Handbook, June 2010 Page 56
Section 3.0: Training & Development System Monitoring and Evaluation for
the Professional Development Planning (PDP) System

3.1 M&E for the Professional Development Planning (PDP) System

The Professional Development Planning System has three major planning components; the Individual Plan for
Professional Development (IPPD) for teachers and school heads, the School Plan for Professional Development
(SPPD) and the Master Plan for Professional Development (MPPD) at both the region and division level.

The diagram below shows the Quality Assurance (QA) and M&E scheme for the Professional Development
Planning System at the Division and Regional levels. At both levels, the T&D Office is tasked to prepare the PDP-
WG members who are assigned to monitor the professional development planning conducted by clusters of
schools and at the level of individual school implementation using M&E tools. The System’s compliance to
standards particularly the development and quality of the SPPDs and MPPDs are quality-assured by the PDP-
WG.

The M&E Report accomplished by the Division PDP-WG is submitted to the Division T&D Chair who in turn
reviews the report with the T&D Division Team. The T&D Office has the responsibility to inform monitored
schools of the significant findings related to the professional development plans. The Report is also used as the
basis for necessary adjustments to the system, if any.

The same process is followed at the Regional level for the completion of professional development plans. The
Regional T&D Office convenes the PDP-WG who is tasked to monitor the program planning in terms of quality
and processes followed and reports its findings to the Regional T&D Chief. Following any necesssary
adjustments, new standards and guidelines are sent to the divisions and schools.
2.4. QA-M&E for Professional Development
2.4.1
Planning
Division/Region
Prepare PDP-WG
IPPD, Division/Region 2.4.6a
T&D Office
Compl SPPD
PDP-WG 2.4.5 Identify &
etion Division
for M&E IPPD Inform
of MPPD Review
2.4.2a
and monitored
TDNA Region
2.4.2b M&E/QA
Monitor
Resource schools/divi
MPPD
QA for Report 2.4.7b
Professional
s sions
Development Products’ Makeof
QA/M&E
necessary
Planning complianc
findings
adjustments
PDP
System at e to
TDI School and 2.4.3 standards 2.4.8c
to PDPSystem
S District levelRecord Prepare
System Adjustm
Dat M&E and Reportenton
QA PDP Process
aba Results
2.4.4 based on
se Prepare QA/M&E for
M&E/QA Reg policy
Report Reg
on
Report review
PDP &Polic
M&E/QA adjustment
Processy
Report Revie
w&
Adjus
tment

T&D System M&E Framework and Tools Handbook, June 2010 Page 57
3.2. M&E for the IPPD for Teachers and School Heads

M&E tools are provided to support the orientation and the implementation of the IPPD as well as the
overall management of the process. The following tools are available:

Tools for IPPD


IPPD-M&E Form 1: Process Observation Guide for Teachers/School Heads
IPPD-M&E Form 2: End of IPPD Planning Evaluation for Teachers/School Heads
IPPD-M&E Form 3: Review of Accomplished IPPD
IPPD-M&E Form 4: Summary Template of IPPD Goals/Objectives for Teachers/School Heads
IPPD/SPPD-M&E Form 5: Division Tracking Form of Accomplished IPPDs/SPPDs
IPPD/SPPD-M&E Form 6: Region Tracking Form of Accomplished IPPDs/SPPDs

B. Professional Development Planning (PDP) System


B.1. IPPD
Systems M&E Tools for the M&E Tools for the M&E Tools for the School
Level Regional Level Division/Cluster Level Level

Output IPPD/SPPD-M&E Form 6: IPPD-M&E Form 3: Review of IPPD-M&E Form 3: Review of


Region Tracking Form of Accomplished IPPD Accomplished IPPD
Accomplishment
IPPDs/SPPDs IPPD-M&E Form 4: Summary Template IPPD-M&E Form 4: Summary Template of
of IPPD Goal/Objectives IPPD Goal/Objectives

IPPD/SPPD-M&E Form 5: Division


Tracking Form of Accomplishment
IPPDs/SPPDs

Process IPPD-M&E Form 1: IPPD Process IPPD-M&E Form 1: IPPD Process


Observation Guide for Teachers/ School Observation Guide for Teachers/ School
Heads Heads

IPPD-M&E Form 2 End of IPPD IPPD-M&E Form 2: End of IPPD Planning


Planning Evaluation for Teachers/School Evaluation for Teachers/School Heads
Heads

Input Resource Materials Checklist for IPPD Resource Materials Checklist for IPPD
incorporated into the IPPD Guide incorporated into the IPPD Guide

The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of the IPPD
process:

What will be How it will be M&E tool to Who will be When will the How will the results be used
monitored monitored be used responsible monitoring
for the take place
monitoring
The process A process observer IPPD-M&E School PDP – During the IPPD Results will be reviewed by the
followed in the will be identified and Form 1: IPPD WG (Teachers process for PDP-WG, recommendations
accomplishment will use the Process IPPD) teachers at the developed to improve processes
of the IPPD by Observation Guide Observation school level and and included in the Program
teachers and Guide for Division PDP- School Heads Completion Report
school heads Teachers/Sch WG (SHs IPPD) at cluster level
ool Heads

T&D System M&E Framework and Tools Handbook, June 2010 Page 58
Teachers/Schoo Teachers/School IPPD-M&E School PDP – Following the End of IPPD Evaluation will be
l Heads Heads will complete Form 2: End WG (Teachers accomplishment collated by the PDP-WG and
perception of an End of IPPD of IPPD IPPD) of the IPPD reviewed to identify how the
the success of Planning Evaluation Planning Planning processes can be improved.
the IPPD Evaluation process at the
planning Division PDP- school level for A summary of the results and
process WG (SHs IPPD) teachers and at recommendations will be included
the cluster level in the Program Completion
for School Report and recommendations
Heads incorporated into future processes
The quality of School Heads and IPPD-M&E School Heads Following the Feedback will be provided to
the Department Heads Form 3: and Department completion of individual teachers/school heads
accomplished will review teachers’ Review of Heads for the teachers’ to enhance the quality of the
IPPD IPPD at school level Accomplishe teachers IPPD at the IPPD.
d IPPD school level and
ES1/PSDS will ES1/PSDS/ASD School Heads
review completed S for School at the cluster
IPPD of School Heads level
Heads at the cluster
level
The IPPD goals IPPD’s will be IPPD-M&E School PDP – Following the For Teachers IPPD: SHs/Dept
and objectives reviewed and results Form 4: WG (Teachers completion of Heads will consolidate key
of summarized at the Summary IPPD) the teachers’ findings to inform the
teachers/school school level for Template of IPPD at the SPPD/MPPD
heads teachers and at the IPPD Division PDP- school level and
cluster level for Goal/Objectiv WG (SHs IPPD) School Heads’ For School Heads IPPD: Division
School Heads es at the cluster PDP-WG/PSDS will consolidate
level key findings for a cluster of
School Heads and prepare a
report for submission to the
Division T&D Chair. T&D Chair
will identify key recommendations
to be included in Program
Completion Report and inform
MPPD
The number of A Division Tracking IPPD/SPPD- Division PDP- Following the Results will be included in
IPPDs Form will be M&E Form 5: WG accomplishment Division Program Completion
accomplished completed listing the Division of the IPPD by Report and inform future IPPD
by schools number of teachers Tracking teachers and policy
within the and school heads Form for school heads
division who have Accomplishe
accomplished IPPDs d
IPPDs/SPPD
s
The number of A Region Tracking IPPD/SPPD- Region PDP- Following the Results will be included in Region
IPPDs Form will be M&E Form 6: WG accomplishment Program Completion Report and
accomplished completed listing the Region of the IPPD by inform future IPPD policy
within each number of teachers Tracking teachers and
division within and school heads Form for school heads in
the region who have Accomplishe each division
accomplished IPPDs d IPPDs
across all divisions

T&D System M&E Framework and Tools Handbook, June 2010 Page 59
T&D System M&E Framework and Tools Handbook, June 2010 Page 60
IPPD-M&E Form 1: IPPD Process Observation Guide for Teachers /
School Heads

SCHOOL/CLUSTER Observed:_________________________________________________
NAME OF PROCESS OBSERVER: _____________________________________________
DATE: ______________________ VENUE: _____________________
PARTICIPANTS NAME: (Attached Attendance Sheet)

DIRECTION: Observe the process involved in the activities associated with the development of the
IPPD. If the activity is accomplished, write YES in the appropriate column, if not, write NO.

ACCOMPLISHED
ACTIVITIES
Yes or No
I. Development of an understanding of the IPPD and its purpose
a. Conduct of a warm up activity to start the session
b. Discussion on how to further develop oneself as a professional to improve
performance of one’s duties and responsibilities.
c. Presentation of the objective of the IPPD workshop and explanation of the
meaning of IPPD, its purpose and guiding principles.
d. Explanation regarding the accomplishment of the IPPD being a vital
responsibility of all professionals for the development of the school and
improvement of learners
II. Completion of the IPPD
a. Analysis of the information such as TDNA, AIP, School assessment reports
and/or other relevant available documents.
b. Formulation of the IPPD goal
c. Deriving the objectives from the goal by reviewing the list of priority needs and
specific competency areas
d. Identification of the strategies/methods and activities for pursuing one’s
professional development goal and objectives
e. Establishment of the timeframe for the various activities identified in the IPPD
f. Identification of possible resources that can support the implementation of the
IPPD
g. Review of the IPPD
h. Signing of the IPPD

Do you have any comments regarding the IPPD process?

Do you have other comments/suggestions/recommendations for the improvement of the IPPD process?

Process Observer :___________________________________


Signature Over Printed Name
___________________________________
T&D System M&E Framework and Tools Handbook, June 2010 Page 61
Designation

IPPD-M&E Form 2: End of IPPD Planning Evaluation for Teachers/


School Heads

Name: _______________________________ Designation:_________________________

Sex: Male Female

Please rate how you feel about the IPPD planning session relative to the following processes involved in
the accomplishment of the IPPD. Please tick the appropriate column for your rating using the scale below.

Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Level In a very significant way
3 High Level In a meaningful way
2 Low Level In a limited way only
1 Very Low Level Not in any meaningful way

IPPD Accomplishment Rating Scale


To what level do you feel: 1 2 3 4
1 the following documents were used in the analysis and development of the
context of your IPPD?
a. TDNA results: NCBTS-TSNA/ National Competency Based
Standards for School Heads (NCBS –SH)
b. SIP/AIP
c. Student/Pupil Performance Data
2 the formulation of IPPD overall goal was based on the results of the analysis of
the current development needs of the school and learners?
3 the IPPD goal was taken into consideration in formulating the objectives?
4 the formulation of program objectives was based on your own professional
need and the learning needs of your school learners?
5 the decisions about the strategies, methods and activities were based on the
objectives to be achieved and the competencies to be enhanced?
6 the various funding sources were identified to support the implementation of
the different programs?
7 you have considered development priorities and the one-year coverage in
setting the timeframe for the different professional activities?
8 you have appropriately identified the success indicators for your:
a. professional competencies enhanced?
b. student learning competencies improved?
9 you have enhanced your knowledge and skill in professional development
planning?
10 you will be able to apply the learning gained in future similar activities?
11 you are able to transfer the technology learned to others?
Do you have other comments/suggestions/recommendations for the improvement of the IPPD
process?

T&D System M&E Framework and Tools Handbook, June 2010 Page 62
IPPD-M&E Form 3: Review of Accomplished IPPD

Name of IPPD Planner:_______________________ Designation: ______________________

This form has been developed to support a Review Process of the accomplished Individual
Plan for Professional Development (IPPD). The School Head and Department
Heads/Coordinators should review the IPPD completed by the teachers while the PDP-WG
Chair/ES1/PSDS should review the IPPD of SHs to evaluate the level of adherence to
standards followed. Based on the review feedback should be provided to the IPPD Planner
and the IPPD further enhance if required.

Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Level In a very significant way
3 High Level In a meaningful way
2 Low Level In a limited way only
1 Very Low Level Not in any meaningful way

Use the scale above to evaluate the level to which the accomplished IPPD adheres to the following
standards:

To what level …….. 1 2 3 4


1. does the IPPD focus on the mandated functions, competency standards for the
profession and the development priorities of the school, national goals and
thrusts?
2. does the IPPD goal focus on improvement of school effectiveness and learning
outcomes?
3. does the IPPD adhere to the following SMART standards:
 Specific and focused on learners and school priorities?

 Measurable progress and accomplishments through a monitoring


and evaluation scheme?

 Attainable and results-oriented?

 Relevant strategies appropriately connected to goals and


objectives?

 Time-bound within targets but flexible to afford revisions and


updates?

4. does the IPPD reflect andragogical (adult learning) methodologies/ activities


that are known to be effective in attaining the IPPD goal and objectives?

5. does the IPPD reflect processes that are embedded in the job, i.e. inherent to
the practice of the profession, and a continuing course of action.

Reviewed by:

Name: _____________________________________ Designation: _________________________________


Date: _______________________________________

T&D System M&E Framework and Tools Handbook, June 2010 Page 63
IPPD-M&E Form 4: Summary Template of IPPD Goal/Objectives for
Teachers/School Heads

Summary of the Teachers’/School Heads’ IPPD Priorities based on objectives set


IPPD Objectives Set Based on Competency Domains/Strands
Names 1.1 1.2

Totals
Directions: (1) List the names of teachers/school heads who accomplished their IPPD. (2) Write the competency number
the IPPD, on the top row of the succeeding columns. (3) Enter the 3 prioritized objectives of each teacher/school head in
O3 indicating the IPPD objective of an individual teacher/school head based on set objectives in the IPPD form. The numb
number of entered objectives per column and write this in the totals. This information will be useful for the planners of the

T&D System M&E Framework and Tools Handbook, June 2010 Page 64
IPPD/SPPD-M&E Form 5: Division Tracking Form for Accomplished
IPPDs/SPPDs (electronic version available)

Division: _________________________ Date: __________________

No. of No. of Teacher School Head SPPD


Districts School Name Teachers IPPD IPPD Completed Comments
accomplished accomplished
District 1 1.
2.
3.
4.
5.
6.
7.
8.
9.
10
Sub Total
District 2 11
12
13
14
15
16
17
18
19
20
Sub Total
District 3 21
22
23
24
25
26
27
28
29
30
Sub Total
TOTALS

T&D System M&E Framework and Tools Handbook, June 2010 Page 65
IPPD/SPPD-M&E Form 6: Region Tracking Form for Accomplished
IPPDs/SPPDs (electronic version available)
Region: ____________________________ Date: -
_______________________
No. of No. of No. of No. School No. SPPD
Divisions District Name Schools Teacher Teacher Head IPPD Complete Comments
s IPPD accomplishe d
accomplishe d
d
Division 1.
1 2.
3.
4.
5.
6.
7.
8.
9.
1
0
Sub Total
Division 1
2 1
1
2
1
3
1
4
1
5
1
6
1
7
1
8
1
9
2
0
Sub Total
Division 2
3 1
2
2
2
3
2
4
2
5
T&D System M&E Framework and Tools Handbook, June 2010 Page 66
2
6
2
7
2
8
2
9
3
0
Sub Total
TOTAL
S

T&D System M&E Framework and Tools Handbook, June 2010 Page 67
3.3 : M&E for the SPPD

M&E tools are provided to support the School Plan for Professional Development (SPPD) process.
The following tools are available:

Tools for SPPD


T&D-M&E Form 1: Individual Profile Template
SPPD-M&E Form 1: Process Observation Guide for SPPD
SPPD-M&E Form 2: End of SPPD Planning Evaluation
SPPD-M&E Form 3: SPPD Debriefing Guide Checklist
SPPD-M&E Form 4: Review Tool for Accomplished SPPD
SPPD-M&E Form 5: Summary Template for Schools’ Professional Development Priority Programs
Based on SPPDs at District Level
IPPD/SPPD-M&E Form 6: Division Tracking Form for Accomplished IPPD/SPPD
IPPD/SPPD- M&E Form 7: Region Tracking Form for Accomplished IPPD/SPPD

System M&E Tools for Regional M&E Tools for M&E Tools for School
Level Level Division/Cluster Level Level

Output SPPD-M&E Form 3: SPPD De- SPPD-M&E Form 2: End of


IPPD/SPPD-M&E Form 7: Region briefing Guide Checklist SPPD Planning Evaluation
Tracking Form for Accomplished
IPPDs/SPPD SPPD-M&E Form 4: Review Tool for
Accomplished SPPD

SPPD-M&E Form 5: Summary


Template for Schools’ Priority
Professional Development
Programs based on SPPDs at
District Level

IPPD/SPPD-M&E Form 6: Division


Tracking Form for Accomplished
IPPDs/SPPD

Process SPPD-M&E Form 1: Process SPPD-M&E Form 1: Process


Observation Guide for SPPD Observation Guide for SPPD

Input T&D-M&E Form 1: Individual


Profile Template

Resource Materials Checklist for


SPPD incorporated into the
SPPD Guide

T&D System M&E Framework and Tools Handbook, June 2010 Page 68
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of the
SPPD process:

What is monitored How it is M&E tool to be Who is When does the How are results used
monitored used responsible monitoring take
for the place
monitoring
The membership of All members of T&D-M&E School During the PDP -WG analyzes profiles
the teams planning teams Form 1: Head formation of to ensure teams are well
responsible for the are asked to Individual planning teams represented by the various
development of provide a Profile personnel groups and have
SPPD in relation to: personal profile Template members with relevant
outlining their experiences.
- The experiences work Recommendation based on
which individuals experiences the analysis is made to
bring to the team and improve future team
- The level of qualifications. membership and included in
representation of the Program Completion
the different Report
personnel groups
on the team

The process followed A process SPPD-M&E School During the SPPD ES1/PSDS consolidates
in accomplishing the observation is Form 1: PDP-WG process at the results from observations
SPPD and the level completed Process school level from their cluster and
of collaboration Observation Divisions PD prepare a report for
between team Guide for – WG submission to the Division
members SPPD (represented T&D Chair. T&D Chair
by identifies key
ES1/PSDS) recommendations and
include in Program
Completion Report for the
conduct of the SPPD.

Team Members Team members SPPD-M&E School Following the End of Program Evaluations
perception of the complete an Form 2: End of PDP- WG accomplishment are collated by the PDP-WG
extend they End of Program SPPD Planning of the SPPD and reviewed to identify how
successfully Planning Evaluation the processes can be
completed the SPPD Evaluation improved.
planning process A summary of the results are
included in the Program
Completion Report and
recommendations
incorporated into future
processes

The SPPD process A debriefing SPPD-M&E School Following the Key finding and
meeting will be Form 3: SPPD PDP-WG accomplishment recommendations to be
conducted De-briefing (led by SH) of the SPPD at the include in Program
involving all Guide school level Completion Report and will
those involved Checklist inform future conduct of the
in facilitating SPPD
the SPPD
process

T&D System M&E Framework and Tools Handbook, June 2010 Page 69
The accomplished SPPDs will be SPPD-M&E Division Following the ES1/PSDS/ASDS
SPPD reviewed Form 4: PDP-WG completion of the consolidates key finding and
Review Tool for (represented SPPD prepares a report for
Accomplished by submission to the Division
SPPD ES1/PSDS) T&D Chair. T&D Chair
identifies key
recommendations and
include in Activity Completion
Report for the conduct of the
SPPD

All SPPDs submitted SPPDs Priority SPPD-M&E District Upon submission ES1/PSDS consolidates the
at the District Level Programs will Form 5: ES/PSDS of the SPPDs at 3 priority programs for
be consolidated Summary the District level professional development
Template for listed in each SPPD. The
Schools’ accomplished Template will
Priority be submitted with a cover
Professional report to the Division T&D
Development Chair through the PDP-WG.
Programs
based on
SPPDs at
District Level

The number of A Division IPPD/SPPD- Division Following the Results will be included in
SPPDs Tracking Form M&E Form 6: PDP-WG accomplishment Division Program Completion
accomplished by will be Division of the SPPD by Report and inform future
schools within the completed Tracking Form schools in each SPPD policy
division listing the for division
number Accomplished
schools who IPPDs/SPPDs
have
accomplished
SPPDs

The number of A Region IPPD/SPPD- Region Following the Results will be included in
SPPDs Tracking Form M&E Form 7: PDP-WG accomplishment Region Program Completion
accomplished in will be Region of the SPPD by Report and inform future
each division within completed Tracking Form schools in each SPPD policy
the region listing the for division
number of Accomplished
schools who IPPDs/SPPDs
have
accomplished
SPPDs across
all divisions

T&D System M&E Framework and Tools Handbook, June 2010 Page 70
T&D-M&E Form 1: Individual Profile Template

I PERSONAL DATA
Name:

(Surname) (First Name) (Middle Name)

Employee Number (If Applicable):


Sex: Male Female
Date of Birth:
Home Address:
Contact #: e-mail address:
Region: Division: District:
Office/School: Address:
Current Other
Position: Designations:
Highest Educational Attainment:

II. WORK EXPERIENCE


(List from most current.)
LEVEL e.g. Elem/Sec/ALS INCLUSI
MAIN AREA OF RESPONSIBILITY
POSITION school, district, division, VE
e.g. subjects taught, level supervised PERIOD
region

Use additional sheet if necessary.

III. TRAINING ATTENDED OVER THE LAST THREE YEARS

Please check training focus and management level for all training attended over the last three years.

T&D System M&E Framework and Tools Handbook, June 2010 Page 71
Training Focus Training Management Level of Training
attended Central Region Division Cluster School
over last 3
years ()
Curriculum

Resource Materials
Development

Planning

Management

Policy Development

Research

Other, please specify


______________

IV. SIGNIFICANT EXPERIENCES


Identify which of the following areas you consider to be your area(s) of expertise:
S School Based Management
Quality Assurance Monitoring and Evaluation
Access Education Subject Specialization: _____________)
Education Planning Policy Development
Learning Resource Materials Development ICT
Delivery of Training Other, please specify ________________

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please specify -----

List your significant experiences in the identified areas

Use additional sheet if necessary.

V. TRAINING AND DEVELOPMENT EXPERIENCES


Identify which of the following specific areas you consider to be your area(s) of
expertise:

Competency Assessment Program Planning

Program Designing Resource Materials Development

T&D System M&E Framework and Tools Handbook, June 2010 Page 72
Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to the best
of my knowledge and belief.

Date: Signature:

Please submit completed form to Training and Development Division/Unit. Information will be incorporated into
the T&D Information System Database.

T&D System M&E Framework and Tools Handbook, June 2010 Page 73
SPPD-M&E Form 1: Process Observation Guide for SPPD

SCHOOL OBSERVED:_________________________________________________
NAME OF PROCESS OBSERVER: _____________________________________
DATE: ______________________
VENUE: _____________________
PARTICIPANTS NAME & DESIGNATION: (Attached Attendance Sheet)

DIRECTION: Observe the process involved in the activities associated with the development of the
SPPD. If the activity is accomplished, write YES in the appropriate column, if not, write NO. Rate
the level of collaboration between participants in completing the various activities using the rating
scale below:

RATING SCALE: (1) Low (2) Moderate (3) Average (4) High

ACTIVITIES LEVEL OF
ACCOMPLISHED COLLABORATIO
Yes or No N
1 2 3 4
I. Development of an understanding the SPPD and its purpose
a. Conduct of a warm up activity to form personnel groups

b. Discussion on how the school’s personnel can further


develop themselves as professionals to improve school
performance
c. Presentation of the objective of the SPPD workshop and
explanation of the meaning of SPPD, its purpose and
guiding principles.
d. Explanation regarding the accomplishment of the SPPD
being the joint responsibility of all stakeholders and
personnel groups who will be affected by the plan.

II. Completion of the SPPD


a. Analysis of the national and regional/division context for
the SPPD based on the recommended documents and/or
other relevant available documents.
b. Development of a narrative based on reviewed documents
for inclusion in Section 1a of the SPPD.
c. Analysis of the data on student performance
d. Analysis of data on needs assessment results for the
different personnel groups
e. Formulation of the SPPD goal
f. Formulation of the objectives from the goal by reviewing
the list of priority needs and the specific competency areas
g. Identification of the target group for each of the objectives
formulated
h. Decision on the content for all programs identified in the
SPPD for each target group
i. Identification of the mode of delivery for each program
j. Establishment of the timeframe for the various programs

T&D System M&E Framework and Tools Handbook, June 2010 Page 74
identified in the SPPD
k. Estimation of the budgetary requirements for each
program
l. Identification of sources of funds for each program
m Review of the SPPD
.
n. Signing of the SPPD

Do you have any comments regarding the SPPD process?

Do you have other comments/suggestions/recommendations for the improvement of the SPPD


process?

Process Observer:

___________________________________
Signature Over Printed Name

___________________________________
Designation

T&D System M&E Framework and Tools Handbook, June 2010 Page 75
SPPD-M&E Form 2: End of SPPD Planning Evaluation

Name of SPPD Planner: ___________________________ Sex: Male Female

Personnel Group Represented: _______________________________

Please rate how you feel the SPPD team faired relative to the following processes involved in the
accomplishment of the SPPD. Please tick the appropriate column for your rating using the scale
below.

Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way

SPPD Accomplishment Rating Scale


To what extent do you feel: 1 2 3 4

1 the following documents were used in the analysis and development of the context
of the SPPD?
a. BESRA PIP
b. SIP/AIP
c. NCBTS Framework
d. Consolidated TSNA Results
e. Consolidated Teachers IPPD
f. Consolidated Non-Teaching TDNA
g. Student Performance Data
2 the formulation of SPPD overall goal was based on the results of the analysis of
the current context relating to Human Resource Development?
3 the SPPD goal was taken into consideration in formulating the objectives?

4 the formulation of program objectives was based on the professional development


needs of the different personnel groups?
5 decisions about the content of the programs were based on the objectives to be
achieved and the competencies to be enhanced?
6 the following were considered in identifying the delivery modes for the different
programs?
a. Specific Context
b. Target Participants
c. Effective Modes of Learning
d. Available Resources
7 the following were considered in estimating the budget for the different
programs?
a. Cost associated with Program Pre-Implementation
b. Cost associated with Program Implementation

T&D System M&E Framework and Tools Handbook, June 2010 Page 76
8 the various funding sources were identified to support the implementation of the
different programs?
9 the following were considered in setting the timeframe for the different programs?
a. Development Priorities
b. Cumulative Nature of the Programs
c. One-Year Coverage of the SPPD
10 you have been capacitated through your involvement in the planning process?
11 you will be able to apply the learning gained in planning for future similar
activities?
12 you are able to transfer the technology learnt to others?

Do you have other comments/suggestions/recommendations for the improvement of the SPPP


process?

T&D System M&E Framework and Tools Handbook, June 2010 Page 77
SPPD-M&E Form 3: SPPD Debriefing Guide Checklist

This form has been developed to guide the facilitators debriefing meeting following the completion of the
School Plan for Professional Development (SPPD). The T&D Chair /ES1/PSDS/School Head should manage
the debriefing meeting and ensure a record is kept of the discussions. The information from this meeting
should inform future SPPD activities and the Program Completion Report.
------------------------------------------------------------------------------------------------------------
DATE: ______________________
VENUE: _____________________

Facilitating Team MEMBERS present at the debriefing: (Attached Attendance Sheet)

Directions: Discuss each of the questions below as a group and reach a consensual answer. Check the
appropriate column that corresponds to your group response. Write comments to support your response.
QUESTIONS YES NO COMMENTS
1. Were all targeted participants
present?

2. Were all the necessary


materials organized in
advance and made available
during the activity?
3. Was the venue conducive to
the development of the
SPPD?

4. Were the steps/processes


outlines in the session guide
properly followed by the
facilitators?
5. Was sufficient time provided
for participants to seek
clarification on the various
steps involved in the SPPD
process?
6. Was there sufficient time to
thoroughly conduct all the
planned activities?
7. Were the participants able to
successfully accomplish the
SPPD? If not, was a strategy
developed to ensure the
SPPD will be completed in a
timely manner?

8. Were all the objectives


achieved?

T&D System M&E Framework and Tools Handbook, June 2010 Page 78
9. What suggestions/recommendations can you make that will improve the conduct of the SPPD?
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________
10. General comments on the conduct of the activity:
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________

Name: ____________________________
(Signature Over Printed Name)
T&D Chair /ES1/PSDS/School Head

T&D System M&E Framework and Tools Handbook, June 2010 Page 79
SPPD-M&E Form 4: Review Tool for Accomplished SPPD

This form has been developed to support a review


Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way

Use the scale above to evaluate the extent to which the accomplished SPPD adheres to the following
standards:

To what extent …….. 1 2 3 4


1. does the SPPD focus on improving student learning and consider the development
priorities of the school, national goals and thrusts?
2. does the SPPD outline opportunities for all personnel groups to participate in
continuous professional development programs to increase their current level of
competencies?
3. are the strategies identified in the SPPD proven to be effective in increasing
participation of personnel in professional development and in the implementation
of new learning into work practices?
4. does the SPPD reflect the collaborative undertakings of all personnel groups?

5. is the SPPD part of a formative and cyclical process where data from previous
planning experiences are analyzed and used to improve the process?
6. does the SPPD reflect a unified approach in improving human resource
development by taking into consideration the national goals and thrusts?
7. does the SPPD provide for alternative programs for professional development to
incorporate emerging priorities?

Name: _____________________________________

Position: ____________________________________

Date: _______________________________________

T&D System M&E Framework and Tools Handbook, June 2010 Page 80
SPPD-M&E Form 5: Summary Template of School Professional Development Priority Programs
Based on SPPDs at District Level

(Note: This Form is for one clientele group only, e.g. Teachers group. For non-teaching personnel, a separate sheet should be
used.)
Direction: Write the name of school and respective priority programs. Then, check the appropriate column that represents the
domain/strand related to the each priority program listed.

Summary of the Schools’ Professional Development Priority Programs based on SPPDs


Dom 1 Dom 2 D3 Dom 4 Dom 5 D6 Dom 7
Schools SPPD Priority Programs S S S S S S S S S S S S S S S S S S S S S S S
1 2 1 2 3 4 5 1 1 2 3 4 5 6 7 1 2 3 4 1 1 2 3
A.. 1.

2.

3.

B. 1.

C. 1

T&D System M&E Framework and Tools Handbook, June 2010 Page 81
D. 1

E.

(Continue the list to enter all the School’s data.)

E/PSDS Name and Signature:_________________________________________ Date Accomplished:


__________________________

T&D System M&E Framework and Tools Handbook, June 2010 Page 82
IPPD/SPPD-M&E Form 6: Division Tracking Form for Accomplished IPPDs/SPPDs (electronic version available)

Division: _________________________ Date: __________________

No. of No. of Teacher School Head SPPD


Districts School Name Teachers IPPDs IPPDs Completed Comments
accomplished accomplished
District 1 1.
2.
3.
4.
5.
6.
7.
8.
9.
10
Sub Total
District 2 11
12
13
14
15
16
17
18
19
20
Sub Total
District 3 21
22

T&D System M&E Framework and Tools Handbook, June 2010 Page 83
23
24
IPPD/SPPD-M&E Form 7: Region Tracking Form for Accomplished IPPDs/SPPDs (electronic version available)
25
26
27
28
29
30
Sub Total
TOTALS

Region: ____________________________ Date: _______________________


No. of No. of No. of Teacher IPPDs No. of School Head No. of SPPDs
Divisions District Name Schools Teachers accomplished IPPDs accomplished Completed Comments
Division 1 1.
2.
3.
4.
5.
6.
7.
8.
9.
10
Sub Total
Division 2 11
12
13
14

T&D System M&E Framework and Tools Handbook, June 2010 Page 84
15
16
17
18
19
20
Sub Total
Division 3 21
22
23
24
25
26
27
28
29
30
Sub Total
TOTALS

T&D System M&E Framework and Tools Handbook, June 2010 Page 85
3.4: M&E for the MPPD

M&E tools are provided to support the Master Plan for Professional Development (MPPD)
process. The following tools are available:

Tools for MPPD


T&D-M&E Form 1: Individual Profile Template
MPPD-M&E Form 1: Process Observation Guide for Division/Region MPPD
MPPD-M&E Form 2: End of MPPD Planning Evaluation Division/Region
MPPD-M&E Form 3: Division/Region MPPD Debriefing Guide Checklist
MPPD-M&E Form 4: Review Tool for Accomplished MPPD for Division/Region

B. Professional Development Planning (PDP) System


B.3. MPPD
System M&E Tools for M&E Tools for M&E Tools for School
Level Regional Level Division/Cluster Level Level

Output MPPD-M&E Form 4: Review Tool MPPD-M&E Form 2: End of MPPD


for Accomplished MPPD for Planning Evaluation-
Division/Region Division/Region

MPPD-M&E Form 3: MPPD-M&E Form 3: Division/


Division/Region MPPD De- Region MPPD De-briefing Guide
briefing Guide Checklist Checklist

MPPD-M&E Form 2: End of


MPPD Evaluation-
Division/Region

Process MPPD-M&E Form 1: Process MPPD-M&E Form 1: Process


Observation Guide for Observation Guide for Division /
Division/Region MPPD Region MPPD

Input T&D-M&E Form 1: Individual T&D-M&E Form 1: Individual


Profile Template Profile Template

Resource Materials Checklist for Resource Materials Checklist for


Regional MPPD incorporated into Division MPPD incorporated into
the Region MPPD Guide the Division MPPD Guide

T&D System M&E Framework and Tools Handbook, June 2010 Page 86
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of
MPPD process:

What is monitored How it is M&E tool to Who is When does How are results used
monitored be used responsible the
for the monitoring
monitoring take place
The members of the All members T&D-M&E PDP-WG During the PDP- WG analyze profiles to
teams responsible for of planning Form 1: Division formation of ensure teams are well
the development of teams are Individual and planning represented by the various
MPPD in relation to: asked to Profile Region teams personnel groups and have
- The experiences provide a Template level members with relevant
which individual personal experiences. Recommendation
members bring to the profile based on the results is made to
team outlining improve future team
- The appropriateness their work membership and included in
of representation of experiences the Program Completion
the different and Report
personnel groups on qualification
the team s. Profiles of planners to be
entered into TDIS data base

The process followed A process MPPD-M&E PDP- WG During the Results are reviewed by the
in accomplishing a observation Form 1: at the MPPD at the PDP-WG and
MPPD and the level of is completed Process Region division/ recommendations developed
collaboration between Observation and region level to improve processes and
team members Guide for the Division included in the Program
Division/ Completion Report
Region

Team Members Team MPPD-M&E PDP- WG Following the End of Program Evaluations
perception of the members Form 2: End accomplish- are collated by the PDP-WG
extent they complete an of MPPD ment of the and reviewed to identify how
successfully End of Planning Program the processes can be
completed the Program Evaluation Planning improved. Recommendations
planning process Planning Division/ process at are included in the Program
Evaluation Region the region Completion Report
and division
level

The MPPD process at A debriefing MPPD-M&E PDP-WG Following the Results from the de-briefing
the region and meeting Form 3: accomplishm are incorporated into
division level involving all Division/ ent of the Region/Division Program
those Region MPPD at the Completion reports and used
involved in MPPD De- region and by the T&D Chief/Chair to
facilitating briefing division level improve future processes
the MPPD Guide
process Checklist

The accomplished Completed MPPD-M&E Following Regional Results from the review of
MPPD MPPD’s will Form 4: the PDP- WG MPPDs are incorporated into
be reviewed Review Tool completio members for Region/Division Program
at the region for n of the both the Completion reports and used
and division Accomplishe MPPD at Region and by the T&D Chief/Chair to
level d MPPD for the region Division improve future processes
Division/Regi and MPPD
on division

T&D System M&E Framework and Tools Handbook, June 2010 Page 87
level

T&D-M&E Form 1: Individual Profile Template

I PERSONAL DATA
Name:

(Surname) (First Name) (Middle Name)

Employee Number (If Applicable):


Sex: Male Female
Date of Birth:
Home Address:
Contact #: e-mail address:
Region
: Division: District:
Office/School: Address:
Current Other
Position: Designations:
Highest Educational Attainment:

II. WORK EXPERIENCE


(List from most current.)
LEVEL e.g. INCLUS
MAIN AREA OF RESPONSIBILITY
POSITION Elem/Sec/ALS school, IVE
e.g. subjects taught, level supervised PERIOD
district, division, region

T&D System M&E Framework and Tools Handbook, June 2010 Page 88
Use additional sheet if necessary.

III. TRAINING ATTENDED OVER THE LAST THREE YEARS

Please check training focus and management level for all training attended over the last three
years.

Training Focus Training Management Level of Training


attended Central Region Division Cluster School
over last 3
years ()
Curriculum

Resource Materials
Development

Planning

Management

Policy Development

Research

Other, please specify


______________

IV. SIGNIFICANT EXPERIENCES


Identify which of the following areas you consider to be your area(s) of expertise:
S School Based Management
Quality Assurance Monitoring and Evaluation
Access Education Subject Specialization:
_____________)
Education Planning Policy Development
Learning Resource Materials Development ICT
Delivery of Training Other, please specify
________________

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please


specify --

List your significant experiences in the identified areas

T&D System M&E Framework and Tools Handbook, June 2010 Page 89
Use additional sheet if necessary.

V. TRAINING AND DEVELOPMENT EXPERIENCES


Identify which of the following specific areas you consider to be your
area(s) of expertise:

Competency Assessment Program Planning

Program Designing Resource Materials


Development

Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best of my knowledge and belief.

Date: Signature:

Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.

T&D System M&E Framework and Tools Handbook, June 2010 Page 90
T&D System M&E Framework and Tools Handbook, June 2010 Page 91
MPPD-M&E
MPPD-M&EForm 2: 1:
Form End of MPPD
Process Planning Evaluation
Observation Guide for –
Division/Region
Division/Region MPPD

LEVEL OF PLAN: REGION DIVISION

NAME OF PROCESS OBSERVER: _____________________________________


DATE: ______________________ VENUE: _____________________
PARTICIPANTS NAME & DESIGNATION: (Attached Attendance Sheet)
DIRECTION: Observe the process involved in the activities associated with the development of the
MPPD. If the activity is accomplished, write YES in the appropriate column, if not, write NO. Rate the
level of collaboration between participants in completing the various activities using the rating scale
below:
RATING SCALE: (1) Low (2) Moderate (3) Average (4) High

ACTIVITIES ACCOMPLI LEVEL OF


SHED COLLABORATIO
Yes or N
No 1 2 3 4
I. Development of an understanding the MPPD and its purpose

a. Conducted a warm-up activity to form personnel groups


b. Discussion on how the region/division personnel can further develop
themselves as professionals to improve performance
c. Presentation of the objective of the MPPD workshop and explanation of
the meaning of MPPD, its purpose and guiding principles.
d. Explanation regarding the accomplishment of the MPPD being the joint
responsibility of all stakeholders and personnel groups who will be
affected by the plan.
II. Completion of the MPPD
a. Analysis of the national and regional/division context for the MPPD
based on the required other available documents.
b. Development a narrative based on reviewed documents for inclusion in
Section 1a of the MPPD
c. Analysis of the data on student performance
d. Analysis of data on needs assessment results for the different
personnel groups
e. Formulation of the MPPD goal
f. Formulation of the objectives from the goal by reviewing the list of
priority needs and the specific competency areas
g. Identification of the target group for each of the objectives formulated
h. Decision on the content of the MPPD for each target group
i. Identification of the mode of delivery for each professional
development activity
j. Establishment of the timeframe for the various program identified in
the MPPD
k. Estimation of the budgetary requirements for each professional
development activity
l. Identification of sources of funds for each professional development
activity
m. Review of the MPPD
n. Signing of the MPPD

Process Observer: Designation_____________________

___________________________________
Signature Over Printed Name

T&D System M&E Framework and Tools Handbook, June 2010 Page 92
Name of MPPD Planner: ________________________ Sex: Male Female

Client Group Represented: ________________________

Please rate how you feel the MPPD team faired relative to the following processes involved
in the accomplishment of the MPPD. Please tick the appropriate column for your rating using
the scale below.

Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way

MPPD Accomplishment Rating Scale


To what extend do you feel ……. 1 2 3 4
1 the following documents were used in the analysis and development of
the context of the MPPD?
a. BESRA PIP
b. SBM Framework
c. NCBTS Framework
d. REDP/DEPD
e. Regional/Division Master Training Plan
f. Student Performance Data
g. TDNA Results
2 the formulation of MPPD overall goal was based on the results of the
analysis of the context of the MPPD?
3 the formulation of the objectives were based on the professional
development needs of the different personnel groups?
4 the MPPD goal was taken in to consideration in the formulating the
objectives?
5 decisions about the content of the programs were based on the objectives
to be achieved and the competencies to be enhanced?
6 the following were taken into consideration when identifying the
delivery modes for the different programs?
a. Specific Context
b. Target Participants
c. Effective Modes of Learning
d. Available Resources

7 the following were taken into consideration when estimating the budget
for the different programs?
a. Cost associated with Program Designing/Resource
Materials Development
b. Cost associated with Program Delivery
c. Cost associated with Program Monitoring and

T&D System M&E Framework and Tools Handbook, June 2010 Page 93
Evaluation
8 you were able to successfully identify various funding sources to support
the implementation of the different programs?
9 the following were taken into consideration when setting the timeframe
for the different programs?
a. Development Priorities
b. Cumulative Nature of the Programs
c. Three-Year Coverage of the MPPD
10 you have been capacitated by your involvement in the planning process?
11 you will be able to apply the learning gained in planning for future
similar activities?
12 you are able to transfer the technology learnt to others?

Do you have other comments/suggestions/recommendations for the improvement of the


MPPD process?

MPPD-M&E Form 3: Division/Region MPPD Debriefing Guide


Checklist

This form has been developed to guide the facilitators debriefing meeting following
the completion of the Master Plan for Professional Development (MPPD) at the
region or division level. The T&D Chief/Chair should manage the debriefing meeting
and ensure a record is kept of the discussions. The information from this meeting
should inform future MPPD activities and the Program Completion Report.

LEVEL OF PLAN: REGION DIVISION

T&D System M&E Framework and Tools Handbook, June 2010 Page 94
DATE: ______________________ VENUE: _____________________

Facilitating Team MEMBERS present at the debriefing: (Attached Attendance


Sheet)

Directions: Discuss each of the questions below as a group and reach a


consensual answer. Check the appropriate column that corresponds to your group
response. Write comments to support your response.

QUESTIONS YE NO COMMENTS
S
1. Were all targeted participants present?

2. Were all the necessary materials


organized in advance and made available
during the activity?
3. Was the venue conducive to the
development of the MPPD?

4. Were the steps/processes outlines in the


session guide properly followed by the
facilitators?
5. Was sufficient time provided for
participants to seek clarification on the
various steps involved in the MPPD
process?
6. Was there sufficient time to thoroughly
conduct all the planned activities?

7. Were the participants able to successfully


accomplish the MPPD?

8. Were all the objectives achieved?

9. What suggestions/recommendations can you make that will improve the conduct
of the MPPD?
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________

10. General comments on the conduct of the activity:


___________________________________________________________

T&D System M&E Framework and Tools Handbook, June 2010 Page 95
___________________________________________________________
___________________________________________________________
___________________________________________________________
___________________________________________________________

Name: ____________________________
(Signature Over Printed Name)
T&D Chief/Chair

MPPD-M&E Form 4: Review Tool for Accomplished MPPD for


Division/Region

Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way

Use the scale above to evaluate the extent to which the accomplished MPPD adheres to the
following:
To what extent …….. 1 2 3 4
1. does the MPPD focus on improving student learning and consider the
development priorities of the division/region, national goals and
thrusts?
2. does the MPPD outline opportunities for all personnel groups to
participate in continuous professional development programs to
increase their current level of competencies?
3. do the objectives and competencies directly relate to the overall goal of
the MPPD?
4. have related competencies been organized to form programs?

T&D System M&E Framework and Tools Handbook, June 2010 Page 96
5. are related programs logically sequenced?
6. have a range of delivery modes been recommended?
7. will the output and outcomes identified provide evidence that program
objectives have been met and competencies enhanced?
8. does the budget estimate take into consideration the costs associated
with the design, resource materials development, implementation and
monitoring of all programs
9. have sources of funds been identified for the proposed programs
10. are the programs logically scheduled across the years covered by the
MPPD?
11. are the strategies identified in the MPPD effective in increasing
participation and involvement of education personnel in professional
learning?
12. does the MPPD incorporate formative and cyclical processes and
promote the accurate collection and analysis of data to improve future
activities?
13. does the MPPD reflect a unified approach to improve human resource
development?

Name: _____________________________________

Position: ____________________________________

Date: _______________________________________

Section 4.0: T&D System Monitoring and Evaluation for the Program
Designing and Resource Development (PDRD) System

4.1 M&E for the Program Designing and Resource Development (PDRD) System

The Program Designing and Resource Development System has two major planning
components; program Designing and Resource Development

It is the responsibility of the PDRD-WG to complete the M&E and QA processes associated
with the development of program designs and resource packages. The PDRD-WG will be
expected to report on their findings and to incorporate any recommendations for
improvement into future processes.

The M&E and QA for the PDRD System is shown in the diagram below. The systems flow is the
same as that in the M&E for the other subsystems of Training and Development and is
followed at the division and at the regional levels. Basically, the members of the PDRD-WG
responsible for the M&E prepares the resources they need for the task, then implements its
M&E Plan as scheduled. The nature of the task is to ensure the compliance of the

T&D System M&E Framework and Tools Handbook, June 2010 Page 97
implementers monitored to the standards set for program designing and resource
development activities for the various clientele. Part of the M&E task is to review the quality
of the program designs and the resource packages that have been produced and another
group is convened by the T&D Chief/Chair to review the quality of the plans based on
standards set for professional development plans. Results are recorded, reports prepared and
submitted to the T&D Office for uploading in the TDIS.

The T&D Office, in turn, informs the schools and the divisions monitored of the findings and
makes the necessary adjustments to the system. Reports are prepared on the monitored
processes to inform Regional policy review and adjustment.
3.4. QA –M&E for Program Designing and
Region/Division
Resource T&D
Development Regional/Division
PDRD-WG
3.4.1 T&D Unit
Com Prepare 3.4.6 3.4.6a
pletio PDRD- Review Identify &
n of WG for M&E Inform
MPP M&E and Report monitored
3.4.2
DMonitor Resource 3.4.3 Division of
s QA assess M&E3.4.6b
findings
Program Formulate/R
Designing compliance
to standards evise PDRD
and resource Guidelines
development at
division/sch Region PDRD
at regional 3.4.4 ool level Policy Standards &
level Record
TDIS Review Guidelines
M&E
Data &
Results 3.4.7
base 3.4.5 Adjust
M&E ment Communicat
Prepare e standards
M&E Report
& guidelines
Report to divisions

4.2. M&E for Program Designing

M&E tools are provided to support the program designing process. The following tools are available:

Tools for Program Designing:


T&D-M&E Form 1: Individual Profile Template
D-M&E Form 1: End of Program Designing Evaluation
D-M&E Form 2: Program Design Review/Quality Assurance Tool

C. Program Designing and Resource Development (PDRD) System


C.1. Program Designing
System M&E Tools for the M&E Tools for the M&E Tools for the

T&D System M&E Framework and Tools Handbook, June 2010 Page 98
Level Regional Level Division/Cluster Level School Level
Output
D-M&E Form 2: Program Design D-M&E Form 2: Program Design D-M&E Form 2: Program Design
Review/Quality Assurance Tool Review/Quality Assurance Tool Review/Quality Assurance Tool

Process D-M&E Form 1: End of Program D-M&E Form 1: End of Program D-M&E Form 1: End of Program
Designing Evaluation Designing Evaluation Designing Evaluation

Input T&D-M&E Form 1: Individual T&D-M&E Form 1: Individual T&D-M&E Form 1: Individual
Profile Template Profile Template Profile Template

Resource Requirements Checklist Resource Requirements Checklist Resource Requirements Checklist


for Program Designing for Program Designing for Program Designing
incorporated into the Program incorporated into the Program incorporated into the Program
Designing Guide Designing Guide Designing Guide

T&D System M&E Framework and Tools Handbook, June 2010 Page 99
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation of the
Program Designing process:

What will be How it will be M&E tool to Who will be When will the How will the results be
monitored monitored be used responsible monitoring used
for the take place
monitoring
The membership of All members of T&D-M&E PDRD-WG During the The PDRD- WG analyzes
the team the Program Form 1: formation of the profiles to ensure that
responsible for the Designing Team Individual Program members have the relevant
development of the are asked to Profile Designing Team experience and expertise to
Program Design in provide a Template support the program design
relation to the personal profile process.
experiences and outlining their
expertise which work Profiles are to be entered
individuals bring to experiences and into the TDIS database of
the team. qualifications. Program Designers at the
Region, Division and
School levels.

Team Members Program D-M&E PDRD- WG Following the End of Program Evaluation
perception of the Designing Team Form : 1 End completion of Forms are collated by the
extent they members will of Program the Program PDRD-WG and reviewed to
successfully individually Designing Designing identify how the processes
completed the complete the Evaluation process can be improved.
designing process End of Program
Designing A summary of the results
Evaluation are included in the Program
Completion Report and the
recommendations are
incorporated in future
processes

Completed Program The Program D- M&E Form Program At the The Program Design is
Designs Designing Team 2: Program Designing Team completion of a refined based on
and a QA Team Design and a QA Team program design recommendations from the
will review and Review/ at the region, review/QA. Based on the
quality assure Quality division and review, a decision is made
the completed Assurance school level regarding whether the
Program Designs Tool program is to be
at the region, implemented or not.
division and Recommendations are
school levels made to improve future
program designing
processes and included in
the Program Completion
Report.

T&D-M&E Form 1: Individual Profile Template

I PERSONAL DATA
Name:

T&D System M&E Framework and Tools Handbook, June 2010 Page 100
(Surname) (First Name) (Middle Name)

Employee Number (If Applicable):


Sex: Male Female
Date of Birth:
Home Address:
Contact #: e-mail address:
Region: Division: District:
Office/School: Address:
Current Other
Position: Designations:
Highest Educational Attainment:

II. WORK EXPERIENCE


(List from most current.)
LEVEL e.g. Elem/Sec/ALS INCLUSI
MAIN AREA OF RESPONSIBILITY
POSITION school, district, division, VE
e.g. subjects taught, level supervised PERIOD
region

Use additional sheet if necessary.

III. TRAINING ATTENDED OVER THE LAST THREE YEARS

Please check training focus and management level for all training attended over the last three years.

Training Focus Training Management Level of Training


attended Central Region Division Cluster School
over last 3
years ()
Curriculum

T&D System M&E Framework and Tools Handbook, June 2010 Page 101
Resource Materials
Development

Planning

Management

Policy Development

Research

Other, please specify


______________

IV. SIGNIFICANT EXPERIENCES


Identify which of the following areas you consider to be your area(s) of expertise:
S School Based Management
Quality Assurance Monitoring and Evaluation
Access Education Subject Specialization: _____________)
Education Planning Policy Development
Learning Resource Materials Development ICT
Delivery of Training Other, please specify ________________

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please specify --

List your significant experiences in the identified areas

Use additional sheet if necessary.

V. TRAINING AND DEVELOPMENT EXPERIENCES


Identify which of the following specific areas you consider to be your area(s) of expertise:

Competency Assessment Program Planning

Program Designing Resource Materials Development

Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

T&D System M&E Framework and Tools Handbook, June 2010 Page 102
Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to the best
of my knowledge and belief.

Date: Signature:

Please submit completed form to Training and Development Division/Unit. Information will be incorporated
into the T&D Information System Database.

T&D System M&E Framework and Tools Handbook, June 2010 Page 103
D-M&E Form 1: End of Program Designing Evaluation

Name of Program Designer: _________________________Sex: Male Female

Title of the Program Design: _____________________________________________

Target Personnel Group: ________________________________________________

As a member of the Program Designing Team please rate how you think the team implemented the following
processes involved in the development of the program design. Please tick the appropriate column for your
rating using the scale below.
Numerical Rating Interpretation
4 Very High Extent
3 High Extent
2 Low Extent
1 Very Low Extent

To what extent did the Program Designing Team implement/demonstrate 1 2 3 4


the following?
1 Examined existing Program Designs for the purpose of adoption/
adaption
2 Supplied adequate general program information for identifying the
Program Design
3 Aligned the rationale of the Program Design to the MPPD/SPPD goal
4 Considered the objectives in the MPPD/SPPD when developing the
Program Design objectives
5 Provided adequate details in the Program Design in relation to:
a. the delivery mechanism to be employed (e.g. F3 and JEL)
b. how the program will be facilitated
6 Provided sufficient information in the Program Content Matrix for both
the F3 and JEL components in relation to:
a. specific objectives
b. content (KSAs)
c. suggested activities
d. duration of activities
7 Outlined a clear schedule of activities for the F3 and JEL components
8 Described detailed information regarding the materials required to
implement the program
9 Outlined the M&E details of the program with a clear description of the
M&E process to be employed
10 Itemized the budgetary requirements accurately for the implementation
of the Program Design
11 Enhanced competencies in the program designing process
12 Expressed commitment to apply the learning gained in program
designing in future similar activities
13 Expressed willingness to transfer the technology learned to others

T&D System M&E Framework and Tools Handbook, June 2010 Page 104
Do you have other comments/suggestions/recommendations for the improvement of the
program designing process?

Please submit completed form to PDRD-WG. Results should be incorporated in


the Program Completion Report

T&D System M&E Framework and Tools Handbook, June 2010 Page 105
D-M&E Form 2: Program Design Review/Quality Assurance Tool

This form is used by both the PDRD-WG and the Quality Assurance Team to support the review
and quality assurance of the developed program designs at the region, division and school level.
The PDRD-WG will use the form to internally review its work before submitting the Program
Design for QA, through the T&D Chief/Chair or School Head. The T&D Chief/Chair or School
Head will establish a Quality Assurance Team to review the developed program design to ensure
that it meets the standards set for program designs.

Rating Guide:
Numerical Rating Interpretation
4 Very High Extent
3 High Extent
2 Low Extent
1 Very Low Extent

Use the scale above to evaluate the Program Design by checking the appropriate column
To what extent …….. 1 2 3 4
1 does the program design build on quality program design concepts?
2 do the rationale, objectives and competencies identified in the
program design relate to current demands in education as stipulated
in the SPPD/MPPD?
3 does the program design take into consideration the specific need
of the target group and the context in which the work in
identifying:
a. the delivery mode (formal and job-embedded)?
b. innovative strategies?
c. research-based practices?
4 does the Program Content Matrix provide sufficient information in
relation to the KSAs to be developed for:
a. the F3 program delivery?
b. the JEL program delivery?
5 is the content described in the program design:
a. logically sequenced?
b. accurately presented?
c. sufficiently covered?
6 is the schedule of activities:
a. logically organized
b. an accurate reflection of required resources needed to
successfully implement both the formal face-to-face and
job-embedded learning components of the program?
7 have the required support materials been accurately identified?
8 has an accurate budget for the program been prepared?
9 is the program design a product of collaboration between qualified
and competent educators?
10 is the program design user friendly, technology enabled and cost
effective?

T&D System M&E Framework and Tools Handbook, June 2010 Page 106
11 does the suggested job-embedded learning (JEL) component
encourage the engagement of participants in applying their learning
from the face-to-face (F3) training in their daily work?
12 does the JEL promote opportunities for collaborative learning in the
workplace?
13 does the JEL include an appropriate time frame adequate to
complete expected accomplishments/outputs?
14 does the JEL provide a flow of activities for JEL implementation?
15 does the JEL identify means of verifying accomplishments and
outputs?

Do you have any comments/suggestions/recommendations for the improvement of the


Program Design:

A. Formal Face-to-Face (F3) Design:

B. Job-Embedded Learning (JEL) Design

Name: _____________________________________

Position: ____________________________________

Date: _______________________________________

Results should be used when developing the Program Completion Report

T&D System M&E Framework and Tools Handbook, June 2010 Page 107
4.3 M&E for Resource Development

M&E tools are provided to support the resource development process. The following tools are
available:

Tools for Resource Development:


T&D-M&E Form 1: Individual Profile Template
RD-M&E Form 1: End of Resource Package Development Evaluation
RD-M&E Form 2: Program Resource Package Review/ Quality Assurance Tool

C. Program Designing and Resource Development (PDRD) System


C.2. Resource Development
System M&E Tools for the M&E Tools for the M&E Tools for the
Level Regional Level Division/Cluster Level School Level
Output RD-M&E Form 2: Program RD-M&E Form 2: Program RD-M&E Form 2: Program
Resource Package Resource Package Resource Package Review/
Review/Quality Assurance of the Review/Quality Assurance of the Quality Assurance of the Tool
Tool Tool

Process RD-M&E Form 1: End of RD-M&E Form 1: End of RD-M&E Form 1: End of
Resource Package Development Resource Package Development Resource Package
Evaluation Evaluation Development Evaluation

Input T&D-M&E Form 1: Individual T&D-M&E Form 1: Individual T&D-M&E Form 1: Individual
Profile Template Profile Template Profile Template

T&D System M&E Framework and Tools Handbook, June 2010 Page 108
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation
of the Resource Development process:

What will be How it will be M&E tool to Who will be When will the How will the results
monitored monitored be used responsible monitoring be used
for the take place
monitoring
The membership All members of T&D-Form PDRD-WG During the PDRD-WG analyzes
of the teams the Resource 1: Individual formation of the the profiles to ensure
responsible for Development Profile team that teams have
the development team are asked Template members with relevant
of Program to provide a experiences.
Resource personal profile Recommendations
Package in outlining their based on the analysis
relation to the work are made to improve
level of experiences and future team
experiences qualifications. membership and are
which individuals included in the Program
bring to the team Completion Report

At Region, Division and


School Levels, profiles
are to be entered into
the TDIS

Team Members’ Team members RD-Form 1: PDRD-WG Following the End of Program
perception of the complete the End of completion of Resource Package
extent they Form for End of Program the Resource Development
successfully Program Resource Package Evaluation Forms are
completed the Resource Package development collated by the PDRD-
Program Package Developmen process at the WG and reviewed to
Resource Development t Evaluation region, division identify how the
Package process Evaluation and school level processes can be
improved.
A summary of the
results are included in
the Program
Completion Report and
recommendations
incorporated into future
processes

Completed An initial review RD-Form 2: PDRD-WG at At the The Program Resource


Resource of the Resource Program the region, completion of a Package is refined
Package Packages is Resource Division and Resource based on
conducted at the Package school level Package recommendations from
region, division Review/ development the review/QA
and school level Quality process Based on the results of
Assurance the review, a decision is
QA teams are Tool QA teams made regarding
organized to established at whether the program is
review final the Region, to be implemented or
Resource Division and not.
Packages school level Recommendations are
developed at the made to improve future
Region, Division Resource Package
and school levels development processes
and included in the

T&D System M&E Framework and Tools Handbook, June 2010 Page 109
Program Completion
Report.

T&D-M&E Form 1: Individual Profile Template

I PERSONAL DATA
Name:

(Surname) (First Name) (Middle Name)

Employee Number (If Applicable):


Sex: Male Female
Date of Birth:
Home Address:
Contact #: e-mail address:
Region
: Division: District:
Office/School: Address:
Current Other
Position: Designations:
Highest Educational Attainment:

II. WORK EXPERIENCE


(List from most current.)
LEVEL e.g. INCLUS
MAIN AREA OF RESPONSIBILITY
POSITION Elem/Sec/ALS school, IVE
e.g. subjects taught, level supervised PERIOD
district, division, region

Use additional sheet if necessary.

T&D System M&E Framework and Tools Handbook, June 2010 Page 110
III. TRAINING ATTENDED OVER THE LAST THREE YEARS

Please check training focus and management level for all training attended over the last three
years.

Training Focus Training Management Level of Training


attended Central Region Division Cluster School
over last 3
years ()
Curriculum

Resource Materials
Development

Planning

Management

Policy Development

Research

Other, please specify


______________

IV. SIGNIFICANT EXPERIENCES


Identify which of the following areas you consider to be your area(s) of expertise:
S School Based Management
Quality Assurance Monitoring and Evaluation
Access Education Subject Specialization:
_____________)
Education Planning Policy Development
Learning Resource Materials Development ICT
Delivery of Training Other, please specify
________________

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please


specify --

List your significant experiences in the identified areas

Use additional sheet if necessary.

T&D System M&E Framework and Tools Handbook, June 2010 Page 111
V. TRAINING AND DEVELOPMENT EXPERIENCES
Identify which of the following specific areas you consider to be your area(s) of expertise:

Competency Assessment Program Planning

Program Designing Resource Materials Development

Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best of my knowledge and belief.

Date: Signature:

Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.

RD-M&E Form 1: End of Resource Package


Development Evaluation

Name of Resource Material Developer: ______________________Sex: Male: Female:

T&D System M&E Framework and Tools Handbook, June 2010 Page 112
Title of Resource Package Developed: ____________________________________
Target Personnel Group: _______________________________________________
Please rate how you think the Training & Development Resource Package Development team
implemented the following processes involved in the development of the resource package.
Please tick the appropriate column for your rating using the scale below.
Numerical Rating Interpretation
4 Very High Extent
3 High Extent
2 Low Extent
1 Very Low Extent
To what extent did the developers demonstrate the following? 1 2 3 4
1 Considered the Program Design when developing the Resource Package
2 Examined existing resource materials/packages for the purpose of adoption/adaption
3 Followed the standards and guiding principles in the development of the resource package
4 Conceptualized the context matrix clearly articulating the requirement of the program in
relation to:
a. activities to be conducted
b. specific objectives to be achieved
c. key understanding to be developed
d. resource materials required
5 Outlined a sequentially organized schedule of activities to support the implementation of the
program including both the F3 and JEL Components
6 Articulated in the resource package a clear learning approach and methodology for both the
F3 and JEL components
7 Developed session guides, (including accompanying powerpoint presentations and scripts)
that clearly outline the conduct of all activities
8 Developed all the necessary handouts and reading materials required for the successful
implementation of the program
9 Identified all the materials required to support the delivery of the program
10 Described the monitoring and evaluation process clearly and developed any necessary M&E
tools
11 Reviewed and quality assured the Resource Package
12 Enhanced competencies in program resource development
13 Expressed learning gained during the resource development process

Do you have other comments/suggestions/recommendations for the improvement of the


resource package development process?

Please submit completed form to PDRD-WG. Results should be incorporated into the Program
Completion Report

RD-M&E Form 2: Program Resource Package Review/ Quality


Assurance Tool

This form is used by both the PDRD-WG and the Quality Assurance Team to support the
review and quality assurance of the developed program resource package at the region,

T&D System M&E Framework and Tools Handbook, June 2010 Page 113
division, school levels. The PDRD-WG will use the form to internally review its work before
submitting the Program Resource Package for QA, through the T&D Chief/Chair or School
Head. The T&D Chief/Chair or School Head will establish a Quality Assurance Team to
review the developed program resource package to ensure that it meets the standards set for
program resource development.

Rating Guide:
Numerical Rating Interpretation
4 Very High Extent
3 High Extent
2 Low Extent
1 Very Low Extent

Use the scale above to assess the Program Resource Package and check the appropriate
column for each item.

Standards for Professional Development Materials 1 2 3 4


1. It is accurate in content and reflects the ways in which
knowledge is conceptualized within the domain.

2. There is logical and smooth flow of ideas.

3. It uses language and symbols of the content domain and its way
of representation, and supports learners in developing and using
them.
4 It uses the following correctly and appropriately:
- - terms and expressions
- - symbols and notations
A. Integrity

- - diagrammatic representation
- - graphical representation
5. It assists the learner by identifying and differentiating between
different points of view and perspectives presented.
6. It is supported with content that is based on current research and
incorporates innovative strategies and best practices.
7. Presentation of factual content is accurate and up-to-date.

8. There are no outdated information, improper use of statistics;


inaccurate graphs; over simplified models, examples or
simulations.
9. Sources of information are identified and properly referenced.
B. Learner-focus

10. It deepens educators’ content knowledge and enhances


competencies?
11. Learning objectives are made explicit to learners/users.

12. It promotes adult learning principles

T&D System M&E Framework and Tools Handbook, June 2010 Page 114
13. Content is structured to scaffold learning.

14. The resource package takes into consideration the specific needs
of the target group and the context in which they work.
15. Clear instructions for use are provided (i.e. purpose, processes,
intended outcomes are explicit).
16. Learning and information design is intuitive (i.e. the user knows
what to do and how to do it).
17. The context matrix is clearly conceptualized articulating the
C. Usability

activities, specific objectives, key understanding and resource


materials required to deliver the program.
18. It has a clear schedule of activities outlined to support the
implementation of the program.
19. It includes appropriate follow-up and technical assistance that
will support the development of competencies in the work place.
20. The resource package technology enhances the conduct and is
cost effective for training purposes
21. The learning resource connects to trainees’ personal/local
knowledge and experience
- linguistic and cultural experience
- local (community/geographic) conditions
- individual and family circumstances- including, gender,
abilities, economic conditions
- interest and degree of engagement (in particular
addresses differently abled learners)
D. Accessibility

22. Material is free of ideological, cultural, religious, racial, and gender


biases and prejudices.
23. Resource does not confront or embarrass learners in any of the
following ways:
- require learner to expose personal data which may
embarrass them
- invade learners’ privacy
- unfavourably compare learners’ learning performance
with learners’ identity
- unfavourably or stereotypically compare family or
community characteristics with learners’ identity
- unnecessarily or indiscriminately confront cultural
beliefs or practices
E. Provision of Support Materials

24. The session guides, including accompanying powerpoints and scripts,


are clear and logically sequenced
25. All the necessary handouts and reading materials required for the
successful implementation of the program have been developed

T&D System M&E Framework and Tools Handbook, June 2010 Page 115
DevelopmentF. Collaborative 26. The resource package reflects a product of group effort among
qualified and competent educators.

27. The resource package provides monitoring and evaluation scheme and
for M&EG. Provision

tools

What are your comments/suggestions/recommendations for the improvement of the resource


package?

Name: _____________________________________

Position: ____________________________________

Date: _______________________________________

Results should be shared with the Program Resource Development Working Group and inform the
development of the Program Completion Report

Section 5.0: T&D System Monitoring and Evaluation for the


ProgramDelivery (PDy) System

5.1. M&E for the Program Delivery System


The Program Management Team has the responsibility for overseeing the Monitoring and
Evaluation Processes for the Program Delivery System.

T&D System M&E Framework and Tools Handbook, June 2010 Page 116
The diagram below shows the M&E process flow for the program delivery component of the
system. It includes M & E processes and tools designed to monitor the operations,
adherence to standards, processes and end-of-training evaluation at the Regional, Division
and School levels. It includes the utilization of required tools and methodology (e.g. process
observation tool, rating scales, open-ended questionnaire, journal writing, and evaluation
feedback). The data gathering strategies, data analysis and resources for the preparation of
M&E reports are necessary elements for the system to be operational. The M&E provides
information on strengths and weaknesses of the system for the improvement and
sustainability of operations at the Regional, Division/District and School levels.

4.2. M&E for Program Delivery


Compl
School /Division/Region
4.2.1 PDy-WG Division/Regional 4.2.6a
T&D Office
etion
of Prepare 4.2.5 Identify &
Resou 4.2.2a PDy-WG for Review Inform
4.2.2b M&E monitored
rce Monitor M&E and 4.2.6b
Conduct Report schools of
Packa ProgramResources Make
process & M&E findings
ge Delivery necessary
end-of- PDy
Processes 4.2.6cSystem
adjustments to
training &
and 4.2.3 Prepare Report
PDy System Adjustm
JEL
compliance toRecord on Progent
TDIS standards M&E evaluation Delivery
Data Results Process based
base 4.2.4 on M&E for
Prepare Reg policy
Report Reg
on PDy
M&E Report review &
ProcessPolicy
adjustment
M&E Review
Report &
Adjust
ment

M&E tools are provided to support both the Formal face-to-face (F3) and Job-embedded
Learning (JEL) program delivery phases of the training program as well as the overall
management of the program. The following tools are available.

5.2. Monitoring and Evaluation for Program Delivery

Tools for the Formal Face to Face(F3) delivery:

T&D-M&E Form 1: Individual Profile Template


F3-M&E Form 1: Walkthrough Observation Checklist
F3-M&E Form 2: Learning Process Observation and Facilitation Skills
F3-M&E Form 3: End of the F3 Program Assessment and Consolidation Template
F3-M&E Form 4: External M&E for F3 Process and Accomplishments
F3-M&E Form 5: Rapid Competency Assessment and Consolidation Template
F3-M&E Form 6: F3 Program Completion Report Template
F3-M&E Form 7: Summary Template for Refining the Resource Package

T&D System M&E Framework and Tools Handbook, June 2010 Page 117
Tools for Job-Embedded Learning (JEL) delivery
JEL-M&E Form 1: Quality Assurance of the JEL Contract
JEL-M&E Form 2: JEL Journal Entry Sheet
JEL-M&E Form 3: JEL Reflection Template
JEL-M&E Form 4: JEL Advising Tracking Form
JEL-M&E Form 5: Trainees End of Job-Embedded Learning Evaluation and Consolidation
Template
JEL-M&E Form 6: JEL Program Completion Template

D. Program Delivery (PD) System


D.1. Formal Face-to-Face (F3)
System M&E Tools for Regional Level M&E Tools for Division/Cluster M&E Tools for School Level
Level Level
Output F3-M&E Form 5: Rapid F3-M&E Form 5: Rapid Competency F3-M&E Form 5 Rapid
Competency Assessment with Assessment with consolidation Competency Assessment with
consolidation template template consolidation template

F3-M&E Form 6: F3 Program F3-M&E Form 6: F3 Program F3-M&E Form 6: F3 Program


Completion Report Template Completion Report Template Completion Report Template

F3-M&E Form 7: Summary F3-M&E Form 7: Summary F3-M&E Form 7: Summary


Template for Refinement of Template for Refinement of Template for Refinement of
Resource Package Resource Package Resource Package
Process F3-M&E Form 1: Walkthrough F3-M&E Form 1: Walkthrough F3-M&E Form 1: Walkthrough
Observation Checklist Observation Checklist Observation Checklist

F3-M&E Form 2: Learning F3-M&E Form 2: Learning Process F3-M&E Form 2: Learning
Process Observation and Observation and Facilitation Skills Process Observation and
Facilitation Skills Facilitation Skills

F3-M&E Form 3: End of F3 F3-M&E Form 3: End of F3 F3-M&E Form 3: End of F3


Program Assessment with Program Assessment with Program Assessment with
consolidation template consolidation template consolidation template

F3-M&E Form 4: External ME for F3-M&E Form 4: External ME for F3 F3-M&E Form 4: External ME for
F3 Processes and Accomplishment Processes and Accomplishments F3 Processes and
Accomplishments
Input T&D-M&E Form 1: Individual T&D-M&E Form 1: Individual Profile T&D-M&E Form 1: Individual
Profile Template Template Profile Template
D.2. Job-embedded Learning (JEL)
Output JEL-M&E Form 6: JEL Program JEL-M&E Form 6: JEL Program JEL-M&E Form 6: JEL Program
Completion Report Template Completion Report Template Completion Report Template

Process JEL-M&E Form 2: JEL Journal JEL-M&E Form 2: JEL Journal Entry JEL-M&E Form 2: JEL Journal
Entry Sheet Sheet Entry Sheet

JEL-M&E Form 3: Job-Embedded JEL-M&E Form 3: Job-Embedded JEL-M&E Form 3: Job-


Learning (JEL) Reflection Template Learning (JEL) Reflection Template Embedded Learning (JEL)
Reflection Template
JEL-M&E Form 4: JEL Advising
Tracking Form JEL-M&E Form 4: JEL Advising JEL-M&E Form 5: Trainee’s End
Tracking Form of JEL Evaluation with
JEL-M&E Form 5: Trainee’s End of consolidation template
JEL Evaluation with consolidation JEL-M&E Form 5: Trainee’s End of
template JEL Evaluation with consolidation
template
Input JEL-M&E Form 1: Quality JEL-M&E Form 1: Quality JEL-M&E Form 1: Quality
Assurance of the JEL Contract Assurance of the JEL Contract Assurance of the JEL Contract

T&D System M&E Framework and Tools Handbook, June 2010 Page 118
The matrix below describes the mechanism and tools to be used for the monitoring and evaluation
of the Program Delivery System.

What will be How it will be M&E tool to Who will be When will the How will the results
monitored monitored be used responsible monitoring take be used
for the place
monitoring
The membership All members of T&D-M&E PDy-WG During the The PDy-WG will
and experience the Program Form 1: formation of the analyze profiles to
of the all those Management Individual Program ensure team members
Team and Profile Management have relevant
involved in the
Trainers will be Template Team and experiences.
program delivery asked to Trainers’ Team, Recommendation based
process e.g. provide a and upon on the analysis will be
Program personal profile registration of the made to improve future
Managers, outlining their Trainees selection processes of
Trainers, work Program Managers and
Trainees experiences Trainers and included in
and the F3 Program
qualifications. Completion Report.
Trainees will Profiles to be entered
also be asked into TDIS
to complete a
profile on
registration

The effectiveness The Program F3-M&E Form Program At the end of the Results will be reviewed
of the walk- Management 1: Management Walk-through of to identify how the walk-
through process Team members Walkthrough M&E Team the Resource through process can be
and the Observation package at least improved. Results will
in preparing the
Trainers will all Checklist 1 week prior to also inform the activities
Program complete a the delivery of the that will need to be
Management checklist and training program accomplished prior to
Team and the results will be the training program
Trainers for the collated by the delivery.
delivery of the Program Results will inform the
training Management final F3 Program
Team Completion Report

The trainers Process F3-M&E Form Program During the Results from the
learning process observers will 2: Learning Management conduct of all Learning Process
and facilitation be identified for Process M&E Team sessions during Observation will be
each session Observation the F3 program used to inform daily
skills
to complete the and delivery debriefing sessions and
proforma e.g. Facilitation to improve the delivery
An off-duty Skills of the training program.
trainer, a
member of the
Program
Management
team

Trainees, All trainees, F3-M&E Form Program At the end of the Results will be used to
Trainers and trainers and 3: End of F3 Management F3 phase of the inform future delivery of
Program Program Program M&E Team training program the training program
Managers will Assessment and to enhance
Managers level
complete an with Program Management
of satisfaction evaluation of Consolidation and Trainers future

T&D System M&E Framework and Tools Handbook, June 2010 Page 119
with the F3 the F3 phase Template performance.
phase of the of the training Consolidated results will
training program program analyzed and used to
inform the final F3
Program Program Completion
Management Report
Staff will
consolidate
results

The quality of the External F3-M&E Form Regional During the Results will be
F3 program monitors will be 4: External Personal for F3phase of the discussed with the
asked to M&E for the Division level program program management
evaluate the F3 Process F3 programs. staff and trainers and
compliance to and will be incorporated into
standards Accomplishmen Division the F3 Program
ts Personal for Completion Report.
Cluster/school
level F3 Results will be used to
programs inform future F3
programs and T&D
policies.

Trainee’s self- All Trainees’ F3-M&E Form Program Prior to the Results will be used to
perception of will complete a 5: Rapid Management beginning of the inform future delivery of
their level of rapid Competency M&E Team F3 phase of the the training program.
competency Assessment program and Consolidated results will
competency
assessment. Before and again at the end analyzed and used to
before and after Program After the F3 of the F3 inform the final F3
their involvement Management Program program. Program Completion
in a F3 training Staff will Report
program consolidate
results

Overall All members of F3-M&E Form PDy-WG At the completion The F3 Program
effectiveness, the Program 6: F3 of the F3 phase Completion report will
efficiency and Management Program of a training be submitted to the T&D
Team will be Completion program Chief at the Region/
success of the F3
expected to Report Division level and the
training program contribute to Template School Head at the
the school level and used to
accomplishme inform future F3
nt of a F3 programs and T&D
Program policies.
Completion
Report.

The Resource All trainers will F3-M&E Form Program Following the Results will be
package used to be asked to 7: Summary Management delivery of the F3 incorporated into the F3
inform the F3 review the Template for M&E Team phase of the Program Completion
Resource Refining the program Report and submitted to
program delivery
Package in Resource the T&D Chief at the
relation to the Package Region/ Division level
sessions they and the School Head at
were the school level for
responsible for consideration and
delivering and action.

T&D System M&E Framework and Tools Handbook, June 2010 Page 120
make
recommendatio
ns for further
enhancement\
Program
Management
will be
consolidate
recommendatio
ns for all
sessions

Tools for JEL:


The quality of the The JEL Team JEL-M&E JEL Team Prior to the Results will be used to
JEL Contract will be Form 1: implementation of enhance the JEL
developed by responsible for Quality the JEL Contract Contract and to improve
reviewing all Assurance of future processes.
trainees in
JEL Contracts the JEL
relation to set and ensuring Contract
standards the
completeness
of the proforma

Individual All Trainees will JEL-M&E JEL Team During all phases Journals will be
progress, complete a Form 2: JEL of the JEL reviewed by the JEL
learning, insights journal to Journal Entry program Team during the JEL
document the Sheet program to inform next
gained and
JEL process. steps. The journal will
issues provide a means of
encountered verification for the
during the successful completion
various phases of of the JEL Contract.
JEL the program

Individual All Trainees will JEL-M&E JEL Team During the Results will be reviewed
progress and complete the Form 3: JEL Reflection Stage by the JEL Team and be
ability to reflect JEL Reflection Reflection of the JEL phase used to support the
Template Template of the program Trainee in identifying
on learning to
next steps.
improve future It will provide a means
practice of verification for the
successful completion
of JEL Reflection phase
of the JEL program.

The level, type JEL Advisers JEL-M&E Program During the Results will be used to
and effectiveness will be expected Form 4: JEL Management various phases of improve the JEL
of the support to keep a record Advising M&E Team the JEL program Advising process and
of the JEL Tracking will be incorporated into
provided by a
advise they Form the JEL Program
JEL Adviser provide to Completion Report.
trainee’s
Results will be used to
inform the JEL
Handbook and future
programs and T&D
policies.

T&D System M&E Framework and Tools Handbook, June 2010 Page 121
Trainees level of All trainees will JEL-M&E Program At the end of the Results will be used to
satisfaction with complete an Form 5: Management JEL program inform future JEL
the JEL program evaluation of Trainee’s M&E Team delivery of programs.
the JEL End of JEL Results will analyzed
program. Evaluation and used to inform the
Program final JEL Program
Management Completion Report
Staff will
consolidate
results.

Overall All members of JEL-M&E T&D Chief at At the completion The JEL Program
effectiveness, the Program Form 6: JEL the of the JEL Completion report will
efficiency and Management Program Region/Divisi program be submitted to the T&D
Team will be Completion on level Chief at the Region/
success of the
expected to Template School Head Division level and the
JEL program contribute to the at the school School Head at the
accomplishment level school level and used to
of a JEL inform future JEL
Program programs and T&D
Completion policies.
Report.

T&D System M&E Framework and Tools Handbook, June 2010 Page 122
T&D-M&E Form 1: Individual Profile Template

I PERSONAL DATA
Name:

(Surname) (First Name) (Middle Name)

Employee Number (If Applicable): Sex: Male Female


Date of Birth:
Home Address:
Contact #: e-mail address:
Region
: Division: District:
Office/School: Address:
Current Position: Other Designations:

Highest Educational Attainment:

II. WORK EXPERIENCE


(List from most current.)
LEVEL e.g. INCLUS
MAIN AREA OF RESPONSIBILITY
POSITION Elem/Sec/ALS school, IVE
e.g. subjects taught, level supervised PERIOD
district, division, region

Use additional sheet if necessary.

T&D System M&E Framework and Tools Handbook, June 2010 Page 123
III. TRAINING ATTENDED OVER THE LAST THREE YEARS

Please check training focus and management level for all training attended over the last three
years.

Training Focus Training Management Level of Training


attended Central Region Division Cluster School
over last 3
years ()
Curriculum

Resource Materials
Development

Planning

Management

Policy Development

Research

Other, please specify


______________

IV. SIGNIFICANT EXPERIENCES


Identify which of the following areas you consider to be your area(s) of expertise:
S School Based Management
Quality Assurance Monitoring and Evaluation
Access Education Subject Specialization:
_____________)
Education Planning Policy Development
Learning Resource Materials Development ICT
Delivery of Training Other, please specify
________________

Certified Trainers by NEAP Central NEAP-Region TEI

SEAMEO- INNOTECH Foreign Assisted Projects (FAP) Other, please


specify -----

List your significant experiences in the identified areas

T&D System M&E Framework and Tools Handbook, June 2010 Page 124
Use additional sheet if necessary.

V. TRAINING AND DEVELOPMENT EXPERIENCES


Identify which of the following specific areas you consider to be your
area(s) of expertise:

Competency Assessment Program Planning

Program Designing Resource Materials


Development

Program Delivery Program Management

Monitoring and Evaluation of Training

List your significant experiences in the identified areas

Use additional sheet if necessary.

I certify that the information I have given to the foregoing questions are true, complete, and correct to
the best
of my knowledge and belief.

Date: Signature:

T&D System M&E Framework and Tools Handbook, June 2010 Page 125
Please submit completed form to Training and Development Division/Unit. Information will be
incorporated into the T&D Information System Database.

F3-M&E Form 1: WALKTHROUGH OBSERVATION CHECKLIST

Directions: Read the following statements. Tick the appropriate column that
corresponds to your response.

Statements YE N
S O
1. Program Management Team members and Trainers were all present during the
walkthrough.
2. A collaborative effort among the Program Management Team members and trainers
was manifested during the walkthrough.
3. Individual tasks were understood and fairly assigned to all the Program
Management Team members and trainers based on peoples strengths.
4. During the walk-through trainers were made aware of the materials and resources
they were require to prepare for their assigned sessions
5. During the walk -through the session plans were reviewed in detail and the
strategies recommended were discussed and practiced.
6. The trainers were open to suggestions and were willing to learn.

7. The sequencing and the relationship between the different sessions was discussed
and the linkages identified.
8. Any required adjustments to the resource package where made while staying
faithful to the training programs intent and purpose.
9. Issues and concerns were discussed and settled in a healthy atmosphere.

10 The Program Management Team members and trainers committed to perform their
. tasks and responsibilities.

Significant Observations:
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________

Suggestions/Recommendations:
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________
__________________________________________________________________________________

___________________________________
Observer’s Signature over Printed Name

T&D System M&E Framework and Tools Handbook, June 2010 Page 126
F3-M&E Form 2: Learning Process Observation and Facilitation
Skills

This form is to be used during the actual delivery of a program. A Process Observer will need to be
assigned to complete the Learning Process Observation for each session. Results should be used to
inform daily debriefing sessions. At the end of this form is a checklist of facilitation skills which may
be observed and recorded.

Session No. _____ Title: ____________________________________________


Time Session Started: ________________ Time Session Ended:____________
Process Observer: ___________________ Designation (M&E Team Member/Trainer)

Phases of Facilitation Skills Trainee’s Comments


Session Demonstrated Knowledge
/Insights/Skills,
Values Learned
Introductory

Activity

T&D System M&E Framework and Tools Handbook, June 2010 Page 127
Analysis

Abstraction

T&D System M&E Framework and Tools Handbook, June 2010 Page 128
Application

Concluding
Activity

T&D System M&E Framework and Tools Handbook, June 2010 Page 129
Observe if the skill has been demonstrated by the Facilitator. If so, put a check in the appropriate
column.

Facilitation Skills √
OBSERVING SKILLS
1. noted trainees’ level of involvement in all activities
2. monitored the energy level of the trainees during sessions
3. sensed the needs of the trainees that may affect the learning process
QUESTIONING SKILLS
4. formulated questions in a simple manner
5. asked questions that were clear and focused
6. formulates follow-up questions to trainees’ responses appropriately
7. asked Higher Order Thinking Skills (HOTS)
8. acknowledged trainees’ responses
9. solicited, accepted and acted on feedback from trainees
10. processed responses with probing questions to elicit the desired training
LISTENING SKILLS
11. listened and understood the meaning of what had been said
12. responded positively to trainees insights
13. clarified and checked my understanding of what was heard
14. reacted to ideas not to the person
ATTENDING SKILLS
15. created the proper environment based on adult learning principles
16. directed and redirected the trainees to the learning tasks
17. managed the learning atmosphere throughout the sessions
18. acknowledged greetings and responses of trainees
INTEGRATING SKILLS
19. highlighted important results of the activity that lead to the attainment of the
objectives of the session
20. deepened and broadened trainees outlook on the significance of the outputs
ORAL COMMUNICATION SKILLS
21. expressed ideas with clarity, logic and in grammatically correct sentences
22. spoke with a well-modulated voice
23. delivered ideas with confidence and sincerity
SKILL IN USING TRAINING AIDS
24. employed appropriate and updated training aids
25. made training aids that were simple and clear
26. used training aids that were attractive and interesting
27. utilized training aids that were socially, culturally, and gender-fair

T&D System M&E Framework and Tools Handbook, June 2010 Page 130
F3-M&E Form 3: End of the F3 Program Assessment

Respondent Type: Trainee Trainer Program Manager

Name (Optional): ____________________________ Sex: Male Female

Program Title: ________________________ Date: ______________

Please assess the effectiveness of the entire F3 component of the program according to the indicators
below.
Please refer to the following rating scale:

4-Strongly Agree (SA); 3-Agree (A); 2-Disagree (D); 1-Strongly Disagree (SD)

Rating
After the conduct of the F3 component of the 1 2 3 4
program, I believe that … SD D A SA
A. Program 1 the training program was delivered as planned
Planning, 2 the training program was managed efficiently
Management and 3 the training program was well-structured
Preparation
B. Attainment of 4 the program objectives were clearly presented
Objectives 5 the session objectives were logically arranged
6 the program and session objectives were
attained
C. Delivery of 7 program content was appropriate to trainees’
Program Content roles and responsibilities
8 content delivered was based on authoritative
and reliable sources
9 new learning was clearly presented
1 the session activities were effective in
0 generating learning
1 adult learning methodologies were used
1 effectively
1 management of learning was effectively
2 structured e.g. portfolio, synthesis of previous
learning, etc.
D. Trainees’ 1 trainees were encouraged to consider how
Learning 3 ideas and skills gained during the training could
be incorporated into their own practices
1 contribution of all trainees, both male and
4 female, were encouraged
1 trainees demonstrated a clear understanding of
5 the content delivered
E. Trainers’ Conduct 1 the trainers’ competencies were evident in the
of Sessions 6 conduct of the sessions
1 teamwork among the trainers and staff was
7 manifested
1 trainers established a positive learning
8 environment
1 training activities moved quickly enough to
9 maintain trainees’ interest

T&D System M&E Framework and Tools Handbook, June 2010 Page 131
F. Provision of 2 training materials were clear and useful
Support Materials 0
2 powerpoint presentations supported the flow of
1 the sessions
2 the resources provided were appropriate to
2 trainees’ needs
G. Program 2 Program Management Team members were
Management Team 3 courteous
2 Program Management Team was efficient
4
2 Program Management Team was responsive to
5 the needs of trainees
H. Venue and 2 the venue was well lighted and ventilated
Accommodation 6
2 the venue was comfortable with sufficient
7 space for program activities
2 the venue had sanitary and hygienic conditions
8
2 meals were nutritious and sufficient in quantity
9 and quality.
3 the accommodation was comfortable with
0 sanitary and hygienic conditions
I. Overall 3 I have the knowledge and skills to apply the
1 new learning
3 I have the confidence to implement the JEL
2 contract

Please provide your honest response to each of the following questions:

What do you consider your most significant learning from the program?

What changes would you suggest to improve similar programs in the future?

Briefly describe what you have learned and how it will help you with your work.

What further recommendations do you have?

T&D System M&E Framework and Tools Handbook, June 2010 Page 132
F3-M&E Form 3: End of the F3 Program Assessment
Consolidation Template

Collate the accomplished F3-M&E Form 3: End of the F3 Program Assessment, and review the
results. Separate results should be consolidated for each respondent type e.g. Trainees.
Trainers and Program Managers. Use the table below to consolidate the results for the
quantitative items.

Note: The scoring and consolidation can be efficiently done using MS Excel.

Use the scale below to interpret mean rating for each item of the assessment:
3.5 to 4.0 = (SA) Strongly Agree
2.5 to 3.4 = (A) Agree
1.5 to 2.4 = (D) Disagree
1.0 to 1.4 = (SD) Strongly Disagree

Qualitative results should also be summarized below.

Tally (T) Frequency (e) (f) Mean


Rating
a b c d
Item SD D A SA SD D A SA a+b+c SD+ D e/f
s Tx1 Tx2 Tx3 Tx4 +d + A+
SA
Ex. llll llll 0x1= 0x2=0 8x3=24 7x4=28 24+28 7+8= 15 52/15=
III II = 0 = 52 3.47
= 7
8
A Program Planning/Management/Preparation
1
2
3
B Attainment of Objectives
4
5
6

C Delivery of Program Content


7
8
9
10
11

T&D System M&E Framework and Tools Handbook, June 2010 Page 133
12

D Trainees’ Learning

13

14

15

E Trainers Conduct of Sessions

16

17

18

19

F Provision of Support Materials

20

21

22

G Program Management Team

23

24

25

H Venue and Accommodation

26

27

28

29

30

I Overall

31

32

T&D System M&E Framework and Tools Handbook, June 2010 Page 134
Summary of Qualitative Responses

What do you consider your most significant learning from the program?










What changes would you suggest to improve similar programs in the future?











Briefly describe what you have learned and how it will help you with your work.










What further recommendations do you have?










T&D System M&E Framework and Tools Handbook, June 2010 Page 135
F3-M&E Form 4: External M&E for the F3 Process
and Accomplishments

Program Title: _________________________________ Date:


________________Venue:_________

The indicators below are the standards for monitoring F3 sessions. In the “Observation”
column, put a check mark (√) if the indicator is observed and a cross mark (x) if not
observed. In the “remarks” column write your comments.

I have verified that the following Observation


activities were observed: Remarks
1. Program 1.1 Program responsibilities matrix well-
Planning, implemented (Training matrix and
Managemen Facilitators/ tasking matrix)
t and 1.2 Readiness of program management
Preparation before and during delivery
(reproduction of materials, M & E tools
for F3 Nos. 1-11 and
Resource/training materials (e.g. LCD,
Laptops, etc)
2. 2.1 Clear and logical presentation of
Objectives session objectives
2.2 Successful delivery of session
objectives
3. Program 3.1 Appropriate to trainees’ roles and
Content responsibilities
3.2 Clear presentation of new learning
3.3 Effective session activities generated
new discoveries
3.4 Successful use of 4 A’s
3.5 Effective MOL strategies
4. Trainees’ 4.1 Encouragement of participation from
Learning both male and female in sharing their
ideas
4.2 Demonstration by trainees of a clear
understanding on the content
delivered
5. Trainers’ 5.1 The trainers’ competencies were
Conduct of evident in the conduct of the sessions
Sessions 5.2 Teamwork among the trainers and
staff was manifested
5.3 Trainers established a positive
learning environment
5.4 Systematic and consistent conduct of
training activities to sustain trainees’
interest

T&D System M&E Framework and Tools Handbook, June 2010 Page 136
6. JEL 6.1 The trainee has the knowledge and
Contract skills to apply the new learning
6.2 The trainees accomplished the JEL
Contract and are confident to
implement it
7. Provision 7.1 Training materials are organized, clear
of Support and useful
Materials 7.2 Powerpoint presentations support the
flow of the sessions
7.3 The resources provided are
appropriate to trainees’ needs
8. Program 8.1 Program Management Team
Managemen members are cooperative and
t Team courteous
8.2 Program Management Team
members are responsive to the needs
of trainees
8.3 Program Management Team
members are efficient and effective
9. Venue 9.1 Well-lighted, ventilated and with good
and hygienic conditions
Accommoda 9.2 Comfortable with sufficient space for
tion program activities
9.3 Meals were nutritious and sufficient in
quantity and quality.

Recommendations

Monitored by:

____________________________
Name and Designation

T&D System M&E Framework and Tools Handbook, June 2010 Page 137
F3-M&E Form 5: Rapid Competency Assessment

Note to the Program Management: This Template will guide you in developing the M & E
Form, Rapid Competency Assessment Tool, for a specific program to be delivered. Work
through the following steps to complete the M&E Form:

1. Write the title of the F3 program in the title box above.


2. The items to be rated are specific to the content of the program that is being
delivered. Identify the items by referring to the overall program content based on the
program objectives and the content of each session identified in the session guides
for the F3 in the Resource Package.
3. Items for the session on JEL are already suggested below.
4. Delete this box when reproducing the Tool to be distributed to the participants.

Name: _________________________ Sex: Male Female

Region/Division: _________________ Date: _____________

Direction: Describe your level of competency in the following items. For the column labeled “Pre
F3”, you will describe your competency level before you joined/attended this program. For the
column marked “Post F3”, describe your competency level after having participated in the training
program
Competency Scale to be used:
4 – I have a mastery of the competency and have demonstrated/applied it
3 – I have adequate competency and need to practice it
2 – I have inadequate competency and no understanding of how to apply it
1 – I have no competency/learning at all.

Competencies Enhanced Pre F3 Post F3


A. Overall Content of the Program
1.
2.
3.
B. Content of the Sessions
Session 1:
4.
5.
6.
Session 2:
7.
8.
9.
Session 3:

T&D System M&E Framework and Tools Handbook, June 2010 Page 138
10.
11.
12.
Session 4…and so on
13.
14.
C. The Job-embedded Learning (JEL)
15. The concept and purpose of the JEL in professional
development
16. Accomplishing a JEL Contract
THANK YOU

T&D System M&E Framework and Tools Handbook, June 2010 Page 139
F3-M&E Form 5: Rapid Competency Assessment Consolidation
Template

Title of F3 : ______________________________________________
Region/Division/School : ___________________________________
Date: ___________________________________________________

Direction: Each of the participant’s Rapid Assessment Mean Score for the column “PRE-F3”and
“POST-F3” (refer to F3-M&E Form 5) will need to be calculated and then consolidated in the table
below.

Compute the gain for each participant using the formula: Gain = (POST-F3) – (PRE-F3 ).
Calculate the Average Gain for the participants using the formula:
Average Gain = Sum of gains /number of participants

A positive difference means an increase in competency; a zero or negative results indicate no


improvement.

Rapid Assessment Summary


Name of Participant Rapid Assessment Rapid Assessment Gain
Mean Score PRE-F3 Mean Score POST – F3
1.
2.
3.

AVERAGE GAIN

T&D System M&E Framework and Tools Handbook, June 2010 Page 140
F3-M&E Form 6: F3 Program Completion Report Template
Program Title: (Please
(Add title of use electronic version)
program)

Facilitator(s): (Add names of the Facilitators/trainers and their positions)

Location and (Write the city and the actual venue e.g. Cebu, EcoTech)
venue:
Duration: (Include duration of the F3 phase )

Date: (include dates for F3)

No of Males (Add No.) Female (Add No.) Total (Add


Participants: Total )

Attendance List (Include as Attachment 1)

Program At the end of the program the participants will have


Objectives  (Add objectives from the program design/resource package)

Program (Include as Attachment 2)


Schedule

Key Results  (Identify the key results from the conduct of the program taking in to
consideration the F3 phase)

Resources (Identify the resources required to conduct the program e.g. Title of the
Materials Resource Package, Operations Manual)

T&D System M&E Framework and Tools Handbook, June 2010 Page 141
F3-M&E Form
M&E Analysis 7: Summary
After reviewing Template
the F3-M&E forthe
results from Refining the a narrative
program write
analyzing the results. This should include
Resource
 Results Package
from the participants evaluation of the program
(electronic
 Results version
from the available)
facilitators review of the program
 Results from the program managers review of the program
 Strengths and areas for improvement should be identified in this
section

(Include as Attachment 3 a copy of the F3-M&E Results)

General In this section make any general comments about the program and identify
Comments and any issues encountered in relation to:
Issues  its delivery (during the F3 phase)
Encountered - trainers/facilitators
- participants
- content of program
- delivery strategies
- training materials
-
 its management (during the F3 phase)
- prior to delivery
- during the F3 phase

Recommendation In this section discuss any recommendations you may have to improve future
s programs. Include suggestions for refining the Resource Package

Program Report Attachments

Attachment 1: Program Attendance List


(Insert here)

Attachment 2: Program Schedule of Activities


(Insert here)

Attachment 3: M&E Results


(Insert here)

T&D System M&E Framework and Tools Handbook, June 2010 Page 142
Title of F3 Program: _________________________________________________________________

Venue: _______________________________________________ Date: _______________

Directions: Fill in the template with the necessary information for the refinement of the Resource
Package.

Resource Package: _____________________________________________

Sessions/ Content Process Comments/


Titles Suggestions
Session 1

Session 2

Session 3

Etc…

Prepared by: ____________________________________


Designation: ____________________________________
Date: _______________________

(Note: Please attach Resource Package with corrections when/upon endorsing to T&D Chief)

T&D System M&E Framework and Tools Handbook, June 2010 Page 143
JEL-M&E Form 1: Quality Assurance of JEL Contract

This form aims to support a Quality Assurance Process of the accomplished JEL Contract. The JEL
TEAM shall review the JEL contract to assess the extent to which the standards were followed in its
accomplishment.

Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way

Use the scale above to assess the extent to which the accomplished JEL Contract adheres to the
following:

To what extent does the JEL Contract …….. 1 2 3 4


1. clearly identify the goals and objectives to be addressed?
2. clearly state the areas of practice where improvements are to be made?
3. describe the planned activities that are to be accomplished?
4. incorporate activities which are integrated into regular work place practices?
5. identify strategies that are experiential and based on adult learning
approaches?
6. state the roles and responsibilities of the JEL team?
7. outline the resources required to achieve the JEL goals and objectives?
8. specify a realistic timeframe for its accomplishment?
9. identify realistic and achievable MOVs?

Recommendations:

Name: ___________________________________

Position: _________________________________

Date: ____________________________________

T&D System M&E Framework and Tools Handbook, June 2010 Page 144
JEL-M&E Form 2: JEL Journal Entry Sheet

The JEL Journal identifies the type of information that should be documented by all JEL
Team Members to record progress during the various stages of the JEL Program. The journal
entries will be used to inform discussions during the Reflection Stage and as a Means of
Verification of Learning. Each JEL Member should establish their own Journal Booklet and
record their entries based on the information below, for each entry.

JEL-M&E Form 3: Job-Embedded Learning (JEL)


Reflection Template
The Job-Embedded Learning (JEL) Reflection Template is to be completed by the Trainee during the
Journal Entries
Reflection phase of the JEL process. It should be completed as part of a review of progress and in
consultation with the JEL
The following Team.
information is a guide to the type of information you should
include in your Journal
- Date of Journal Entry
Name of ___________________________________________
Trainee:
Date: - ___________________________________________
Description of activity/activities completed/observed

Summary- Nature of the support


Achievements provided
in relation to theby the and
goal JELobjectives
Team as stated in the JEL
of Contract
Progress - Learning gained
to date
- Changes in practices/behavior

- Problems met

- How problems were resolved

- Suggested next steps

Changes in Behavior as a result of the training program (F3 and JEL)

Identified Strengths
JEL-M&E Form 4: JEL ADVISING TRACKING FORM
(electronic version available)

NOTE: If you have demonstrated all the objectives outlined in the JEL Contract, proceed to the
Internalization Stage.
If you have not demonstrated all the objectives outlined in the JEL Contract please complete the
sections below before commencing the Enhancement Stage

What are my next When? JEL Team What resources What will be the
steps? support do I need? Means of Verifying
(Activities and required? (MOV) of my learning?
Strategies) ( planned
accomplishments and
date)

Signature of Date
Learner/Trainee

Signature of JEL Signature of


coach/mentor immediate
supervisor
Signature of co-
learner/s

Next Steps

Name of JEL Adviser: ___________________________ Sex: Male


Female
Workplace: __________________________________ Position:
_______________________________________

Dates Name of Area of Advisee’s Identified Adviser’s Specific


Advisee/ JEL Current Need for Role Support/
Position Advisin Activity/ Advice Advice
, g Situation Provided
Workpl and Results
ace
JEL-M&E Form 5: Trainees End of Job-Embedded Learning (JEL)
Evaluation

Name of Trainee: _______________________________ Sex: Male Female

Title of Training Program: __________________________ Date:


_________________

Please rate how you feel you have faired relative to the following processes involved in the
accomplishment of the Job –Embedded Learning phase of the training program. Please tick the
appropriate column for your rating using the scale below.

Rating Guide:
Numerical Interpretation Description
Rating
4 Very High Extent In a very significant way
3 High Extent In a meaningful way
2 Low Extent In a limited way only
1 Very Low Extent Not in any meaningful way

End of Job-Embedded Learning Phase Rating Scale


To what extent do you believe: 1 2 3 4

A. Planning for 1 the roles and responsibilities of JEL team and trainees
Implementation were thoroughly reviewed and understood by all?
2 the objectives of JEL contract were clearly understood
by all?
3 the JEL team schedule of activities was confirmed and
agreed to by all?
B. Implementation 4 there was evidence of enhancement of your
competencies?
5 that there was minimum disruption to your organic
functions and the entire learning community when
accomplishing JEL activities?
6 activities accomplished were cost effective and
practical?
7 activities accomplished were well-coordinated and well
managed?
8 formative records of learning were kept e.g. journal?
C. Reflection 9 reflections contained qualitative data on your
accomplishments?
10 the documents presented were objective?
11 the reflections made were KSA-oriented?
12 KSAs for further enhancement were identified?
13 strengths were identified?
14 next steps where identified?
D. Enhancement 15 enhancement activities were focused on achieving
identified competencies?
16 alternative strategies were employed?
17 additional support from the coach / JEL team was
provided?
18 additional time for enhancement was allocated when
required?
E. Internalization 19 enhanced competencies were demonstrated in daily
work?
20 new KSAs were recognized by others?
21 best practices were voluntarily shared with colleagues?
F. Portfolio 22 the portfolio provided evidence of the learning that has
(optional) taken place during the JEL phase of the program?

Describe the major changes have you made to your work practice as a result of the training program
(F3 and JEL).

Describe how you shared your learning with colleagues. Give details as to how it was done, who was
involved and the reactions received.

Do you have other comments/suggestions/recommendations for the improvement of the JEL phase
of the training program?
JEL-M&E Form 5: Trainees End of Job-Embedded Learning (JEL)
Evaluation Consolidation Template
Collate the accomplished JEL-M&E Form 5: Trainees’ End of JEL Evaluation, and review the
results. Use the table below to consolidate the results for the quantitative items.

Note: The scoring and consolidation can be efficiently done using MS Excel.

Use the scale below to interpret mean rating for each item of the assessment:
3.5 to 4.0 = Very High Extent (VHE)
2.5 to 3.4 = High Extent (HE)
1.5 to 2.4 = Low Extent (LE)
1.0 to 1.4 = Very Low Extent (VLE)

Qualitative results should also be summarized below.

Tally (T) Frequency (e) (f) Mean


Ratin
a b c d
g
Item VL LE HE VH VLE LE HE VHE a+b+ VLE+ e/f
s E E Tx1 Txs Tx Tx4 c+d LE +
3 HE+
VHE
Ex. llll llll 0x1=0 0x2=0 8x3 7x4= 24+28 7+8= 15 52/15
III =7 =2 28 = = 3.47
=8 4 52
A Planning for Implementation
1
2
3
B Implementation
4
5
6

C Reflection
9
10
11
12
13

14

D Enhancement
15

16

17

18

E Internalization

19

20

21

F Portfolio (Optional)

22
Summary of Qualitative Responses

Major changes made as a result of the training program (F3 and JEL).








Methods used to share learning with colleagues.







Who was involved








Reactions received.






Comments/suggestions/recommendations for the improvement of the JEL phase of the training


program








JEL-M&E Form 6: JEL Program Completion Report Template
Program Title: (Add title(Please use electronic version)
of program)

Facilitator(s): (Add names of the JEL team/Advisers)

Venue: (Write the actual venue e.g. school, division/regional office )

Duration: (Include duration of the F3 and JEL phases )

Date: (include dates for F3 and JEL phase)

No of Males (Add No.) Female (Add No.) Total (Add


Participants: Total )

Attendance List (Include as Attachment 1)

JEL Program At the end of the JEL program the trainees will have
Objectives as 
indicated in F3 
Resource 
Package

JEL Program (Include as Attachment 2)


Schedule

Key Results  (Identify the key results from the conduct of the program taking in to
consideration JEL phase)

Resources (Identify the resources required to conduct the JEL program


Materials
M&E Analysis After reviewing the JEL-M&E results from the program write a narrative
analyzing the results. This should include
 Results from the quality assurance of the JEL Contract (JEL-M&E
Form1)
 Results from the trainees’ evaluation of the JEL program (JEL-M&E
Form 5)
 Strengths and areas for improvement should be identified in this
section

(Include as Attachment 3 a copy of the JEL M&E Results)

General In this section make any general comments about the program and identify
Comments and any issues encountered in relation to:
Issues  its delivery (during JEL phase)
Encountered - trainers/advisers
- participants
- content of JEL program
- strategies
- training materials
-
 its management (during the JEL phases)
- prior to delivery
- during the JEL phase

Recommendation In this section discuss any recommendations you may have to improve future
s programs

Program Report Attachments

Attachment 1: JEL Trainees List


(Insert here)

Attachment 2: Program Schedule of Activities


(Insert here)

Attachment 3: M&E Results


Acknowledgements

to

The Project STRIVE 2 Training and Development


Component Members who developed the standards, processes and tools of the PDy
System Operations

Region VI Region VII Region VIII


Violenda Gonzales, AO- V Emiliano Elnar, Jr. Division Corazon Abella, Division
Chief Chief
Lydia Antang, Asst. Division Milagros Villanueva, ES-II Alejandra Lagumbay, P-II
Chief
Editha Segubre, ES-II Churchita Villarin, ES-II Adelma Rabuya, PSDS
Renato Ballesteros, ES-II Leah Apao, ES-II Ma. Lita Veloso, P-I
Aylen Tuvilla, ES-II Jovena Amac, HT-III
Amelita Pitalgo, ES-II Susana Acuin, ES-II
Feliciano Buenafe, ES-II
Negros Occidental Bohol/Tagbilaran Northern Samar
Marsette Sabbaluca, ES-I Debra Sabuero, P-I Nimfa Graciano, ES-I
Michell Acoyong, ES-I John Ariel Lagura, P-I Cristito Eco, P-III
Corazon Mohametano, Lilibeth Laroga, P-I Imelda Valenzuela, P-III
PSDS
Regie Sama, P-II Ma. Lileth Calacat, P-I Carlos Balanquit, PSDS
Susan Severino, HT-IV Remigio Arana, MT-I Nedy Tingzon, P-I
Joyce Aringo, P-II Casiana, Caberte, PSDS Noe Hermosilla, P-I
Juna Flores, HT III Fe Ty, ES-I
Cristina Zaragoza, TIC
DepED- EDPITAF T&D Coordinator

Jonathan Batenga

Project STRIVE T&D Technical Advisers

Louise A. Quinn Twila G. Punsalan


International Technical Adviser National Technical Adviser

You might also like