You are on page 1of 64

This SBM Manual of Operations: Contextualized Version , is for the exclusive use of DepEd Region 10.

Should other regions decide to adapt this manual or any portion thereof, they should
seek the approval of management.
Appendix S. DepEd’s Vision and Mission

The DepEd Vision


We dream of Filipinos
who passionately love their country School-Based Management
and whose values and competencies
enable them to realize their full potential Manual of Operations:
and contribute meaningfully to building the nation. A Contextualized Version

As a learner-centered public institution,


the Department of Education
continuously improves itself
to better serve its stakeholders.

The DepEd Mission


To protect and promote the right of every Filipino to quality, equita-
ble, culture-based, and complete basic education where:

Students learn in a child-friendly, gender-sensitive, safe, and moti- A Project of


vating environment. Department of Education Region 10
Teachers facilitate learning and constantly nurture every learner. spearheaded by
Administrators and staff, as stewards of the institution, ensure Field Technical Assistance Division (FTAD)
an enabling and supportive environment for effective learning to 2018
happen.
Family, community, and other stakeholders are actively en-
gaged and share responsibility for developing life-long learners.

124

1
Department of Education
Region 10 – Northern Mindanao Appendix R. Region 10 SBM Officers
Office of the Regional Director
REGION 10 SBM OFFICERS

President: Para D. Talip, MA (Mis. Occ.)


M E S SAGE Vice President: Ariel B. Montecalbo, PhD (Bukidnon)
Creating A Child and Community-Centered Education System
(ACCESs) is the lifeblood of School-Based Management (SBM). Taking a cue Board Members:
on the centrality of learners, the school and the community concretely manifest
symbiotic functions in the delivery of basic education services geared toward a Secretary Edelina M. Ebora, MA (Malaybalay)
better access and governance and, most importantly, the improvement of quality
of basic education.
Treasurer Imelda D. Pongase, MA (Ozamiz City)
Auditor Mario Esteban C. Arsenal, MA (Tangub City)
With the K to 12 Basic Education Program, schools are now challenged
to imbibe the basic tenets of SBM: Leadership and Governance, Curriculum and P.I.O. Joy C. Mangubat, PhD (Gingoog City)
Learning, Accountability and Continuous Improvement, and Management of Social Managers:
Resources. After undergoing self-assessment, duly supported by the district of-
fices, schools are being assessed, accredited, and validated based on the four (4) Roberto L. Dechos, Jr., PhD (Iligan City)
SBM principles through the SBM Coordinating Teams at the division and re-
gional levels. Esther V. Tabañag, PhD (Valencia City)
Ivy T. Jumawan, MA (Lanao del Norte)
Since its implementation in 2013, there has been a clamor from the field
for the Region to contextualize the structured mechanism that responds to the Aileen A. Zaballero, MA (Oroquieta City)
needs and circumstances in Region 10. Hence, the initiative of the Region’s
Field and Technical Assistance Division (FTAD) in coming up with this contex- Wene L.Nahial, MA (Camiguin)
tualized SBM Manual of Operations is a laudable outcome that will truly facili- Danny A. Asio, MA (Misamis Oriental)
tate the validation of SBM practices of schools. Along with this manual is a tech-
nology-based SBM Validation Tool that is user-friendly. Eulogio R. Suaner , Jr. (Cagayan de Oro City)

I am optimistic that this manual will be very useful to everyone for the
Rolly B. Labis, PhD (El Salvador)
continuous improvement of basic education in Northern Mindanao.

Mabuhay ang Rehiyon Diyes!

ALLAN G. FARNAZO
Director IV

2
123
Department of Education
Region 10 – Northern Mindanao
Office of the Assistant Regional Director

M E S SAGE
The school is the heart of the formal education system. It is where chil-
dren learn. For years, schools have been pushing hard in their single aim of
providing the best possible basic education for all learners. The demands of the
time have been the toughest test. The only way to change is to take the challenge,
and the most important change in the governance of basic education, must occur
at the school level.

School-Based Management (SBM) is the institutional expression of such


changes where schools are encouraged to institute local initiatives for improving
the quality of instruction and to ensure that the values, needs and aspirations of
the school community are given the highest consideration. Through SBM,
schools are empowered to make decisions and craft innovations on what are best
for the learners under their care.

I commend the Technical Writing Group (TWG) and the SBM Manual
Writers for coming up with the Contextualized School-Based Management Man-
ual of Operations spearheaded by the Field Technical Assistance Division
(FTAD) .This manual will enable all stakeholders to contribute to quality organi-
zational performance for the sake of the Filipino children, the center of this un-
dertaking.

God Bless.

ATTY. SHIRLEY O. CHATTO


OIC, Assistant Regional Director

3
122
Department of Education
Region 10 – Northern Mindanao
Office of the Chief
Field Technical Assistance Division

MESSAGE

The long wait is over. Finally, this SBM Manual of Operations is here. With its
completion, a happy note is resounding. Hear the regional mantra echo once again,
Diyes is it!

When the Regional Office 10 felt the need to update by contextualizing SBM
Implementation, thus enhancing the contents of DO 83, s. 2012, the newly created Field
Technical Assistance Division (FTAD) grappled to start from where it should, thinking
that listening to the schools division offices listen to schools could be the way to accom-
plish the challenge.

Truly, FTAD drew its strength by conducting a series of consultations and


benchmarking with SBM coordinators within and outside the regional sphere, gathering
and analyzing data after data, until finally it drafted the contextualized SBM Manual of
Operations that could now speak of One Region 10 in all 14 schools divisions.

It was a grueling but a worthwhile feat to walk the miles for this Manual.
FTAD can never thank enough Regional Director Allan G. Farnazo, and Assistant Re-
gional Director Shirley O. Chatto for all the mor al and financial suppor t they have
given. The same gratitude goes to the select SBM Coordinators who contributed their
piece for the entire work; to Dr. Lourdes G. Tolod, former Schools Division Superinten-
dent for editing the rough manuscripts submitted at the start; Atty. Neil Christian D.
Villagonzalo, BEST M & E Specialist, and Mr. Rey O. Macalindong, Project Manage-
ment/M & E Specialist.

Understandably, this first edition may yet be a work in progress as the agency’s
fast changing work dynamics today may still render the contents of this Manual passé
tomorrow. With constant feedback, however, succeeding editions can again be made to
ensure success.

May all of the constituents of DepEd Region 10 find this Manual indeed a valu-
able resource in ensuring that SBM will be better implemented in the field.
Mabuhay po tayong lahat.

EDITH B. LAGO-ORTEGA, PhD


Chief, FTAD

4
121
Acknowledgment

DepEd Region 10 wishes to express its sincere gratitude and appreciation to


the following who, in one way or another, have contributed to the realization of this
Manual of Operations:

Allan G. Farnazo, Dir ector IV of the Depar tment of Education, Re-


gion 10 – Northern Mindanao, for providing us the opportunity to craft this Manual
of Operations for School-Based Management (SBM): Contextualized Version in
order to simplify SBM matters and make it more understandable and responsive to
the needs of the implementers in the field;

Atty. Shirley O. Chatto, Assistant Regional Dir ector , for the legal ex-
pertise extended;

Edith L. Ortega, PhD, Chief, Field Technical Assistance Division


(FTAD), for her initiative to conduct a series of writeshops leading to the realization
of this Manual of Operations, together with Lita F. Base, Maria Salome (Marisa)
M. Manlapig, and Marilinda D. Dumpas;

The other functional divisions in the Region: CLMD, HRDD, PPRD,


QuAD, ESSD, AD, and FD for their suppor t and cooper ation;

Our consultants: Lourdes G. Tolod, PhD, CESO V, (Former SDS, Cagayan


de Oro City Division) Professor Emeritus, Xavier University; Atty. Neil Christian
D. Villagonzalo, Basic Education Sector Tr ansfor mation (BEST) M & E Spe-
cialist; and Rey O. Macalindong, Project Management , M & E Specialist, for their
expertise, patience, perseverance and guidance in crafting this Manual of Opera-
tions;

The writers of this manual for their commitment and expertise: Roberto D.
Napere, Jr., PhD; Susan S. Olana, PhD; Para D. Talip; Ariel B. Montecalbo,
PhD; Joy C. Mangubat, PhD; Rosalyn M. Lato; Fritzie C. Sillabe; Julieto M.
Indonto; Miguelito D. Bendijo; Margarita L. Ruben; Danny A. Asio; Edelina
M. Ebora; Ivy T. Jumawan; Aileen A. Zaballero; Esther V. Tabañag, PhD;
Roie M. Ubayubay, PhD; Jean S. Macasero, PhD; Mario Esteban C. Arsenal;
Imelda D. Pongase; Wenie L. Nahial; Fe D. Arancon; Susan A. Baco; and above
all

To the Almighty God, the ultimate source of wisdom and strength, for the
unconditional grace and love.

5
120
TABLE OF CONTENTS

PAGE

PRELIMINARIES

Title Page 1

Messages 2

Acknowledgment 5

Table of Contents 6

TOPICS

Rationale 9

SBM Framework 10

Region 10 SBM Contextualized Guidelines 11

Region 10 SBM Coordinating Teams 13

Duties and Responsibilities of the Regional Coordinating Teams 14

The Process 15

SBM Validation Flowchart 16

The Deliberation Process 18

Responsibilities of the School Head in SBM Assessment Validation 19

6
119
Basic Steps in the SBM Assessment Process at the School Level 19

Functions of SBM Coordinating Teams 20

Scoring Guidelines for SBM Level of Practice 21

Revised SBM Assessment Validation Procedure Tool 22

SBM Assessment Validation Procedure 24

SBM Validation Procedure of Document -Analysis, Observation

and Discussion (DOD) 25

Suggested Respondents per SBM Principle during Validation 28

Suggested Guidelines in the Granting of Recognition

and Incentives 30

Provision of Technical Assistance 31

REFERENCES 33

APPENDICES 35

A DepEd Order No. 83, s. 2012 36

B School-Based Management (SBM) Acronyms 38

7
118
C SBM Self-Assessment Form ( C.1 to C.14 ) Sample 41

D Revised SBM Assessment Tool DOD (Sample) 55

E Validation Plan Template 74

F Electronic SBM Toolkit User’s Guidelines 75

G SBM Electronic Tracking Tool Manual (ETT) 82

H Suggested Parts of the Program during School Validation 92

I SBM Practices Assessment Result Template 93

J Division Technical Assistance Plan 94

K Steps in conducting the Post-Conference and Sample of Post -

Conference Data Captured Form 95

L Suggested Composition of Division SBM Coordinating Team 96

M SBM Validation Mechanics, Matrix and Suggested Documents

(Artifacts, Evidences , and MOVs in every principle) 97

N Regional Coordinating Team (RCT) TA Sheet 100

O Definition of Performance Indicators 101

P SBM Manual of Operations Writers 110

Q SBM Manual of Operations Writers’ Pictorial 113

R Region 10 SBM Officers 123

S DepEd’s Vision and Mission 124

8 117
Rationale

With the passage of RA 9155 (Governance of Basic Education


Act of 2001), the decentralization of powers and functions of field
offices marked a milestone in the Department of Education (DepEd).
The Act ushered the implementation of School-Based Management
(SBM) thus making the school as the heart of formal education.

The educational reform has brought about the introduction of A


Child and Community-Centered Education System (ACCESs) enhanc-
ing the school and community partnership for greater accountability
of the children’s learning outcomes. To realize this, there is a need to
establish a common ground among stakeholders for the uniformity and
alignment of assessment practice.

As an initial move, DepEd Region 10, through the Field Tech-


nical Assistance Division (FTAD), conducted benchmarking and a se-
ries of consultations with the internal and external stakeholders of
schools from different divisions of Region 12. This was done to observe
their levels of SBM practices which may be applicable in the context of
Region 10.

In the region, a number of schools are still in Level I ; many


have been classified as Level II ;and still others are potential for Level
III. However, there are still issues and concerns pertaining to SBM as-
sessment validation that need to be addressed.

Hence, the FTAD, inspired by Regional Director Allan G.


Farnazo, initiated the crafting of this manual to help and guide the
school heads, teachers, parents, and other stakeholders towards the im-
provement of SBM level of practice.

It is hoped that the use of this contextualized SBM Manual of


Operations will truly serve its purpose so that schools can come up with
the right decisions in becoming more child and community-centered
institutions.

116 9
School-Based Management (SBM) Framework
The focus of School-Based Management (SBM) is a learner who is ex-
pected to be a functionally-literate citizen possessing the essential 21st century
skills of critical thinking and problem solving, collaboration and communication
imbued with the values of self-reliance, productivity, patriotism and service to hu-
manity.

Figure 1 presents the SBM structure, interrelationships, characteristics and


underlying principles.

Reference: D.O. 83, s. 2012

Figure 1. The School-Based Management (SBM) Conceptual Framework

At the core is the output which is a functionally literate citizen, a product


of a cyclic process that originates in the assessment of SBM level of practice until
it concludes in the accreditation of the school. Taking prominence in the process
are stakeholders who collaboratively work for the elevation of the school’s level of
SBM practice. It is in this context that the stakeholders hold the responsibility in
advocating a self-managing and a self-renewing school learning community.

10 115
The entire system is guided by the four principles of A Child and
Community-Centered Education Systems (ACCESs) on leadership and
governance, curriculum and learning, accountability and continuous im-
provement, and resource management. The boundary of the system is a
broken line signifying the readiness of the school to accept inputs and
changes from the external environment that may affect positively or nega-
tively depending on the response of the school.

The Division, Regional and Central Offices are tasked to work with
the schools along this SBM undertaking through the provisions of technical
assistance, professional and administrative support and policies to ensure
that these are observed, the standards are being met and the programs are
being implemented.

The framework reflects the vision, mission and goals of the SBM:
to make the community responsible for the education of its children and
make the children responsible for building the community.

Region 10 SBM Contextualized Guidelines


To better guide the SDOs and others concerned in the SBM mecha-
nism of Region 10, the following processes are to be observed:

1. All public elementary, secondary, stand-alone senior high schools and


integrated schools shall be required to submit for SBM validation.

2. Submission of applications for Levels II and III validation shall be every


April and May, respectively, and duly endorsed by the Schools Divi-
sion Superintendent (SDS) to the Regional Office. The following doc-
uments must be attached:
2.1. The duly filled-in SBM Assessment Validation Form duly
signed by the Division SBM Coordinating Team; and
2.2. The Key Performance Indicators (KPIs) duly certified by the
Division Planning Officer. These documents are subject for re-
view by the SBM Regional Coordinating Team.

114 11
Appendix Q. SBM Manual of Operations Writers’ Pictorial
3. The minimum rating of the Performance Improvement (PI) to proceed to
SBM Documentary -Analysis, Observation, Discussion (DOD) validation
is 1.50 (60%). If the school does not reach the minimum rating, the Divi-
sion Field Technical Assistance Team (DFTAT) shall provide technical
assistance.
4. The SDO shall be notified of the schedule for validation of the concerned
schools through a regional memorandum.

5. There shall be at least one (1) SBM model school for each of the elemen-
tary, secondary, and integrated schools per district as identified by the
SDO.

6. There shall be a pool of Regional SBM Coordinating Teams, composed of


representatives from the different functional divisions and select school
heads.

7. The SDOs shall organize an advocacy mechanism highlighting SBM best


practices anchored on the following principles:

7.1. Leadership and Governance


7.2. Curriculum and Learning
7.3. Accountability and Continuous Improvement
7.4. Management of Resources

8. Teacher and staff involvement in the SBM assessment validation shall be


given weight in their Individual Performance Commitment and Review
(IPCR) Form.

9. There shall be awarding of plaques, certificates, and other forms of recog-


nition by the DO and RO to deserving schools.

10. The Regional SBM Coordinating Teams shall provide technical assistance
to the divisions that need support.

11. A school shall sustain the SBM level II of practice for at least three years
before applying for SBM level III validation.

12
113
REGION 10 SBM COORDINATING TEAMS

CONSULTANTS
Name Position Office/Institution Regional Director (RD)
Former Schools Division Superinten- Professor Emeritus, Chairperson
Lourdes G. Tolod, PhD, CESO V
dent of Cagayan de Oro City Xavier University

Worked with DepEd


since 2004, as Consult- Assistant Regional Director
ant under PRIME, Vice- Chairperson
BESRA, and SEDIP /
Worked at the Develop-
ment Academy of the
Project Management and Philippines (DAP) as FTAD Chief
Rey O. Macalindong
M and E Specialist Senior Fellow/Director Lead Implementer
(1999 - 2001) for the
Systems Development
Office (SDO); and Pro-
ject Manager (1992- Interfacing of Different Functional Divisions per SBM Principle
1997) in the Project
Development Institute
Management
Leadership and Curriculum Accountability and of Resources
Basic Education Sector Governance and Learning Continuous
Atty. Neil Christian D. Villagonzalo M & E Specialist Improvement  FD
Transformation (BEST)
 AD  CLMD
 QuAD  Select
 HRDD  Select School
School  PPRD
 ESSD Heads
Heads  FTAD
 Select School
Heads  Select
School
Heads

SDOs

Schools

112 13
DUTIES AND RESPONSIBILITIES OF THE
REGIONAL COORDINATING TEAMS

Chairperson/Vice-Chairperson
Name Position/Designation Division
 Oversees the implementation of the entire SBM process. Edelina M. Ebora, MA Master Teacher I Malaybalay City
 Acts on SBM matters upon the recommendation of FTAD.
Florderick S. Velarde, Information Technology Of-
Lanao del Norte
FTAD MA ficer I
 Manages the SBM assessment validation process.
 Receives school applications for assessment validation duly endorsed Senior Education Program
Rosalyn M. Lato, MA Ozamiz City
Specialist on M&E
by the SDO.
 Assigns the applications to any of the Regional Coordinating Teams Joy C. Mangubat,PhD Education Program Supervisor Gingoog City
(RCTs).
 Issues notices to the schools concerned on the scheduled on-site
validation. Ivy T. Jumawan, MA
Senior Education Program
Lanao del Norte
 Receives results from the different RCTs. Specialist on M&E
 Recommends approval for the Recognition of Levels II & III schools
as well as those for the Level III accreditation. Mario Esteban C. Arse- Senior Education Program
Tangub City
nal, MA Specialist on M&E
 Informs SDOs on the Assessment Validation Results.
 Facilitates the conduct of the Awarding of Plaques/Certificates. Senior Education Program
 Leads in the conduct of trainings that relate to SBM implementation. Wenie L. Nahial Camiguin
Specialist on M&E
 Maintains database of SBM validated schools and best practices.
Aileen A. Zaballero, MA Senior Education Program Oroquieta City
 Facilitates continuous improvement in the management of the SBM
Specialist on M&E
validation process.
Danny A. Asio Senior Education Program Misamis Oriental
Other Functional Divisions Specialist on M&E
 Engage in the assessment validation processes based on their special-
ized functions.
 Submit results of the assessment validation to the Office of the Re-
gional Director through FTAD with reports on TA provided (if any),
best practices and recommendation for accreditation.

Select School Heads


 Join the Regional Coordinating Team depending on their availability
and the level of the school applicants (Elementary/Secondary).
 Claim travel expenses and allowances from Region 10 funds.

14 111
Appendix P. SBM Manual of Operations Writers The Process
WRITERS

Name Position/Designation Division

Para D. Talip, MA Education Program Supervisor Misamis Occidental

Education Program Specialist on


Ariel B. Montecalbo, PhD Bukidnon
M&E

Imelda D. Pongase,MA Education Program Supervisor Ozamiz City

Fritzie C. Sillabe, MA Education Program Supervisor El Salvador

Jean S. Macasero, PhD Education Program Supervisor Cagayan De Oro City

Esther V. Tabañag,PhD Education Program Supervisor Valencia City

Susan A. Baco, MA Public School District Supervisor Misamis Occidental

Roie M. Ubayubay, PhD Public School District Supervisor Misamis Oriental

Susan S. Olana, PhD Principal IV Malaybalay City


Validation Plan
Margarita L. Ruben ,MA Principal IV Misamis Occidental
FTAD facilitates in meeting the RO Coordinating Teams to
Julieto M. Indonto, MA Principal II Oroquieta City agree on the availability of members during the on-site valida-
tion, target dates and assessing the Pre-assessment Result.
Fe D. Arancon, MA Principal 11 Misamis Oriental (Refer to Appendix E: Validation Plan Template)

Miguelito D. Bendijo, MA Principal I Valencia City

Roberto D. Napere , Jr.


Principal 1 Iligan City
PhD

15
110
SBM Validation Flowchart 1.16 Retention Rate
The Retention Rate determines the degree of pupils/students in a particular
Activities Details
school year who continue to be in school in the succeeding year. This indicator
1. Receives SBM 1.FTAD Education Program Supervisor receives is also vulnerable to migration and is not advisable to compute at the school
Assessment Vali- the SBM Assessment Validation Forms level.
dation Form (AVFs) from the Schools Division Offices,
(AVF in hard copies. Elementary: Secondary:
Enrolment Gr 2-6, SY N Enrolment Yr 2-4, SY N
1.1EPS distributes the SBM AVFs in bunch by ---------------------------- x 100 ---------------------------- x 100
SDOs to designated Regional Coordinating Enrolment Gr 1-5, SY N-1 Enrolment Yr 1-3, SY N-1
Teams (RCT) for review within a week.
1.2EPS collects results of the review from RCT
Where:
a day after the one-week review.
Enrolment Gr 2-6, SY N Aug. 31 Enrolment (Table A - GESP & GSSP; Table A1 & B1 -
2. Prepares the SBM 2.EPS prepares the SBM Validation Plan (VP) that Enrolment Yr 2-4, SY N PSP)
Validation Plan contains the schedule of schools to be visit- Enrolment Gr 1-5, SY N-1 Previous SY March 31 Enrolment + Dropouts
Enrolment (Table D - GESP & GSSP; Table A4 - PSP)
(VP) ed, to be attached in a memo for Chief’s Yr 1-3, SY N-1

review and recommendation for approval


by the Regional Director.
2.1EPS secures the approved SBM VP and memo
from the regional Director.
2.2EPS ensures the memo is released and uploaded
in the regional websites.
3. Facilitates the 3.EPS facilitates the conduct of SBM AV
conduct of SBM As- by the RCT.
sessment Validation 3.1Pre-work with RCT
3.2Distribution of SBM AV Kits: tools,
one sheet certificate of appearance
containing the list of schools to be
visited at given period.
3.3RCT onsite validation

4. Convenes for de- 4.RCT convenes to deliberate and consolidate the


liberation and con- following:
solidation of AV 4.1Results of Validation
4.2Best Practices
results. 4.3TA Provided during validation
4.4 Other TA needs to address

109
16
1.14 Completion Rate
The Completion Rate measures the percentage of grade/year 1 entrants who
graduate in elementary/secondary education. It is available only up to the di-
5.Reviews and 5.EPS reviews consolidated results submit-
vision level and above. Data for grade/year 1 are based on the predecessor of
finalizes the ted by different RCTs.
BEIS, the Unified Data Gathering System (UDGS), which did not have any val-
consolidated 5.1Prepares a memo for the review and
idation procedures and did not monitor the completeness of the data submit-
recommendation of the Chief.
ted. RCT reports 5.2Seeks the approval of the
Elementary: Secondary: for the Regional Director
Graduates Gr 6, SY N Graduates Yr 4, SY N awarding 5.3Ensures the release and uploading
--------------------------- x 100 -------------------------- x 100
Enrolment Gr 1, SY N-5 Enrolment Yr 1, SY N-3 ceremony of the memo in regional web-
sites.

Where:
6.Make a Sustaina- 6.0 Chief leads in making the SP.
bility Plan (SP) 6.1 Communicates SDOs through
Graduates Gr 6, SY N Previous SY promotees/graduates Region Memo
Graduates Yr 4, SY N (Table D - GESP & GSSP; Table A4 - PSP) 6.2 Accomplishes SP Template
Enrolment Gr 1, SY N-5 - based on UDGS division level data
Enrolment Yr 1, SY N-3 Objective/s
Activity
Interventions/TA Needed
1.15 Failure Rate Persons Involved
This indicator evaluates the extent of pupils/students who failed a given Time Frame
grade/year level. Resources Needed
Expected Outcome
MOVs
Elementary: Secondary:
Failures Gr X, SY N Failures Yr X, SY N
-------------------------- x 100 --------------------------- x 100
Enrolment Gr X, SY N Enrolment Yr X, SY N

Where:
Failures Gr X, SY N
Failures Yr X, SY N Previous SY March 31 Enrolment - Promotees
(Table D - GESP & GSSP; Table A4 - PSP)Enrolment Gr X, SY N
Enrolment Yr X, SY N Previous SY March 31 Enrolment + Dropouts
(Table D - GESP & GSSP; Table A4 - PSP)

____________________________
Failures = Enrolment - Promotees

108
17
Where:
The Deliberation Process Enrolment Yr X+1, SY N Enrolment (Table A - GESP & GSSP; Table A1 & B1 - PSP)
Repeaters Gr X, SY N Repeaters (Table A - GESP & GSSP; Table A1 & B1 - PSP)
Repetears Gr X+1, SY N
Deliberation is a process of thoughtfully weighing options, usually Enrolment Gr X, SY N-1 Previous SY March 31 Enrolment + Dropouts
(Table D – GESP & GSSP; Table A4 - PSP)
prior to voting and/or decision-making. It emphasizes the use
of logic and reason, fair and objective judgment. Group decisions are
made after a consensus has been arrived at. 1.12 Simple Dropout Rate
The Simple Dropout Rate calculates the percentage of pupils/students who
do not finish a particular grade/year level. It does not capture pupils/
Roles and Responsibilities of the Members in the students who finish a grade/year level but do not enroll in the next grade/
Deliberation Process year level the following school year.
Elementary: Secondary:
Dropouts Gr X, SY N Dropouts Yr X, SY N
Role Responsibilities ------------------------- x 100 ------------------------- x 100
Enrolment Gr X, SY N Enrolment Yr X, SY N
1. Team leader  Convenes team members
 Assigns and defines specific roles of Where:
each member Dropouts Gr X, SY N Previous SY Dropouts
Droputs Yr X, SY N (Table D - GESP & GSSP; Table A4 - PSP)
 Facilitates the deliberation process Enrolment Gr X, SY N Previous SY March 31 Enrolment + Dropouts
 Signs the final report of the team Enrolment Yr X, SY N (Table D - GESP & GSSP; Table A4 - PSP)

2. Co-Team leader  Assumes responsibilities in the absence


of the former 1.13 Transition Rate
 Assists the team leader in the delibera- The indicator assesses the extent by which pupils are able to move to the
tion process next higher level of education (i.e. primary to intermediate and elementary
to secondary). Care should be exercised in using this indicator at the level of
3. Secretary  Checks the attendance of the members
the Division, Municipal and Legislative Districts where migration can in-
 Prepares the minutes of the proceed-
crease or reduce the results of the indicator. It is not calculated at the school
ings
level for this reason.
 Collects the filled-out assessment vali-
dation forms Primary to Intermediate: Elementary to Secondary:
 Keeps records of the deliberation pro-
Enrolment Gr 5, SY N Enrolment Yr 1, SY N
ceedings --------------------------- x 100 -------------------------- x 100
Enrolment Gr 4, SY N-1 Graduates Gr 6, SY N-1
4. Assistant Secretary  Assists the secretary in the former’s
absence
Where:
 Documents the deliberation process Enrolment Gr 5, SY N Aug. 31 Enrolment
Enrolment Yr 1, SY N (Table A - GESP & GSSP; Table A1 & B1 - PSP)
Enrolment Previous SY March 31 Enrolment + Dropouts
5. Members  Participate actively in the deliberation -1
Gr 4, SY N
(Table D - GESP & GSSP; Table A4 - PSP)
process Graduates Gr 6, SY Previous SY Promotees/Graduates
N-1 (Table D - GESP & GSSP; Table A4 - PSP)

18
107
Where: Responsibilities of the School Head
Promotees Gr X, SY N
Promotees Yr X, SY N
Previous SY Promotees
(Table D - GESP & GSSP; Table A4 - PSP)
in SBM Assessment Validation
Enrolment Gr X, SY N Previous SY March 31 Enrolment + Dropouts
Enrolment Yr X, SY N (Table D - GESP & GSSP; Table A4 - PSP) The School Head (SH) is mainly responsible for the school performance. As
1.10 Repetition Rate
prime mover of SBM, the SH shall:
 organize the SBM Coordinating Team at the school
This is an EFA indicator which determines the magnitude of pupils/students level;
who repeat a grade/year level.
 orient the stakeholders on the policies and guidelines of
Elementary: Secondary: SBM;
Repeaters Gr X, SY N Repeaters Yr X, SY N  facilitate in the crafting of SBM Work Plan;
--------------------------- x 100 --------------------------- x 100
Enrolment Gr X, SY N-1 Enrolment Yr X, SY N-1  lead in the conduct of self-assessment;
 facilitate the post conference, and;
Where:  submit results to the Division SBM Coordinator
Repeaters Gr X, SY N Repeaters (Table A - GESP & GSSP; Table A1 & B1 - PSP)
Repeaters Yr X, SY N
Enrolment Gr X, Previous SY March 31 Enrolment + Dropouts
Enrolment Yr X, SY N-1
SY N-1
(Table D - GESP & GSSP; Table A4 - PSP)
Qualities of a Validator

1.11 School Leaver Rate Facilitative


School Leaver Rate is the EFA measure for dropout rate. It covers both pupils/ Amiable
students who do not finish a particular grade/year level as well as those who Respectful
finish but fail to enroll in the next grade/year level the following school year. It is Nur tur ing
theoretically more comprehensive than Simple Dropout Rate but becomes unreli- Ar ticulate
able in areas with substantial migration. Care should be exercised in using this Zealous
indicator at the level of the Division, Municipal and Legislative Districts. The sys- Open-minded
tem does not allow use of School Leaver Rate at the school level where it is very
likely to result in a misleading measure of dropout rate.
Basic Steps in the Assessment Process at the
School Leaver Rate = 1 - Promotion Rate - Repetition Rate
School Level
Or
Elementary: The following are the basic steps:
(Enrolment Gr X, SY N-1 - Repeaters Gr X, SY N) - (Enrolment Gr X+1, SY N - Repetears Gr X+1, SY N) Step 1. Organize a team of at least 12 members. Each team will have at
------------------------------------------------------------------------------------------------------------------- x 100 least three members per principle composed of teachers, students and
Enrolment Gr X, SY N-1
other stakeholders. The team will select a leader and a documenter.
Secondary:
(Enrolment Yr X, SY N-1 - Repeaters Yr X, SY N) - (Enrolment Yr X+1, SY N - Repetears Yr X+1, SY N) Step 2. Classify documents by principle. Gather evidence using the
------------------------------------------------------------------------------------------------------------------ x 100
Enrolment Yr X, SY N-1 DOD process and select samples of documents using emergent satura-
tion sampling and snowballing. Summarize the evidences and arrive at a
consensus of the rating on each indicator based on documented evidences.

106 19
Step 3. Conduct self-assessment.
3.1. compute the Performance Improvement in step 1 which is 60
percent of the SBM assessment (see Appendix H for sample Gr 1 Gr 2 Gr 3 Gr 4 Gr 5 Gr 6
computation); Pupil-years 1,057.00 901.37 831.79 787.11 744.26 693.15
3.2. proceed to step 2 for the DOD which is 40 percent of the SBM Total Promotees (including
repeaters) 871.70 812.89 774.52 733.76 690.61 667.65
assessment; Reconstructed Cohort 100.00% 87.17 81.29 77.45 73.38 69.06%
3.3. set a schedule for deliberation of the evidences gathered; and Survival Rate % % % %
3.4. invite representative assigned to every principle to confirm 1.8 Coefficient of Efficiency
the evidences presented.
This indicator measure the internal efficiency of the education system. It evaluates
the impact of repetition and dropout on the efficiency of the educational process in
Step 4. Proceed to the validation process through interview and observation producing graduates. It is calculated using the Pupil-Years and the Total Promotees
(including repeaters) used in calculating the Reconstructed Cohort Survival Rate.
of classes to validate documented evidences. DOD process shall be properly
handled. Years input per graduate

Step 5. Discuss documents, process evidence and summarize data for each Elementary: Secondary:
Total Promotees Gr 6 (including repeaters) Total Promotees Yr 4 (including repeaters)
principle/indicator. Classify issues, problems, opportunities and others. Each -------------------------------------------------------- x 6 -------------------------------------------------------- x 4
indicator is scored. Pupil-Years Gr 1-6 Pupil-Years Yr 1-4

Step 6. Conduct a post-conference to deliberate on observations made and The indicator assesses the number of years it takes for an average pupil/student
data gathered. to graduate from the elementary/secondary level. It is calculated using the Pupil
-Years and the Total Promotees/Graduates (including repeaters) used in calcu-
Step 7. Prepare written report by team. lating the Reconstructed Cohort Survival Rate.

Step 8. Finalize and submit SBM self-assessment results to the Division. 1.9 Promotion Rate

Elementary: Secondary:
Functions of SBM Coordinating Teams Pupil-Years Gr 1-6 Pupil-Years Yr 1-4
Division --------------------------------------------------------
Total Promotees Gr 6 (including repeaters)
--------------------------------------------------------
Total Promotees Yr 4 (including repeaters)

1. Reviews Self-Assessment Validation Forms submitted by the The Promotion Rate assesses the extent of pupils/students who are promot-
schools; ed to the next grade/year level. The grade 6/year 4 promotion rate is the
2. Prepares the schedule on the conduct of assessment validation; graduation rate for the elementary/secondary level. The computation used
3. Validates the accuracy of data and computations in the SBM Assess- in the BEIS is slightly different from the official EFA formula since it utilizes
ment Validation Form; the reported number of promotees rather than computing for the promotees
4. Provides technical assistance if necessary; using the present enrolment and the previous school year enrolment.
5. Consolidates the list of schools qualified for SBM Assessment
Validation;
6. Awards certificates and/or other forms of incentives to schools that Elementary: Secondary:
Promotees Gr X, SY N Promotees Yr X, SY N
have attained level I of SBM practice; and -------------------------- x 100 -------------------------- x 100
7. Recommends schools for levels II and III assessment validation. Enrolment Gr X, SY N Enrolment Yr X, SY N

20 105
Region
Pro- Pro- Promot- Pro- Promot-
moted moted ed to Gr moted ed to Gr 1. Reviews Assessment Validation Forms submitted by the SDOs;
to Gr 2 to Gr 3 4 to Gr 5 6
Cohort with no repetition SY N 1,000.00 824.70 743.74 692.53 645.58 599.05
2. Prepares the schedule on the conduct of validation through a
memorandum;
Cohort repeating once SY N+1 53.92 3. Validates the accuracy of data and computations in the SBM
2.91 Promoted Cohort Gr X x Promotion Rate Gr X-1 Assessment Validation Form;
Cohort repeating twice SY N+2
4. Provides technical assistance if necessary;
Cohort repeating thrice SY N+3 0.16 5. Conducts exit conference;
Cohort repeating four times 0.01 6. Awards plaques/certificates and other forms of incentives to
Repeated Cohort Gr 1 x Repetition
SY N+4
schools that have been validated as levels II and III;
Cohort repeating five times SY
N+5 0.00 7. Recommends schools for level III accreditation to the Central
Cohort repeating six times SY Office (CO).
N+6 0.00

Step 4. Add the repeaters in the previous grade level who were promoted with the pupils in Scoring Guidelines for SBM Level of Practice
the current grade level who repeated.
Measuring the school performance can help and assist the school heads
Pro- in giving appropriate technical assistance to schools.
Promot- Pro- mote
ed to Gr Promoted Promoted moted d to
2 to Gr 3 to Gr 4 to Gr 5 Gr 6 For SBM Level of Practice, a scoring mechanism is provided called
Cohort with no repetition 599.0 SBM Assessment Form divided into 2 categories: the Performance Improve-
SY N 1,000.00 824.70 743.74 692.53 645.58 5 ment (PI) which is given 60 percent and Document Analysis, Observation
Cohort repeating once SY
N+1 53.92 71.62 81.49 86.96 90.17 85.87 and Discussion (DOD) which is given 40 percent.
Cohort repeating twice SY
N+2
Cohort repeating thrice SY
2.91 4.76 6.14 7.11 7.90 7.65 Key Performance Indicators (KPIs) that Determine Per-
N+3 0.16 0.29 0.40 0.48 0.56 0.55 formance Improvement
Cohort repeating four
times SY N+4 0.01 0.02 0.02 0.03 0.04 0.04 Key Performance Indicators (KPIs) refer to a set of quantifiable
Cohort repeating five
times SY N+5 0.00 0.00 0.00 0.00 0.00 0.00 measures which gauge the performance of an enterprise relative to its objec-
Cohort repeating six times tives thereby enabling corrective action where there are deviations (Miller,
SY N+6 0.00 0.00 0.00 0.00 0.00 0.00 2016). For SBM level of practice, the Performance Improvement (PI) in
Repeated Cohort Gr X-1 x Promotion Rate Gr X-1)
terms of access, efficiency and quality are quantitatively measured accord-
+ (Promoted Cohort Gr X x Repition Rate Gr X) ingly through the following KPIs:

Step 5-7. Calculate the total for each grade level to obtain the pupil-years. Multiply 1. Access: Percent Increase of Enrolment or
the pupil-years with the respective promotion rate to get the total promotees Participation Rate
(including repeaters). Calculate the reconstructed cohort survival rate for each 2. Efficiency: Dropout and Repetition Rate
grade level by dividing the Total Promotees Gr X-1 (including repeaters) with the 3. Quality: NAT Results (if available) or Promotion Rate
original cohort of 1000.

21
104
Note: Inasmuch as the following prescribed indicators cannot be generated at the
school level (EBEIS), the Region opted to use the following indicators,
namely: ACCESS - Percent Increase of Enrolment or Participation Rate us- 1.6 Net Enrolment Ratio
ing Community Mapping; EFFICIENCY- Dropout Rate and Repetition Rate; The indicator provides a more precise measurement of the extent of par-
QUALITY - the Promotion Rate is preferred considering future administra- ticipation in primary education of children belonging to the official pri-
tion of NAT which will be by sampling. Should the NAT be restored and
other indicators be made available in the EBEIS, then the original prescribed mary school age.
indicators will be used.
Total Enrolment Ages 6-11, SY N
------------------------------------- x 100
Computation for Performance Improvement (PI) Population Age 6-11, SY N

Where:
(For the computation of PI, Please refer to Appendix C.1 to
Total Enrolment Total Enrolment (Table B - GESP; Table A2 - PSP)
C.14 of this SBM Manual of Oper ations.) SY N
Ages 6-11,

Population Age 6, SY N Projected 2002 population from NSO


Revised SBM Assessment Validation Procedure Tool
1.7 Survival Rate to Grade VI/Year IV
The Revised School-Based Management Assessment tool is guided by the four The Cohort Survival Rate computes the percentage of a cohort of pupils/
principles of A Child and Community-Centered Education System (ACCESs). The in- students who are able to reach grade VI/Year IV. It is used to assess the
dicators of SBM practice have been contextualized from the ideals of an ACCESs internal efficiency and “wastage” in education. This indicator is vulnera-
school system. The unit of analysis is the school system, which may be classified as ble to migration and caution should be used in computing at the school
developing, maturing, or advanced (accredited level). level.

The SBM practice is ascertained by the existence of structured mechanisms, The system adopted the reconstructed cohort method, shown below, in
processes, and practices in all indicators. Teams of practitioners and experts from the calculating the Cohort Survival rate:
district, division, regional offices shall validate the self-assessment results before a lev- Step 1. Compute the Promotion and Repetition Rates for a particular
el of SBM practice is established. When the school has sustained level II for three con-
secutive years, it becomes a potential candidate for level III and can be recommended area.
by the Regional Office to the PASBE through the Central Office.
Gr 1 Gr 2 Gr 3 Gr 4 Gr 5 Gr 6
The revised tool is systems-oriented, principle-guided, evidence-based, learner- Promotion Rate 82.47% 90.18% 93.11% 93.22% 92.79% 96.32%
centered, process-focused, non-prescriptive, user-friendly, collaborative in approach, Repetition Rate 5.39% 3.29% 2.27% 1.60% 1.41% 0.37%
and results/outcomes-focused. This can be used by the internal and external stakehold-
ers of schools. Step 2 & 3. Compute the number of promotees up to grade 6 using the
promotion rates for the respective grade/year levels. Compute the num-
ber of pupils/students in grade/year 1 who repeat once, twice, up to 6
Computation for Document-Analysis, Observation times.
and Discussion (DOD)
1. The four (4) principles are given weights on the basis of their significance
in improving learning outcomes and school operations:
Leadership and Governance 30%
Curriculum and Learning 30%
Accountability and Continuous Improvement 25%
Management of Resources 15%
Total 100%
103
22
2. Each principle has several indicators. Based on the results of the DOD,
Where: the evidence is summarized, and an agreed rating is made for each indica-
Enrolment Gr 1, SY N Total Grade 1 Enrolment (Table B - GESP; Table A2 -
PSP) tor;
Population Age 6, SY N Projected 2002 population from NSO
3. The items are to be rated by checking the appropriate boxes using the rat-
1.4 Net Intake Rate ing scale below:
0 -No evidence
This indicator gives a more precise measurement of access to primary educa- 1- Evidence indicates early or preliminary stages of implementation
tion of the eligible, primary school-entrance age population than the Appar- 2- Evidence indicates planned practice and procedure are fully im-
ent Intake Rate. plemented
3- Evidence indicates practice and procedure satisfy qualify
Enrolment Gr 1, Age 6, SY N standards
------------------------------- x 100
Population Age 6, SY N
4. The ratings given to each indicator shall be consolidated;
Where:
Total Age 6 Grade 1 Enrolment (Table B - GESP; Table A2 -
5. The number of check marks in each indicator shall be counted and
Enrolment Gr 1, Age 6, SY N
PSP) recorded in the summary table;
Population Age 6, SY N Projected 2002 population from NSO
6. The number of check marks in each column shall be multiplied by
the number of points given (1-3);
1.5 Gross Enrolment Ratio
The indicator is used to show the general level of participation in primary 7. The average rating for each principle shall be determined by dividing
education. It is used in place of the Net Enrolment Ratio when data on en- the total score by the number of indicators of each principle;
rolment by single years of age is not available. It can also be used together
8. The average ratings for each principle shall be recorded in the
with the Net Enrolment Ratio to measure the extent of over-aged and under
Summary Table to get the General Average;
-aged enrolment. The system generates this indicator up to the level of the
legislative districts and above. 9. The rating for each principle shall be multiplied by its percentage
Total Enrolment All Ages, SY N
weight to get the weighted average rating;
----------------------------------- x 100
Population Age 6-11, SY N 10. The total rating of the principles shall be determined by getting
the sum of all the weighted ratings. The result is the DOD
Where:
rating;
Total Enrolment All Ages, SY Total Enrolment (Table B - GESP; Table A2 - PSP)
N 11. The level of practice will be computed based on the criteria
Population Age 6, SY N Projected 2002 population from NSO below:
based on the performance improvement (PI); and
according to the validated practice using DOD

12. The final results using the scoring criteria as described in no.11 shall be rec-
orded and validated using the SBM Assessment Form to be signed by the School
Head and the Schools Division Superintendent (Division Level).

102
23
Appendix 0. Definition of Performance Indicators
1.Performance Indicators
Description of SBM Levels of Practice
Level Score/Points Description Remarks 1.1 Gross Enrolment Ratio in Early Childhood Development Programs
This indicator measures the general level of participation of young children in
Developing structures and mechanisms
I 0.50 – 1.49 Developing with acceptable level and extent of com- early childhood development programs. It indicates the capacity of the educa-
munity participation and impact on learning tion system to prepare young children for elementary education. The system
outcomes. generates this indicator only up to the level of the legislative districts and
Introducing and sustaining continuous im- above.
II 1.50 – 2.49 Maturing provement process that Integrates wider
community participation and significantly Enrolment Pre-sch, SY N
improve performance and learning out- ---------------------------- x 100
comes. Population Age 4-5, SY N
Ensuring the production of intended out-
III 2.50 – 3.50 Advanced puts/outcomes and meeting all standards of Where:
(Accredited a system fully integrated in the local com-
Level) munity and is self-renewing and self- sus- Enrolment Pre-sch, SY N Total Pre-school Enrolment (Table A - GESP; Table A1
- PSP)
taining.
Population Age 4-5, SY N Projected 2002 population from NSO

Reference: D.O 83, s. 2012


1.2 Percentage of Grade 1 Pupils with Early Childhood Development Program

SBM Assessment Validation Procedure This indicator measures the level of participation of grade 1 pupils in ECD
programs.
Submission of School Pre-Assessment Result to the Division Level Enrolment with ECD Gr 1, SY N
--------------------------------------- x 100
Step 1: Enrolment Gr 1, SY N
The School SBM Coordinating Team headed by the School Head will
conduct self-assessment. If found qualified, the School SBM Coordi- Where:
Enrolment with ECD Gr 1, Total Grade 1 Enrolment with ECD
nator will endorse the result of the self-assessment approved by the SY N (Table B - GESP; Table A2 - PSP)
School Head to the Division SBM Coordinator for the Division As- Enrolment Gr 1, SY N Total Grade 1 Enrolment (Table B - GESP; Table A2 -
sessment Validation. PSP)

The Division SBM Coordinator will validate the Pre-Assessment Form 1.3 Apparent/Gross Intake Rate
submitted by the School SBM Coordinator. After validating the re- The Apparent Intake Rate reflects the general level of access to primary educa
sult, Division SBM Coordinating Team shall visit and validate the tion. It also indicates the capacity of the education system to provide access to
SBM level of practice of the school using the Assessment Validation grade 1 for the official school-entrance age population. It is used as a substitute
Form. for Net Intake Rate in the absence of data on new entrants by single years of age.
The system generates this indicator up to the level of the legislative districts and
Division SBM Coordinator will endorse list of qualified schools with at- above.
tached SBM Assessment Validation Form of the qualified schools to
the Schools Division Superintendent for the endorsement to DepEd, Enrolment Gr 1, SY N
Region 10. --------------------------- x 100
Population Age 6, SY N

101
24
Appendix N. Regional Coordinating Team (RCT) TA Sheet
Submission of List of Qualified Schools to the
Department of Education
Regional Level
Step 2:
Region X The Field Technical Assistance Division (FTAD) shall check the
Masterson Avenue, Upper Balulang , Cagayan de Oro City SBM Assessment Validation form and if found in order, endorses
Regional Coordinating Team (RCT) TA Sheet
the form to the Regional SBM Coordinating Teams which in-
Division: School Date:
cludes at least one member from QuAD who will ensure the qual-
Acknowledged ity of the data submitted.
Time Desired by the School
Key Areas Developmental Areas Technical Assistance
(TA) /Proposed
Frame Outcomes Principal/ A Regional Memorandum shall be issued for the SBM Assessment
DFTAT
Result indicating the scheduled School Validation for the quali-
A. Access fied schools to be facilitated by the FTAD.
Result of the SBM Validation and the schedule of awarding cere-
mony will be issued through a Regional Memorandum.
The awardees will receive plaques and certificates of recognition.
B. Efficiency
SBM Validation Procedure of Document -Analysis,
Observation and Discussion (DOD)
C.Quality The SBM Assessment Tool uses evidence to determine a school’s
level of practice in which the Document Analysis, Observation and Dis-
cussion (D.O.D.) is a means of evaluating the validity or accuracy of the
evidence. There are three (3) essential steps in evaluating these evidences:
D. Others
1. Conduct document analysis
Obtain and assemble all existing evidences related to the indicator be-
ing assessed. Evaluate the validity of the evidences against Relevance,
______________________ Accuracy, Currency, Consistency and Sufficiency.
Prepared by :
Chair /VC/Member  Relevance. The evidence must be appropriate to the indicator being
assessed.
Accuracy. The evidence must be correct.

100
25
Currency. The evidence must be present, existing, or actual.
Consistency. The evidence must be verifiable and generates the same results Principle 3 : ACCOUNTABILITY AND CONTINUOUS
from most of the sources.
IMPROVEMENT
Sufficiency. The evidence must be adequate and enough.
3.1 Records of attendance of school heads in the
2. Conduct observations to obtain process evidence
Barangay assemblies/ sessions.
Process evidence is obtained by scrutinizing instructional, leadership and
management styles, methods, techniques, approaches and activities used by the 3.1.2 Records of the integration of SIP in the Barangay
school community to achieve the SBM goal. Process evidence is to be used to
cross-validate documentary evidence and synthesize the process evidence for Developmental Plan
group discussion.
3.1.3 School-Based accountability system on processes,
3. Discuss the synthesized documentary and process evidence
mechanisms and tools
Conduct the discussion in a friendly non-confrontational manner to ex-
plain, verify and clarify the evidence. Practices vary in establishing the level of 3.1.4 Reports of the Review Committee
practice of an indicator. The SBM validator may use either of these approach- 3.1.5 Financial Resources, Reports and Liquidations
es:
Integrative. The entire body of evidence for all indicators is assembled first, Principle 4 – MANAGEMENT OF RESOURCES
scrutinized for internal consistency, and finally used as guide in making a uni-
fied decision as to which level of practice an indicator belongs. 4.1 Resource inventory (classrooms, desks, toilets, WATSAN,
equipment)
Non-integrative. The evidences for all indicators are scrutinized one by one
4.1.1 School Report Card accomplished and reported
for evidence and also classified one by one for level of practice. 4.1.2 Record of gains of the implemented intervention
IMPORTANT: Each validator should captur e and document in detail, ac- 4.1.3 Annual AIP revision/adjustment conducted
tual responses and observations on areas for improvement and highlights during 4.1.4 Complete Records on Fiscal Management
the conduct of DOD. These should be reflected in the Form provided below. 4.1.5 School Committee to M & E on Resource Management
4.1.6 School Fiscal Management Council composed of
Who Conducts the DOD? Parents, Teachers and Community Stakeholders.
School SBM Coordinating Team 4.1.7 Monthly Financial Statement signed by the Finance
Management Committee of the school
Division SBM Coordinating Team
Regional SBM Coordinating Team
(Reference: D.O. 83, s. 2018)

26 99
1.3.3 Narrative Reports of the Planning Sessions Conducted OBSERVATIONS AND RESPONSES TEMPLATE:
1.3.4 Revisited SIP with community stakeholders participation
1.3.5 Minutes of Meeting , logbooks, attendance sheets Write below actual responses and observations on
1.3.6 Progress Report of the PIP/SIP. areas for improvement and highlights during the conduct of
1.4 SIP Implementation Plan DOD.
Committees for the Four Components of SIP with a clear SBM Principle: ___________________________
definition of roles and functions of its members.
1.5 Professional Development Plan for School SOURCE
Heads and Community Leaders based on training needs. DATE SITUATION/OBSERVATION
(Document/
Person In-
1.5.1 Indigenized Learning modules and corresponding terviewed)
report
1.5.2 Minutes of Meeting on consultative assembly
1.5.3 Reports of projects implemented
1.6 Baseline data of the school, gaps and needs and interventions.

Principle 2: CURRICULUM AND LEARNING


2.0 Learners of the school community are identified
2.1.1 Localized materials are formulated and utilized
2.1.2 Stakeholders tapped to assist in review classes, reading
remediation, etc..
2.1.3 Differentiated instruction, activities, and materials for diverse
learners
2.1.4 Assessment conducted and results evaluated and analyzed
with appropriate interventions implemented
2.1.5 Child Protection Policy crafted by the school Note: Attach supporting documents where applicable .
2.1.6 Self- learning modules are available. Evaluated by: _____________________________
2.1.7 List of children who are at risk of dropping out and Noted: ___________________________________, SBM Coordinator
interventions done

98 27
Appendix M. SBM Validation Mechanics, Matrix and Suggested Documents (Artifacts, Evidence as
Suggested Respondents per SBM Principle during MOVs in everyPrinciple )

Validation SBM Validation Mechanics, Matrix


Principle 1. Leadership and Governance and Suggested Documents
1.1 School Head/Assistant to the School Head
1.2 School Planning Team Mechanics
1.3 Department Heads
1.4 Grade Level Coordinators Organize Members : Divide members according to the number of principles:
1.5 School Governing Council Chair/Representative Leadership and Governance
1.6 GPTA President/Representative Curriculum and Learning
1.7 LGU Representative Accountability and Continuous Improvement
1.8 SSG/SPG President/Representative Management of Resources
Matrix
Principle 2. Curriculum and Instruction Performance Improvement (60%)
2.1 School Heads a. Access- % increase in Enrolment/Ave. in 3 years
2.2 Master Teachers b. Efficiency - Drop out and Repetition Rate
2.3 Department Heads c. Quality – NAT result/Promotion Rate year and its
2.4 Grade Level Coordinators/Subject Area Coordinators ave. rating
2.5 GPTA President/Representative Suggested Documents, Artifacts, Evidence as MOVs in every Principle (40%)
2.6 Teacher Association President/Representative
2.7 Supreme Student Government/Supreme Pupil Principle 1 : LEADERSHIP AND GOVERNANCE
Government Representative Indicators:
2.8 School Governing Council Chair/ Representative 1.1. ESIP signed by the PTA/SGC representative (approved
2.9 Alumni President/Representative by the Deped Division Officials)
1.1.1 School Planning Team organized
Principle 3. Accountability and Continuous Improvement 1.1.2 Revised ESIP shall be incorporated in the
3.1 School Head/Assistant to the School Head Barangay Developmental Plan.
3.2 School Planning Team 1.2 Revised AIP presented to the stakeholders
3.3 Department Heads 1.2.1 Stakeholders invited and confirmed attendance
3.4 Grade Level Coordinators 1.2.2 Attendance sheets during meetings duly signed
3.5 School Governing Council Chair/Representative 1.3 School manpower divided and assigned in the Four
3.6 GPTA President/Representative Components of SIP: ACCESS, Efficiency, Quality,
3.7 LGU Representative Governance
3.8 SSG/SPG President/Representative 1.3.1 Roles and functions of SBM members by
principle
1.3.2 School Project Implementation Plan ( PIP) with
the participation of stakeholders

28
97
Principle 4. Management of Resources
Appendix L. Suggested Composition of Division SBM Coordinating Team 4.1 School Head
4.2 Supreme Student Government/Supreme
Pupil Government Representative
4.3 School Planning Team
Division SBM Coordinating Team 4.4 LGU/Barangay Chair/Representative
4.5 School Governing Council Chair/
Chair: SDS Representative of School BAC Team
Vice-Chair: ASDS 4.6 Property Custodian
Team Leader: Chief ES, SGOD 4.7 Inspectorate Team
Co-Team Leader: Chief ES, CID 4.8 GPTA
Team Members: Division SBM Coordinator
PSDSs
Education Program Supervisors
Senior Education Program Specialists
Division Engineer
Division Accountant
Administrative Officer (AO) V
Planning Officer
Education Program Specialists II
Supply Officer
Health and Nutrition Personnel
PDOs

29
96
Appendix K. Steps in conducting the Post –Conference and Sample of Post-
SUGGESTED GUIDELINES IN THE GRANTING OF RECOGNITION
AND INCENTIVES Conference Data Captured Form

The following are the suggested guidelines in the granting of recogni-


tion and incentives: Suggested Steps for Post-Conference
The School Head shall lead in the post - conference and assigns
LEVEL 1 LEVEL 2 LEVEL 3 documenter.
*Plaque of Recognition *Plaque of Recognition *Plaque of Recognition The School Head shall present the SBM Assessment Result to the stakeholders.
for the school by the for the school by the Re- for the school by the
gional Office The team shall analyze the results and identify the strengths and areas for im-
SDO Central Office provement of the school using the post -conference data captured form.
*Certificate of Recogni- *Certificate of Recogni- *Certificate of Recogni-
tion for the School Head,
tion for the School Head, Division SBM Coordina- tion for the School
School SBM Coordinat- tor, and the external and Head , Division and Re-
POST- CONFERENCE DATA CAPTURED FORM
ing Team, and the exter- internal stakeholders by gional SBM Coordina-
nal and internal stake- the Regional Office tors, and the external
holders by the SDO *School Head’s participa- and internal stakehold- AREAS NEEDING REMARKS
*School Head’s partici- tion in annual convention ers by the Central Office Issues Identi- Agreed ACTION
pation in annual conven- for validated schools *School Head’s partici- & Con- fied Adjust-
cerns Adjust- ment
tion for validated pation in annual conven- ment
CO RO SCH.
schools *Provisions of equipment tion for validated/ Areas
and facilities from part-
ners/stakeholders
accredited schools
*Provision of instruc- *Special Fund for devel- *Provision of laborato-
tional/reference materi- opmental activities , e.g. ry apparatus, ICT equip-
als from partners/ benchmarking, trainings, ment , and other in-
stakeholders competitions, research, structional facilities
etc.) from Local School from partners/
*Special Fund for devel- Board (City/Provincial) stakeholders
opmental activities, e.g. *Plus factor in the OPCR/
benchmarking, trainings, IPCR Forms *Special Fund for devel-
competitions, research, opmental activities , e.g.
etc.) from Local School benchmarking, train-  The team shall recommend technical assistance needed.
Board (City/Provincial) ings, competitions, re-  The documenter shall present the results of post- conference for approval.
*Plus factor in the search, etc.) from Local  Adjournment.
OPCR/IPCR Forms School Board (City/
Provincial)
*Plus factor in the
OPCR/IPCR Forms

30 95
PROVISION OF TECHNICAL ASSISTANCE
Appendix J. Division Technical Assistance Plan (Sample)
Technical assistance is any form of professional help, guidance or support
given to the schools by the SBM Coordinating Team for the school heads and teach-
ers so they can be more effective in the performance of their functions. Giving of
Department of Education technical assistance shall be done in a nurturing manner.
Region X Technical Assistance can be in three forms:
Division Technical Assistance Plan (DTAP)
Information Sharing. Within this area ar e policies, guidelines, dir ections, and
Division : School Year :
instructions of top DepEd management. They are usually delivered through
District : Month/Date :
office memoranda/orders, conferences and referrals.
No. of identified schools :
__________
Capacity Building. This ar ea r efer s to the development of competencies or
Issues /Concerns/Development Technical Assistance TA Means of Verifi-
knowledge, skills, and attitudes. More often, this type of technical assistance
School
Areas (TA) to be provided
Time Frame
Provider cation (MOVs) is delivered through training, orientation, workshop, coaching or mentoring.

Group and Work Management. This means helping other s in accomplishing


outputs or targets based on their work plans. It includes the documentation of
lessons learned or best practices that may be shared by other schools through
the conduct of meetings/discussion /workshops and LAC sessions.

Utilization of Assessment Results


The assessment results shall be utilized by the SBM Coordinating Team in
providing the appropriate technical assistance needed by the schools to move on to
the next level.

For the School Level


Assessment results shall be the primary basis for the school’s plan of action.
Looking into the specific indicators of each principle (which did not meet the stand-
ard), the stakeholders can plan to provide strategies or measures for improvement.

The DOD Result (Template 2) shows the activities, the assistance needed
and the course/s of actions to be undertaken by the schools after the SBM assessment
validation. The school then provides a consolidated listing of this course/s of actions/
activities the SDO for the technical assistance needed.
Legend : Prepared by :
TA - Technical Assistance

94
31
School heads must make sure that the activities identified and the Appendix I. SBM Practices Assessment Result Template
Assessment Result Template (Template 3) are reflected in the School Im-
provement Plan (SIP) then translated into concrete actions in the Annual
Implementation Plan (AIP). For SBM areas that cannot be addressed within
a year, this will also be plotted in the three-year Work and Financial Plan
(WFP) of the SIP. Level of SBM Practice ________
Having a clear grasp of the technical requirements necessary to im-
plement the SIP/AIP provisions, the school head shall exhaust all means to
avail of the technical assistance from the Division Office. (Refer to Appen-
dix I: SBM Practices Assessment Results Template)

Division Level
The SBM Assessment Result provides the statistical data on the SBM Principle Indicator Actions to Time By Resources
number of schools in the different levels of SBM practices. Below are the be Taken Frame Whom Needed
( Human
steps to be undertaken by the division: and/or
1. The Division SBM Coordinator shall take the lead in the consoli- Material)
dation of the summary of responses from all the schools in the division.
2. The Division Field Technical Assistance Team (DFTAT) shall
identify the developmental areas with reference to the four principles of
SBM.
3. DFTAT develops a Technical Assistance Plan (TAP) to address the
identified developmental areas (Template 4).

Regional Level
The SBM Assessment Results may be used as basis for tech-
nical assistance and for making policy recommendations/proposals to
support the course/s of action from the SDOs. (Refer to Appendix K:
Division Technical Assistance Plan Template.)

32 93
Appendix H. Suggested Parts of the Program during School Validation References
DepED SBM Manual.pdf - A Manual on the Assessment of School-Based
Management Practices Republic of the Philippines Department of
Suggested Parts of the Program during Education School-Based Management (n.d.). Retrieved from
School Validation https://www.coursehero.com/file/27447257/DepED-SBM-
Manualpdf/
I. Courtesy Call to the School Head DO 82, s. 1999 - Signing Authority for the Third Elementary Education
II. Welcome Program ( 15 minutes only)
Project (TEEP) and Secondary Education Development and Im-
 Pambansang Awit
provement Project (SEDIP) | Department of Education. (n.d.). Re-
 Invocation
trieved from http://www.deped.gov.ph/orders/do-82-s-1999
 Welcome Remarks by the School Head
 Overview of the School SBM Best Practices DO 83, s. 2012 - Implementing Guidelines On The Revised School-Based
( 5- minute video presentation) Management (SBM) Framework, Assessment Process And Tool
 Statement of Purpose (APAT) | Department of Education. (n.d.). Retrieved from http://
 Presentation of validators per SBM principle by SBM www.deped.gov.ph/orders/do-83-s-2012.
Coordinating Team Leader
III. Validation Proper Third Elementary Education Project (English) |The World Bank.
IV. Post-conference by the SBM Coordinating Team (n.d.). Retrieved from http: documents worldbank. org/curated/
V. Exit Conference by the SBM CoordinatingTeam leader en/989331468758996223/Philippines-Third-Elementary-
Education-Project

Regional Memo | Department of Education. (n.d.). Retrieved from http://


www.deped.gov.ph/regions/region-x/memos?
search_api_views_fulltext=196

Republic Act 9155. (2013, September 23). Retrieved from https://


www.slideshare.net/juviehayo/republic-act-9155

Republic Act No. 9155 | GOVPH. (2001, August 11). Retrieved from
http://www.officialgazette.gov.ph/2001/08/11/republic-act-no-
9155/

92 33
References

 All entries are checked for accuracy by the FTAD


representative .
School-Based Management Manual. (2013, May 25). Retrieved from
https://www.slideshare.net/SusanCruzado/school-based- Note: Data entries cannot be changed in the Dataset
management-manual Sheet; instead, changes may be done only by the FTAD
Regional Office , upon request.
The Third Elementary Education Project. (n.d.). Retrieved from http:
www.philippinesbasiceducation.us/2013/07/the-third-
elementary-education-project.html

34

91
Graphics

The Graphics sheet of ETT is the pictorial representation of what


is in the School’s Data sheet. It summarizes graphically the data con-
tained in the School’s Dataset Sheet.

II. Steps on How to Operate the ETT

A.The ETT is to be operated by the FTAD representative , who will


act as the responsible person answerable for data validity entered in
the ETT.

 The division SBM coordinator must begin by filling up the cells


in the School Dataset in Blue color .

 Once these cells in the Dataset Form are filled up, click the
button SUBMIT. The data are now automatically consolidated,
tabulated, and graphed in the dashboard, which is found in the link.
APPENDICES

35

90
Appendix A. DepEd Order No. 83, s. 2012
Furthermore, there are 2 groups of links that will aid directly in
accessing the profile of SDOs :

First Group - links to school’s Division Dataset


(The school Dataset Form is found on page 85)

Second Group –links to school’s Division Dashboard

Example:
Percentage Distribution of Elementary and Secondary Schools in the Schools
Division of __________, by SBM level of Practice, School Year 2018-2019

Percentage Distribution of Elementary and Secondary School Teachers in the Schools


Division of __________, by Teachers’ Rank, School Year 2018-2019

Note :

Use password to open the links. Password shall be given only to designated division
SBM Coordinator by FTAD representative.

36 89
VII. DIVISION OF LANAO DEL NORTE
Code : LDN
Total number of school s : 377
LDN-1 : first school listed
LDN-377 : last school listed
VIII.DIVISION OF MALAY BALAY
Code : MBLY
Total number of school s : 89
MBLY-1 : first school listed
MBLY-89 : last school listed
IX. DIVISION OF MISAMIS OCCIDENTAL
Code : MISOC
Total number of schools : 346
MISOC-1 : first school listed
MISOC-346 : last school listed
X. DIVISION OF MISAMIS ORIENTAL
Code : MISOR
Total number of schools : 444
MISOR-1 : first school listed
MISOR-444 : last school listed
XI. DIVISION OF OROQUIETA
Code : ORQT
Total number of school s: 55
ORQT-1 : first school listed
ORQT-55 : last school listed
XII. DIVISION OF OZAMIZ CITY
Code : OZMZ
Total number of school s : 63
OZMZ -1 : first school listed
OZMZ-63 : last school listed
XII. DIVISION OF TANGUB CITY
Code : TNGB
Total number of schools : 74
TNGB-1 : first school listed
TNGB-74 : last school listed
XIV.DIVISION OF VALENCIA CITY
Code : VLNC
Total number of schools : 67
VLNC-1 : first school listed
VLNC-67 : last school listed

88 37
The Master list of Public Elementary and Secondary
Appendix B. List of SBM Acronyms
Schools is taken from the Enhanced Basic Education Infor-
School-Based Management (SBM) ACRONYMS mation System (EBEIS).

ACCESs A Child & Community Centered Education System The following are the assigned codes for the schools by divi-
AIP Annual Implementation Plan sion: (Schools are arranged in alphabetical order)
ALS Alternative Learning System
APAT Assessment Process and Tool I. DIVISION OF BUKIDNON
BAC Bids and Awards Committee Code : BUK
BESRA Basic Education Sector Reform Agenda Total number of schools : 612
CID Curriculum Implementation Division BUK-1 : first school listed
CLMD Curriculum and Learning Management Division
BUK-612 : last School listed
CO Central Office
COA Commission on Audit
II. DIVISION OF CAGAYAN DE ORO CITY
CR Completion Rate Code : CDO
CSR Cohort-Survival Rate Total number of schools : 108
DepEd Department of Education CDO-1 : first school
DFTAT Division Field Technical Assistance Team CDO-108 : last school
DOD Document-Analysis, Observation and Discussion III. DIVISION OF CAMIGUIN
DR Drop Out Rate Code : CAM
DTAP Division Technical Assistance Plan Total number of schools : 72
EBEIS Enhanced Basic Education Information System CAM-1 : first school
ECCD Early Childhood Care and Development CAM-72 : last school
ECE Early Childhood Education IV. DIVISION OF EL SALVADOR CITY
EGRA Early Grade Reading Assessment
ELLNA Early Language Literacy and Numeracy Assessment Code : EL_SAL
ESIP Enhanced School Improvement Plan Total number of schools : 18
ESSD Education Support and Services Division EL_SAL-1 : first school
FD Finance Division EL_SAL-18 : last school
FGD Focus-Group Discussion V. DIVISION OF GINGOOG CITY
FMDP Financial Management Development Plan Code : GNGOG
FTAD Field Technical Assistance Division Total number of schools : 102
GASTPE Government Assistance to Students and Teachers GNGOG-1 : first school
in Private Education GNGOG-102 : last school
GPTA General Parents Teachers Association VI. DIVISION OF ILIGAN CITY
HRDD Human Resource Development Division Code : ILGN
HRDP Human Resource Development Plan
Total number of schools : 117
ILGN-1 : first school
ILGN-117 : last school

38
87
Validation Results
INSET In-Service Training
IPCRF Individual Performance Commitment Review Form
 School Improvement Performance 60%
IPPD Individual Plan for Professional Development
This is determined by three thematic areas:
KPIs Key Performance Indicators
Access (45%) – The indicator being assessed in this area is rate
KRA Key Result Area
of enrolment for the last three (3) consecutive years.
LAC Learning Action Cell
Efficiency (25%) – The indicators being assessed under this area
LGU Local Government Unit
are as follows:
LSB Local School Board
Drop-out Rate
M&E Monitoring and Evaluation
Repetition Rate
MOA Memorandum of Agreement
Quality (30%) – The indicator being assessed under this area is
MOOE Maintenance and Other Operating Expenses
the average increase of the NAT/Promotion results
MOU Memorandum of Understanding
for the past three years.
MOV Means of Verification
NAT National Achievement Test
SBM Principles and Practices 40%
NCBTS National Competency-Based Teacher Standards
 Leadership and Governance (4 indicators) - 30% NGO Non-Governmental Organization
 Curriculum and Learning (7 indicators ) - 30% OPCRF Office Performance Commitment Review Form
 Accountability and Continuous Improvement (5 indicators) -25% PBB Performance-Based Bonus
PI Performance Improvement
 Management of Resources (5 indicators) - 15% PPRD Policy Planning and Research Division
QuAD Quality Assurance Division
SBM Level of Practice
RAMP Resource Allocation and Mobilization Plan
SBM School-Based Management
This section summarizes the overall results of the key Performance
SBMF School-Based Management Fund
Indicators and DOD. It is here that the current level of the school’s SBM level
SBP School Building Program
of practice, i.e. Developing (Level I ), Maturing (Level II), or Advanced (Level
SDO Schools Division Office
III) is shown.
SEF Special Education Fund
SGC School Governing Council
SGOD School Governance and Operations Division
SH School Head
SIP School Improvement Plan
SMEA School Monitoring Evaluation and Adjustment
SOB School Operating Budget
SOSA State of the School Address
SPDP School Physical Development Plan
SPG Supreme Pupil Government

86 39
School’s Dataset
SPPD School Plan for Professional Development The School’s Dataset contains all entries inputted by the division SBM coordi-
SRC School Report Card nator per division . In this sheet, all schools in the fourteen (14) schools divisions
SSG Supreme Student Government are arranged in alphabetical order and given corresponding codes.
TA Technical Assistance
TAP Technical Assistance Plan Cells with blue color must be filled-up ; that is, it requires input from the divi-
TSNA Teacher’s Strengths and Needs Assessment sion SBM Coordinator while the rest of the cells with orange color must be filled-up
by the FTAD representative , and are therefore protected.

The Dataset Form has the following parts:

1. SBM School Information


 School I.D.
 Date Entered
 Name of School
 School Level
 Schools Division
 Name of School Head (complete name)
 No. of Teacher Is
 No. of Teacher IIs
 No. of Teacher IIIs
 No. of Master Teacher Is
 No. of Master Teacher IIs
 SBM Level of Practice 2018
 Assessment Validation Score
 Assessment Validation Result 2018

40
85
Rationale in the Creation of the ETT
Appendix C.1. SBM Self-Assessment Form Sample ( Main Menu)
The Electronic Tracking Tool, which is referred to here as ETT, was con-
ceptualized in response to the felt need of DepEd Regional Office 10 , to have a real-
time monitoring of School-Based Management (SBM) currently practiced in all
schools in the fourteen schools divisions in DepEd Region 10.

The ETT will facilitate actual processing in the collection, synthesizing ,


and analyzing of real-time data. This will allow DepEd Regional Office 10, as well
as key stakeholders, to see the current SBM level of practice of their respective
schools. In addition, real-time monitoring will allow efficient validation of SBM
level of practice, particularly at the school level. This is crucial in performing in-
depth analysis as basis for policy pronouncements and/or formulations.

Likewise, it would lessen the cost incurred by the Regional Education Pro-
gram Supervisors as they do their regular monitoring functions vis-à-vis the SBM
program.

It is envisioned that the ETT would be set up in DepEd Regional Office 10


website to serve as platform wherein division SBM Coordinators can directly view
their division’s respective data, for validation.

Features of the ETT

It is called an “electronic” tracking device since data have to be accessed by


means of a computer or computer network, i.e. through the internet, once it is set up
in DepEd Regional Office 10 website.

The ETT is in Microsoft Office Excel format. It has three components


placed in three sheets, namely: links , Schools Dataset, and Graphics.

Region’s Dashboard

The SBM Region’s Dashboard serves as the ETT point of entry. It is on


this sheet wherein Regional Director / educational leaders will view the Region’s
SBM Level of Practice and SDO’s profile through the links posted .

41
84
Appendix C.2. SBM Self-Assessment Form Sample ( Sub -Menu)
SBM Electronic Tracking Tool Manual

42
83
Appendix C.3. SBM Self-Assessment Form Sample ( Option 1)
Appendix G. SBM Electronic Tracking Tool Manual (ETT)

SBM Electronic Tracking Tool (ETT)

A Regional Tool spearheaded by the Field Technical


Assistance Division (FTAD) in tracking the level of SBM
Implementation of the different schools in the Region

Objectives
1. to upgrade practices as provided for in DO 83 s. 2012;
2. to strengthen SBM implementation in all schools, divisions ,
and in the Region;
3. to develop a region-wide tracking system for all schools in the region
on their respective SBM level of implementation;
4. to fast-track data gathering from the field in terms of SBM;
5. to make data available anytime as basis for the provision of
Technical Assistance and conduct of research study for policy
making ; and
6. to easily spot potential schools for higher accreditation

82 43
Appendix C.4. SBM Self-Assessment Form Sample ( Option1 ) 11. Click this button to open DOD assessment and evaluate every
indicator based on the means of verification (facts/evidences).

12. A sub-menu will be opened that will give you a hint on which option is
the best combination. It can help you decide whether to proceed to DOD or
to give technical assistance .

13. After assessing your DOD principle based on MOVs, click the back
arrow to return to the sub-menu. Then go back to the best option that
qualifies for the requirement which is PI>=1.50 (60%). Click the option
and print the output.

14.. Finally, save your file.

44 81
Appendix C.5. SBM Self-Assessment Form Sample ( Option 2 )
10. Accomplish the sub-menu

80
45
Appendix C.6. SBM Self-Assessment Form Sample ( Option 2 )
8. Click the SBM Validation and DOD Assessment to view the initial
result of the Performance Improvement.

9. A sub-menu will be opened that will give you a hint on which option is the best
combination. It can help you to decide whether to give technical assistance; if not,
qualify any of the options and proceed to DOD if qualified. In this example, op-
tion 1 qualified for the next level. P.I. is greater than or equal to 2.50 and the rest
of the options did not qualify . The cells are automatically filled-in black color.

46
79
5. And finally, enter the names of validators on the space Appendix C.7. SBM Self-Assessment Form Sample ( Option 3 )
provided.

6. If School ID not found, update the data

7. Click the school profile button to enter the data of your school

78 47
Appendix C.8. SBM Self-Assessment Form Sample ( Option 3 ) 4. Accomplish the Thematic Area

48
77
Appendix C.9. SBM Self-Assessment Form Sample ( Option 4 )
To use the template, load the template and enable the macro, clicking the
yellow warning sign “SECURITY WARNING Macros have been
disabled.”

To do that,
1. Click the Enable Content

2. Enter school ID, if not found then update/change the maintenance menu below, go back

to the previous screen by clicking back arrow and enter again the school ID

3. Click the drop down list under year column and select your baseline year for
validation.

49
76
Appendix F. Electronic SBM Toolkit User’s Guidelines
Appendix C.10. SBM Self-Assessment Form Sample ( Option 4 )
Steps to Accomplish Electronic SBM-Validation and DOD

Assessment Tool

The user makes sure that Microsoft Excel 2013 version is


installed in the computer. Excel editions lower and higher than 2003
are incompatible . Excel is part of the Microsoft suite of office pro-
grams. If you are not sure whether Excel 2013 edition is installed on
your computer, the fastest way to find out is to click the start button
and scroll up the programs installed, and look for the Microsoft Ex-
cel 2013 edition from the list that appears on the screen and click it.

The Excel window or screen should be similar to Figure 1.

50
75
Appendix E. Validation Plan Template

Appendix C.11. SBM Self-Assessment Form Sample ( Option 5 )

74
51
Appendix C.12. SBM Self-Assessment Form Sample ( Option 5 )
Principle IV– Management of Resources (15%)
Indicator Artifacts Remarks
Standard Rating
1 2 3

52
73
Appendix C.13 SBM Self-Assessment Form Sample ( Option 6 )

Principle IV– Management of Resources (15%)


Indicator Artifacts
Remarks
Standard Rating
1 2 3

53
72
Principle IV– Management of Resources (15%)
Appendix C.14. SBM Self-Assessment Form Sample ( Option 6 )
Indicator Artifacts Rating Re-
Standard marks
1 2 3

54
71
Department of Education Appendix D. Revised SBM Assessment Tool DOD (Sample)
Region X
Zone 1 , Upper Balulang Cagayan de Oro City

70 55
Principle I– Leadership and Governance (30%)
Principle III– Accountability and Continuous Improvement (25%)
Indicator
Remarks
Standard Artifacts Rating Indicator Re-
marks
Standard Artifacts Rating
1 2 3
1 2 3

56 69
Principle III– Accountability and Continuous Improvement (25%) Principle I– Leadership and Governance (30%)
Indicator Remarks Indicator Re-
Standard Artifacts Rating Standard Artifacts Rating marks
1 2 3 1 2 3

57
68
Principle I– Leadership and Governance (30%) Principle III– Accountability and Continuous Improvement (25%)
Indicator Indicator Artifacts
Standard Artifacts Rating Re- Remarks
marks
Standard Rating
1 2 3 1 2 3

58
67
Principle III– Accountability and Continuous Improvement (25%)

Indicator
Standard Artifacts Rating Remarks

1 2 3

66 59
Department of Education
Region X
Department of Education Zone 1 Upper Balulang, Cagayan de Oro City
Region X
Zone 1 Upper Balulang, Cagayan de Oro City

60 65
Principle II– Curriculum and Learning (30%)
Principle II– Curriculum and Learning (30%)
Indicator Indicator
Standard Artifacts Rating Re- Standard Artifacts Rating Re-
marks marks
1 2 3 1 2 3

64
61
Principle II– Curriculum and Learning (30%)
Principle II– Curriculum and Learning (30%)
Indicator Re-
Standard Artifacts Rating marks Indicator
Standard Artifacts Rating Re
1 2 3 1 2 3 marks

62 63

You might also like