Professional Documents
Culture Documents
Should other regions decide to adapt this manual or any portion thereof, they should
seek the approval of management.
Appendix S. DepEd’s Vision and Mission
124
1
Department of Education
Region 10 – Northern Mindanao Appendix R. Region 10 SBM Officers
Office of the Regional Director
REGION 10 SBM OFFICERS
I am optimistic that this manual will be very useful to everyone for the
Rolly B. Labis, PhD (El Salvador)
continuous improvement of basic education in Northern Mindanao.
ALLAN G. FARNAZO
Director IV
2
123
Department of Education
Region 10 – Northern Mindanao
Office of the Assistant Regional Director
M E S SAGE
The school is the heart of the formal education system. It is where chil-
dren learn. For years, schools have been pushing hard in their single aim of
providing the best possible basic education for all learners. The demands of the
time have been the toughest test. The only way to change is to take the challenge,
and the most important change in the governance of basic education, must occur
at the school level.
I commend the Technical Writing Group (TWG) and the SBM Manual
Writers for coming up with the Contextualized School-Based Management Man-
ual of Operations spearheaded by the Field Technical Assistance Division
(FTAD) .This manual will enable all stakeholders to contribute to quality organi-
zational performance for the sake of the Filipino children, the center of this un-
dertaking.
God Bless.
3
122
Department of Education
Region 10 – Northern Mindanao
Office of the Chief
Field Technical Assistance Division
MESSAGE
The long wait is over. Finally, this SBM Manual of Operations is here. With its
completion, a happy note is resounding. Hear the regional mantra echo once again,
Diyes is it!
When the Regional Office 10 felt the need to update by contextualizing SBM
Implementation, thus enhancing the contents of DO 83, s. 2012, the newly created Field
Technical Assistance Division (FTAD) grappled to start from where it should, thinking
that listening to the schools division offices listen to schools could be the way to accom-
plish the challenge.
It was a grueling but a worthwhile feat to walk the miles for this Manual.
FTAD can never thank enough Regional Director Allan G. Farnazo, and Assistant Re-
gional Director Shirley O. Chatto for all the mor al and financial suppor t they have
given. The same gratitude goes to the select SBM Coordinators who contributed their
piece for the entire work; to Dr. Lourdes G. Tolod, former Schools Division Superinten-
dent for editing the rough manuscripts submitted at the start; Atty. Neil Christian D.
Villagonzalo, BEST M & E Specialist, and Mr. Rey O. Macalindong, Project Manage-
ment/M & E Specialist.
Understandably, this first edition may yet be a work in progress as the agency’s
fast changing work dynamics today may still render the contents of this Manual passé
tomorrow. With constant feedback, however, succeeding editions can again be made to
ensure success.
May all of the constituents of DepEd Region 10 find this Manual indeed a valu-
able resource in ensuring that SBM will be better implemented in the field.
Mabuhay po tayong lahat.
4
121
Acknowledgment
Atty. Shirley O. Chatto, Assistant Regional Dir ector , for the legal ex-
pertise extended;
The writers of this manual for their commitment and expertise: Roberto D.
Napere, Jr., PhD; Susan S. Olana, PhD; Para D. Talip; Ariel B. Montecalbo,
PhD; Joy C. Mangubat, PhD; Rosalyn M. Lato; Fritzie C. Sillabe; Julieto M.
Indonto; Miguelito D. Bendijo; Margarita L. Ruben; Danny A. Asio; Edelina
M. Ebora; Ivy T. Jumawan; Aileen A. Zaballero; Esther V. Tabañag, PhD;
Roie M. Ubayubay, PhD; Jean S. Macasero, PhD; Mario Esteban C. Arsenal;
Imelda D. Pongase; Wenie L. Nahial; Fe D. Arancon; Susan A. Baco; and above
all
To the Almighty God, the ultimate source of wisdom and strength, for the
unconditional grace and love.
5
120
TABLE OF CONTENTS
PAGE
PRELIMINARIES
Title Page 1
Messages 2
Acknowledgment 5
Table of Contents 6
TOPICS
Rationale 9
SBM Framework 10
The Process 15
6
119
Basic Steps in the SBM Assessment Process at the School Level 19
and Incentives 30
REFERENCES 33
APPENDICES 35
7
118
C SBM Self-Assessment Form ( C.1 to C.14 ) Sample 41
8 117
Rationale
116 9
School-Based Management (SBM) Framework
The focus of School-Based Management (SBM) is a learner who is ex-
pected to be a functionally-literate citizen possessing the essential 21st century
skills of critical thinking and problem solving, collaboration and communication
imbued with the values of self-reliance, productivity, patriotism and service to hu-
manity.
10 115
The entire system is guided by the four principles of A Child and
Community-Centered Education Systems (ACCESs) on leadership and
governance, curriculum and learning, accountability and continuous im-
provement, and resource management. The boundary of the system is a
broken line signifying the readiness of the school to accept inputs and
changes from the external environment that may affect positively or nega-
tively depending on the response of the school.
The Division, Regional and Central Offices are tasked to work with
the schools along this SBM undertaking through the provisions of technical
assistance, professional and administrative support and policies to ensure
that these are observed, the standards are being met and the programs are
being implemented.
The framework reflects the vision, mission and goals of the SBM:
to make the community responsible for the education of its children and
make the children responsible for building the community.
114 11
Appendix Q. SBM Manual of Operations Writers’ Pictorial
3. The minimum rating of the Performance Improvement (PI) to proceed to
SBM Documentary -Analysis, Observation, Discussion (DOD) validation
is 1.50 (60%). If the school does not reach the minimum rating, the Divi-
sion Field Technical Assistance Team (DFTAT) shall provide technical
assistance.
4. The SDO shall be notified of the schedule for validation of the concerned
schools through a regional memorandum.
5. There shall be at least one (1) SBM model school for each of the elemen-
tary, secondary, and integrated schools per district as identified by the
SDO.
10. The Regional SBM Coordinating Teams shall provide technical assistance
to the divisions that need support.
11. A school shall sustain the SBM level II of practice for at least three years
before applying for SBM level III validation.
12
113
REGION 10 SBM COORDINATING TEAMS
CONSULTANTS
Name Position Office/Institution Regional Director (RD)
Former Schools Division Superinten- Professor Emeritus, Chairperson
Lourdes G. Tolod, PhD, CESO V
dent of Cagayan de Oro City Xavier University
SDOs
Schools
112 13
DUTIES AND RESPONSIBILITIES OF THE
REGIONAL COORDINATING TEAMS
Chairperson/Vice-Chairperson
Name Position/Designation Division
Oversees the implementation of the entire SBM process. Edelina M. Ebora, MA Master Teacher I Malaybalay City
Acts on SBM matters upon the recommendation of FTAD.
Florderick S. Velarde, Information Technology Of-
Lanao del Norte
FTAD MA ficer I
Manages the SBM assessment validation process.
Receives school applications for assessment validation duly endorsed Senior Education Program
Rosalyn M. Lato, MA Ozamiz City
Specialist on M&E
by the SDO.
Assigns the applications to any of the Regional Coordinating Teams Joy C. Mangubat,PhD Education Program Supervisor Gingoog City
(RCTs).
Issues notices to the schools concerned on the scheduled on-site
validation. Ivy T. Jumawan, MA
Senior Education Program
Lanao del Norte
Receives results from the different RCTs. Specialist on M&E
Recommends approval for the Recognition of Levels II & III schools
as well as those for the Level III accreditation. Mario Esteban C. Arse- Senior Education Program
Tangub City
nal, MA Specialist on M&E
Informs SDOs on the Assessment Validation Results.
Facilitates the conduct of the Awarding of Plaques/Certificates. Senior Education Program
Leads in the conduct of trainings that relate to SBM implementation. Wenie L. Nahial Camiguin
Specialist on M&E
Maintains database of SBM validated schools and best practices.
Aileen A. Zaballero, MA Senior Education Program Oroquieta City
Facilitates continuous improvement in the management of the SBM
Specialist on M&E
validation process.
Danny A. Asio Senior Education Program Misamis Oriental
Other Functional Divisions Specialist on M&E
Engage in the assessment validation processes based on their special-
ized functions.
Submit results of the assessment validation to the Office of the Re-
gional Director through FTAD with reports on TA provided (if any),
best practices and recommendation for accreditation.
14 111
Appendix P. SBM Manual of Operations Writers The Process
WRITERS
15
110
SBM Validation Flowchart 1.16 Retention Rate
The Retention Rate determines the degree of pupils/students in a particular
Activities Details
school year who continue to be in school in the succeeding year. This indicator
1. Receives SBM 1.FTAD Education Program Supervisor receives is also vulnerable to migration and is not advisable to compute at the school
Assessment Vali- the SBM Assessment Validation Forms level.
dation Form (AVFs) from the Schools Division Offices,
(AVF in hard copies. Elementary: Secondary:
Enrolment Gr 2-6, SY N Enrolment Yr 2-4, SY N
1.1EPS distributes the SBM AVFs in bunch by ---------------------------- x 100 ---------------------------- x 100
SDOs to designated Regional Coordinating Enrolment Gr 1-5, SY N-1 Enrolment Yr 1-3, SY N-1
Teams (RCT) for review within a week.
1.2EPS collects results of the review from RCT
Where:
a day after the one-week review.
Enrolment Gr 2-6, SY N Aug. 31 Enrolment (Table A - GESP & GSSP; Table A1 & B1 -
2. Prepares the SBM 2.EPS prepares the SBM Validation Plan (VP) that Enrolment Yr 2-4, SY N PSP)
Validation Plan contains the schedule of schools to be visit- Enrolment Gr 1-5, SY N-1 Previous SY March 31 Enrolment + Dropouts
Enrolment (Table D - GESP & GSSP; Table A4 - PSP)
(VP) ed, to be attached in a memo for Chief’s Yr 1-3, SY N-1
109
16
1.14 Completion Rate
The Completion Rate measures the percentage of grade/year 1 entrants who
graduate in elementary/secondary education. It is available only up to the di-
5.Reviews and 5.EPS reviews consolidated results submit-
vision level and above. Data for grade/year 1 are based on the predecessor of
finalizes the ted by different RCTs.
BEIS, the Unified Data Gathering System (UDGS), which did not have any val-
consolidated 5.1Prepares a memo for the review and
idation procedures and did not monitor the completeness of the data submit-
recommendation of the Chief.
ted. RCT reports 5.2Seeks the approval of the
Elementary: Secondary: for the Regional Director
Graduates Gr 6, SY N Graduates Yr 4, SY N awarding 5.3Ensures the release and uploading
--------------------------- x 100 -------------------------- x 100
Enrolment Gr 1, SY N-5 Enrolment Yr 1, SY N-3 ceremony of the memo in regional web-
sites.
Where:
6.Make a Sustaina- 6.0 Chief leads in making the SP.
bility Plan (SP) 6.1 Communicates SDOs through
Graduates Gr 6, SY N Previous SY promotees/graduates Region Memo
Graduates Yr 4, SY N (Table D - GESP & GSSP; Table A4 - PSP) 6.2 Accomplishes SP Template
Enrolment Gr 1, SY N-5 - based on UDGS division level data
Enrolment Yr 1, SY N-3 Objective/s
Activity
Interventions/TA Needed
1.15 Failure Rate Persons Involved
This indicator evaluates the extent of pupils/students who failed a given Time Frame
grade/year level. Resources Needed
Expected Outcome
MOVs
Elementary: Secondary:
Failures Gr X, SY N Failures Yr X, SY N
-------------------------- x 100 --------------------------- x 100
Enrolment Gr X, SY N Enrolment Yr X, SY N
Where:
Failures Gr X, SY N
Failures Yr X, SY N Previous SY March 31 Enrolment - Promotees
(Table D - GESP & GSSP; Table A4 - PSP)Enrolment Gr X, SY N
Enrolment Yr X, SY N Previous SY March 31 Enrolment + Dropouts
(Table D - GESP & GSSP; Table A4 - PSP)
____________________________
Failures = Enrolment - Promotees
108
17
Where:
The Deliberation Process Enrolment Yr X+1, SY N Enrolment (Table A - GESP & GSSP; Table A1 & B1 - PSP)
Repeaters Gr X, SY N Repeaters (Table A - GESP & GSSP; Table A1 & B1 - PSP)
Repetears Gr X+1, SY N
Deliberation is a process of thoughtfully weighing options, usually Enrolment Gr X, SY N-1 Previous SY March 31 Enrolment + Dropouts
(Table D – GESP & GSSP; Table A4 - PSP)
prior to voting and/or decision-making. It emphasizes the use
of logic and reason, fair and objective judgment. Group decisions are
made after a consensus has been arrived at. 1.12 Simple Dropout Rate
The Simple Dropout Rate calculates the percentage of pupils/students who
do not finish a particular grade/year level. It does not capture pupils/
Roles and Responsibilities of the Members in the students who finish a grade/year level but do not enroll in the next grade/
Deliberation Process year level the following school year.
Elementary: Secondary:
Dropouts Gr X, SY N Dropouts Yr X, SY N
Role Responsibilities ------------------------- x 100 ------------------------- x 100
Enrolment Gr X, SY N Enrolment Yr X, SY N
1. Team leader Convenes team members
Assigns and defines specific roles of Where:
each member Dropouts Gr X, SY N Previous SY Dropouts
Droputs Yr X, SY N (Table D - GESP & GSSP; Table A4 - PSP)
Facilitates the deliberation process Enrolment Gr X, SY N Previous SY March 31 Enrolment + Dropouts
Signs the final report of the team Enrolment Yr X, SY N (Table D - GESP & GSSP; Table A4 - PSP)
18
107
Where: Responsibilities of the School Head
Promotees Gr X, SY N
Promotees Yr X, SY N
Previous SY Promotees
(Table D - GESP & GSSP; Table A4 - PSP)
in SBM Assessment Validation
Enrolment Gr X, SY N Previous SY March 31 Enrolment + Dropouts
Enrolment Yr X, SY N (Table D - GESP & GSSP; Table A4 - PSP) The School Head (SH) is mainly responsible for the school performance. As
1.10 Repetition Rate
prime mover of SBM, the SH shall:
organize the SBM Coordinating Team at the school
This is an EFA indicator which determines the magnitude of pupils/students level;
who repeat a grade/year level.
orient the stakeholders on the policies and guidelines of
Elementary: Secondary: SBM;
Repeaters Gr X, SY N Repeaters Yr X, SY N facilitate in the crafting of SBM Work Plan;
--------------------------- x 100 --------------------------- x 100
Enrolment Gr X, SY N-1 Enrolment Yr X, SY N-1 lead in the conduct of self-assessment;
facilitate the post conference, and;
Where: submit results to the Division SBM Coordinator
Repeaters Gr X, SY N Repeaters (Table A - GESP & GSSP; Table A1 & B1 - PSP)
Repeaters Yr X, SY N
Enrolment Gr X, Previous SY March 31 Enrolment + Dropouts
Enrolment Yr X, SY N-1
SY N-1
(Table D - GESP & GSSP; Table A4 - PSP)
Qualities of a Validator
106 19
Step 3. Conduct self-assessment.
3.1. compute the Performance Improvement in step 1 which is 60
percent of the SBM assessment (see Appendix H for sample Gr 1 Gr 2 Gr 3 Gr 4 Gr 5 Gr 6
computation); Pupil-years 1,057.00 901.37 831.79 787.11 744.26 693.15
3.2. proceed to step 2 for the DOD which is 40 percent of the SBM Total Promotees (including
repeaters) 871.70 812.89 774.52 733.76 690.61 667.65
assessment; Reconstructed Cohort 100.00% 87.17 81.29 77.45 73.38 69.06%
3.3. set a schedule for deliberation of the evidences gathered; and Survival Rate % % % %
3.4. invite representative assigned to every principle to confirm 1.8 Coefficient of Efficiency
the evidences presented.
This indicator measure the internal efficiency of the education system. It evaluates
the impact of repetition and dropout on the efficiency of the educational process in
Step 4. Proceed to the validation process through interview and observation producing graduates. It is calculated using the Pupil-Years and the Total Promotees
(including repeaters) used in calculating the Reconstructed Cohort Survival Rate.
of classes to validate documented evidences. DOD process shall be properly
handled. Years input per graduate
Step 5. Discuss documents, process evidence and summarize data for each Elementary: Secondary:
Total Promotees Gr 6 (including repeaters) Total Promotees Yr 4 (including repeaters)
principle/indicator. Classify issues, problems, opportunities and others. Each -------------------------------------------------------- x 6 -------------------------------------------------------- x 4
indicator is scored. Pupil-Years Gr 1-6 Pupil-Years Yr 1-4
Step 6. Conduct a post-conference to deliberate on observations made and The indicator assesses the number of years it takes for an average pupil/student
data gathered. to graduate from the elementary/secondary level. It is calculated using the Pupil
-Years and the Total Promotees/Graduates (including repeaters) used in calcu-
Step 7. Prepare written report by team. lating the Reconstructed Cohort Survival Rate.
Step 8. Finalize and submit SBM self-assessment results to the Division. 1.9 Promotion Rate
Elementary: Secondary:
Functions of SBM Coordinating Teams Pupil-Years Gr 1-6 Pupil-Years Yr 1-4
Division --------------------------------------------------------
Total Promotees Gr 6 (including repeaters)
--------------------------------------------------------
Total Promotees Yr 4 (including repeaters)
1. Reviews Self-Assessment Validation Forms submitted by the The Promotion Rate assesses the extent of pupils/students who are promot-
schools; ed to the next grade/year level. The grade 6/year 4 promotion rate is the
2. Prepares the schedule on the conduct of assessment validation; graduation rate for the elementary/secondary level. The computation used
3. Validates the accuracy of data and computations in the SBM Assess- in the BEIS is slightly different from the official EFA formula since it utilizes
ment Validation Form; the reported number of promotees rather than computing for the promotees
4. Provides technical assistance if necessary; using the present enrolment and the previous school year enrolment.
5. Consolidates the list of schools qualified for SBM Assessment
Validation;
6. Awards certificates and/or other forms of incentives to schools that Elementary: Secondary:
Promotees Gr X, SY N Promotees Yr X, SY N
have attained level I of SBM practice; and -------------------------- x 100 -------------------------- x 100
7. Recommends schools for levels II and III assessment validation. Enrolment Gr X, SY N Enrolment Yr X, SY N
20 105
Region
Pro- Pro- Promot- Pro- Promot-
moted moted ed to Gr moted ed to Gr 1. Reviews Assessment Validation Forms submitted by the SDOs;
to Gr 2 to Gr 3 4 to Gr 5 6
Cohort with no repetition SY N 1,000.00 824.70 743.74 692.53 645.58 599.05
2. Prepares the schedule on the conduct of validation through a
memorandum;
Cohort repeating once SY N+1 53.92 3. Validates the accuracy of data and computations in the SBM
2.91 Promoted Cohort Gr X x Promotion Rate Gr X-1 Assessment Validation Form;
Cohort repeating twice SY N+2
4. Provides technical assistance if necessary;
Cohort repeating thrice SY N+3 0.16 5. Conducts exit conference;
Cohort repeating four times 0.01 6. Awards plaques/certificates and other forms of incentives to
Repeated Cohort Gr 1 x Repetition
SY N+4
schools that have been validated as levels II and III;
Cohort repeating five times SY
N+5 0.00 7. Recommends schools for level III accreditation to the Central
Cohort repeating six times SY Office (CO).
N+6 0.00
Step 4. Add the repeaters in the previous grade level who were promoted with the pupils in Scoring Guidelines for SBM Level of Practice
the current grade level who repeated.
Measuring the school performance can help and assist the school heads
Pro- in giving appropriate technical assistance to schools.
Promot- Pro- mote
ed to Gr Promoted Promoted moted d to
2 to Gr 3 to Gr 4 to Gr 5 Gr 6 For SBM Level of Practice, a scoring mechanism is provided called
Cohort with no repetition 599.0 SBM Assessment Form divided into 2 categories: the Performance Improve-
SY N 1,000.00 824.70 743.74 692.53 645.58 5 ment (PI) which is given 60 percent and Document Analysis, Observation
Cohort repeating once SY
N+1 53.92 71.62 81.49 86.96 90.17 85.87 and Discussion (DOD) which is given 40 percent.
Cohort repeating twice SY
N+2
Cohort repeating thrice SY
2.91 4.76 6.14 7.11 7.90 7.65 Key Performance Indicators (KPIs) that Determine Per-
N+3 0.16 0.29 0.40 0.48 0.56 0.55 formance Improvement
Cohort repeating four
times SY N+4 0.01 0.02 0.02 0.03 0.04 0.04 Key Performance Indicators (KPIs) refer to a set of quantifiable
Cohort repeating five
times SY N+5 0.00 0.00 0.00 0.00 0.00 0.00 measures which gauge the performance of an enterprise relative to its objec-
Cohort repeating six times tives thereby enabling corrective action where there are deviations (Miller,
SY N+6 0.00 0.00 0.00 0.00 0.00 0.00 2016). For SBM level of practice, the Performance Improvement (PI) in
Repeated Cohort Gr X-1 x Promotion Rate Gr X-1)
terms of access, efficiency and quality are quantitatively measured accord-
+ (Promoted Cohort Gr X x Repition Rate Gr X) ingly through the following KPIs:
Step 5-7. Calculate the total for each grade level to obtain the pupil-years. Multiply 1. Access: Percent Increase of Enrolment or
the pupil-years with the respective promotion rate to get the total promotees Participation Rate
(including repeaters). Calculate the reconstructed cohort survival rate for each 2. Efficiency: Dropout and Repetition Rate
grade level by dividing the Total Promotees Gr X-1 (including repeaters) with the 3. Quality: NAT Results (if available) or Promotion Rate
original cohort of 1000.
21
104
Note: Inasmuch as the following prescribed indicators cannot be generated at the
school level (EBEIS), the Region opted to use the following indicators,
namely: ACCESS - Percent Increase of Enrolment or Participation Rate us- 1.6 Net Enrolment Ratio
ing Community Mapping; EFFICIENCY- Dropout Rate and Repetition Rate; The indicator provides a more precise measurement of the extent of par-
QUALITY - the Promotion Rate is preferred considering future administra- ticipation in primary education of children belonging to the official pri-
tion of NAT which will be by sampling. Should the NAT be restored and
other indicators be made available in the EBEIS, then the original prescribed mary school age.
indicators will be used.
Total Enrolment Ages 6-11, SY N
------------------------------------- x 100
Computation for Performance Improvement (PI) Population Age 6-11, SY N
Where:
(For the computation of PI, Please refer to Appendix C.1 to
Total Enrolment Total Enrolment (Table B - GESP; Table A2 - PSP)
C.14 of this SBM Manual of Oper ations.) SY N
Ages 6-11,
The SBM practice is ascertained by the existence of structured mechanisms, The system adopted the reconstructed cohort method, shown below, in
processes, and practices in all indicators. Teams of practitioners and experts from the calculating the Cohort Survival rate:
district, division, regional offices shall validate the self-assessment results before a lev- Step 1. Compute the Promotion and Repetition Rates for a particular
el of SBM practice is established. When the school has sustained level II for three con-
secutive years, it becomes a potential candidate for level III and can be recommended area.
by the Regional Office to the PASBE through the Central Office.
Gr 1 Gr 2 Gr 3 Gr 4 Gr 5 Gr 6
The revised tool is systems-oriented, principle-guided, evidence-based, learner- Promotion Rate 82.47% 90.18% 93.11% 93.22% 92.79% 96.32%
centered, process-focused, non-prescriptive, user-friendly, collaborative in approach, Repetition Rate 5.39% 3.29% 2.27% 1.60% 1.41% 0.37%
and results/outcomes-focused. This can be used by the internal and external stakehold-
ers of schools. Step 2 & 3. Compute the number of promotees up to grade 6 using the
promotion rates for the respective grade/year levels. Compute the num-
ber of pupils/students in grade/year 1 who repeat once, twice, up to 6
Computation for Document-Analysis, Observation times.
and Discussion (DOD)
1. The four (4) principles are given weights on the basis of their significance
in improving learning outcomes and school operations:
Leadership and Governance 30%
Curriculum and Learning 30%
Accountability and Continuous Improvement 25%
Management of Resources 15%
Total 100%
103
22
2. Each principle has several indicators. Based on the results of the DOD,
Where: the evidence is summarized, and an agreed rating is made for each indica-
Enrolment Gr 1, SY N Total Grade 1 Enrolment (Table B - GESP; Table A2 -
PSP) tor;
Population Age 6, SY N Projected 2002 population from NSO
3. The items are to be rated by checking the appropriate boxes using the rat-
1.4 Net Intake Rate ing scale below:
0 -No evidence
This indicator gives a more precise measurement of access to primary educa- 1- Evidence indicates early or preliminary stages of implementation
tion of the eligible, primary school-entrance age population than the Appar- 2- Evidence indicates planned practice and procedure are fully im-
ent Intake Rate. plemented
3- Evidence indicates practice and procedure satisfy qualify
Enrolment Gr 1, Age 6, SY N standards
------------------------------- x 100
Population Age 6, SY N
4. The ratings given to each indicator shall be consolidated;
Where:
Total Age 6 Grade 1 Enrolment (Table B - GESP; Table A2 -
5. The number of check marks in each indicator shall be counted and
Enrolment Gr 1, Age 6, SY N
PSP) recorded in the summary table;
Population Age 6, SY N Projected 2002 population from NSO
6. The number of check marks in each column shall be multiplied by
the number of points given (1-3);
1.5 Gross Enrolment Ratio
The indicator is used to show the general level of participation in primary 7. The average rating for each principle shall be determined by dividing
education. It is used in place of the Net Enrolment Ratio when data on en- the total score by the number of indicators of each principle;
rolment by single years of age is not available. It can also be used together
8. The average ratings for each principle shall be recorded in the
with the Net Enrolment Ratio to measure the extent of over-aged and under
Summary Table to get the General Average;
-aged enrolment. The system generates this indicator up to the level of the
legislative districts and above. 9. The rating for each principle shall be multiplied by its percentage
Total Enrolment All Ages, SY N
weight to get the weighted average rating;
----------------------------------- x 100
Population Age 6-11, SY N 10. The total rating of the principles shall be determined by getting
the sum of all the weighted ratings. The result is the DOD
Where:
rating;
Total Enrolment All Ages, SY Total Enrolment (Table B - GESP; Table A2 - PSP)
N 11. The level of practice will be computed based on the criteria
Population Age 6, SY N Projected 2002 population from NSO below:
based on the performance improvement (PI); and
according to the validated practice using DOD
12. The final results using the scoring criteria as described in no.11 shall be rec-
orded and validated using the SBM Assessment Form to be signed by the School
Head and the Schools Division Superintendent (Division Level).
102
23
Appendix 0. Definition of Performance Indicators
1.Performance Indicators
Description of SBM Levels of Practice
Level Score/Points Description Remarks 1.1 Gross Enrolment Ratio in Early Childhood Development Programs
This indicator measures the general level of participation of young children in
Developing structures and mechanisms
I 0.50 – 1.49 Developing with acceptable level and extent of com- early childhood development programs. It indicates the capacity of the educa-
munity participation and impact on learning tion system to prepare young children for elementary education. The system
outcomes. generates this indicator only up to the level of the legislative districts and
Introducing and sustaining continuous im- above.
II 1.50 – 2.49 Maturing provement process that Integrates wider
community participation and significantly Enrolment Pre-sch, SY N
improve performance and learning out- ---------------------------- x 100
comes. Population Age 4-5, SY N
Ensuring the production of intended out-
III 2.50 – 3.50 Advanced puts/outcomes and meeting all standards of Where:
(Accredited a system fully integrated in the local com-
Level) munity and is self-renewing and self- sus- Enrolment Pre-sch, SY N Total Pre-school Enrolment (Table A - GESP; Table A1
- PSP)
taining.
Population Age 4-5, SY N Projected 2002 population from NSO
SBM Assessment Validation Procedure This indicator measures the level of participation of grade 1 pupils in ECD
programs.
Submission of School Pre-Assessment Result to the Division Level Enrolment with ECD Gr 1, SY N
--------------------------------------- x 100
Step 1: Enrolment Gr 1, SY N
The School SBM Coordinating Team headed by the School Head will
conduct self-assessment. If found qualified, the School SBM Coordi- Where:
Enrolment with ECD Gr 1, Total Grade 1 Enrolment with ECD
nator will endorse the result of the self-assessment approved by the SY N (Table B - GESP; Table A2 - PSP)
School Head to the Division SBM Coordinator for the Division As- Enrolment Gr 1, SY N Total Grade 1 Enrolment (Table B - GESP; Table A2 -
sessment Validation. PSP)
The Division SBM Coordinator will validate the Pre-Assessment Form 1.3 Apparent/Gross Intake Rate
submitted by the School SBM Coordinator. After validating the re- The Apparent Intake Rate reflects the general level of access to primary educa
sult, Division SBM Coordinating Team shall visit and validate the tion. It also indicates the capacity of the education system to provide access to
SBM level of practice of the school using the Assessment Validation grade 1 for the official school-entrance age population. It is used as a substitute
Form. for Net Intake Rate in the absence of data on new entrants by single years of age.
The system generates this indicator up to the level of the legislative districts and
Division SBM Coordinator will endorse list of qualified schools with at- above.
tached SBM Assessment Validation Form of the qualified schools to
the Schools Division Superintendent for the endorsement to DepEd, Enrolment Gr 1, SY N
Region 10. --------------------------- x 100
Population Age 6, SY N
101
24
Appendix N. Regional Coordinating Team (RCT) TA Sheet
Submission of List of Qualified Schools to the
Department of Education
Regional Level
Step 2:
Region X The Field Technical Assistance Division (FTAD) shall check the
Masterson Avenue, Upper Balulang , Cagayan de Oro City SBM Assessment Validation form and if found in order, endorses
Regional Coordinating Team (RCT) TA Sheet
the form to the Regional SBM Coordinating Teams which in-
Division: School Date:
cludes at least one member from QuAD who will ensure the qual-
Acknowledged ity of the data submitted.
Time Desired by the School
Key Areas Developmental Areas Technical Assistance
(TA) /Proposed
Frame Outcomes Principal/ A Regional Memorandum shall be issued for the SBM Assessment
DFTAT
Result indicating the scheduled School Validation for the quali-
A. Access fied schools to be facilitated by the FTAD.
Result of the SBM Validation and the schedule of awarding cere-
mony will be issued through a Regional Memorandum.
The awardees will receive plaques and certificates of recognition.
B. Efficiency
SBM Validation Procedure of Document -Analysis,
Observation and Discussion (DOD)
C.Quality The SBM Assessment Tool uses evidence to determine a school’s
level of practice in which the Document Analysis, Observation and Dis-
cussion (D.O.D.) is a means of evaluating the validity or accuracy of the
evidence. There are three (3) essential steps in evaluating these evidences:
D. Others
1. Conduct document analysis
Obtain and assemble all existing evidences related to the indicator be-
ing assessed. Evaluate the validity of the evidences against Relevance,
______________________ Accuracy, Currency, Consistency and Sufficiency.
Prepared by :
Chair /VC/Member Relevance. The evidence must be appropriate to the indicator being
assessed.
Accuracy. The evidence must be correct.
100
25
Currency. The evidence must be present, existing, or actual.
Consistency. The evidence must be verifiable and generates the same results Principle 3 : ACCOUNTABILITY AND CONTINUOUS
from most of the sources.
IMPROVEMENT
Sufficiency. The evidence must be adequate and enough.
3.1 Records of attendance of school heads in the
2. Conduct observations to obtain process evidence
Barangay assemblies/ sessions.
Process evidence is obtained by scrutinizing instructional, leadership and
management styles, methods, techniques, approaches and activities used by the 3.1.2 Records of the integration of SIP in the Barangay
school community to achieve the SBM goal. Process evidence is to be used to
cross-validate documentary evidence and synthesize the process evidence for Developmental Plan
group discussion.
3.1.3 School-Based accountability system on processes,
3. Discuss the synthesized documentary and process evidence
mechanisms and tools
Conduct the discussion in a friendly non-confrontational manner to ex-
plain, verify and clarify the evidence. Practices vary in establishing the level of 3.1.4 Reports of the Review Committee
practice of an indicator. The SBM validator may use either of these approach- 3.1.5 Financial Resources, Reports and Liquidations
es:
Integrative. The entire body of evidence for all indicators is assembled first, Principle 4 – MANAGEMENT OF RESOURCES
scrutinized for internal consistency, and finally used as guide in making a uni-
fied decision as to which level of practice an indicator belongs. 4.1 Resource inventory (classrooms, desks, toilets, WATSAN,
equipment)
Non-integrative. The evidences for all indicators are scrutinized one by one
4.1.1 School Report Card accomplished and reported
for evidence and also classified one by one for level of practice. 4.1.2 Record of gains of the implemented intervention
IMPORTANT: Each validator should captur e and document in detail, ac- 4.1.3 Annual AIP revision/adjustment conducted
tual responses and observations on areas for improvement and highlights during 4.1.4 Complete Records on Fiscal Management
the conduct of DOD. These should be reflected in the Form provided below. 4.1.5 School Committee to M & E on Resource Management
4.1.6 School Fiscal Management Council composed of
Who Conducts the DOD? Parents, Teachers and Community Stakeholders.
School SBM Coordinating Team 4.1.7 Monthly Financial Statement signed by the Finance
Management Committee of the school
Division SBM Coordinating Team
Regional SBM Coordinating Team
(Reference: D.O. 83, s. 2018)
26 99
1.3.3 Narrative Reports of the Planning Sessions Conducted OBSERVATIONS AND RESPONSES TEMPLATE:
1.3.4 Revisited SIP with community stakeholders participation
1.3.5 Minutes of Meeting , logbooks, attendance sheets Write below actual responses and observations on
1.3.6 Progress Report of the PIP/SIP. areas for improvement and highlights during the conduct of
1.4 SIP Implementation Plan DOD.
Committees for the Four Components of SIP with a clear SBM Principle: ___________________________
definition of roles and functions of its members.
1.5 Professional Development Plan for School SOURCE
Heads and Community Leaders based on training needs. DATE SITUATION/OBSERVATION
(Document/
Person In-
1.5.1 Indigenized Learning modules and corresponding terviewed)
report
1.5.2 Minutes of Meeting on consultative assembly
1.5.3 Reports of projects implemented
1.6 Baseline data of the school, gaps and needs and interventions.
98 27
Appendix M. SBM Validation Mechanics, Matrix and Suggested Documents (Artifacts, Evidence as
Suggested Respondents per SBM Principle during MOVs in everyPrinciple )
28
97
Principle 4. Management of Resources
Appendix L. Suggested Composition of Division SBM Coordinating Team 4.1 School Head
4.2 Supreme Student Government/Supreme
Pupil Government Representative
4.3 School Planning Team
Division SBM Coordinating Team 4.4 LGU/Barangay Chair/Representative
4.5 School Governing Council Chair/
Chair: SDS Representative of School BAC Team
Vice-Chair: ASDS 4.6 Property Custodian
Team Leader: Chief ES, SGOD 4.7 Inspectorate Team
Co-Team Leader: Chief ES, CID 4.8 GPTA
Team Members: Division SBM Coordinator
PSDSs
Education Program Supervisors
Senior Education Program Specialists
Division Engineer
Division Accountant
Administrative Officer (AO) V
Planning Officer
Education Program Specialists II
Supply Officer
Health and Nutrition Personnel
PDOs
29
96
Appendix K. Steps in conducting the Post –Conference and Sample of Post-
SUGGESTED GUIDELINES IN THE GRANTING OF RECOGNITION
AND INCENTIVES Conference Data Captured Form
30 95
PROVISION OF TECHNICAL ASSISTANCE
Appendix J. Division Technical Assistance Plan (Sample)
Technical assistance is any form of professional help, guidance or support
given to the schools by the SBM Coordinating Team for the school heads and teach-
ers so they can be more effective in the performance of their functions. Giving of
Department of Education technical assistance shall be done in a nurturing manner.
Region X Technical Assistance can be in three forms:
Division Technical Assistance Plan (DTAP)
Information Sharing. Within this area ar e policies, guidelines, dir ections, and
Division : School Year :
instructions of top DepEd management. They are usually delivered through
District : Month/Date :
office memoranda/orders, conferences and referrals.
No. of identified schools :
__________
Capacity Building. This ar ea r efer s to the development of competencies or
Issues /Concerns/Development Technical Assistance TA Means of Verifi-
knowledge, skills, and attitudes. More often, this type of technical assistance
School
Areas (TA) to be provided
Time Frame
Provider cation (MOVs) is delivered through training, orientation, workshop, coaching or mentoring.
The DOD Result (Template 2) shows the activities, the assistance needed
and the course/s of actions to be undertaken by the schools after the SBM assessment
validation. The school then provides a consolidated listing of this course/s of actions/
activities the SDO for the technical assistance needed.
Legend : Prepared by :
TA - Technical Assistance
94
31
School heads must make sure that the activities identified and the Appendix I. SBM Practices Assessment Result Template
Assessment Result Template (Template 3) are reflected in the School Im-
provement Plan (SIP) then translated into concrete actions in the Annual
Implementation Plan (AIP). For SBM areas that cannot be addressed within
a year, this will also be plotted in the three-year Work and Financial Plan
(WFP) of the SIP. Level of SBM Practice ________
Having a clear grasp of the technical requirements necessary to im-
plement the SIP/AIP provisions, the school head shall exhaust all means to
avail of the technical assistance from the Division Office. (Refer to Appen-
dix I: SBM Practices Assessment Results Template)
Division Level
The SBM Assessment Result provides the statistical data on the SBM Principle Indicator Actions to Time By Resources
number of schools in the different levels of SBM practices. Below are the be Taken Frame Whom Needed
( Human
steps to be undertaken by the division: and/or
1. The Division SBM Coordinator shall take the lead in the consoli- Material)
dation of the summary of responses from all the schools in the division.
2. The Division Field Technical Assistance Team (DFTAT) shall
identify the developmental areas with reference to the four principles of
SBM.
3. DFTAT develops a Technical Assistance Plan (TAP) to address the
identified developmental areas (Template 4).
Regional Level
The SBM Assessment Results may be used as basis for tech-
nical assistance and for making policy recommendations/proposals to
support the course/s of action from the SDOs. (Refer to Appendix K:
Division Technical Assistance Plan Template.)
32 93
Appendix H. Suggested Parts of the Program during School Validation References
DepED SBM Manual.pdf - A Manual on the Assessment of School-Based
Management Practices Republic of the Philippines Department of
Suggested Parts of the Program during Education School-Based Management (n.d.). Retrieved from
School Validation https://www.coursehero.com/file/27447257/DepED-SBM-
Manualpdf/
I. Courtesy Call to the School Head DO 82, s. 1999 - Signing Authority for the Third Elementary Education
II. Welcome Program ( 15 minutes only)
Project (TEEP) and Secondary Education Development and Im-
Pambansang Awit
provement Project (SEDIP) | Department of Education. (n.d.). Re-
Invocation
trieved from http://www.deped.gov.ph/orders/do-82-s-1999
Welcome Remarks by the School Head
Overview of the School SBM Best Practices DO 83, s. 2012 - Implementing Guidelines On The Revised School-Based
( 5- minute video presentation) Management (SBM) Framework, Assessment Process And Tool
Statement of Purpose (APAT) | Department of Education. (n.d.). Retrieved from http://
Presentation of validators per SBM principle by SBM www.deped.gov.ph/orders/do-83-s-2012.
Coordinating Team Leader
III. Validation Proper Third Elementary Education Project (English) |The World Bank.
IV. Post-conference by the SBM Coordinating Team (n.d.). Retrieved from http: documents worldbank. org/curated/
V. Exit Conference by the SBM CoordinatingTeam leader en/989331468758996223/Philippines-Third-Elementary-
Education-Project
Republic Act No. 9155 | GOVPH. (2001, August 11). Retrieved from
http://www.officialgazette.gov.ph/2001/08/11/republic-act-no-
9155/
92 33
References
34
91
Graphics
Once these cells in the Dataset Form are filled up, click the
button SUBMIT. The data are now automatically consolidated,
tabulated, and graphed in the dashboard, which is found in the link.
APPENDICES
35
90
Appendix A. DepEd Order No. 83, s. 2012
Furthermore, there are 2 groups of links that will aid directly in
accessing the profile of SDOs :
Example:
Percentage Distribution of Elementary and Secondary Schools in the Schools
Division of __________, by SBM level of Practice, School Year 2018-2019
Note :
Use password to open the links. Password shall be given only to designated division
SBM Coordinator by FTAD representative.
36 89
VII. DIVISION OF LANAO DEL NORTE
Code : LDN
Total number of school s : 377
LDN-1 : first school listed
LDN-377 : last school listed
VIII.DIVISION OF MALAY BALAY
Code : MBLY
Total number of school s : 89
MBLY-1 : first school listed
MBLY-89 : last school listed
IX. DIVISION OF MISAMIS OCCIDENTAL
Code : MISOC
Total number of schools : 346
MISOC-1 : first school listed
MISOC-346 : last school listed
X. DIVISION OF MISAMIS ORIENTAL
Code : MISOR
Total number of schools : 444
MISOR-1 : first school listed
MISOR-444 : last school listed
XI. DIVISION OF OROQUIETA
Code : ORQT
Total number of school s: 55
ORQT-1 : first school listed
ORQT-55 : last school listed
XII. DIVISION OF OZAMIZ CITY
Code : OZMZ
Total number of school s : 63
OZMZ -1 : first school listed
OZMZ-63 : last school listed
XII. DIVISION OF TANGUB CITY
Code : TNGB
Total number of schools : 74
TNGB-1 : first school listed
TNGB-74 : last school listed
XIV.DIVISION OF VALENCIA CITY
Code : VLNC
Total number of schools : 67
VLNC-1 : first school listed
VLNC-67 : last school listed
88 37
The Master list of Public Elementary and Secondary
Appendix B. List of SBM Acronyms
Schools is taken from the Enhanced Basic Education Infor-
School-Based Management (SBM) ACRONYMS mation System (EBEIS).
ACCESs A Child & Community Centered Education System The following are the assigned codes for the schools by divi-
AIP Annual Implementation Plan sion: (Schools are arranged in alphabetical order)
ALS Alternative Learning System
APAT Assessment Process and Tool I. DIVISION OF BUKIDNON
BAC Bids and Awards Committee Code : BUK
BESRA Basic Education Sector Reform Agenda Total number of schools : 612
CID Curriculum Implementation Division BUK-1 : first school listed
CLMD Curriculum and Learning Management Division
BUK-612 : last School listed
CO Central Office
COA Commission on Audit
II. DIVISION OF CAGAYAN DE ORO CITY
CR Completion Rate Code : CDO
CSR Cohort-Survival Rate Total number of schools : 108
DepEd Department of Education CDO-1 : first school
DFTAT Division Field Technical Assistance Team CDO-108 : last school
DOD Document-Analysis, Observation and Discussion III. DIVISION OF CAMIGUIN
DR Drop Out Rate Code : CAM
DTAP Division Technical Assistance Plan Total number of schools : 72
EBEIS Enhanced Basic Education Information System CAM-1 : first school
ECCD Early Childhood Care and Development CAM-72 : last school
ECE Early Childhood Education IV. DIVISION OF EL SALVADOR CITY
EGRA Early Grade Reading Assessment
ELLNA Early Language Literacy and Numeracy Assessment Code : EL_SAL
ESIP Enhanced School Improvement Plan Total number of schools : 18
ESSD Education Support and Services Division EL_SAL-1 : first school
FD Finance Division EL_SAL-18 : last school
FGD Focus-Group Discussion V. DIVISION OF GINGOOG CITY
FMDP Financial Management Development Plan Code : GNGOG
FTAD Field Technical Assistance Division Total number of schools : 102
GASTPE Government Assistance to Students and Teachers GNGOG-1 : first school
in Private Education GNGOG-102 : last school
GPTA General Parents Teachers Association VI. DIVISION OF ILIGAN CITY
HRDD Human Resource Development Division Code : ILGN
HRDP Human Resource Development Plan
Total number of schools : 117
ILGN-1 : first school
ILGN-117 : last school
38
87
Validation Results
INSET In-Service Training
IPCRF Individual Performance Commitment Review Form
School Improvement Performance 60%
IPPD Individual Plan for Professional Development
This is determined by three thematic areas:
KPIs Key Performance Indicators
Access (45%) – The indicator being assessed in this area is rate
KRA Key Result Area
of enrolment for the last three (3) consecutive years.
LAC Learning Action Cell
Efficiency (25%) – The indicators being assessed under this area
LGU Local Government Unit
are as follows:
LSB Local School Board
Drop-out Rate
M&E Monitoring and Evaluation
Repetition Rate
MOA Memorandum of Agreement
Quality (30%) – The indicator being assessed under this area is
MOOE Maintenance and Other Operating Expenses
the average increase of the NAT/Promotion results
MOU Memorandum of Understanding
for the past three years.
MOV Means of Verification
NAT National Achievement Test
SBM Principles and Practices 40%
NCBTS National Competency-Based Teacher Standards
Leadership and Governance (4 indicators) - 30% NGO Non-Governmental Organization
Curriculum and Learning (7 indicators ) - 30% OPCRF Office Performance Commitment Review Form
Accountability and Continuous Improvement (5 indicators) -25% PBB Performance-Based Bonus
PI Performance Improvement
Management of Resources (5 indicators) - 15% PPRD Policy Planning and Research Division
QuAD Quality Assurance Division
SBM Level of Practice
RAMP Resource Allocation and Mobilization Plan
SBM School-Based Management
This section summarizes the overall results of the key Performance
SBMF School-Based Management Fund
Indicators and DOD. It is here that the current level of the school’s SBM level
SBP School Building Program
of practice, i.e. Developing (Level I ), Maturing (Level II), or Advanced (Level
SDO Schools Division Office
III) is shown.
SEF Special Education Fund
SGC School Governing Council
SGOD School Governance and Operations Division
SH School Head
SIP School Improvement Plan
SMEA School Monitoring Evaluation and Adjustment
SOB School Operating Budget
SOSA State of the School Address
SPDP School Physical Development Plan
SPG Supreme Pupil Government
86 39
School’s Dataset
SPPD School Plan for Professional Development The School’s Dataset contains all entries inputted by the division SBM coordi-
SRC School Report Card nator per division . In this sheet, all schools in the fourteen (14) schools divisions
SSG Supreme Student Government are arranged in alphabetical order and given corresponding codes.
TA Technical Assistance
TAP Technical Assistance Plan Cells with blue color must be filled-up ; that is, it requires input from the divi-
TSNA Teacher’s Strengths and Needs Assessment sion SBM Coordinator while the rest of the cells with orange color must be filled-up
by the FTAD representative , and are therefore protected.
40
85
Rationale in the Creation of the ETT
Appendix C.1. SBM Self-Assessment Form Sample ( Main Menu)
The Electronic Tracking Tool, which is referred to here as ETT, was con-
ceptualized in response to the felt need of DepEd Regional Office 10 , to have a real-
time monitoring of School-Based Management (SBM) currently practiced in all
schools in the fourteen schools divisions in DepEd Region 10.
Likewise, it would lessen the cost incurred by the Regional Education Pro-
gram Supervisors as they do their regular monitoring functions vis-à-vis the SBM
program.
Region’s Dashboard
41
84
Appendix C.2. SBM Self-Assessment Form Sample ( Sub -Menu)
SBM Electronic Tracking Tool Manual
42
83
Appendix C.3. SBM Self-Assessment Form Sample ( Option 1)
Appendix G. SBM Electronic Tracking Tool Manual (ETT)
Objectives
1. to upgrade practices as provided for in DO 83 s. 2012;
2. to strengthen SBM implementation in all schools, divisions ,
and in the Region;
3. to develop a region-wide tracking system for all schools in the region
on their respective SBM level of implementation;
4. to fast-track data gathering from the field in terms of SBM;
5. to make data available anytime as basis for the provision of
Technical Assistance and conduct of research study for policy
making ; and
6. to easily spot potential schools for higher accreditation
82 43
Appendix C.4. SBM Self-Assessment Form Sample ( Option1 ) 11. Click this button to open DOD assessment and evaluate every
indicator based on the means of verification (facts/evidences).
12. A sub-menu will be opened that will give you a hint on which option is
the best combination. It can help you decide whether to proceed to DOD or
to give technical assistance .
13. After assessing your DOD principle based on MOVs, click the back
arrow to return to the sub-menu. Then go back to the best option that
qualifies for the requirement which is PI>=1.50 (60%). Click the option
and print the output.
44 81
Appendix C.5. SBM Self-Assessment Form Sample ( Option 2 )
10. Accomplish the sub-menu
80
45
Appendix C.6. SBM Self-Assessment Form Sample ( Option 2 )
8. Click the SBM Validation and DOD Assessment to view the initial
result of the Performance Improvement.
9. A sub-menu will be opened that will give you a hint on which option is the best
combination. It can help you to decide whether to give technical assistance; if not,
qualify any of the options and proceed to DOD if qualified. In this example, op-
tion 1 qualified for the next level. P.I. is greater than or equal to 2.50 and the rest
of the options did not qualify . The cells are automatically filled-in black color.
46
79
5. And finally, enter the names of validators on the space Appendix C.7. SBM Self-Assessment Form Sample ( Option 3 )
provided.
7. Click the school profile button to enter the data of your school
78 47
Appendix C.8. SBM Self-Assessment Form Sample ( Option 3 ) 4. Accomplish the Thematic Area
48
77
Appendix C.9. SBM Self-Assessment Form Sample ( Option 4 )
To use the template, load the template and enable the macro, clicking the
yellow warning sign “SECURITY WARNING Macros have been
disabled.”
To do that,
1. Click the Enable Content
2. Enter school ID, if not found then update/change the maintenance menu below, go back
to the previous screen by clicking back arrow and enter again the school ID
3. Click the drop down list under year column and select your baseline year for
validation.
49
76
Appendix F. Electronic SBM Toolkit User’s Guidelines
Appendix C.10. SBM Self-Assessment Form Sample ( Option 4 )
Steps to Accomplish Electronic SBM-Validation and DOD
Assessment Tool
50
75
Appendix E. Validation Plan Template
74
51
Appendix C.12. SBM Self-Assessment Form Sample ( Option 5 )
Principle IV– Management of Resources (15%)
Indicator Artifacts Remarks
Standard Rating
1 2 3
52
73
Appendix C.13 SBM Self-Assessment Form Sample ( Option 6 )
53
72
Principle IV– Management of Resources (15%)
Appendix C.14. SBM Self-Assessment Form Sample ( Option 6 )
Indicator Artifacts Rating Re-
Standard marks
1 2 3
54
71
Department of Education Appendix D. Revised SBM Assessment Tool DOD (Sample)
Region X
Zone 1 , Upper Balulang Cagayan de Oro City
70 55
Principle I– Leadership and Governance (30%)
Principle III– Accountability and Continuous Improvement (25%)
Indicator
Remarks
Standard Artifacts Rating Indicator Re-
marks
Standard Artifacts Rating
1 2 3
1 2 3
56 69
Principle III– Accountability and Continuous Improvement (25%) Principle I– Leadership and Governance (30%)
Indicator Remarks Indicator Re-
Standard Artifacts Rating Standard Artifacts Rating marks
1 2 3 1 2 3
57
68
Principle I– Leadership and Governance (30%) Principle III– Accountability and Continuous Improvement (25%)
Indicator Indicator Artifacts
Standard Artifacts Rating Re- Remarks
marks
Standard Rating
1 2 3 1 2 3
58
67
Principle III– Accountability and Continuous Improvement (25%)
Indicator
Standard Artifacts Rating Remarks
1 2 3
66 59
Department of Education
Region X
Department of Education Zone 1 Upper Balulang, Cagayan de Oro City
Region X
Zone 1 Upper Balulang, Cagayan de Oro City
60 65
Principle II– Curriculum and Learning (30%)
Principle II– Curriculum and Learning (30%)
Indicator Indicator
Standard Artifacts Rating Re- Standard Artifacts Rating Re-
marks marks
1 2 3 1 2 3
64
61
Principle II– Curriculum and Learning (30%)
Principle II– Curriculum and Learning (30%)
Indicator Re-
Standard Artifacts Rating marks Indicator
Standard Artifacts Rating Re
1 2 3 1 2 3 marks
62 63