You are on page 1of 8

PROFESSIONAL EDUCATION

PrEd 146 & 161n: LEARNING TARGETS


Assessment in Learning Learning targets are statements on what learners are supposed to learn and
what they can do because of instruction. Compared with educational goals, standards,
TERMINOLOGIES and objectives, learning targets are the most specific and lead to more specific
Test instructional and assessment activities.
 instrument or technique that measure someone’s knowledge TYPES OF LEARNING TARGETS
 tool for measuring any quality, ability, skill, or knowledge Kowledge
 Example:  Refers to factual, conceptual, and procedural information that learners
Checking students’ ability to use a microscope, practical exam in P.E 11 must learn in a subject or content area
Measurement  Example: I can explain the role of the conceptual framework in a
 quantitative description of attributes research.
 an instrument or tool used to assess human intelligence, achievement, Reasonong
temperament, attitudes, or something expressed quantitatively  Knowledge-based thought processes that learners must learn. It involves
 Example: students’ scores, grade submitted the application of knowledge in problem-solving, decision-making, and
Assessment other tasks that require mental skills.
 process of gathering evidence of students’ performance over a period of  Example: I can justify my research problems with a theory
time to determine learning Skills
 rooted in the Latin word assidere, which means, “sitting next to another.”  Use of knowledge and/or reasoning to perform or demonstrate physical
 Example: Entrance Examination, daily quiz skills
Evaluation  Example: I can facilitate a focus group discussion (FGD) with research
 process of judging the quality of a performance participants.
 a process of systematic collection, analysis, interpretation, appraisal, or Product
judgment of the worth of organized data as a basis for decision-making.  Use of knowledge, reasoning, and skills in creating a concrete or tangible
 Example: product
Reporting Examination results, level of proficiency, pass or fail  Example: I can write a thesis proposal.
Affect
APPROACHES TO ASSESSMENT  refers to affective characteristics that students can develop and
demonstrate because of instruction. It includes attitude, beliefs, interests,
and values.
 Example: I can appreciate the importance of addressing potential ethical
issues in the conduct of thesis research.

DOMAIN II
(PSYCHOMOTOR DOMAIN)
Levels Description Illustration Verbs Sample Objective
Observing Watching active mental identify, discern, Pick non-verbal
attention to a physical distinguish, differentiate, communication
activity describe, relate, select cues
Imitating Attempt to copy a Begin, explain, move, Recognize one’s
physical behavior display, proceed, react, limitations
show, state, volunteer
Practicing Performing a specific Bend, calibrate, construct, Operate quickly
activity repeatedly fasten, grasp, manipulate and accurately
Adapting Fine-tuning the skill and Organize, relax, shorten, Construct a new
making minor sketch, write, create, design scheme or
adjustments to attain sequence
perfection

TYPES OF TEST DOMAIN III


According to kind of learning: (AFFECTIVE DOMAIN)
 Achievement test (assess how well a person has learned specific content The affective domain refers to how we deal with situations emotionally
or mastered certain skills) such as feelings, appreciation, enthusiasm, motivation, values, and attitude. The
 Aptitude test (measure a person's potential to learn or perform well in taxonomy is ordered into five levels as the individual progresses towards internalization
future tasks or activities where the belief or senses constantly influencing or regulating a person’s behavior.
According to ability: Levels Description Illustration Verbs Sample
 Speed test (often focus on rapid decision-making and efficiency) Objective
 Power test (measure the individual's maximum performance level) Receiving Being aware of something click, point to, sit, choose, Listen to
According to interpretation of learning: or alert to it and able to explain, describe, follow, others
 Norm-referenced (compare an individual's performance to that of a listen or pay attention hold, identify, name, reply respectfully
group of peers -- the norm group) Responding Showing commitment to answer, assist, comply, Participate
 Criterion-referenced (measure an individual's performance against respond in some measure to conform, discuss, greet, help in
predetermined criteria or standards) the idea or phenomenon perform, practice, read, discussion
According to scoring: recite, report, write, tell
 Objective test (have clear and specific scoring criteria, often in the form Valuing Showing a willingness Complete, demonstrate, Show
of right or wrong answers) perceived as valuing or differentiate, explain, follow, ability to
 Subjective test (rely on the judgment or interpretation of a grader) favoring specific ideas invite, join, justify, propose, resolve
According to preparation: report, share, study, perform
 Standardized test (designed to ensure fairness and comparability across Organizing Arranging values into Arrange, combine, complete, Prioritize
test-takers) priorities, creating a unique adhere, alter, defend, explain, time
 Teacher made test (created by individual educators to assess specific value system by comparing, formulate, integrate, effectively
learning objectives or classroom content) relating and synthesizing organize, connect, synthesize for family
Other Types: values
 Intelligence test (measure various aspects of cognitive abilities, such as Characterizing Practicing value system that Act, view, control, listen, Show self-
reasoning, problem-solving, and verbal or nonverbal abilities) controls one’s behavior, execute, modify, perform, trust when
 Prognostic test (predict future performance or outcomes in a particular exhibiting behavior revise, solve, verify asking
area, such as academic success or job performance) consisted of pervasive,
 Placement test (used to determine an individual's level of proficiency or predictable and
readiness for a particular course, program, or job role) characteristics of the
person.
DOMAIN I
(COGNITIVE DOMAIN) ASSESSMENT GUIDING PRIINCIPLES
Bloom's Taxonomy is a framework used in education to classify 1. Assessment of learning is an integral part (sine qua non) of the teaching-learning
educational objectives and cognitive skills into hierarchical levels. It was developed by process.
Benjamin Bloom and his colleagues in the 1950s, with subsequent revisions and 2. Assessment tool should match performance objective.
adaptations over the years. 3. The results of assessment must be feedback to the learners.
Remembering - This level involves recalling or recognizing information. 4. In assessing learning, teachers must consider learners’ learning styles and
Understanding - This level involves grasping the meaning of information.
multiple intelligences.
Applying - This level involves using information in new or different contexts.
Analyzing - This level involves breaking down information into its component parts 5. To help build a culture of success, it is sound to give some positive feedback
and understanding the relationships between them. along with not-so-good ones.
Evaluating - This level involves making judgments or assessments based on criteria 6. Emphasize on self-assessment.
and evidence. 7. The bell curve mentality must be abandoned.
Creating - This highest level involves generating new ideas, products, or solutions. 8. Assessment of learning should never be used as a punishment or as a disciplinary
measure.
LEVELS OF KNOWLEDGE: 9. Results of learning assessment must be communicated regularly and clearly to parents.
Factual knowledge 10. Emphasize on real-world application.
 specific facts, details, or data points 11. Assessment works best when it is continuous, ongoing, and cumulative.
Conceptual knowledge 12. To ensure reliability of assessment results, use multiple sources.
 understanding abstract ideas, principles, theories, or frameworks 13. Set your acceptable standard or criterion of success.
Procedural knowledge 14. Outcomes-based assessment focuses on student activities that will still be
 knowing how to do something or perform a task relevant after formal schooling concludes.
Metacognitive knowledge 15. Assessment requires attention to supporting student activities.
 awareness and understanding of one's own cognitive processes 16. Assessment must be manageable for both teachers and students.
TYPES OF ASSESSMENT
AFFECTIVE ASSESSMENT
 Cognitions, affect, behavioral intentions, evaluation
 Assessment tools:
1. Self-report
 There are varied ways to express students' affect as self-report The
most common and direct way is while having a casual conversation
or interview. Students can also respond to a written questionnaire
or survey about themselves or other students.
2. Rating Scale
TRADITIONAL AUTHENTIC  Example: Rating Scale (Attitude towards Mathematics)
Selecting a Response Performing a Task Directions:
Contrived Real-life Put the score on the column for each of the statement as it applies to
Recall/Recognition Construction/Application you. Use 1 to 5, 1 being the lowest and 5 the highest possible score.
Teacher-struactured Student-structured SCORE
Indirect Evidence Direct Evidence I am happy during Mathematics class.
Convergent Assessment Divergent Assessment
I get tired doing board work and drills.
TRADITIONAL ASSESSMENT
I enjoy solving word problems
Selected Response
3. Thurstone and Likert Scales
 True-False
 Matching Type - perfect, imperfect  Example:
 Multiple Choice Directions: Put a check on the column for each of the statement that
Constructed Response applies to you,
 Completion Legend: SA – Strongly Agree
 Supply Type A – Agree
 Essay - restricted, non-restricted/extended U – Undecided
 Enumeration D – Disagree
SD – Strongly Disagree
CONSTRUCTING PERFORMANCE TASKS (SA) (A) (U) (D) (SD)
The GRASPS framework is a useful tool for constructing performance 5 4 3 2 1
tasks in educational settings. It helps teachers design tasks that are authentic and I am happy during Mathematics class.
meaningful, allowing students to apply their knowledge and skills in real-world I get tired doing board work and drills.
contexts. I enjoy solving word problems
Goal - Define the overarching objective of the task. 4. Semantic Differential (SD)
Role - Specify the role or perspective that students will adopt when completing the  Example:
task. Mathematics:
Audience - Identify the audience for whom the task is intended. Boring__ __ __ __ __ __ __ __ __Interesting
Situation - Describe the context or situation in which the task takes place. Important__ __ __ __ __ __ __ __ _ Useless
Product - Determine what students will create or produce as a result of 5. Sentence completion
completing the task.  Examples:
Standard - Clearly articulate the standards or criteria that will be used to evaluate I think Mathematics as a subject is_____________________
students' performance on the task. I like my Mathematics teacher the most because___________
SCORING RUBRICS
(Criteria and Levels of Performance) PORTFOLIO ASSESSMENT
General Portfolio
 Can be applied to different tasks  purposeful collection of student’s work that demonstrates the student’s
Analytical skills and accomplishments (Belgrad, 2008)  Artifacts, reproduction,
 Specifies levels of performance for each criterion attestations, and productions
 Checking of long time project Types:
Holistic 1. Standard-Based portfolio
 Assesses performance across multiple criteria an whole  This collects evidence that links student achievement to particular
 Checking of fast projects learning standards. It focuses on specific standards that are predetermined
Task-specific by the teacher and discussed to the students at the start of the school year.
 Used to evaluated specific tasks 2. Development / documentation / growth portfolio
 This portfolio displays changes and accomplishments related to academic
performance over time. The assembled work sample is to provide
COMMON RATING SCALE ERRORS evidence about the student growth which also provides meaningful
1. Leniency Error opportunities for selfevaluation of the students.
 Occurs when a teacher tends to make almost all ratings toward the high end of 3. Process portfolio
the scale, avoiding the scale's low end.  This shows the steps and/or the results of a completed project or task as
2. Severity Error the primary goal of this portfolio. This is very useful because the final
 A teacher tends to make almost all ratings toward the low end of the scale. This product does not always show the skills and knowledge that the student
is the opposite of leniency error. used in an effort to complete the project.
3. Central Tendency Error 4. Product portfolio
 Occurs when a teacher hesitates to use extremes and uses only the middle part of  The product portfolio is similar to the process portfolio except that its
the scale focus is on the end product rather than on the process in which the
product was developed. In this type of portfolio, there is a little or no
4. Halo Effect
information about the steps that was used in crafting the product.
 Occurs when a teacher lets his/her general impression of the student affect how
he/she rates the student on specific dimension. 5. Showcase / best work portfolio
 This shows the best of the students' best work. This type of portfolio is
5. Personal bias
based on the students' personal criteria rather than the criteria of their
 Occurs when a teacher has a general tendency to use inappropriate or irrelevant
teacher
stereotypes favoring boys over girls.
6. Logical Error  Elements:
1. Cover Sheet.
 Occurs when a teacher gives similar ratings to two or more dimensions that the
2. Table of Contents.
teacher believes to be related where in fact they are not related at all.
3. Work samples.
7. Rater Drift
4. Dates of all sample works to facilitate evidence of growth overtime.
 Occurs when the raters, whose ratings originally agreed, begin to redefine the
5. Drafts of the written products, or even the seminal attempts in writing the
rubrics for themselves.
write ups for the portfolio and the revised version based on the corrected
versions.
6. Self-assessment.
PERFORMANCE-BASED ASSESSMENT 7. Future goals.
Four types of Learning Targets Used in Performance Assessment 8. Other's comments and assessments.
Deep Understanding
 The idea is to involve students meaningfully in hands-on activities for extended
periods of time so that their understanding is rich and more extensive than what TEST DEVELOPMENT AND VALIDATION
can be attained by more conventional instruction and traditional paperand-pencil
assessments.
CONSTRUCTING PEN-AND-PAPER
Reasoning
 Reasoning is essential with performance assessment as the students demonstrate Tests Steps in Planning for a Test
skills and construct products. 1. Identifying test objectives/lesson outcomes
Skills 2. Deciding on the type of objective test to be prepared
 Psychomotor skills describe clearly the physical action required for a given task. 3. Preparing a Table of Specifications (TOS)
These may be developmentally appropriate skills or skills that are needed for 4. Constructing the draft test items 5. Try-out and validation
specific tasks: fine motor skills, gross motor actions, more complex athletic Table of Specifications
skills, some visual skills, and verbal/auditory for young children. A “test map” that guides the teacher in constructing a test, which includes the
Products objectives, levels of objectives to be tested, items and their corresponding
 These are completed works such as term papers, projects, and other assignments percentages
in which students use their knowledge and skills
CHARACTERISTICS OF ASSESSMENT METHODS
1. Clear and Appropriate Learning Target
 Assessment can be made precise, accurate, and dependable only if it
is explicitly defined.
K R 21=
K
( K − 1)
1−
[ {
n( K − M )
Ks
2
=
50
( 50− 1 )
1− }]
20 (50 − 35 )
(50)(5.5)
2
[ {
 Target – denotes that something is observe through the behavior of
the students.
2. Appropriateness of Assessment Methods
 Pearson r Formula

r=
(
∑ XY
N )(

NN )
∑X
∑Y

√ −( ) √ N) −(
2 2 2 2
∑X ∑X ∑Y ∑Y
N N N
Pretest (X) Posttest (Y) 2 2 XY
X Y
3. Balance 5 8 25 64 40
 A balanced assessment takes advantage of both traditional and 5 9 25 81 45
alternative assessment 6 6 36 36 36
 A balanced assessment sets targets in all domains of learning or areas 6 5 36 25 30
of intelligence 7 9 49 81 63
4. Validity 7 10 49 100 70
 the extent to which a test measure what it intends to measure 8 6 64 36 48
 Face validity  8 8 64 64 64
 accomplished by examining the physical appearance of the 9 9 81 81 81
instrument. 9 10 81 100 90
 Content validity 
 done through a careful and critical examination of the ∑ X = 70 ∑Y = 80 ∑X
2
= 510 ∑Y
2
= 668 ∑ XY =
objective of assessment so that it reflects the curricular goals. 567
 Criterion-related validity/concrete validity Solution:

( )( ) 10 ( 10 )( 10 )
 established statistically such that the set of scores revealed by ∑ XY
∑Y ∑X 567 70 80
the measuring instrument correlates with the scores obtained in − −
another external predictor or measure NN N
 two purposes: r= =

√ N N √ N N √ 10 10 √ 10 −(
−( ) −( ) −( )
2 2 2 2 2
 predictive validity (present status of the individual ∑X ∑X ∑Y ∑Y 510 70 668
 concurrent validity (future performance of an
individual)
 Construct validity Questionable Reliability
 statistically calculated by comparing psychological
characteristics and variables that potentially influence scores in
a test. ITEM ANALYSIS AND VALIDATION
 Types: 1. Difficulty index
 Convergent validity (defines another trait other than  Aka p-value
what is intended)  Describes how easy or difficult the test items
 divergent/discriminant validity (only the intended trait
can be described and not the other traits)
Pu+ P L
5. Reliability
 Formula: Df =
 consistency of the scores obtained
2
 Test-retest method Where: Df = Difficulty Index
 Equivalent forms
 Split-half method
Pu = Proportion of the upper group who got the item correctly
 Scorer consistency P L = Proportion of the lower group who got the item correctly
 Internal consistency (Kuder-Richardson-21) 
Range Interpretation Action
 Scale:
0.00 - 0.20 Very difficult Reject
 0.90 and above = excellent reliability 
 0.81 - 0.90 = very good for a classroom test 0.21 - 0.40 Difficult Revise
 0.71 - 0.80 = good for a classroom test 0.41 - 0.60 Moderately difficult Retain
 0.61- 0.70 = somewhat low  0.61 - 0.80 Easy Revise
 0.51 - 0.60 = suggest need for revision of test 0.81 - 1.00 Very Easy Reject
 0.50 or below = questionable reliability 2. Discrimination index
 Describes whether an item discriminates the students who belong in
 Spearman-Brown Formula the lower group from those who are in the upper group
 (-) = more from the lower group got the item correctly
2(rℎalf )
reliability of test =  (+) = more from the upper group got the item correctly
1+ rℎalf  Formula:Ds =Pu − PL
Where: rhalf = reliability of the half of the test
Where: D s
Example:
Compute the Spearman-Brown reliability index if the correlation Pu = Proportion of the upper group who got the item correctly
between the odd and even scores is 0.84.
Solution: Excellent
P L = Proportion of the lower group who got the item correctly
 (+) = retain, (-) = reject
Range Interpretation Action
2(rℎalf ) 2(0.84 ) 1.68
reliability of test = = = =0.913043478 -1.00 to -0.50 Can be discriminate but item is questionable Reject
1+ rℎalf 1+0.84 1.84 -0.49 to 0.45 Non-discriminating Revise
0.46 to 1.00 Discriminating Retain
 Kuder-Richardson Formula 21

[ { }]
Difficulty Index VS. Discriminating Index
K n( K − M ) Difficulty Index ( VS Discrimination Index Decision
K R 21= 1− Df ) ( Ds )
( K − 1) Ks
2
Reject vs Any decision REJECT
Where: K = no. of items on the test Revise vs Reject REJECT
M = mean of the test Revise vs Revise REVISE
2
s = variance of the test scores Revise vs Retain REVISE
n = no. of test takers Retain vs Reject REJECT
Example: Retain vs Revise REVISE
A 50 item test was administered to a group of 20 students. The mean Retain vs Retain RETAIN
score was 35 while standard deviation was 5.5. Compute the KR21
index of reliability.
Solution: SCORE DISTRIBUTIONS
Given: M = 35; n = 20; s = 5.5; K = 50 Negatively Skewed (high scores)
Positively Skewed (low scores)
Solution:
Array: 72, 75, 80, 81, 86, 88, 88, 90, 92, 95, 98
Median= X =X =X 6=88( 6 th score )
( n +12 ) ( 11+1
2 )
Interpretation:
There are five scores less than or equal to 88 points and there are five
scores greater than or equal to 88 points.

 (If the number of scores is an even number, the median is the point
halfway between two middle scores)
Scores of students in their 2nd long exam in PrEd146
Solution:
Array: 115, 118, 119, 120, 120, 124, 125, 127, 127, 128
Xn+Xn X 10 + X 10
STATISTICS IN EDUCATIONAL ASSESSMENT 2 2
+1
2 2
+1 X 5+ X 5+1 X 5 + X 6 120
Median= = = = =
MEASURES OF CENTRAL TENDENCY
2 2 2 2
MEAN Interpretation:
 the average Half of the students got 122 points and below while the other half got
 Affected by outliers 122 points and above.
 For interval and ratio data
 Formula:
∑x
 (Ungrouped data) x=
n
Grouped:
Where: ∑x = sum of item values Students’ scores in Math 11 during midterm exam.
n = total number of items Scores f <cf Class Boundaries
∑ fx 120 -122 2 40 119.5 - 122.5
 (Grouped data) x= 117 - 119 2 38 116.5 - 119.5
n 114 -116 2 36 113.5 - 116.5
Where: x = midpoint of the intervals 111 - 113 4 34 110.5 - 113.5
f = frequency of class intervals 108 - 110 5 30 107.5 - 110.5
n = total number of frequency 105 - 107 9 25 104.5 - 107.5
EXAMPLE:
102 -104 6 16 101.5 - 104.5
Ungrouped:
99 - 101 3 10 98.5 - 101.5
The grades in Algebra of 10 students are 82, 85, 79, 78, 89, 87, 88, 89,
75, and 77. What is their mean grade? 96 - 98 4 7 95.4 - 98.5
Solution: 93 - 95 2 3 92.5 - 95.5
90 - 92 1 1 89.5 - 92.5
∑ x 82+85+ 79+78+89+87 +88+89+75+ 77 829
x= = = =82.9∨83 N = 40
n 10 10 Solution:
Interpretation: N 40
Most of the students’ scores are close to 83 points. Since = = 20, the median class interval was located through
Grouped: 2 2
Calculate the mean grade of 50 students in Physics cumulative frequency which is the closest of greater than 20.
Interval Frequency (f) Midpoint (x) fx Given: L = 104.5, F = 16, f = 9, N = 40, i = 3

( ) ( )
90 - 94 7 92 644 N 40
85 - 89 13 87 1,131 −F − 16
80 - 84
75 - 79
70 - 74
16
8
6
82
77
72
1,312
616
432
Md=L+
2
f
( i )=104.5+
2
9
( 3 )=104.5+
20 −16
9 (
50 416 4,135 Interpretation:
Solution: Half of the students got 106points and below while the other half got
∑ fx 4135 106 points and above.
x= = =82.7∨83
n 50
Interpretation:
Most of the students’ scores are close to 83 points.
MODE
 The most frequent score
 May not occur in a data set
 Nominal statistics
MEDIAN  Formula:
 The middle score  (Ungrouped data)
 Less affected by outliers The score in a set of scores that occurs most frequently.
 An ordinal statistics  (Grouped data)
 Formula: Mode is taken at the midpoint of the class interval with the
 (Ungrouped data) largest or highest frequency

( )
Median= X f mo − f 1
 ODD:
( n +12 ) M o=L+
2 f mo −f 1 − f 2
(i)
Xn+Xn Where: L = lower class boundary of the modal class
+1
 EVEN: 2 2
Median= i = class size
2 fmo = frequency of the modal class
Where: X is constant f1 = frequency of the class one step lower than the modal class
n = total number of students f2 = frequency of the class one step higher than the modal class
 Types:

( )
N  Unimodal is a score distribution that consists of one mode.
−F  Bimodal is a score distribution that consists of two modes.
 (Grouped data) 2
Md=L+ (i)  Trimodal is a score that consists of three modes. It is also
f considered as multi-modal – a score distribution that consists
of more than two modes.
Where: Md = median
EXAMPLE:
L = exact lower limit of interval containing the median
Ungrouped:
F = sum of frequencies below L
 In the set of scores (3,5,5,7,8,8,8,10) the mode is 8 because it occurs more
f = frequency of interval containing the median
often than any other score.
N = number of cases
 Raw scores: 97 93 90 95 91 92 98 88
i = class width/class size
90 90 91 94 96 90 90 87
EXAMPLE:
Here, the value of 90 occurs five times, more frequently than any other
Ungrouped:
value. Hence, the mode is 90.
 (If the number of scores is an odd number, the median is the middle
Grouped:
score)
Scores f Class Boundaries
Scores of 11 students in their 1st long exam in PrEd146 are: 88, 81, 75,
95, 98, 88, 86, 72, 80, 90, 92, 95, 98. 40 - 44 4 39.5 - 44.5
45 - 49 5 44.5 - 49.5 Following are the observations showing the age of 50 employees
50 - 54 8 (f1) 49.5 - 54 5 working in 2 whole sale centers. Find the variance.
55 - 59 10 (fmo) 54.5 - 59.5 Scores f x fx x− x ( x − x )2 f( x − x )
2
60 - 64 7 (f2) 59.5 - 64.5
40 - 44 4 42 168 42 - 55 = -13 (4)(169) = 676
65 - 69 6 64.5 - 69.5 ( −13 )2=16
70 - 74 5 69.5 - 74.5 9
Solution: 45 - 49 7 47 329 47 - 55 = -8
( − 8 )2=¿ (7)(64) = 448

M o=L+
(
f mo − f 1
2 f mo −f 1 − f 2 ) (i) =54.5+
( 2 (10)− 8− 7 )
10 −8
(5) =
50 - 54 14 52 728 52 - 55 = -3
64
( −3 )2=¿ 9 (14)(9) = 126

( ) () ( 2 )2=¿ 4
55 - 59 11 57 627 57 - 55 = 2 (11)(4) = 44
2 2 10
L+ (5) = L+ (5) = =54.5+2=56.5∨57
20 −15 5 5 60 - 64 8 62 4496 62 - 55 = 7
( 7 )2 =49 (8)(49) = 392
Interpretation:
It is estimated that about 22% (10 out of 45) of the observations are
65 - 69 6 67 402 67 - 55 = 12
( 12 )2=144 (6)(144) = 864

equal to 57 points. (Majority of the students got the score of 57 points.) Total 50 2750 2550
Solution:
∑ fx 2750
x= = =55
n 50
2
2 ∑ f (x −x) 2550 2550
s= = = =52.04081633∨52
n−1 50 −1 49
Interpretation:
The distance of the squared deviation from the mean is approximately 52.

MEASURES OF DISPERSION/VARIABILITY
(Higher value = higher variability/dispersion)
RANGE
 Most simple
 Affected by extreme scores
 Formula: R = HS - LS
STANDARD DEVIATION
Where: R = range
 Square root of the variance
HS = highest score;
 Especially useful for normal distribution
LS = lowest score  Formula:


 Example:
2
What is the range of the following group of scores ∑( x− x)
1,3,4,5,5,6,7,8,8,9?  (Ungrouped data)s=
R = HS – LS n− 1


=9–1 2
=8
 (Grouped data) s=
∑ f (x −x)
Interpretation:
The difference between the highest score and the lowest score is 8. n −1
EXAMPLE:
Ungrouped:
VARIANCE Scores of students in PrEd146 quiz are:
 How close the scores are to the middle of the distribution 9, 2, 5, 4, 12, 7, 8, 11, 9, 3, 7, 4, 12, 5, 4, 10, 9, 6, 9, 4
 Formula: Solution: (refer to solution for ungrouped variance)

∑( x− x)
2 ∑ x 9+ 2+ 5+4 +12+7+8+11+9+3+7 +4 +12+5+4 +10
 (Ungrouped data) s2= x= =
n− 1 n 20
2 2 2 2 2 2
Where: s2 = variance ∑ ( x i − x ) =( 9 −7 ) + ( 2− 7 ) + ( 5 −7 ) + ( 4 −7 ) + (12 −7 ) + ( 7 −7

√ √ √
x = individual score 2
∑( x− x) 178 178
x = mean s= = = = √ 9.368421053=3.0607
n = total number of cases n− 1 20− 1 19
2 Interpretation:
∑ f (x −x)
 (Grouped data) s2= The distance of the square root of the squared deviation from the mean
n−1 is approximately 3
Where: s2 = variance Grouped:
f = frequency Following are the observations showing the age of 50 employees
x = individual score working in 2 whole sale centers.
x = mean Scores f x fx x− x ( x − x )2 f( x − x )2
n = total number of cases 40 - 44 4 42 168 42 - 55 = -13 (4)(169) = 676
EXAMPLE: ( −13 )2=16
Ungrouped: 9
Scores of students in PrEd146 quiz are: 45 - 49 7 47 329 47 - 55 = -8 (7)(64) = 448
9, 2, 5, 4, 12, 7, 8, 11, 9, 3, 7, 4, 12, 5, 4, 10, 9, 6, 9, 4 ( − 8 )2=¿
Solution: 64
Step 1. Work out the mean 50 - 54 14 52 728 52 - 55 = -3 (14)(9) = 126
2
( −3 ) =¿ 9
∑ x 9+ 2+ 5+4 +12+7+8+11+9+3+7 +4 +12+5+4 +10+9+ 6+9+ 114 =57140 627
x= = 55 - 59 =7 57 - 55 = 2 ( 2 )2=¿ 4 (11)(4) = 44
n 20 60 - 64 8 62
20 4496 62 - 55 = 7 (8)(49) = 392
Step 2. Then for each number: subtract the Mean and square the result. ( 7 )2 =49
This is the part of the formula that says: 65 - 69 6 67 402 67 - 55 = 12 2 (6)(144) = 864
2 2 2 2 2 2 2 2( 12 ) =144
( 8 −7 )250
∑ ( x i − x ) =( 9 −7 ) + ( 2− 7 ) + ( 5 −7 ) + ( 4 −7 ) + (12 −7 ) + ( 7 −7 ) +Total
2
+ ( 11−7 )2750
2 2
)2 + ( 12− 7
+ ( 9 − 7 ) + ( 3 −7 ) + ( 7 −7 ) + ( 4 − 7 2550
Step 3. Then work out the mean of those squared differences. To work out the Solution:
mean, add up all the values then divide by n – 1.
∑ fx 2750
∑( x− x) 178
2 x= = =55
2
s= = =9.368421053∨9 n 50


n− 1 20 −1

√ √
2
∑ f (x −x) 2550 2550
Interpretation:
The distance of the squared deviation from the mean is approximately 9
s= = = =√ 52.04081633=7.2139
points.
n −1 50 −1 49
Interpretation:
The distance of the square root of the squared deviation from the mean
Grouped:
is approximately 7.
i = position of the given quantile
Raw scores – least meaningful

COEFFICIENT OF VARIATION EXAMPLE:


 Use to compare different set of data (Ungrouped)
s Consider the following raw scores: 46, 45, 20, 20, 19, 47, 19, 19, 19,
 Formula: CV = ×100 19, 18, 45, 48, 50. Find Q 1, D 3 and P75
x
Solution:
Where: s = standard deviation
Array of Scores: 18, 19, 19, 19, 19, 19, 20, 20, 45, 45, 46, 47, 48, 50
x = mean
 Example:
For quartile:
Sex s x
Male 10.05 Qi= ¿ th item 85
Female 2.55 4 85
Male: (1)(14) 14
s 10.05 Q 1= tℎ item= tℎ item=3.5 tℎitem
CV = ×100= ×100=11.82352941%∨11.82% 4 4
x 85 19+19 38
Female: between 3 rd∧4 tℎ= = =19
s 2.55 2 2
CV = ×100= ×100=3 % Interpretation:
x 85 25% of the students got 19 points and above, while the 75% of the
Interpretation: students got 19 points and below.
Females’ scores are more homogenous than males’ scores.
For decile:
Di= ¿ th item
10
(3)(14 ) 42
D 3= tℎitem= tℎitem=4.2 tℎitem
10 10
19+19 38
between 4 tℎ∧5 tℎ= = =19
2 2
Interpretation:
30% of the students got 19 points and above, while the 70% of the
students got 19 points and below.

For percentile:
Pi= ¿ th item
SCORES AND THEIR POSITION 100
Quartile (75)(14) 1050
 Divides the distribution into 4 equal parts, with one-fourth of the data P75= tℎitem= tℎitem=10.5 tℎ item
values in each part. 100 100
 This means: (Notice that Q2 is also the median.) 45+46 91
 about 25% of the data falls at or below the first quartile (Q1); between 10 tℎ∧11tℎ= = =45.5∨4
 50% of the data falls at or below the 2nd quartile (Q2), and 2 2
 75% falls at or below the 3rd quartile (Q3). Interpretation:
 Formula: 75% of the students got 46 points and above, while the 25% of the

 (Ungrouped data)Qi= ¿ tℎ item students got 46 points and below.


4

( )
¿ −<cf
 (Grouped data) 4
Qi=L+ (C)
f
Decile (Ungrouped)
 Divides the distribution into 10 equal parts Heights of Employees (inches) Number of Employees <cf
 There are 9 deciles such that 10% of the distribution is equal 61 - 63 2 2
or less than decile 1, (D1), 64 - 66 5 7
 20% of the distribution is equal or less than decile 2 (D2); and 67 - 69 12 19
so on. 70 - 72 15 34
 Notice that D5 is also the median. 73 - 75 8 42
 Formula: 76 - 78 5 47

 (Ungrouped data)Di= ¿ tℎ item 79 - 81 3


50
50
10
Find Q 1, D 5 and P78

( )
¿ −< cf
Solution:
 (Grouped data) 10
Di=L+ (C) Array of Scores: 18, 19, 19, 19, 19, 19, 20, 20, 45, 45, 46, 47, 48, 50
f
Percentile For quartile:
 Divides the distribution into 100 equal parts
¿ (1)(50) 50
 In the same manner, for percentiles, (Notice that P50 is also the Since
4
= = =12.5, class is located. (It must be
median) 4 4
 there are 99 percentiles such that 1% of the scores are less than greater than or equal but closest to 12.5 in <cf.
the first percentile (P1),

( )
 2% of the scores are less than the second percentile (P2), and ¿ −<cf
so on. 4
 Formula: Qi=L+ (C)
¿ tℎ item f
(Ungrouped data) Pi =

( )

100 (1)(50)
−7

( )
¿ −< cf
 (Grouped data)
Pi=L+
100
f
(C)
Q 1=66.5+
4
12
(3)=66.5+
12.5 −7
12
(3)=66.5+
5
1 ( ) (
Interpretation:
Where: L = lower class boundary of the quantile class 25% of the students got 68 points and above, while the 75% of the
C = class interval students got 68 points and below.
<cf = cumulative frequency for the class interval preceding the quantile
class For decile:
f = frequency of the quantile class
n = total number of frequencies
¿ (5)(50) 250 Grade 11 to 12 
Since
10
= = =¿25, class is located. (It must be  The average of the Quarterly Grade produces the semestral grades 
10 10  GA = sum of all final grades / no. of learning areas
greater than or equal but closest to 25 in <cf. BASICS OF GRADING

( )
DepEd Order No. 8, s. 2015 – Policy Guidelines on Classroom
¿ −< cf Assessment for the K to 12 Basic Education
10 Formative Assessment
Di=L+ (C)
f

( )
(5)(50)
−19
D 5=69.5+
10
15
(3)=69.5+
25 − 19
15
(3)=69.5+
6
15 (
(3)=69.5+
18
15 ) ( )
=69.5+1.2=70.7∨71 ( )
Interpretation:
50% of the students got 71 points and above, while the 50% of the
students got 71 points and below.

For percentile:
¿ (78)(50) 3900
Since
100
= = =¿39, class is located. (It
100 100 Summative Assessment
must be greater than or equal but closest to 39 in <cf.

( )
¿ −< cf
100
Pi=L+ (C)
f

( )
(78)( 50)
−34
P78=72.5+
100
8
(3)=72.5+
39 − 34
8
( 3)=72.5+
5
8
(3)=72.5+ (
15
8 ) ()
=72.5+1.875=74.375∨74 ( )
Interpretation:
78% of the students got 74 points and above, while the 22% of the
students got 74 points and below.

Computation of Summative Assessment for Grades 1 to 12

GRADING, REPORTING AND FEEDBACKING

GRADING PURPOSES
1. Enhancing students’ learning
2. Reports to parents/guardians
3. Administrative and guidance uses 
 Describe grading procedures to students at the start of the class. 
 Clarify that grades will be based on achievement levels of intended
learning outcomes only. Obtain valid evidences. 
 Do not lower an achievement grade for particular misbehaviors.

K TO 12 GRADING SYSTEM
Descriptors of Learner’s Progress
Below 75 Did Not Meet expectations
75-79 Fairly Satisfactory BASICS OF REPORTING
80-84 Satisfactory 85-89 Very Satisfactory Reporting
90-100 Outstanding  The process of collecting, analyzing, and presenting information in a
Minimum grade = 60 → transmuted to 75 structured format to convey important details to a specific audience.
Kindergarten:  For a Good Parent-Teacher Conference
 No numerical grades   Be positive in approach. Have a listening ear.
 Use of checklist, anecdotal records, and portfolios  Be objective.
 Don’t project an “omniscient” image.
REPORTING OF GRADES AT THE END OF THE SCHOOL YEAR  Encourage parents to participate and share information.
Grade 1 to 10   Practice good communication and relation skills.
 Average of the quarter grades produce the end of the year grade   Don’t talk about other students.
 The general average is computed by dividing the sum of all final grades by  End with an encouraging note.
the total number of learning areas
BASICS OF FEEDBACKING
 Feedbacking, or providing feedback, is a fundamental aspect of
communication and growth in various contexts, including professional,
educational, and personal settings.
 Feedback is information given to a person about their actions, behaviors, or
performance. It can be positive, negative, or neutral and is intended to
provide guidance, correction, motivation, or reinforcement.

Purpose:
Improvement:
 Feedback helps individuals understand what they are doing well and
areas where they can improve.
Motivation:
 Positive feedback can boost morale and motivation, while
constructive feedback can inspire change and growth.
Clarification:
 Feedback clarifies expectations and helps individuals align their
actions with desired outcomes.
Relationship Building:
 Constructive feedback fosters trust and strengthens relationships by
promoting open communication and understanding.

Types of Feedback:
Positive Feedback:
 Acknowledges and reinforces desirable behaviors or outcomes. It
highlights strengths and accomplishments, encouraging repetition of
those behaviors.
Constructive Feedback:
 Addresses areas for improvement or development. It focuses on
specific behaviors or actions and provides suggestions for
enhancement.
Negative Feedback:
 Points out behaviors or actions that are not meeting expectations. It
should be delivered constructively to encourage improvement rather
than discourage.

Components of Effective Feedback:


Specific:
 Feedback should be specific, focusing on particular behaviors or
actions rather than generalizations.
Timely:
 Feedback is most effective when delivered promptly, allowing
individuals to reflect on their actions while the details are fresh.
Balanced:
 Feedback should include both positive aspects and areas for
improvement, maintaining a balanced perspective.
Objective:
 Feedback should be based on observable facts and behaviors rather
than assumptions or personal biases.
Actionable:
 Feedback should provide clear suggestions or steps for improvement,
enabling individuals to take concrete actions.
Respectful:
 Feedback should be delivered with respect and empathy, considering
the recipient's feelings and perspectives.

Feedback Delivery:
Choose the Right Time and Place:
 Provide feedback in a private setting and at a time when the recipient is
receptive.
Use a Constructive Tone:
 Deliver feedback in a constructive and non-threatening manner, focusing
on growth and development.
Encourage Dialogue:
 Foster open communication by inviting the recipient to share their
perspective and discuss potential solutions.
Follow-Up:
 Follow up on feedback to track progress, provide additional support, and
reinforce positive changes.

You might also like