You are on page 1of 14

STEPS IN A STUDENT OUTCOMES

ASSESSMENT (ANDERSON, ET. AL.,


MATCHING LEARNING TARGETS AND
2015)
ASSESSMENT METHODS
5 STANDARDS OF QUALITY 1. Create learning outcome statements
ASSESSMENT
2. Design teaching/assessments to
to inform sound instructional decisions
achieve these outcomes statements
1. Clear purpose  “Why are you
3. Implement teaching/assessment
assessing?”
activities
2. Clear learning targets  “What do
4. Analyze data on individual and
you want to assess?”
aggregate levels
3. Sound assessment design  “How
5. Re-assess the process
are you going to assess?”
4. Effective communication of results
LEARNING DOMAINS
5. Student involvement in the
assessment process COGNITIVE DOMAIN (KNOWLEDGE-
BASED) – “What do I want learners to
TWO CONCEPTS TO FAMILIARIZE:
know?”
LEARNING TARGETS /OUTCOMES and
PSYCHOMOTOR DOMAIN (SKILLS-
DOMAINS
BASED) – “What actions do I want learners
ASSESSMENT METHODS or to be able to perform?”
ASSESSMENT TASKS
AFFECTIVE DOMAIN (VALUES,
LEARNING TARGETS /OUTCOMES and ATTITUDES, AND INTERESTS) – “What
DOMAINS actions do I want learners to think or care
about?”
“The BALANCE in assessment”
DEFINING “LEARNING TARGET”
TAXONOMY OF LEARNING DOMAINS
 A description of performance that
(COGNITIVE)
includes what learners should know
“What do I want learners to know?”
and be able to do.
TAXONOMY OF ANDERSON,
 It contains the criteria used to judge
KRATHWOHL ET. AL. (2001)
student performance.
 It is derived from national and local CREATING
standards.
EVALUATING

 It is similar to that of a learning ANALYZING

outcome APPLYING

UNDERSTANDING
IDENTIFYING LEARNING OUTCOMES
REMEMBERING
EXAMPLE OF BLOOM’S TAXONOMY: differentiating one triangle from another
ART OF QUESTIONING based on their characteristics.
We’ll use the tale “The Three Little Pigs” as • Analyzing. How is the first triangle
the subject. similar to a rectangle?
• Remembering. Describe the place The students compare the
where the pigs lived. characteristics of a right triangle with
those of a rectangle.
• Understanding. Summarize the
story of the three little pigs. • Evaluating. How would you prove
that all right triangles fit in a circle,
• Applying. Build a theory of why only
with each vertex (or corner) of the
the third pig decided to build a brick
triangle touching the circle?
house.
The students extend their understanding
• Analyzing. Outline the actions of the
of triangles by proving a well-known
pigs. And decide how you would act
theorem of geometry (see Thale’s
in the same events.
Theorem)
• Evaluating. Assess what would
• Creating. How could you use the
happen if the three little pigs acted
right triangle to design our next
differently.
project?
• Creating. Write a poem, song, or
The students integrate information they
skit to describe the whole story in a
know and understand about the right
new form.
triangle into designing a new project.
SEQUENCING
Domain: Cognitive
• Remembering. What can you tell Topic A: Quadratic Equations
me about the first triangle?
A. Solve quadratic equations by
The students provide any information factoring.
they know about the mathematical B. Describe a quadratic equation.
concept. C. Compare the four methods of
• Understanding. What makes the solving quadratic equations.
first triangle a right triangle? D. Differentiate a quadratic equation
from other types of equations in
The students use the information they terms of form and degree.
already know about triangles to rightly E. Formulate real-life problems
identify a specific triangle. involving quadratic equations.
F. Examine the nature of roots of a
• Applying. Based on what you know
quadratic equation.
about right triangles, why is the
second triangle not a right triangle? Topic B: Mechanical Energy
The students apply the information they A. Decide whether the total mechanical
already know about triangle to energy remains the same during a
certain process.
B. Create a device that shows
conservation of mechanical energy.
C. State the law of conservation of
SIMPSON
energy.
D. Explain energy transformation in SIMPLIFIED
& RE-
various activities or events. ORGANIZED
LEVELS

E. Perform activities to demonstrate


conservation of mechanical energy. DAVE HARROW

F. Determine the relationship among


the kinetic, gravitational potential, TAXONOMY OF LEARNING DOMAINS
and total mechanical energies of a (PSYCHOMOTOR)
mass at any point between “What actions do I want learners to be
maximum potential energy and able to perform?”
maximum kinetic energy.
TAXONOMIES OF SIMPSON (1972)
Considering…
(Progressive levels)
Learning outcome: design an experiment
to determine the factors that affect the 1. Perception
strength of an electromagnet 2. Set
3. Guided Response
Revised Bloom’s Taxonomy: Creating 4. Mechanism (Basic Proficiency)
Marzano & Kendall’s Taxonomy: Knowledge 5. Complex Overt Response (Expert)
Utilization 6. Adaptation
7. Origination
Which of the following factors does NOT
affect the strength of an electromagnet? TAXONOMIES OF DAVE (1975)

A. Diameter of the coil Behavior levels

B. Direction of the windings 1. Imitation


2. Manipulation
C. Nature of the core material 3. Precision
4. Articulation
D. Number of turns in the coil
5. Naturalization
Are students able to attain the learning
TAXONOMIES OF HARROW (1972)
outcome?
Coordination levels
ANSWER: NO
1. Reflex Movements
TAXONOMY OF LEARNING DOMAINS
2. Fundamental Movements
(PSYCHOMOTOR)
3. Perceptual Abilities
“What actions do I want learners to be
4. Physical Abilities (Fitness)
able to perform?”
5. Skilled Movements
6. Non-Discursive Communication
SPECIAL TYPE OF TAXONOMY: S.O.L.O.
Taxonomy
Structure of the Observed Learning
SIMPSON
(Progressi
Outcome
on)
a means of classifying learning outcomes in
SIMPLIFIED &
RE-
terms of their complexity, enabling us to
ORGANIZED
LEVELS assess students’ work in terms of
its quality not of how many bits of this and
DAVE HARROW of that they have got right.
(Behavior) (Coordination)

TAXONOMIES OF SIMPSON, DAVE, AND


HARROW
(SIMPLIFIED & RE-ORGANIZED LEVELS)
1. Observing
2. Imitating
3. Practicing
4. Adapting
SEQUENCING: PSYCHOMOTOR
Topic C: Basic Sketching SOLO Taxonomy (Biggs & Collis, 1982)

A. Watch how tools are selected and SPECIAL TYPE OF TAXONOMY: Systems
used in sketching. of knowledge
B. Create a design using combinations
of lines, curves and shapes.
C. Draw various lines, curves, and
shapes.
D. Set the initial drawing position.
TAXONOMY OF LEARNING DOMAINS
(AFFECTIVE)
“What actions do I want learners to think or
care about?”
TAXONOMY OF KRATHWOHL ET. AL.
(1964)
1. Receiving
2. Responding MARZANO & KENDALL (2007)
3. Valuing
4. Organizing
5. Internalizing Values
(Characterization)
4. PRODUCTS
5. AFFECT
(1-4 Categories of Learning)
People with goals succeed because they
know where they are going. – Earl
Nightingale
ASSESSMENT METHODS or
ASSESSMENT TASKS
“The APPROPRIATENESS in assessment”
ASSESSMENT METHOD
1. SELECTED-RESPONSE FORMAT
 MULTIPLE CHOICE
 TRUE OR FALSE
 MATCHING TYPE
 INTERPRETIVE
2. CONSTRUCTED-RESPONSE
FORMAT
 BRIEF-CONSTRUCTED
 ORAL QUESTIONING
 PERFORMANCE TASKS
 ESSAY ITEMS
3. TEACHER OBSERVATIONS
4. STUDENT SELF-ASSESSMENTS
MATCHING LEARNING TARGETS WITH
ASSESSMENT METHOD
CONSTRUCTIVE ALIGNMENT
Learning target
Teaching-learning activities and
TYPES:
assessment tasks activate the same verbs
1. KNOWLEDGE AND SIMPLE as in the Learning Target
UNDERSTANDING
2. DEEP UNDERSTANDING AND
REASONING
3. SKILLS
3. PRACTICALITY & EFFICIENCY
4. ETHICS
VALIDITY
Validity pertains to the accuracy of the
inferences teachers make about students
based on the information gathered from the
assessment (McMillan, 2007; Popham,
2011).
An assessment is valid if it measures a
In reference to garavalia, marken, & sommi
student’s actual knowledge and
(2003)
performance with respect to the intended
Tip in choosing an assessment method: learning outcomes and NOT something else
(McMillan, 2007; Popham, 2011).
Determine the role of the assessment
(placement, feedback, diagnosis, WHAT IS INTENDED TO MEASURE
intervention?)
CRITERION-RELATED EVIDENCE
• Nature of the task
- Concurrent Validity
• Level of cognitive processing - Predictive Validity (mock exam and
board exam should correlate)
• Context of the assessment - Statistical tool: Pearson correlation
QUESTIONS TO PONDER: coefficient (r), or Spearman’s rank
order correlation
• What evidence of learning should be - degree to which test scores agree
gathered? with an external criterion
• What mental processes should - examines the relationship between
students demonstrate? an assessment and another
measure of the same trait
• How would the assessment be - 3 types of criteria: achievement test
carried out? scores; ratings, grades and other
numerical judgements made by the
• What is the format?
teacher; career data
• How long will the assessment take?  Divided into concurrent (other
criteria assessed simultaneously)
• Are there systems in place and and predictive (predicting future or
resources available for this past events) sub-areas
assessment?  Deals with whether the assessment
• How will the assessment results be scores obtained for participants are
interpreted? related to a criterion outcome
measure
MAIN PRINCIPLES OF HIGH-QUALITY  For example for predictive, do SAT
ASSESSMENT scores predict post-secondary
1. VALIDITY performance?

2. RELIABILITY CONTENT-RELATED EVIDENCE


- Face validity (psychology enters) CONSTRUCT-RELATED EVIDENCE
- Instructional Validity (clear
- 3 types: Theoretical, Logical, and
directions)
Statistical
- Table of Specifications
- 2 methods: Convergent and
Divergent/Discriminant
- Statistical tool: Multitrait-Multimethod
Matrix
- Convergent validity shows you
whether two tests that should be
highly related to each other are
indeed related.
- Divergent (Discriminant)
validity shows you whether two
tests that should not be highly
related to each other are, indeed,
unrelated.
 Deals with whether the assessment
is measuring the correct construct
(trait/attribute/ability/skill)
 For example, is this human biology
exam actually measuring human
biology constructs
THREATS TO VALIDITY (Miller, Linn, &
Gronlund, 2009)
1. Unclear test directions
2. Complicated vocabulary and
sentence structure
3. Ambiguous statements
4. Inadequate time limits
5. Inappropriate level of difficulty of test
items
 Deals with whether the assessment
6. Poorl constructed test items
content and composition is
7. Mismatch test items and learning
appropriate given what is being
outcomes
measured (e.g., does the test reflect
8. Short test
the knowledge/skills required to do a
9. Improper arrangement of items
job or demonstrate that one grasps
10. Identifiable patterns of answers
the course material)
 For example, is there an appropriate SUGGESTIONS for enhancing validity
representation of questions from (McMillan, 2007)
each topic area on the assessment
 Ask others to judge the clarity of
that reflect the curriculum that is
what you are assessing.
being taught
 Related to but not to be confused
with “face validity”
 Check to see if different ways of SOURCES OF RELIABILITY EVIDENCE
assessing the same thing give the
A. STABILITY (test-retest)
same result.
B. EQUIVALENCE (parallel forms, pre-
 Sample a sufficient number of
& post-test)
examples of what is being assessed
C. INTERNAL CONSISTENCY (split-
 Prepare a detailed table of
half, Cronbach alpha & Kuder-
specifications
Richardson 20/21)
 Ask others to judge the match
D. SCORER OR RATER
between the assessment items and
CONSISTENCY (inter-rater,
the objectives of the assessment
Spearman’s Rho or Cohen’s Kappa)
 Compare groups known to differ on
E. DECISION CONSISTENCY
what is being assessed
CAUSES OF MEASUREMENT ERROR
 Compare scores taken before to
those taken after instruction 1. Examinee-specific factors
 Compare predicted consequences to - Fatigue
actual consequences - Boredom
- Lack of motivation
 Compare scores on similar, but
- Momentary lapses of memory
different traits
- Carelessness
 Provide adequate time to complete
2. Test-specific factors
the assessment
- Insufficient directions
 Ensure appropriate vocabulary,
- Ambiguous questions
sentence structure, and item
- Intentional trick questions
difficulty
3. Scoring factors
 Ask easy questions first
- Inconsistent grading systems
 Use different methods to assess the
- Carelessness
same thing
- Computational Errors
 Use only for intended purpose
RELIABILITY indicates the extent to which
MAIN PRINCIPLES OF HIGH-QUALITY scores are free from measurement errors.
ASSESSMENT: RELIABILITY
IF A TEST IS VALID THEN IT IS
RELIABILITY: consistency of performance RELIABLE
TYPES OF RELIABILITY WAYS to improve reliability assessment
results, by Nitko and Brookhart (2011)
- INTERNAL RELIABILITY: Assesses the
consistency of results across items within a  Lengthen the assessment
test procedure whenever practical by
providing more time, more
- EXTERNAL RELIABILITY: gauges the
questions, and more observation.
extent to which a measure varies from one
use to another
 Broaden the scope of the MAIN PRINCIPLES OF HIGH-QUALITY
procedure by assessing all the ASSESSMENT: PRACTICALITY AND
significant aspects of the largest EFFICIENCY
learning performance.
• Familiarity with the Method
 Improve objectivity by using a
• Time required
systematic and more formal
procedure for scoring student • Ease in Administration
performance. A scoring scheme or
rubric would prove useful. • Ease of Scoring

 Use multiple trackers by employing • Ease of Interpretation


inter-rater reliability. • Cost
 Combine results from several MAIN PRINCIPLES OF HIGH-QUALITY
assessments especially when ASSESSMENT: ETHICS
making crucial educational
decisions. • FAIRNESS (transparency)

 Provide sufficient time for * Some concern/s:


students’ assessment completion
Pop quizzes
 Teach students how to perform
Test-taking skills
their best by providing practice and
training to students and motivating Familiarity of the test (for teachers)
them.
• OPPORTUNITY TO LEARN
 Match the assessment difficulty to
students’ ability levels by providing * Some concern/s:
tasks that are neither too easy nor Imbalance / inadequate instructional
too difficult, and tailoring the approaches
assessment to each student’s ability
level when possible. • PREREQUISITE KNOWLEDGE
AND SKILLS
 Differentiate among students’
performance ranking by selecting * Some concern/s:
assessment tasks that distinguish or
Lack of foundational background of
discriminate the best from the least
knowledge and skills
able students.
• AVOIDING STEREOTYPING
VALIDITY AND RELIABILITY
* Some concern/s:
Lack of foundational background of
Note that some of the suggestions on
knowledge and skills
improving reliability overlap with those
concerning validity. That is because Stereotyping – generalization of a group
reliability is a precursor to validity. of people based on inconclusive
observations of a small sample of the
However, it is important to note that high
same group.
reliability does not ensure a high degree of
validity. * Alleviating stereotype threats
 Be careful in asking questions about Insensitivity of teachers to students
topics related to a student’s with disabilities
demographic group.
* Six categories in accommodation
 Place measures of maximal
1. Presentation
performance at the beginning of
assessments before giving less formal 2. Response
self-report activities that contain
personal topics or information of 3. Setting
academic functioning. 4. Timing
 Do not describe tests as diagnostic of 5. Scheduling
intellectual capacity.
6. Others, if there are…
 Determine if there are mediators of
stereotype threat that affect test * Three elements to accommodate
performance. special needs:

 Consider possibility of stereotype 1. Nature and extent of the learner’s


threat when interpreting test scores of disability
susceptible typecast individuals.
2. Type and format of assessment
• AVOID BIAS (offensiveness; unfair
3. Competency and content being
penalization)
assessed
* Concern/s:
• RELEVANCE
Forms of assessment bias:
* Concern/s:
offensiveness and unfair
penalization.  Irrelevant assessment
Disparate impact * Criteria for achieving quality
assessments (Killen, 2000)
Offensiveness happens if test-
takers get distressed, upset, or  Assessment should reflect the
distracted about how an individual or knowledge and skills that are most
a particular group is portrayed in the important for students to learn.
test.
 Assessment should support every
Unfair penalization harms student student’s opportunity to learn things that
performance due to test content. It is are important.
not because items are offensive, but
rather the content caters to some  Assessment should tell teachers and
particular groups from the same individual students something that they
economic class, race, gender, and do not already know.
among others, leaving other groups • ETHICAL ISSUES
at a loss or a disadvantage.
* Concern/s:
• ACCOMMODATE SPECIAL NEEDS
 Sensitive questions (sexuality,
* Concern/s: family problems w/o consent…)
 In testing: on how the learner could improve
proficiency;
o possible harm to the
participants F. Repeat instruction, assessment, and
o confidentiality of results feedback until the student achieves
o deception in regard to the a predetermined level of mastery.
purpose and use of the into the K-12 CURRICULUM GUIDE
assessment (DepEd No. 21, s. 2019, prescribed under
o temptation to assist students Republic Act No. 10533)
in answering tests or
responding to surveys The K to 12 Basic Education Curriculum

STANDARDS-BASED ASSESSMENT to The standards-based approach and


the K-12 CURRICULUM GUIDE the constructive alignment of curriculum are
both clearly applied in the K to 12 Basic
DEFINING “STANDARDS-BASED Education Curriculum (BEC)
ASSESSMENT”…
Key Stage Standards - set the goal for
an approach that compares student what the students should know and be able
achievement to a predetermined to do after completing the Key stages of the
educational content and performance K to 12 BEC.
standards or learning outcomes.
• Kinder to Grade 3
define what the students need to know and
what the students can do within a • Grades 4-6
specified time frame.
• Grades 7-10
IMPLEMENTING STANDARDS-BASED
• Grades 11-12.
ASSESSMENT
Core Learning Area Standards – set the
A. Identify a key fact or important body
goal for what the students should know and
of knowledge, the essential content
be able to do in a specific learning area
and concepts;
(such as Science, Math) for the entire
B. Identify the indicators (i.e. evidence) duration of the K to 12 BEC.
that students will show when the
Grade Level Standards - set the goal for
concept or content has been
what the students should know and be able
understood;
to do in the specific core learning areas
C. Choose a collection of assessments after completing each grade level ( K1,
that will allow students to Grade 1…etc).
demonstrate the indicators;
Content Standards - set the goal for what
D. Identify the proficiency of the student the students should know after instruction
with respect to a standard using a on a specific subject matter.
scale or rubric;
Performance Standards - set the goal for
E. Communicate to the learner in order what the students should be able to do after
to provide meaningful feedback. This instruction on a specific subject matter
feedback should provide information content.
Learning Competency - defines the set of • DEGREE OF MASTERY
knowledge, skills, and abilities required to
UNPACKING THROUGH 5P’s
achieve the Content and Performance
Standards. Purpose - If a lesson is to be taught there
must be a good reason for teaching it.
UNPACKING LEARNING COMPETENCIES
 What are you teaching? Why are
The final step in the design of a standards-
you teaching this?
based curriculum is unlocking the
competency standards. This is done by  Where does it fit into the
deconstructing competency standards curriculum/Schemes of Learning?
into smaller tasks so that it would be
easier to measure student achievement.  How will it benefit the pupils?
Each task is then matched with an Preparation
instructional objective.
 Do you have the right resources?
WHAT DO WE MEAN “UNPACKING”?
 Is the classroom fit for purpose?
is the process of deconstructing student
learning outcomes into component  Do you need any special
parts/competencies to identify key life-long arrangements for the lesson?
transferable learning skills and the types of
Pitch - The pitch of the lesson must ensure
learning experiences, activities, tasks, and
that all pupils can access the materials.
assessments that align with those
outcomes.  Describe the type and range of
differentiation required.
Learning competencies need to be
unpacked. This means that learning  Identify the range of ‘Levelness’ and
competencies or sometimes we call them what this looks like.
learning outcomes are to be
deconstructed into component parts.  Identify how extensions might be
made if necessary (higher or lower).
UNPACKING THROUGH…
Progress – You and your children must
5P’s know that progress has been made.
• PURPOSE  How will you know that progress has
been made?
• PREPARATION
 When will pupils refer to and reflect
• PITCH
on the Learning Objectives?
• PACE
 How do you know progress has
• PROGRESS been made?

ABCD’s UNPACKING THROUGH ABCD’s

• AUDIENCE The ABCDs of Writing Instructional


Objectives, a.k.a. learning targets/outcomes
• BEHAVIOR (Gahasan, Undated)
• CONDITION
The ABCD method of writing objectives is Learning Competencies (Observable
an excellent guide to formulating learning behaviors expected from the learners after
instructional objectives. the learning period to be able to cope with
the content and performance standards):
"A" for audience
The learners should be able to:
"B" for behavior
a. describe different objects based on
"C" for conditions their characteristics (e.g. shape, weight,
"D" for the degree of mastery volume, ease of flow);
needed. b. Classify objects and materials as
solid, liquid and gas based on some
Given a set of ten sentences written in the
observable characteristics;
future tense, the student will be able to
rewrite the sentences in present c. Describe ways on the proper use and
progressive tense with no errors in tense or handling solid, liquid and gas found at
tense contradiction. home and in school.
Let us check if the elements are complete: Let us try to deconstruct or unpack the first
identified learning competency. What should
Audience – the student
the learners know to be able to describe
Behavior – rewrite sentences in the objects and materials?
past tense
They should have information on: What
Conditions – given a set of 10 matter is, and the properties of matter
sentences
Using the ABCDs of writing objectives as a
Degree of mastery – no errors in guide, the learning objectives are:
tense or tense contradiction.
 Given a list of concepts, the
Content (Topic): Characteristics of solids, learners should be able to identify
liquids and gases. them correctly as matter or non-
matter.
Content Standards (What the learners
should know after the learning period): The  Given a list of objects and
learners should be able to demonstrate materials and a checklist, the
an understanding of the different ways learners should be able to fill out
of sorting materials and describing them correctly the presence of the
as solid, liquid, or gas based on identified properties.
observable characteristics.
Performance Standards (What skills the
learners should be able to demonstrate or
what products should they be able to make
after the learning period): The learners
should be able to group common objects
found at home and in school according
to solids, liquids, and gas.

You might also like