You are on page 1of 54

Assessing

Student
Learning
Dr Khairul Jamaludin
“Used with skill, assessment can
motivate the reluctant, revive the
discouraged, and thereby increase,
not simply measure, achievement.”
- Chappuis et al.(2014)
What will we
discuss?
• What is assessment?
• Types of assessment
• Quality classroom assessment
• Setting up classroom assessment
• Selecting assessment method
• School-based assessment
• Your presentation!
• Conclusion
What is assessment?
Ministry of Education Malaysia (2018)
“…process of collecting information about pupils’ progress in the
classroom. The on-going assessment is planned, implemented and
reported by respective teachers. This process is ongoing to enable
teachers to determine the pupils’ mastery level.”

Wiggins (1993)
“…is a comprehensive, multifaceted analysis of performance; it must
be judgment-based and personal. “

Chappuis et al. (2014)


“… (1) gather accurate information about student achievement, and
(2) use the assessment process and its results effectively to improve
achievement"
Assessment vs Evaluation
“Evaluation is a judgment regarding the quality or
worth of the assessment results. This judgment is
based on multiple sources of assessment
information.”

“Evaluation thus involves placing a “value” on the


collection “
“Assessment is the act of collecting information about individuals
or groups of individ- uals in order to better understand them.”

“The assessment process involves using a range of strategies to


make decisions regarding instruction and gathering information
about student performance or behavior in order to diagnose
students’ problems, monitor their progress, or give feedback for
improvement.”
What say you?
Everybody
is a genius.
But if you
judge a fish
by its ability
to climb a
tree, it will
live its whole
life believing
that it is
stupid
As a teacher you have….
Planning and managing
Identifying, clarifying, Designing or selecting Devising high-quality
both formative and
and teaching to valued high-quality assessment scoring keys, guides,
summative assessments
learning targets items and tasks and rubrics
in the classroom

Using assessment Offering descriptive Designing assessments Tracking student


results to plan further feedback during so that students can achievement along with
instruction learning self-assess and set goals other relevant data

Calculating grades that


Setting up a system so
accurately represent
students can track and they are assigned
student achievement at
share their progress
the time
What is quality classroom
assessment?
They are designed to They are based on
They accurately
serve the specific clearly articulated
measure student
information needs of and appropriate
achievement.
intended user(s). achievement targets.

They yield results They involve students


that are effectively in self-assessment, sharing their
communicated to goal setting, tracking, learning.
their intended users. reflecting on, and
Keys to effective assessment
Clear purpose

Clear targets

Sound Assessment
Design
Effective
Communication

Student Involvement
Clear purpose

to gather information about student learning that


will inform instruc- tional decisions

need regular information about what each


student has and has not yet learned

decide what comes next in student learning


within lessons or when we diagnose problems

intended to support student learning—to help


students learn more
Clear targets

start the assessment process with a clear sense of the


learning to be assessed—the achievement
expectations we hold for our students, the content
standards at the focus of instruction

When our learning targets are clear to us as teachers,


the next step is to ensure they are also clear to
students

students’ chances of success improve when they start


out with a vision of where they are headed
Sound Assessment
Design

selecting an assessment method capable of reflecting the intended target

After we have chosen a method, we develop it with attention to three other quality criteria

must sample well by including just enough exercises to lead to confi- dent conclusions about
student achievement
must build the assessment of high- quality items, tasks, or exercises accompanied by proper
scoring schemes
every assessment situation brings with it its own list of things that can go wrong and that can
bias the results or cause them to be inaccurate
Effective
Communication

Communication of formative
an assessment’s results must be
assessment infor- mation provides the
communicated to the intended user(s)
kind of descriptive feedback learners
in a timely and understandable way
need to grow.

Communication in a summative
assessment context leaves all
recipients understanding the
sufficiency of student learning such as
when we convert summative
assessment information into grades
that accurately reflect achievement at
a point in time.
Student Involvement

Students decide whether the learning is worth the effort required to


attain it.

Students decide whether they believe they are capable of reaching the
learning targets.

Students decide whether to keep learning or to quit working.

It is only when students make these decisions in the affirmative that


our instruction can benefit their learning.
FIGURE 1.3 Keys to Quality Classroom Assessment

Key 1: Clear Purpose Key 2: Clear Targets


Who will use the information? Are learning targets clear to teachers?
How will they use it? What kinds of achievement are to be assessed?
What information, in what detail, Are these learning targets the focus of
is required? instruction?

Key 3: Sound Design


Do assessment methods match learning targets?
Does the sample represent learning appropriately?
Are items, tasks, and scoring rubrics of high quality?
Does the assessment control for bias?

Key 4: Effective Communication


Can assessment results be used to guide instruction?
Do formative assessments function as effective feedback?
Is achievement tracked by learning target and reported
by standard?
Do grades communicate achievement accurately?

Key 5: Student Involvement


Do assessment practices meet students’ information needs?
Are learning targets clear to students?
Will the assessment yield information that students can use to
self-assess and set goals?
Are students tracking and communicating their
evolving learning? Chappuis et al. (2014)

learning. These are all examples of summative assessment: assessments that provide
Setting up targets
Knowledge Targets
• Knowledge targets represent the factual information, procedural knowledge, and conceptual
understandings that underpin each discipline.
Reasoning Targets
• Reasoning targets specify thought processes students are to learn to do well within a range of subjects.
Skill Targets
• Skill targets are those where a demonstration or physical skill-based performance is at the heart of the
learning.
Product Targets
• Product targets describe learning in terms of artifacts where creation of a product is the focus of the
learning target. With product targets, the specifications for quality of the product itself are the focus of
teaching and assessment.
Disposition Targets
• Disposition targets refer to attitudes, motivations, and interests that affect students’ approaches to
learning. They represent important affective goals we hold for students as a byproduct of their educational
experience.
While engaged in this step, take the
following into account:
• If a target is at the knowledge level,
all underlying targets will be at the
knowledge level. There will be no
reasoning, skill, or product
components.
• Reasoning targets will have
knowledge components, but they do
not require skill or product
components.
• Skill targets always have knowledge
underpinnings. They usually require
reasoning as well.
• Product targets will require
knowledge and reasoning, and in
some cases might be underpinned by
skill targets as well.
• Multiplechoice

Methods
• True/false
• Matching
• Fill-in-the-blank questions

Selected Response

• Questions during
instruction • Short answer items
Personal Written
• Interviews and conferences Communication Response • Extended written response
• Participation items
• Oral exams
• Student journals and logs

Performance Assessment

• Performance task
• Performance criteria
Selected response assessment

Fill-in-the-
Multiplechoice True/false Matching blank
questions

Chappuis et al. (2014)


Selected response assessment

Chappuis et al. (2014)


Selected response assessment
Guideline Right Wrong
Keep wording simple and focused. Aim for the What are the poles of a magnet called? When scientists rely on magnets in the
lowest possible reading level. Good item a. Anode and cathode development of electric motors, they need to
writing first and foremost represents an b. North and south know about poles, which are?
exercise in effective written communication. c. Strong and weak
d. Attract and repel

Ask a full question in the stem. This forces What was the trend in interest rates between Between 1950 and 1965
you to express a complete thought in the 1950 and 1965? a. Interest rates increased.
stem or trigger part of the question, which a. Increased only b. Interest rates decreased.
usually promotes students’ under- standing. b. Decreased only c. Interest rates fluctuated greatly.
c. Increased, then decreased d. Interest rates did not change.
d. Remained unchanged

Eliminate clues to the correct answer either All of these are an example of a bird that flies,
within the question or across questions except an
within a test. When grammatical clues within a. Ostrich
items or material presented in other items b. Falcon
give away the correct answer, students get c. Cormorant
items right for the wrong reasons. d. Robin
*(The article an at the end of the stem
requires a response beginning with a vowel.
As only one is offered, it must be correct.)

Chappuis et al. (2014)


Continued…

• Do not make the correct answer obvious to students who have not
studied the material.
• Highlight critical, easily overlooked words (e.g., NOT, MOST,
LEAST, EXCEPT).
• Have a qualified colleague read your items to ensure their
appropriateness. This is especially true of relatively more
important tests, such as big unit tests and final exams.
• Double-check the scoring key for accuracy before scoring.

Chappuis et al. (2014)


Written assessment
Examples of • Describe two differences between fruits and vegetables.
short answer • List three causes of the Spanish-American War.
• What will happen if this compound is heated? Why will that happen?
items:

Examples of • Evaluate two solutions to an environmental problem. Choose which is better


and explain your choice.
extended written • What motivates (the lead character) in (a piece of literature)?
• Interpret polling data and defend your conclusions.
response items: • Describe a given scientific, mathematical, or economics process or principle.

• The cycle describes the sequence of reactions by which cells generate


Examples of energy.
• It consumes oxygen.
points approach : • It produces carbon dioxide and water as waste products.
• It converts ADP to energy-rich ATP

Example of the • the criteria used for comparison


“rubric” • the accuracy of evidence brought to bear
• the strength of the argument for the supremacy of one over the other.
approach:
Chappuis et al. (2014)
Written assessment: quality guideline
Quality of the Items
• Is written response the best assessment method for this learning
target?
• Do items call for focused responses?
• Is the knowledge to be used clearly indicated?
• Is the reasoning to be demonstrated (if any) clearly indicated?
• Is the item itself written at the lowest possible reading level—will all
students understand what they are to do?
• Will students’ level of writing proficiency in English be adequate to
show you what they know and can do?
• Is there anything in the item that might put a group of students at a
disadvantage regardless of their knowledge or reasoning level?
• Are there enough items to provide a defensible estimate of student
learning on intended targets?
Written assessment: quality guideline
Quality of the Scoring Guide(s)
• For the knowledge aspect of the response, is it clear how points will be
awarded? If a task-specific rubric is used, does the item clearly call for the
features described in the highest level of the rubric?
• For the reasoning portion of the response (if any), does the rubric capture
the essence of high- quality thinking at the highest level? Does it identify
flaws in reasoning at the lower levels?
• Does the scoring guide sufficiently represent the intent of the learning
target(s)?

Scoring Considerations
• Is the total number of items to be scored (number of items on the
assessment times number of students responding) limited to how many the
rater(s) can accurately assess within a reasonable time?
• If the scoring guide is to be used by more than one rater, have raters worked
together to ensure consistent scoring?
performance assessment
assessment based on observation and judgment

Examples of demonstrations (reflecting skill targets) include the


following:
• Playing a musical instrument
• Carrying out the steps in a scientific experiment
• Speaking a foreign language
• Reading aloud with fluency
• Repairing an engine
• Working productively in a group

Examples of products (reflecting product targets) include:

• Term paper
• Lab report
• Work of art
• Wood shop creation

Chappuis et al. (2014)


Designing a good performance-based assessment

R Role: Writers must imagine themselves as fulfilling specific roles—for


example, as tour guides or scientists or critics—when they write.

A Audience: Writers must always visualize their audiences clearly and


consistently throughout the writing process. If they don’t, the writing
will fail.

F Format: Writers must see clearly the format that the finished writing
should have, whether brochure, memo, letter to the editor, or article in
a magazine.

T Topic: Writers have to select and narrow their topics to manageable


proportions, given their audiences and formats.

S Strong verb: the verb we use in the task itself - What is the purpose
for the writing?

Smith (1990)
Strong verbs list

Chappuis et al. (2014)


Example

Chappuis et al. (2014)


Example

Chappuis et al. (2014)


Continued…

Chappuis et al. (2014)


Example

Chappuis et al. (2014)


Example

Chappuis et al. (2014)


Example

Chappuis et al. (2014)


Personal comm. assessment
Gathering information about students through personal communication

Listening to
Interviewing
Asking questions students as they
students in
during instruction participate or
conferences
perform in class

Giving Having students


examinations keep journals and
orally logs

Chappuis et al. (2014)


of Chapter 4. Personal communication options can assess knowledge, reasoning, and

Personal comm. assessment


those skill targets requiring oral communication, such as speaking a foreign language
and participating in group discussions. Figure 8.2 “Personal Communication Options”
Gathering information about students through personal communication

FIGURE 8.2 Personal Communication Options

Format Description Primary Use Target Types


Instructional Teacher poses questions for students to answer or Formative K R
Questions and discuss.
Answers Students pose questions and respond to each other.
Class Students engage in a discussion. Can be either Formative or K R S
Discussions teacher-led or student-led. Summative
Conferences Teacher meets with student to talk about what Formative K R
and Interviews students have learned and have yet to learn.
Oral Teacher plans and poses questions for individual Summative K R S
Examinations students to respond to.
Journals and Students write to explain, explore, and deepen their own Formative K R
Logs understanding; teacher and/or other students respond.

Chappuis et al. (2014)


Personal comm. assessment
How to maximize?

Teach students the


question stems
that elicit different
Ask questions Encourage students patterns of
that elicit to interact with reasoning for
the teacher as their Model the response patterns that you’d like to see from students.
summaries or each others’
only responder. For example:
whatever content
key points of responses rather they are studying.
than looking to Have them use
the learning. question stems in
small- or large-
group discussions.

Cheerfully admit when


you don’t have an
answer and model
Reflect on topics. For
Speculate on a given what to do about it.
example, say, “I
topic. This encourages Follow “I’m not sure”
sometimes wonder . . ”
students to explore with “What could we
This encourages
ideas and understand do to find out?”
students to explore the Sometimes a class
that uncertainty is a
topic rather than to
normal stage in the member or two will be
seek a single quick
thinking process. able to answer a
answer.
question you can’t, in
which case invite
students to weigh in.

Chappuis et al. (2014)


Deciding What to Keep Track of, What to Report,
Record Keeping: Tracking Student Learning

and How to FIGURE


Report It What to Keep Track of, What to Report, and How to Report It
9.2 Deciding

SUM TOTAL OF EVERYTHING STUDENTS DO IN SCHOOL/CLASSROOM


Diagnostic and Practice
Events SELECTION OF MOST VALUED ITEMS FOR
• In-class work: exercises, REPORTING PURPOSES
problems, tasks
• Homework that is for Academic Progress
practice • Learning gains
• Trial, feedback, and • Improvement over time
revision • Specific strengths and areas needing work
• Quizzes and other
formative assessments Skills of Independence and Cooperation
• Work habits
• Attendance
• Cooperation/Group skills
• Homework completion
Track • Organization skills
(Teacher and/or Student) • Behavior
• Academic
honesty SELECTION OF ACHIEVEMENT
ITEMS FOR GRADING PURPOSES
• Periodic assessments
• Final exams and papers
• Reports/Projects
Track & Report
• Culminating demonstrations of
learning

Record & Grade


Source: Adapted from Ken O’Connor, unpublished workshop materials, 2001. Adapted by permission.
How do you choose
an assessment
method that reflects
the target?
Target-method Match
(Stiggins & Chappuis, 2011)
School-based assessment
Component of SBA (MOEM, 2016)
Formulate statements of intended
learning outcomes

Discuss & use assessment results Develop/ select assessment


to improve results / learning measure

Create experiences leading to


outcomes
What if…
Student performance
100

92
90

84
80 80

70

60 60

50

42
40
35
30

20

10

0
TOV Mid year Final year
Student A Student B
School-based assessment

Incorporate different people (teacher, parents, Determine learning outcomes to support & motivate
peers) students

Allow students to get support; enhancing learning in Help identify development of academic & working
class & from home skills

Comprehensive, systematic, continuous, diagnostic


& integrative teacher-directed assessment
procedure
characteristics of SBA

Involve the teacher from beginning to the end Identify and develop suitable assessment tasks

Collection of a number of samples of students’ Conducted by students own teacher


performances over a period of time

Stimulates continuous evaluation and adjustment of the


Students are more active during the assessment teaching and learning
rationales of SBA

Continuously assess in pressure-free environment Reduce reliance on one-off public exam

Improve reliability of assessment Reflect actual abilities & standard

Reinforce students’ autonomy & independent learning Allow immediate and constructive feedbacks
Steps in conducting SBA

Developing
Sequencing learning:
Identifying assessment:
Planning learning Making judgement: Using feedback:
curriculum: Planning a variety of
experiences & Considering how Considering how &
selecting learning, assessment to collect
teaching strategies to judgment will be when to provide
school priorities & comprehensive &
respond to students’ made feedbacks
context for learning meaningful
needs
evidences
Let’s hear from you
SBA in Malaysia?
Centralized
assessment

Academic

In-school
assessment*

Assessment
Physical, sports &
cocurricular activities
assessment
Non-academic

Psychometric
assessment
How it is done?

continuously assess students’


Example: questioning during lesson
development for learning - Assess students’ knowledge, skills and values
(assessment for learning) - Will be used to design next lessons / improvise

formative

continuously assess students’


development as part of Example: reflective & peer review
- Reflective: mind map, portfolio, reflection
learning (assessment as - Peer review: rating, grading, discussion
learning)

assess students’ learning Example: summative assessment


- Final presentation summative
achievement (assessment of
- Final exam
learning)
PRINCIPLES OF SBA/CBA

Designed & developed by Criterion-based assessment:


subject teacher Content standard, learning standard etc.

Systematically documented:
design – implement Focus on individual development
- Record – analysis - improvement

Consist of both summative


Assess students’ all aspects or learning
and formative assessment

Use more than one sources to assess Require follow-up actions

Encourage self-assess & peer review


SKILLS TO ASSESS

knowledge Thinking skills Positive


Kinesthetic skills
attitudes

Language skills Arithmetic skills creativity

Learning skills Social skills Health &


intelligence

Manipulative
Practical skills Moral values
skills
Assessment is…
• collecting information about pupils’
progress
• planned, implemented and reported
by respective teachers.
• ongoing to enable teachers to
determine the pupils’ mastery level.

SBA should be...


• Pressure free
• Reflect actual abilities & standard
• Allow immediate and constructive
feedbacks
THANK
YOU
You are rewarding a teacher poorly if you remain always a pupil (Friedrich Nietzsche)

You might also like