You are on page 1of 38

Classroom

Assessment:
A Review
ASSERTIONS
As formators, teachers are invited daily to make
sound decisions in order to bring out the best in
students.

Aside from the in-school curriculum-bound here now


setting, learners also operate in the context of local
and international societies as well as the spiritual
realm.
ASSERTIONS
Transformed learners are skilled in
knowing, doing, being, and living together.

The transformative process could be


initiated and enhanced through
assessment.
Objectives of Presentation
Explain the role of measurement and evaluation in the
teaching-learning process.
Differentiate measurement, evaluation, and types of
assessment.
Discuss the different principles of high quality assessment .
Share experiences and reflect on one’s practices in making
tests.
Model of the Teaching-
Learning Process
Pre-instructional Activities

Pre-instructional and Instructional Activities

Phase of
Instruction Instructional Activities

Post Instuctional
Model of the Teaching-
Learning Process
Start of an Choose the Think about and Choose and End of an
Evaluate
Instructional objectives of use what you carry out
Instruction Instructional
Sequence instruction know about the methods of Sequence
learning process teaching

Determine the
characteristics of the
students you will teach
Model of the Teaching-
Learning Process

What is
What is What is
measured?
intended? taught?
(Objective)
(Achieved
(Lesson)
Outcome)
Classroom Assessment

It is the process of
collection, interpretation
and use of information to
help teachers make
better decisions.
Components of Classroom
Assessment
Purpose Measurement Evaluation Use

1. Purpose - Why am I doing this assessment?

2. Measurement - What techniques should I use to


gather information?
3. Evaluation – How will I interpret the results?
4. Use - How will I use the results?
PURPOSE
◦Will your assessment deliberately improve
student performance or simply to audit them?
◦Has the assessment motivated students to learn?
◦Do the assessment provide a realistic estimation
of what the students are able to do ?
MEASUREMENT
It is a device or instrument
that is used to measure
students’ achievement, skills,
attitude, intelligence,
personality or anything that
can be expressed
quantitatively.
EVALUATION
◦It is the process of summing up the
results of measurements or tests,
giving them some meaning based
on value judgment.
◦It is a systematic process of
determining the extent to which
instructional objectives are
achieved by the students.
Type of
Evaluation Function Example
Determines student’s
1. Placement entry behavior (what they Pre-test, aptitude test,
readiness test
already know)
Determines feedback
2. Formative whether students are Quizzes, oral questioning
accomplishing the (recitation)
objectives
Determines areas of
3. Diagnostic weaknesses that cannot
Diagnostic tests
be assessed by formative
evaluation
Determines if students
4. Summative satisfied Summative test
Use of Assessment Data

Grading

Diagnosis

Instruction
Trends From To
in Sole emphasis on
outcomes
Assessing of process

Classroo Isolated skills Integrated skills


m Isolated facts Application of
knowledge
Assess Paper and pencil tasks Authentic tasks
ment Single correct answer Many correct answer
Secret Public
standards/criteria standards/criteria
Individuals Group
After instruction During instruction
Little feedback Considerable
feedback
Objective tests Performance-based
Principles of
High Quality
Assessment
1. Clear & Appropriate
Learning Targets
◦A sound assessment begins
with clear and appropriate
targets.

◦Targets should reflect what the


students should know and be able
1. Clear & Appropriate Learning Targets
LEARNING DOMAINS
Cognitive
Complexity of thought from recall of information to
the use of it in new forms and decisions

Affective
Dispositions, values, & other emotional
behaviors

Psychomotor
Overt behaviors demonstrated in relative contexts
1. Clear & Appropriate
Learning Targets
observing comparing/
classifying
communicating

Basic asking
Process questions
making skills
models

measuring &
recording
inferring
data
1. Clear & Appropriate
Learning Targets
Applying findings to
new situations
Finding Formulating
patterns hypothesis

Drawing Integrated
Thinking Stating a
conclusion
Skills problem

Communicating Designing a
results Procedure to
Collecting test hypothesis
data
1. Clear & Appropriate
Learning Targets
Describe the process Describe the planning
Assess the relevance of
of planning for a process you employed
the principles of medium-
medium-scale for your medium-scale
scale business
business business planning…

Remember Understand Apply Analyze Evaluate Create


Factual
Conceptual
Procedural
Metacognitive
1. Clear & Appropriate
Learning Targets
knowledge

reasoning LEARNING skills


TARGETS

products affects
1. Clear & Appropriate
Learning Targets
• Verbal/
linguistic • social/
• logical/
interpersonal
mathematical All
domains • intrapersosal
• Visual/ of
spatial intelligence • natural
• musical • existential
• kinesthetic
2. Appropriateness of Assessment Methods
SELECTED RESPONSE
◦ multiple choice
◦ binary choice (e.g. true/false)
◦ matching
◦ interpretive
2. Appropriateness of Assessment Methods
CONSTRUCTED RESPONSE
◦ Brief Constructed Response
short answer, completion test, label a diagram, show your work
◦ Performance-based tasks
Products , e.g., paper, project, poem, portfolio, video/audio tape,
exhibition, reflection, journal, graph, table illustration
Skills , e.g. ,speech, demonstration, dramatic reading, debate,
recital, enactment
◦ Essay
restricted response, extended response
◦ Oral Questioning
(informal, examination, conference or interviews)
2. Appropriateness of
Assessment Methods
TEACHER OBSERVATION
- Formal
- Informal
SELF-REPORT INVENTORIES
- Attitude survey
- Sociometric device
-Questionnaires
- Inventories
3. Validity
The quality of a test that
assures measurement of what
it is supposed to measure.
TYPES OF VALIDITY
EVIDENCE DESCRIPTION
Face Validity Validity that relies on the physical attributes of the test

Content Validity The extent to which an assessment procedure adequately represents


the content of the assessment domain being tested

Instructional Validity The extent to which the test measures the instructional objectives

Criterion - Related The degree to which performance on an assessment procedure


Validity accurately predicts a student’s performance on an external criterion

The extent to which a test can measure an unobservable trait or


Construct Validity behavior. It could be determined through theoretical explanations,
logical analysis, and statistical procedures.
3. Validity

Assessment Domain Test Items

Excellent
Representativeness
4. Reliability
The consistency of test items
in terms of getting the same
score for a student taking the
same test several times within
the period when traits are not
expected to have changed.
4. Reliability

Scores of a
student
100

75

50
Reliability is related to ERROR
Observed Score = True ability +/- Error

INTERNAL ERROR
Health Anxiety
Mood Fatigue
Motivation General Ability
Test-taking Skills

Actual or Observed
Assessment
True Ability Score

EXTERNAL ERROR
Directions Sampling of Items
Luck Test Interruptions
Item Ambiguity Scoring
Uncomfortable Space Observer Bias
To enhance reliability, the following
suggestions are to be considered:
◦ Use sufficient number of items or tasks. (Other things being equal,
longer tests are more reliable).
◦ Use independent raters or observers who provide similar score on
the same performances.
◦ Make sure the assessment procedures and scoring are as objective
as possible.
◦ Continue assessment until results are consistent.
◦ Eliminate or reduce the influence of extraneous events or factors.
◦ Use shorter assessments more frequently than fewer but long
assessment.
5. Fairness
A fair assessment is one that provides all students an equal opportunity to
demonstrate achievement and yields scores that are comparably valid from one
person or group to another.
Be fair in terms of ….

◦Students knowledge of learning targets and assessment

◦Opportunity to learn

◦Prerequisite knowledge and skills

◦Avoiding teacher stereotypes and bias


6. Positive Consequences
Ask yourself…….
How will assessment affect student motivation?
Will students be more or less likely to be meaningfully
involved?

How will the assessment affect my teaching?

What will the parents think of my assessment?


7. Practicality and Efficiency
Consider….
Familiarity with the method
Time required
Complexity of administration
Ease of scoring and interpretation
Cost effective
What are your
practices in assessing
your students?
Salama
t Po

Agyaman

You might also like