You are on page 1of 37

Designing

Assessment Tasks
by
Sheryl Satorre-Estella, PhD.
University of the Visayas – Pardo
May 14, 2019
What is Assessment?

7/2/2019
Prepared by: SB Satorre
Assessment is the ongoing process of
gathering, analyzing and reflecting on
evidence to make informed and
consistent judgments to improve future
student learning.
2
In layman’s language, how is
the process of assessment
described?

7/2/2019
• Plan it!

Prepared by: SB Satorre


• Do it!
• Check it!
• Revise it!
• Repeat it! 3
Uses of Assessment
1. Planning, conducting and evaluating instruction
2. Diagnosing student’s difficulties

7/2/2019
3. Placing students
4. Providing feedback (formative)

Prepared by: SB Satorre


5. Grading and evaluating learning (summative)

4
3 Main Purposes for Assessment

7/2/2019
Prepared by: SB Satorre
5
• Assessment for Learning(AfL)
occurs when teachers use
inferences about student
progress to inform their teaching.
(formative) embedded in
the TLAs
• Assessment as Learning(AsL)

7/2/2019
occurs when students reflect on
and monitor their progress to

Prepared by: SB Satorre


inform their future learning goals.
(formative)
• Assessment of Learning(AoL)
occurs when teachers use
evidence of student learning to Occurs at the
end of the
make judgments on student process, task,
achievement against goals and or period 6
standards. (summative)
Formal vs. Informal Assessment
Formal Assessment Methods
planned in advance of their administration
lack spontaneity

7/2/2019
typically occur at the end of instruction
students are aware of these methods

Prepared by: SB Satorre


examples include chapter tests, final exams,
graded homework, etc.
Informal Assessment Methods
more spontaneous; less obvious
typically occur during instruction
examples include teacher observations and 7
questions
Qualitative vs. Quantitative
Assessment
Quantitative Assessment Methods
yield numerical scores

7/2/2019
major types include teacher-constructed tests,
standardized tests, checklists, and rating scales

Prepared by: SB Satorre


Qualitative Assessment Methods
yield verbal descriptions of characteristics
main types include teacher observations,
anecdotal records, and informal questions

8
Formative vs. Summative
Formative Evaluation
decision making that occurs during instruction for
purposes of making adjustments to instruction

7/2/2019
more of an evaluation of one’s own teaching
rather than of students’ work

Prepared by: SB Satorre


may be based on formal or informal methods
Summative Evaluation
occurs at the end of instruction (e.g., end of
chapter, end of unit, end of semester)
typically used for administrative decisions (e.g.,
assigning grades, promoting/retaining students)
9
based solely on formal assessment methods
Standardized vs. Nonstandardized
Assessment
Standardized Assessment Methods
administered, scored, and interpreted in identical

7/2/2019
fashion for all examinees
purpose is to allow educators to compare

Prepared by: SB Satorre


students from different schools, states, etc.
examples include SAT, GRE, ITBS, CAT, PRAXIS
Nonstandardized Assessment Methods
typically made by teachers for classroom use
purpose is to determine extent to which subject
matter is being taught and learned
10
Norm-Referenced vs. Criterion-
Referenced Assessment
Norm-Referenced Assessment Methods
show where an individual student’s performance

7/2/2019
lies in relation to other students
standardized tests are usually norm-referenced

Prepared by: SB Satorre


results are quantitative
student performance is compared to norm group
Criterion-Referenced Assessment Methods
compare student performance to pre-established
criteria or objectives
results are quantitative, qualitative, or both
also known as mastery, objectives-referenced, or 11
competency tests
Traditional vs. Alternative
Assessment
Traditional Assessment Methods
procedures such as pencil-and-paper tests and

7/2/2019
quizzes
only one correct response to each test item

Prepared by: SB Satorre


easily and efficiently assess many students
simultaneously
encourage memorization of facts, etc.
Alternative Assessment Methods
more appropriate for hands-on, experiential
learning
12
include authentic assessment (involve real
application of skills beyond instructional context)
Objective vs. Subjective Assessment
Objective Assessment Methods
“objective” refers to method of scoring (no judgments)
contain only one correct answer
examples: multiple-choice, true-false, matching items

7/2/2019
also known as structured-response, selected-response,
teacher-supplied items

Prepared by: SB Satorre


Subjective Assessment Methods
scoring involves teachers’ subjective judgments
several possible correct responses or single correct
response with several ways to arrive at that answer
examples: short-answer and essay items
also known as open-ended, constructed-response, supply-
type items
13
Steps in Designing OBE-
based Assessment Task

7/2/2019
1. Choose the right assessment task/method.
2. Choose the right student activities to complete the

Prepared by: SB Satorre


assessment task/method.
3. Create the scoring or grading criteria.

14
1. Choose the right assessment
task or method.
• Is the assessment task aligned with the subject
intended learning outcome?

7/2/2019
• Is the assessment task reflect a relative
importance to the subject intended learning

Prepared by: SB Satorre


outcome?
• Is the assessment task realistic to the student?
• Is the assessment task measurable?
• Are the resources needed to carry out the
assessment task available?

15
2. Choose the right student activities to
complete the assessment task/method.

7/2/2019
The verb in the subject intended learning outcome
provides the clue on the kinds of student activities in

Prepared by: SB Satorre


the assessment task.

16
Bloom’s Level 1: Remember

ILO verbs: recall, recognize, identify

7/2/2019
Objective test items such as fill-in-the-blank,
matching, labeling, or multiple-choice questions that

Prepared by: SB Satorre


require students to:

• recall or recognize terms, facts, and concepts

17
Bloom’s Level 2: Understand

ILO verbs: interpret, exemplify, summarize, infer, compare,


explain

7/2/2019
Assessments such as papers, exams, problem sets, class
discussions, or concept maps that require students to:

Prepared by: SB Satorre


• summarize readings, films, or speeches
• compare and contrast two or more theories, events, or
processes
• classify or categorize cases, elements, or events using
established criteria
• paraphrase documents or speeches
• find or identify examples or illustrations of a concept or
principle 18
Bloom’s Level 3: Apply

ILO verbs: apply, execute, implement

7/2/2019
Assessments such as paper exams, problem sets,
class discussions, or concept maps that require

Prepared by: SB Satorre


students to:
• Activities such as problem sets, performances,
labs, prototyping, or simulations that require
students to:
• use procedures to solve or complete familiar or
unfamiliar tasks
• determine which procedure(s) are most
appropriate for a given task 19
Bloom’s Level 4: Analyze

ILO verbs: analyze, differentiate, organize, attribute

7/2/2019
Assessments such as case studies, critiques, labs,
paper exams, projects, debates, or concept maps

Prepared by: SB Satorre


that require students to:
• discriminate or select relevant and irrelevant parts
• determine how elements function together
• determine bias, values, or underlying intent in
presented material

20
Bloom’s Level 5: Evaluate

ILO verbs: evaluate, check, critique, assess

7/2/2019
Assessments such as journals, diaries, critiques,
problem sets, product reviews, or studies that

Prepared by: SB Satorre


require students to:
• test, monitor, judge, or critique readings,
performances, or products against established
criteria or standards

21
Bloom’s Level 6: Synthesize

ILO verbs: create, generate, plan, produce, design

7/2/2019
Assessments such as research projects, musical
compositions, performances, essays, business

Prepared by: SB Satorre


plans, website designs, or set designs that require
students to:
• make, build, design or generate something new

22
3. Create the scoring or
grading criteria.

7/2/2019
Prepared by: SB Satorre
23
Methods of Grading SILOs

1. Direct Grading*

7/2/2019
2. Indirect Grading*

Prepared by: SB Satorre


*requires Rubric

24
Using Rubric
• A rubric is a scoring tool that lays out the
specific expectations for a performance task.

7/2/2019
• Rubrics divide a performance tasks into its
component parts and provide a detailed

Prepared by: SB Satorre


description of what constitutes acceptable and
unacceptable levels of performance for each of
those parts.
• Rubric can be use for grading a large variety of
tasks: discussion participation, laboratory
reports, portfolio, group work, oral presentation,
25
role play and more (Stevens and Levi, 2005).
2 Vital Components of a
Rubric

7/2/2019
1. Criteria
2. Scale – descries how well or poorly any

Prepared by: SB Satorre


given task has been performed (ex: Very
Good, Good, Fair, Needs Improvement)

26
Rubric Title:

Assessment
Task:

SILO:
Scale Level Scale Level Scale Level Score
1 2 3

7/2/2019
Criterion 1

Prepared by: SB Satorre


Criterion 2

Criterion 3

Criterion 4

Feedback: 27
Direct Grading
Grading the
overall SILOs

7/2/2019
Prepared by: SB Satorre
Grading Criteria
(using Rubrics) of
Individual SILO

Derive Final
Grade
28
Example: Database System subject
1. contrast traditional file-based systems and
database system in terms of efficiency on data
manipulation, information access and security

7/2/2019
2. explain the different data models as basis for
designing an information system.

Prepared by: SB Satorre


3. apply a relational database model to design the
database for a particular information system
4. design a normalized database for the intended
information system
5. construct the appropriate SQL statements to
solve SQL query problems
29
• contrast traditional file-based
systems and database system in
terms of efficiency on data
manipulation, information access
and security

7/2/2019
• explain the different data models 70% - 79%
as basis for designing an

Prepared by: SB Satorre


information system.
• apply a relational database model
to design the database for a
particular information system

30
1. contrast traditional file-based
systems and database system in
terms of efficiency on data
manipulation, information access
and security

7/2/2019
2. explain the different data models
as basis for designing an 80% - 89%

Prepared by: SB Satorre


information system.
3. apply a relational database
model to design the database for
a particular information system
4. design a normalized database
for the intended information
system 31
1. contrast traditional file-based
systems and database system in
terms of efficiency on data
manipulation, information access
and security
2. explain the different data models

7/2/2019
as basis for designing an
information system. 90% - 100%

Prepared by: SB Satorre


3. apply a relational database model
to design the database for a
particular information system
4. design a normalized database for
the intended information system
5. construct the appropriate SQL
statements to solve SQL query
32
problems
Indirect Grading
Grading the
Assessment Tasks
which are aligned

7/2/2019
with the SILOs

Prepared by: SB Satorre


Grading Criteria of
(using Rubrics)
individual
assessment task

Derive Final Grade


33
References:
• http://www.aaia.org.uk/pdf/Publications/AAIA%20Pupils%20Learning%20from
%20Teachers'%20Responses.pdf
• http://www.aaia.org.uk/pdf/Publications/AAIAformat4.pdf
• http://www.aaia.org.uk/pdf/asst_learning_practice.pdf
• http://community.tes.co.uk/forums/t/300200.aspx

7/2/2019
• http://www.schoolhistory.co.uk/forum/lofiversion/index.php/t7669.html
• www.harford.edu/irc/assessment/FormativeAssessmentActivities.doc
• Paul Black et al, Assessment for Learning, (Open University Press,

Prepared by: SB Satorre


Maidenhead, 2003)
• Paul Black et al, “Working inside the black box”, (nferNelson, London, 2002)
• Paul Black and Dylan William, Inside the Black Box, (nferNelson, London,
1998)
• Assessment Reform Group, Testing, Motivation and Learning, (The
Assessment Reform Group, Cambridge, 2002)
• Assessment Reform Group, Assessment for Learning, (The Assessment
Reform Group, Cambridge, 1999)
• Angelo, TA, KP Cross. Classroom Asessment Techniques: A Handbook for
College Teachers. Jossey-Bass Publishers: San Francisco. 1993. 34
• Southern Illinois University : Several CATs online:
http://www.siue.edu/~deder/assess/catmain.html
• Bresciani, M.J. (September, 2002). The relationship between outcomes,
measurement. and decisions for continuous improvement. National
Association for Student Personnel Administrators, Inc NetResults E-Zine.
http://www.naspa.org/netresults/index.cfm
• Bresciani, M.J., Zelna, C.L., and Anderson, J.A. (2004). Techniques for
Assessing Student Learning and Development in Academic and Student
Support Services. Washington D.C.:NASPA.
• Ewell, P. T. (2003). Specific Roles of Assessment within this Larger Vision.

7/2/2019
Presentation given at the Assessment Institute at IUPUI. Indiana University-
Purdue University- Indianapolis.
• Maki, P. (2001). Program review assessment. Presentation to the Committee

Prepared by: SB Satorre


on Undergraduate Academic Review at NC State University.
• Bresciani, MJ.(2006). Outcomes-Based Undergraduate Academic Program
Review: A Compilation of Institutional Good Practices. Sterling, VA: Stylus
Publishing.
• Bresciani, M. J., Gardner, M. M., & Hickmott, J. (In Press). Demonstrating
student success in student affairs. Sterling, VA: Stylus Publishing.
• NC State University, Undergraduate Academic Program Review. (2001)
Common Language for Assessment. Taken from the World Wide Web
September 13, 2003:
http://www.ncsu.edu/provost/academic_programs/uapr/process/language.html
• Palomba, C.A. and Banta, T.W. (1999). Assessment essentials: Planning,
implementing and improving assessment in Higher Education. San Francisco:
Jossey-Bass. 35
• University of Victoria, Counseling Services. (2003) Learning Skills Program:
Blooms Taxonomy. Taken from the World Wide Web September 13, 2003:
http://www.Coun.uvic.ca/learn/program/hndouts/bloom.html
• Anderson, L., & Krathwohl, D. (2001). A taxonomy for learning, teaching and
assessing: A revision of Bloom’s taxonomy of educational objectives. New York:
Longman.
• Biggs, J. (2003). Teaching for quality learning at university (2nd ed.). Buckingham:
Open University Press/Society for Research into Higher Education.
• Bloom, B. (1956). Taxonomy of educational objectives: The classification of
educational goals. In B. S. Bloom (Ed.) Susan Fauer Company, Inc. , pp. 201-207.
• Jackson, N, Wisdom J and Shaw M, (2003). Using learning outcomes to design a
course and assess learning. The Generic Centre: Guide for Busy Academics. York:

7/2/2019
Higher Education Academy Available at
http://www.heacademy.ac.uk/assets/York/documents/resources/resourcedatabase/id25
2_Guide_for_Busy_%20
• Academics_Using_Learning_Outcomes_to_Design.rtf (accessed 6 September 2008).

Prepared by: SB Satorre


• Krauss, K. L. (2005). Engaged, inert or otherwise occupied: Understanding and
promoting stduent engagement in uinversity learning communities. Paper presented at
the Sharing Scholarship in Learning and Teaching: Engaging Students, James Cook
University. Available at http://www.cshe.unimelb.edu.au/pdfs/Stud_eng.pdf (accessed
6 September 2008).
• Ramsden, P. (2003). Learning to Teach in Higher Education. London, UK: Kogan
Page. St Edward’s University, Centre for Teaching Excellence (2004). Task-oriented
question construction wheel, based on Bloom’s taxonomy. Available at
http://www.stedwards.edu/cte/files/BloomPolygon.pdf (accessed 6 September 2008).
• Task-Oriented Question Construction Wheel based on Bloom’s Taxonomy © 2004 St
Edward’s University Centre for Teaching Excellence
• Why should assessments, learning objectives, and instructional strategies be aligned?
36
https://www.cmu.edu/teaching/assessment/basics/alignment.html
Questions?

You can find me at

7/2/2019
sheryl.satorre@uc.edu.ph

Prepared by: SB Satorre


37

You might also like