You are on page 1of 8

Preparing

student
Test Assembly
TEST ADMINISTRATION Test
administration
AND REPORTING Analysis of
student
responses
Reporting
PREPARING STUDENTS

 Assess maximum, not typical, performance of the student


 Give students enough information about the assessment:
 Time
 Pop quizzes do not assess maximum performance
 Test condition (timed, speeded, take-home)
 Test content
 Emphasis of the test
 Scoring
 Test use

Poor test administration conditions have the potential to interfere with


students demonstrating their full potential and puts at risk the
accuracy and usefulness of the data (ODE, 2014 ).
TEST ASSEMBLY

 Purpose of the assessment being assembled


 Range of content/Learning Progression to be covered
 Distribution of test items
 Type the test and distribute it fairly
 Organization of test items
 Place easier items first  minimize test-anxiety and
improve validity
 Give clear directions to answer each item
 Provide a balance of item types, content coverage,
difficulty levels
 Make sure adequate time is allowed
TEST ADMINISTRATION CONSIDERATIONS

 Time of day
 Time allotted
 Proximity to teaching of targeted content
 Other issues
ANALYSIS OF STUDENT RESPONSES

 Score student responses


 Group students  upper-, middle-, and lower-scoring group
 Did the group of students have particular difficulties? What might
be done about this?
 Compute item dif ficulty (percent of students passing item, or
average score)
 Analyze responses for the least and most dif ficult items, as
well as items where most students could do
 Distribution of scores per item
 Does the item function dif ferently for dif ferent groups of
students?
REPORTING

 Reports should provide information about student


achievement relative to the learning target (ODE,
2014).
BIBLIOGRAPHY

 Nitko, A . J., & Brookhar t, S. (2007). Educational assessment of


students. Upper Saddle River, NJ: Pearson Education, Inc.
 McMillan, J. H. (2007). Classroom assessment. Principles and
practice for ef fective standard -based instruction (4th ed.). Boston:
Pearson - Allyn & Bacon.
 Oregon Depar tment of Education. (2014, June). Assessment
guidance.
 Wilson, M. (2005). Constructing measures: An item response
modeling approach . New York: Psychology Press, Taylor & Francis
Group.
 Wilson, M., & Sloane, K. (2000). From principles to practice: An
embedded assessment system. Applied Measurement in
Education, 13 (2), pp. 1 81-208.
 Smar ter Balanced Assessment Consor tium. (2012, April). General
item specifications.
CREATIVE COMMONS LICENSE

I n t r o d uc t io n to Te s t A d m i n i st r a t io n a n d Re p o r t i n g P P T b y t h e O r e g o n D e p a r t me n t o f
E d u c a t io n a n d B e r ke l ey E v a l ua t i o n a n d A s s e s s m e n t Re s e a r c h C e n te r i s l i c e n s e d u n d e r
a C C BY 4 . 0 .

You are free to:


 Share — copy and redistribute the material in any medium or format
 Adapt — remix, transform, and build upon the material

Under the following terms:


 Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes
were made. You may do so in any reasonable manner, but not in any way that suggests the licensor
endorses you or your use.
 NonCommercial — You may not use the material for commercial purposes.
 ShareAlike — If you remix, transform, or build upon the material, you must distribute your
contributions under the same license as the original.

O r e go n D e p a r t me n t o f E d u c a t i o n w e l c o m e s e d i t i n g o f t h e s e r e s o u r c e s a n d w o u l d
g r e a t l y a p p r e c i a te b e i n g a b l e to l e a r n f r o m t h e c h a n g e s m a d e . To s h a r e a n e d i te d
ve r s i o n o f t h i s r e s o u r c e , p l e a s e c o n t a c t C r i s te n M c L e a n , c r i s te n .m c l e a n @ s t a te . o r. u s .

You might also like