You are on page 1of 8

Preparing

student
Test Assembly
TEST ADMINISTRATION Test
administration
AND REPORTING Analysis of
student
responses
Reporting
PREPARING STUDENTS

 Assess maximum, not typical, performance of the student


 Give students enough information about the assessment:
 Time
 Pop quizzes do not assess maximum performance
 Test condition (timed, speeded, take-home)
 Test content
 Emphasis of the test
 Scoring
 Test use

Poor test administration conditions have the potential to interfere with


students demonstrating their full potential and puts at risk the
accuracy and usefulness of the data (ODE, 2014 ).
TEST ASSEMBLY

Purpose of the assessment being assembled


Range of content/Learning Progression to be
covered
Distribution of test items
 Type the test and distribute it fairly
Organization of test items
 Place easier items first  minimize test-anxiety and
improve validity
 Give clear directions to answer each item
 Provide a balance of item types, content coverage,
difficulty levels
Make sure adequate time is allowed
TEST ADMINISTRATION CONSIDERATIONS

Time of day
Time allotted
Proximity to teaching of targeted content
Other issues
ANALYSIS OF STUDENT RESPONSES

 Score student responses


 Group students  upper-, middle-, and lower-scoring group
 Did the group of students have particular difficulties? What might
be done about this?
 Compute item difficulty (percent of students passing item,
or average score)
 Analyze responses for the least and most difficult items, as
well as items where most students could do
 Distribution of scores per item
 Does the item function differently for different groups of
students?
REPORTING

Reports should provide information about student


achievement relative to the learning target (ODE,
2014).
BIBLIOGRAPHY

 Nitko, A. J., & Brookhart, S. (2007). Educational assessment of


students. Upper Saddle River, NJ: Pearson Education, Inc.
 McMillan, J. H. (2007). Classroom assessment. Principles and
practice for effective standard-based instruction (4th ed.). Boston:
Pearson - Allyn & Bacon.
 Oregon Department of Education. (2014, June). Assessment
guidance.
 Wilson, M. (2005). Constructing measures: An item response
modeling approach. New York: Psychology Press, Taylor & Francis
Group.
 Wilson, M., & Sloane, K. (2000). From principles to practice: An
embedded assessment system. Applied Measurement in
Education, 13 (2), pp. 181-208.
 Smarter Balanced Assessment Consortium. (2012, April). General
item specifications.
CREATIVE COMMONS LICENSE

Introduction to Test Administration and Reporting PPT by the


Oregon Department of Education and Berkeley Evaluation and Assessment Research
C e n t e r i s l i c e n s e d u n d e r a C C B Y 4 . 0.

You are free to:


 Share — copy and redistribute the material in any medium or format
 Adapt — remix, transform, and build upon the material

Under the following terms:


 A t t r i b u t i o n — Y o u m u s t g i v e a p p r o p r i a t e c r e d i t, p r o v i d e a l i n k t o t h e l i c e n s e , a n d
i n d i c a t e i f c h a n g e s w e r e m a d e. Y o u m a y d o s o i n a n y r e a s o n a b l e m a n n e r , b u t n o t i n a n y w a y t h a t
suggests the licensor endorses you or your use.
 N o n C o m m e r c i a l — Y o u m a y n o t u s e t h e m a t e r i a l f o r c o m m e r c i a l p u r p o s e s.
 ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions
under the same license as the original.

Oregon Department of Education welcomes editing of these resources and would greatly
appreciate being able to learn from the changes made. To share an edited version of this
r e s o u r c e , p l e a s e c o n t a c t C r i s t e n M c L e a n , c r i s t e n . m c l e a n @ s t a t e . o r . u s.

You might also like