You are on page 1of 40

IMPROVING TEST

ITEMS:TEST ITEM
ANALYSIS

Nilo Macawile Eder, PhD


Summary of Topics:
 Introduction: Looking Back
 ITEM ANALYSIS
◦ Uses of Item Analysis
◦ Types:
 Quantitative Item Analysis
 Difficulty Index
 Discrimination Index
 Qualitative Item Analysis
THE BASIC TEACHING MODEL

Instructional Entering Instructional Performance


Objectives Behavior Procedures Assessment

Figure 1. The basic teaching model and the feedback


loops for performance assessment
ROLE OF THE TEACHER IN FORMAL
EDUCATIVE PROCESS:

 Identify the educational objectives sought


 Determine the educational experiences the
learner must have to achieve the objectives
 Know the learners well enough so that you can
design and order their experiences according to
their varied characteristics
 Evaluate the degree to which the desired changes
in learner behavior have taken place
Assessing the
Performance of the
Students
- one of the most important functions of a
teacher
Steps/Processes:
• Designing assessment tools
• Packaging and reproducing the test
• Administering the test
• Checking, scoring and recording
• Return and give feedback
• Improving the test items
Designing Assessment Tools

• Write instructional objectives


• Prepare Table of Specifications
• Write test items that match with instructional
objectives
PACKAGING AND REPRODUCING
 Put items of the same format together
 Arrange test items from easy to difficult

 Give proper spacing for each item for easy reading

 Keep questions and options in the same page

 Place the illustrations near the options

 Check the key answer

 Check the direction of the test

 Provide space for name, data and score

 Proofread and reproduce


Administering the Test
Before administering the test
Induce positive test taking attitude
Inform the students about the purpose
Give oral directions before distributing
Give test-taking hints that guessing,
skipping, and the like, is prohibited
Inform students the length of time allowed
(time started and time finished – written on
the board); give warning before end of time
Administering Test
Before administering the test (con’t.)
Tell students how to signal or call your attention if
they have questions
Tell students how papers will be collected
Tell students what to do when they are done with the
test
Rotate the method of distributing papers
Make sure the students are comfortable
Remind the students to write their names
If test has more than one page, have each student
check to see that all pages are there.
ADMINISTERING THE TEST
During Examination:
 Do not give instructions or avoid talking while the
exam is going on
 Avoid giving hints
 Monitor to check students’ progress and
discourage cheating
 Give warnings if students are not pacing their
work appropriately
 Make notes of any questions asked – this can be
used for purposes of revisions of test items
 Test items must be collected uniformly to save
time and avoid test papers to be misplaced
ADMINISTERING THE TEST
After the Test:
 Check immediately
 Score and record immediately
 Return promptly
 Discuss test items with the students (post
discussions)
ANALYZING THE TEST
Item Analysis
A technique to determine the quality of a test item,
i.e. whether a test item is good, needs improvement, or
needs to be discarded

“A ‘postmortem’ is just as necessary in classroom


assessment as it is in medicine.”
- Lewis Aiken (1997)
Uses of Item Analysis
Item analysis data provide a basis for:
efficient class discussion of the test results
remedial work
general improvement of classroom instruction
for increased skills in test construction
constructing test bank
Kinds of Item Analysis
Quantitative Item Analysis
◦ Difficulty index
◦ Discrimination index
◦ Analysis of Responses options
Qualitative Item Analysis
◦ A process by which the teacher or expert carefully
proofreads the test before it is administered, to
check if there are typographical errors, to avoid
grammatical clues that may lead away to giving the
correct answer, and to ensure that the level of
reading materials is appropriate.
Difficulty Index

It is the proportion of the number of


students in the upper and the lower groups
who answered an item correctly.
Difficulty Index
n
Df = ------ , where
N

Df = difficulty index
n = number of students in the upper and lower
groups who answered correctly
N = total number of students who answered the
test in the upper group and lower group
Level of Difficulty of an Item

Index Range Difficulty Level

0.00 – 0.20 Very Difficult


0.21 – 0.40 Difficult
0.41 – 0.60 Average/Moderate
Difficulty
0.61 – 0.80 Easy
0.81 – 1.00 Very Easy
Discrimination Index

It is the power of the item to discriminate the


students between those who scored high and
those who scored low in the test; i.e., it is the
power of the item to discriminate the
students who know the lesson and those who
do not know the lesson.
Discrimination Index
Cu – Cl
Di = ------------- , where
D

Di = discrimination index
Cu = number of students selecting the correct
answer in the upper group
Cl = number of students selecting the correct
answer in the lower group
D = number of students in either group
Level of Discrimination
Index Range Discrimination Action
Level
0.19 and below Poor item Should be
discarded or
revised
0.20 – 0.29 Marginal item Needs some
revision
0.30 – 0.39 Reasonably Needs
good item improvement
0.40 - 1.00 Very good item Retain
Analysis of Response Items

Examining the effectiveness or


attractiveness of the distracters (incorrect
answers) to those who do not know the
correct answer
Distracter Analysis
Distracter – a term used for incorrect
options in a multiple-item test
Using quantitative item analysis, one can
determine if the distracters are effective or
not
Item analysis can identify non-performing
test items, but this item seldom indicates
the error in the given item
Distracter Analysis
Factors to be considered why the students
failed to get the correct answers:
It is not taught in the class properly
It is ambiguous
The correct answer is not in the given
options
It has more than one correct answer
It contains grammatical clues to mislead
the students
Distracter Analysis
Factors to be considered why the students
failed to get the correct answers:
The student is not aware of the content
The students were confused by the logic
because of the double negatives
The student failed to study the lesson
Distracter Analysis
Miskeyed Item
- there are more students from the upper
group who choose the incorrect options
than the key
Guessing Item
- students from the upper group have
equal spread of choices among the given
alternatives
Distracter Analysis
Reasons for guessing the answer:
The content of the test is not discussed in the
class or in the text
The item is very difficult
The question is trivial

Ambiguous Item
- more students from the upper group choose
equally an incorrect option and the keyed
answer
Steps in Item Analysis
Arrange the scores from highest to lowest
Separate the scores into upper group and lower group
Class of 30 students - divide into 2 groups
Top 27% in the upper group and lowest 27% in the lower
group
Others: top 30% and bottom 30%; or 33% in every group
Steps in Item Analysis
Count the number of those who chose the alternatives
in the upper group and lower group for each item and
record the information
Compute the value of difficulty index and the
discrimination index
Make an analysis of each response in the distracters
Make an analysis of each item
Template for recording:
(Ex. B is the correct response)
Item No. 1, 50 students 30% is 15

Options A B* C D E
Upper 2 10 1 1 1
Group
(n=15)
Lower 4 6 2 2 1
Group
Example 1. A class is composed of 40
students. Divide the group into two. For
item number 1, option C is the correct
answer. Given the data, make an analysis of
the test item.
Options A B C* D E

Upper 3 4 11 2 0
Group

Lower 4 8 4 4 0
Group
ANALYSIS FOR ITEM # 1
 Difficulty Index
Df = 15/40 = 0.375 or 37.5%
 Discriminating Power

Di = (11 – 4)/20 = 0.35 or 35%


 Analysis:
 The item is difficult
 It has a positive discrimination (more students in the upper group got
correct answers); reasonably good, needs improvement
 Retain options A, B, D –these attract more students in the lower
group
 Decision: Retain the test item but change option E
Exercise 1: A class is composed of 50
students. Use 27% to get the upper
and lower groups. Make an analysis of
the test item given the data.
Option A B* C D E

Upper 3 7 1 1 2
Group
Lower 2 6 3 0 3
Group
Exercise 2: A class is composed of
30 students. Analyze the test item
given the following results.
Options A* B C D E

Upper 7 3 1 2 2
Group

Lower 1 4 3 4 3
Group
Exercise 3: A class is composed of 50
students. Use the 27% to get the
upper and the lower groups. Analyze
the item given the following data:
Options A B C D* E

Upper 3 2 2 4 3
Group

Lower 3 1 1 7 2
Group
Exercise 4: Make an analysis of an
item with the following data.
(N=40)

Options A B C* D E

Upper 2 3 3 4 8
Group
Lower 4 4 3 5 4
Group
Exercise 5: Make an item analysis
for a test item given the following
data. (N=40)
Options A B C D* E

Upper 2 2 7 7 2
Group

Lower 2 4 5 6 3
Group
Exercise 6: Make an analysis of an
item with the following data.

Options A B C D E*

Upper 5 4 3 4 4
Group

Lower 5 4 4 3 3
Group
References
Gabuyo, Yonardo A. (2012). Assessment of
Learning I (Textbook and Reviewer). Manila:
Rex Book Store.
Garcia, Carlito D. (2008). Measuring and
Evaluating Learning Outcomes . Manila:
Books Atbp. Publishing.
Hetzel, Susan M. (1997). “Basic Concepts of Item
Analysis.” Texas ANM University .
THANK YOU FOR
LISTENING

You might also like